Archive for June, 2012
Wednesday, June 27th, 2012
By William Tucker
A few years ago the San Juan County Historical Society in Silverton, Colorado had a wonderful idea for reducing the $600-a-month electric bill at its historic Mayflower Mill, an 83-year-old gold-and-silver refinery. Why not install a small hydroelectric turbine so the National Historical Landmark, could generate its own electricity?
The least populated county in Colorado, San Juan has a proud history in hydro. The nation’s first alternating current dams were built in nearby Ames and Tacoma in the late 19th century, several years before George Westinghouse and Nikolai Tesla achieved success at Niagara Falls. Both mountain dams are still operating.
“A century ago, mills all over the San Juan Mountains were powered by hydroelectricity,” says Bev Rich, chairman of the Historical Society. “It’s a fortunate result of our geography and an abundant water supply. We proposed using this historic technology to carry our organization into the future.”
The initial steps proved surprisingly easy. The Society secured a $105,000 grant from History Colorado. It acquired a 300-square-foot shed in nearby Eureka once used to shelter miners waiting to make the trip up the mountain and hauled it up to the mill to house the turbine. An existing pipeline already diverted water from the Arastra Gulch higher up in the mountains. The whole complex would generate 8 kilowatts, enough to power Mayflower through the tourist season. Then during the winter the surplus electricity could be sold to the local San Miguel Power Association.
At this point, the Historical Society encountered the Federal Energy Regulatory Commission.
It seems that generating any kind of electricity with water anywhere in the United States is a federal matter. And being a federal matter, it requires all kinds of environmental, architectural, biological, archaeological and anthropological review.
“First they required us to produce detailed architectural drawings of the shed housing the generator,” says Rich. “Then they needed a new survey determining exactly where the shed sits on the property. Next we had to open a 30-day comment period for every federal agency you can think of, including responses from downstream Indian tribes. The whole process would cost us an additional $25,000 and take months to complete.”
Fortunately, Kurt Johnson, an attorney with Telluride Energy in nearby Telluride and President of the Colorado Small Hydro Association, volunteered to help the Society thread the bureaucratic obstacle course pro bono. Still, the process has taken the better part of a year with no end in sight. “The generator was ready to go into operation last fall,” says Johnson. “We are still waiting.”
Mayflower Mills became Exhibit A in a Congressional hearing earlier this month aimed at shortening the regulatory process for small energy projects. “It’s the same with putting solar collectors on your house,” says Johnson. “You have to get approval from both state and federal agencies. A person ends up spending more money on regulatory approval than on the capital investment itself.”
But carving out an exception for small projects like Mayflower only ignores the whole problem of excessive and redundant regulation of all energy projects. Executives in the nuclear industry now complain that one of their biggest problems is having project directors at the Nuclear Regulatory Commission retire before the regulatory review can be completed.
So if it seems difficult to get approval for an 8-kilowatt hydroelectric project, think of what it takes to build a 1,500-megawatt nuclear reactor.
Wednesday, June 20th, 2012
Can Natural-Gas Methanol Replace Foreign Oil?
By William Tucker
In my travels in the world of energy I have many interesting encounters but one that stands out in particular was a recent meeting with California entrepreneur Yossie Hollander. Hollander had a big success with a software company and has now turned his talents to trying to find a way for the nation to stop spending $1/3 trillion a year on foreign oil.
Hollander has founded a group called the Fuel Freedom Foundation whose aim is to push any and all alternatives to foreign oil – diesel, ethanol, natural gas or anything else. But his favorite is methanol.
Why methanol? The argument is pretty straightforward. We already know that methanol can power internal combustion engines. The Indianapolis 500 and many other circuits have been using it for the past several decades. Indianapolis officials originally mandated a conversion after two leading drivers were killed in a horrible gasoline fire in the second lap of the 500 in 1964. But drivers eventually came to appreciate methanol’s clean burn and relatively high octane rating as well. It was only recently supplanted by corn ethanol after a huge lobbying effort from the ethanol industry.
The problem with methanol has always been there wasn’t enough feedstock. Methanol was originally distilled from wood, which gave it the name “wood alcohol.” But we don’t want to go cutting down forests and there would be a terrific problem with the waste. Municipal garbage has been another feedstock but there’s not nearly enough of it. Now we have hit the jackpot with shale gas, however, and for the first time in history we have enough raw material to think on a grand scale. Besides, there’s a huge glut and everybody in the industry is trying to figure out what to do with it.
Hollander has a simple solution. Methanol can be synthesized from natural gas by “reforming” it with water vapor. “It’s early 20th century chemistry,” he says. “You don’t need distillation or the refining required by the oil industry. It’s a low-energy, low-cost process.” He suggests the industry build reforming plants next to natural gas pipelines, then use existing tanker trucks to distribute to local gas stations. “You wouldn’t have to change the pumps or anything,” he says. “You could use the infrastructure we have now.”
So why isn’t anybody picking up on this? That’s the great mystery to me. On Monday The Wall Street Journal did an entire supplement on converting our transport sector to natural gas yet it never mentioned methanol once. Instead it concentrated on compressed natural gas (CNG), and diesel. CNG is an extremely awkward technology and not even the Journal was very enthusiastic:
The big issue with building [compressed] natural-gas vehicles is the fuel tank. Gasoline and natural gas engines are relatively similar. But natural gas must be stored under high pressure- so the tanks must be stronger, heavier and larger. And that drives up the price. The only natural-gas passenger car sold in the U.S., the Honda Civic GX, costs about $5,200 more than a comparable gasoline vehicle and $3,600 more than the gasoline/electric hybrid Civic. (emphasis in original)
Methanol can go in your gas tank just the way it is now. Putting it in your engine would require a few adjustments to the fuel injection system but the cost is no more than $100 at the factory and $200 for conversion of an older vehicle by the local mechanic.
Next the Journal talked about “Reinventing the Fuel,” yet here it mentioned everything except methanol.
The technology to turn natural gas into a low-sulfur diesel fuel was developed long ago in Nazi Germany, but it continues to be an expensive process that has limited its success.
Last year Royal Dutch Shell PLC opened its massive Pearl Gas-to-Liquids project in Qatar. . . The project now produces enough diesel from natural gas per day to fuel 160,000 cars . . .
Shell is considering a similar plant in Louisiana, where it hopes to draw upon the abundance of U.S. natural gas . . .The project could cost up to $10 billion, but the company hopes lessons learned form building Pearl will help keep those costs down.
Dallas-based chemical firm Celanese Corp. has started to produce fuel-grade ethanol as a substitute for the corn-based ethanol from a plant in Clear Lake Texas. But the company doesn’t expect commercial-scale production in the near future.
In Silicon Valley, Siluria Technologies Inc. has figured how to turn natural gas into ethylene, which can be used to make a wide range of fuels and other projects. The technique involves a genetically engineered virus that coats itself with a metal that serves as a catalyst.
Am I missing something here? If producing methanol is as simple as Hollander says, why aren’t Royal Dutch Shell, Celanese and Silicon Valley looking at it instead of fooling with these much more complicated and costly technologies? I can’t figure it out.
We already have a methanol industry. There are 18 major plants producing 2.6 billion gallons a year. Methanol is used in a variety of manufacturing technologies and makes up 30 percent of your windshield fluid. It sells for less than $2 per gallon. Methanol only has 2/3 the energy value of gasoline but that still makes it cheaper. Of course it would be a long slog to ramp the industry up to the 136 billion gallons we would need to replace half our gasoline, but the price would be as likely to come down as go up with economies of scale.
Redirecting a large portion of our gas supplies into powering cars would do wonders for nuclear energy. It would take away the illusion that gas is “too cheap to meter” and that we can throw it away by using it to generate electricity. (People in the industry call it “washing dishes with champagne.”) If anybody can see a flaw in Yossie Hollander’s argument, please let me know.
Monday, June 18th, 2012
Is China Really Giving Up Solar for Nuclear?
By William Tucker
“China to Drop Solar Energy to Focus on Nuclear Power.” That’s the headline of a story sent to me by an old college friend. Wow, that’s big stuff! Where did he get it?
Sure enough, the story appears on Electric Light & Power ; a reasonably reliable source. Now I spend every morning reading headlines about energy and I don’t ever recall seeing anything about this. How did the mainstream press miss such a big story?
The text in ElectricLight & Power’s appears under the byline “Asia Pulse” and provides mores details:
China will accelerate the use of new-energy sources such as nuclear energy and put an end to blind expansion in industries such as solar energy and wind power in 2012, Chinese Premier Wen Jiabao says in a government report published on March 5.
China will instead develop nuclear power in 2012, actively develop hydroelectric power, tackle key problems more quickly in the exploration and development of shale gas, and increase the share of new energy and renewable energy in total energy consumption.
The news has been picked up on the blog, “WUWT” (“Watts Up With That?”), where former television meteorologist Anthony Watts expresses his skepticism about global warming. Watts says he got the story from John Droz’s newsletter via Dr. Roger Pielke, Jr., who comments, “I can hear Joseph Romm’s head exploding.” It is indeed an extraordinary development. It smacks of an admission that China has just been turning out windmills and solar collectors to sell to the West while taking care of business with nuclear at home.
Among the comments on Watts’ blog, however, is a one from a “Lord Timothy of Edsion,” who remarks:
Gee, post a link to Wen Jiabao’s full speech why don’cha. Decoded, all it means is that China will result [resume?] construction on the four plants on which world was suspended following Fukushima.
As for the phrase “China to drop solar energy” … neither this phrase, nor any phrase like it, appears anywhere in Wen Jiabao’s speech.
So the mystery deepens. I go to the link and am confronted with a headline, “Full Text: Report on the Work of the Government,” over a bifurcated text reminiscent of the Rosetta Stone. On the left are Chinese characters, short, pithy, occupying only nine lines. On the right is the matching English translation running more than 25 lines. Boy, those Chinese sure know how to write succinctly, although I hear it takes almost a lifetime to learn those pictograms. The translation sound something like one of those 12-hour rants from Kim Il Jong or Fidel Castro:
We will use economic, legal, and the necessary administrative means to conserve energy and reduce emissions in key areas such as manufacturing, transportation, construction, public institutions, and people's homes, and in 1,000 key energy-intensive enterprises; and close down more outdated production facilities. We will tighten supervision of energy use, develop smart power grids and ensure the proper distribution of energy supplies, and implement effective administrative practices such as efficient electricity generation and distribution, energy performance contracting, and government procurement of energy-efficient goods and services.
Does anyone ever pay any attention to this stuff?
Anyway, I do a keyword search and sure enough, there it is in the Block #4, section 2:
We will prevent blind expansion in our capacity to manufacture solar energy and wind power equipment.
We will optimize the energy structure, promote clean and efficient use of traditional energy, safely and effectively develop nuclear power, actively develop hydroelectric power, tackle key problems more quickly in the exploration and development of shale gas, and increase the share of new energy and renewable energy in total energy consumption.
It’s on these two sentences that the story is based.
I feel like I’m playing a game of “telephone.” This hardly seems like the earth-shattering announcement portrayed in Electric Light and Power. On the Watts site opinion is split. Some people say they’re actually going to cut back on solar (but what about that “increase renewables” in the second sentence), others say the premier means he wants to cut back on manufacturing windmills and solar collectors for Western consumption, since the U.S. is about to throw up tariffs and the market is getting saturated. Now I know what it was like to be one of those old Kremlin watchers who tried to decode the power rankings in the Soviet Union’s Politburo by interpreting who was standing next to whom at the May Day Parade.
So this is how news gets around the world. It does indeed look like the Chinese are cutting back on “blind expansion” of wind and solar. And they are definitely going ahead with nuclear. But does that mean they have decided to develop nuclear instead of going ahead with solar? I’ll let you decide.
Wednesday, June 13th, 2012
Low-Level Radiation: Is There a Hormetic Effect?
By William Tucker
In the early 1980s, a Taiwan steel company accidentally mixed a quantity of highly radioactive cobalt-60 into a batch of steel rebar. The radioactive rods were then in the construction of 1,700 apartments. As a result, people living in these buildings were subject to radiation up to 30 times the normal amount received from the natural background.
When dismayed Taiwanese officials discovered this enormous error fifteen years later, they surveyed past and present apartment dwellers expecting to find an epidemic of cancer. Normal incidence would have predicted 160 cancers among the 10,000 residents. To their astonishment, the researchers discovered only five cases of cancer – a 97 percent reduction from the anticipated amount. Birth defects were also 94 percent below the anticipated rate.
These findings were published in the Journal of American Physicians and Surgeons in 2004. As one researcher phrased it, exposure to high levels of background radiation had apparently bestowed upon residents “an effective immunity from cancer.” [W.L. Chen et al., “Is Chronic Radiation an Effective Prophylaxis Against Cancer?” Journal of American Physicians and Surgeons, Spring 2004. The upper line is the expected rate of cancer over a 20-year period among 10,000 residents. The bottom line is the actual rate.]
The Taiwan apartment incident is just one of many examples that has convinced a wide cohort of radiation scientists that the dangers of low-level exposures have been wildly exaggerated and there may actually be a “hormetic” effect – a word that still doesn’t appear in most dictionaries – meaning that low-level exposure may actually be beneficial.
The whole thing makes a certain amount of sense. First, there is ample evidence that the body has repair mechanisms that respond almost immediately in repairing genetic damage caused by radiation. Last December, researchers at Berkeley observed the repair mechanism at work in human breast cells and even managed to film it. “Our data show that at lower doses of ionizing radiation, DNA repair mechanisms work much better than at higher doses,” said Mina Bissell, a world-renowned breast cancer researcher with Berkeley Lab’s Life Sciences Division. “This non-linear DNA damage response casts doubt on the general assumption that any amount of ionizing radiation is harmful and additive.”
As a press report in Berkeley’s R&D magazine later expressed it: “This contradicts the standard model for predicting biological damage from ionizing radiation—the linear-no-threshold hypothesis or LNT—which holds that risk is directly proportional to dose at all levels of irradiation,”.
In April researchers at MIT reported they had exposed mice to 400 times natural background radiation over a period of five weeks without detecting any genetic damage. ''Almost all radiation studies are done with one quick hit of radiation. That would cause a totally different biological outcome compared to long-term conditions,'' reported Bevin Engelward, an associate professor of biological engineering at MIT and one of the paper’s authors. “Exposure to low-dose-rate radiation is natural, and some people may even say essential for life,” added co-author Jacquelyn Yanch, a senior lecturer in MIT's department of nuclear science and engineering. “The question is, how high does the rate need to get before we need to worry about ill effects on our health?”
All this is quite contradictory to the opinions expressed in the recent special issue of the Bulletin of Atomic Scientists, where the authors not only claim that there is “no safe dose” of radiation but argue that exposure must be measured cumulatively over a period of decades. When this approach is taken, natural background and medical exposures quickly add up so that after 40 years every American is now approaching the danger zone of 10 rems, above which cancer incidence begins to show up. As host editor Jan Beyea, former energy director at the Audubon Society, put it:
In developed countries, the average accumulated dose for medical procedures is now so high that a significant percentage of the population in these countries will be above 0.1Sv [10 rem]. Therefore this population will be primed for radiation-induced cancers from release from nuclear reactors or dirty bombs, even using the hypothetical dose-response models of the LNT dissenters. There is no longer a convenient excuse for avoid using the LNT to estimate consequences from real or projected releases of radioactivity materials, even when the dose of concern is below 0.1 Sv.
The implications of this debate are enormous. If in fact there is no danger from radiation at the level of 400 times background – 120 rems spread out over the course of a year – then the entire Fukushima evacuation zone becomes habitable. The “Land of Wolves” surrounding Chernobyl – which is now thriving with animal life – could be fit for human habitation again as well.
In fact, as hormesis supporters point out, life on earth evolved in an environment that was much more intensely radioactive than it is today. It would be surprising if we had not developed mechanisms to deal with routine radiation damage, even though they may have atrophied to some degree. As Manhattan Project veteran Ted Rockwell expressed it: “If radiation really posed a serious danger to living creatures, we would have developed sensory organs to detect it a long, long time ago.”
Re-evaluating the presumed dangers of low-level radiation may be one of those paradigm shifts that takes the scientific community a generation to absorb. A whole worldview – and an entire industry – is now dedicated to the premise that radiation is an invisible killer against which huge resources must be deployed – even entire technologies abandoned – in order to provide ourselves with adequate protection.
Change will only come slowly. It won’t happen overnight.
Monday, June 11th, 2012
Low-Level Radiation – Is There a 'Bystander Effect'
By William Tucker
Probably no issue hangs more heavily over nuclear energy than the enigma of the effects of low doses of ionizing radiation. The whole future of the technology may depend on whether the effects are less or more dangerous than currently predicted.
The question revolves around what happens below severe exposures of 10 rem – about the annual exposure that would come from living in the worst contaminated areas of the evacuation zone around Fukushima at this moment. Data from exposure to the atomic bomb clearly indicate a linear dose-response curve for cancer incidence down to 10 rem. Below that, the figures disappear into the general incidence of cancer in any large population.
Is there a threshold below which the body’s natural defense mechanisms act to reduce the rate? Is there even a “hormetic” effect, where low-level exposures stimulate the body’s defenses so they are primed to repair even greater exposures? Or, on the contrary, is there a “bystander effect” by which the genetic damage to one cell from radiation can be transmitted to another? If this is true, then the dangers from low-level radiation could actually increase beyond what would be anticipated from the linear dose-response effect.
In this column we will review the bystander effect.
The phenomenon was first suggested in 1992 by John Little and Hatsumi Nagasawa of the Harvard School of Public Health, who found that when cells not damaged by radiation were grown in the same dish with cells damaged by radiation, the non-damaged cells also showed signs of DNA damage. The finding was confirmed in 1999 by a group at Columbia University that demonstrated it with alpha particles. Subsequent research speculated that either there may be direct communication between cells or there may be some kind of signal molecules that carry the information between cells.
In the last decade, researchers at the London Regional Cancer Center in Ontario suggested that radiation damage may release free radicals, which cause genetic damage in neighboring cells. Finally, in 2006 the Department of Energy released a report of the work of biologist Bruce Lehnert at Los Alamos, which confirmed that the induction of genetic mutations can occur in nearby cells, even when they have not been directly exposed to radiation.
So the phenomenon is now generally confirmed. As Colin Hill, of the Norris Comprehensive Cancer Center at the University of Southern California writes in the current issue of the Bulletin of Atomic Scientists:
It is now clear that bystander effects do occur and are a general phenomenon induced by all types of radiation. The development of studies in intact tissues, as opposed to cell culture, clarifies that bystander effects cause changes in cells in complete tissues, as well as in the surrounding tissue that was not hit by the radiation. Scientists have noted that, for some types of tissue and types of radiation delivery, there is accumulating evidence that there is an increase in cancer induction – that is, above the linear response – in low doses. (Emphasis added)
In other words, not only are small doses of radiation as dangerous as the type that would be received from being in the vicinity of an atomic blast, they are actually more dangerous because of the bystander effect.
What to make of all this? Well, one thing that colors the argument of nuclear energy opponents is their insistence that exposures must be measured in cumulative doses and that there are absolutely no mechanisms for repairing the genetic damage done by radiation, even at the lowest levels. Using this kind of logic, anti-radiation crusaders are turning their attention to medical x-rays and cat scans, arguing that we are approaching the point where these exposures constitute a health danger. Starting at this point, any additional exposure from the operation of nuclear reactors is regarded as an unnecessary risk, no matter how small.
But the idea that the 350 millirems from the natural background plus the average of 350 millirems that now come from medical procedures is pushing up cancer rates is purely hypothetical. There is no evidence whatsoever from demographic surveys and considerable evidence to the contrary. The residents of the Rocky Mountain region, for instance, suffer the highest levels of background radiation in the United States from a combination of high altitude and plenty of radioactive material in the granite. Yet they have the lowest incidence of cancer in the country. Meanwhile, residents of the Mississippi Delta have the lowest levels of background radiation and suffer the highest cancer incidence in the country.
If indeed there is a bystander effect, it is also hard to see why it would operate more damagingly at low exposures than at high exposures. Most of the results in the laboratory, in fact, have been achieved at relatively high doses of radiation. And if the effect were the same at low and high doses, then the bystander effect would already be incorporated into the linear dose-response model. As former Audubon Society energy expert Jan Beyea concludes in the Bulletin forum:
It should be noted that all of these cellular effects – including bystander effect, genomic instability, and adaptive response, some of which are thought to have effects working in opposite directions – could already be incorporated into the linear human dose-response curve, making the debate much ado about nothing.
But one other possibility is that there is actually a threshold and even a hormetic effect – and that somehow the mysterious bystander effect may even be part of these. That’s the final possibility we’ll consider in the next column.
Friday, June 8th, 2012
Low-Level Radiation: Are We Already Overexposed?
By William Tucker
(This is the second in a series of columns considering the problem of low-level radiation.)
In the landmark book, The Structure of Scientific Revolutions,
Thomas Kuhn described the procedure whereby new ideas are incorporated into the scientific community. He noted that “paradigm shifts” rarely occur when an older idea that is proving more and more untenable goes through a series or greater and greater embellishments until it finally collapses of its own weight. Still, a wholesale conversion of the scientific community rarely occurs. Instead, a new generation will grow up accepting the new paradigm as a given.
We seem to be going through such a process with regard to the dangers of low-level ionizing radiation.
Ever since the dropping of the atomic bomb in 1945, it has been widely known that nuclear radiation in large doses can cause cancer. There had been isolated incidents with radium during the 1920s and 1930s before Hiroshima, Nagasaki and subsequent nuclear testing showed how exposure could occur on a large scale.
When it became clear that large numbers of industrial workers would now become exposed to low levels of radiation, the first impulse was to say “make it as low as possible.” As it became clear how expensive it would be to eliminate every trade of radiation, however, standards were set. Those standards have been debated ever since.
The old paradigm is that the known impact of radiation at levels above 10 rems can be extrapolated right down to zero under the assumption that there is “no safe dose” or radiation. Gordon Thompson, executive director of the Institute for Research and Security Studies, sets this case forward in a straightforward manner in the current issue of the Bulletin of Atomic Scientists. He says that since nothing yet stands in its place, the linear, no threshold (LNT) hypothesis must be accepted, along with its implications, which are that nuclear activities are causing cancer. Moreover, public officials should also be cognizant that anyone who challenges this hypothesis is probably doing so only “for profit.”
For contemporary policy purposes, the LNT hypothesis can be regarded as well-established science. . . . In the United States, an average individual dose of 10 mSv would cause about 500 excess cancer deaths over time across a population of one million. This statement can be extended to any level of low dose, or any large population, by simple proportion. . . . [I]f the LNT model is valid, the excess deaths from radiation are real events; they are just masked by a greater number of other cancer deaths, so that they are not directly observable. . . . As long as the LNT model remains the prevailing hypothesis, this balance should explicitly recognize the existence of real, but masked, health effects. Also, the balance should take a political mature view of the financial incentives related to human-made radiation exposure. As shown above, exposure is often linked to industrial activities that are in part, profit driven. (Emphasis added)
What makes this house of cards even more presumptive is the notion that radiation exposure must also be considered cumulative.
The concept of collective dose means that individual doses are aggregated across the population, regardless of whether an individual’s exposure is a one-time event, episodic, or continuous.
Thus, while the figure of 0.1 Sv might seem far out of reach for ordinary levels of exposure, when a lifetime dose is measured, the numbers eventually add up. This leads Thompson to argue that medical screening devices are being overutilized and to identify “a growing concern within the health professions that disease screening has been overemphasized as well as a recognition that profit has influenced this trend.” As recounted last time, it also leads the forum moderator, former Audubon energy analyst Jan Beyea, to conclude that even negligible doses from the operation of nuclear reactors may be critical because they must be added to the already dangerous exposure from background and medical uses that we already experience.
But all this assumes that the body has no defenses against even the smallest doses of radiation and that being exposed to 0.1 Sv over the course of 40 years is the same as being exposed to 0.1 Sv from a nuclear accident or an atomic blast. Isn’t this a rather large assumption upon which to be basing such alarming predictions? We’ll look at this further next time.
Monday, June 4th, 2012
Bulletin of Atomic Scientists Returns to Low-Level Radiation
By William Tucker
Nothing energizes the anti-nuclear movement more than the concern that exposure to low-level radiation is harmful and that nuclear power plants are emitting death rays that quietly spread cancer throughout the population.
Those who remember the 1970s will not have forgotten Dr, John Gofman fulminating on late night television that "for every reactor that is built, babies die," or the ubiquitous Dr. Ernest Sternglass informing the public that every blip in cancer incidence that occurred around the country was the result of a radioactive cloud passing overhead months before,
With the current revival of nuclear power, then, it is not surprising to find that the Bulletin of Atomic Scientists has decided to return to the issue of low-level exposures.
The May/June edition contains a special forum of eight essays ranging across fields from the biology of the hypothesized "bystander" effect to the communications theory of why people fear nuclear reactors more than routine medical diagnoses. Guest editor is Jan Beyea, former director of energy programs at the Audubon Society, moderates the discussion.
In his introductory essay, Beyea begins with a simple but indisputable observation: No one knows the effects of exposure to low-level radiation:
"Though the debate takes on many shapes, it always revolved around one magical number: 0.1 Sieverts (Sv), the dividing line between what is considered high and low exposure today. It is equivalent to about 40 cumulative years of the average unavoidable background radiation and to about 40 years of average medical diagnostic radiation in the United States. And from this magical number, more disputes spring, specifically on the radiation risks below 0.1 SV, as well as the risks from protracted radiation exposure above and below this number. The debates can be brutal – so much so that, at times, they make the spats between William Jennings Bryan and Clarence Darrow look lame.
Long-term data from Japanese victims at Hiroshima and Nagasaki have confirmed a very clear, linear dose-response curve for cancer incidence down to 0.1 SV – the equivalent of 10 rems. Below that, however, cancer incidence disappears into the general background levels in the population. You might think the invisibility of any statistics would allay minds, but with such an important question as whether to build more nuclear plants nothing can be overlooked. And so critics extrapolate the numbers right down to the lowest levels. When these hypothetical rates are then projected across huge populations – the size of Europe, for instance – numbers can emerge that make for newspaper headlines.
Beyea outlines four ways in which cancer incidence can be projected below the 0.1 Sv level:
- The ratio of dose-to-response proceeds in a straight line down to zero, so there is "no safe dose of radiation," even though the effects may be so small as to be undetectable.
- The dose-response curve may proceed in a straight line until it reaches some "threshold," below which there is no danger.
- The adverse response may actually increase at lower levels due to a hypothetical phenomenon called the "bystander effect."
- Exposures at the lowest level may be beneficial, in that they act like a vaccine and stimulate the body's defense mechanisms against further radiation damage. This is the “hormetic” effect – a word that still does not appear in most dictionaries.
Beyea gives attention to all four possibilities but in the end decides that it doesn’t much matter. Existing levels of exposure to radiation are already so high that any increment from nuclear sources becomes unacceptable under any circumstances:
Given the increase in radiation from medical diagnostics and the interest in protracted exposure, the possible existence of a threshold or hormetic effect for public policy appears to be a moot issue for developed countries when it comes to future exposure. Even if the level of medical diagnostic exposures does not increase in the future, over the course of 40 years most people in developed countries will receive an average of 0.1 Sv from medical procedures alone. With this in mind as a dose starting point for millions of people, it is fair to say that any exposure to radioactive elements from a nuclear accident or a dirty bomb would definitely contribute to their delayed cancer risk (emphasis added).
The assumption behind this diagnosis, of course, is that the dangers of radiation exposure are cumulative and that being exposed to 10 rems from 40 years of background and medical exposures is the same as being exposed to 10 rems in a single burst from a nuclear accident or nuclear weapon.
Whether this assumption can be justified and where it leads in terms of public policy implications is something we’ll discuss in future columns.