The News That Matters about the Nuclear Industry Fukushima Chernobyl Mayak Three Mile Island Atomic Testing Radiation Isotope

There’s money in denying the science about ionising radiation – it’s useful nuclear lobby spin

Recently, the National Council on Radiation Protection and Measurements (NCRP) – scientists who provide guidance and recommendations on radiation protection under a mandate from Congress – supported the LNT model. NCRP analyzed 29 epidemiological studies and found that the data was “broadly supportive” of the LNT model and that “no alternative dose-response relationship appears more pragmatic or prudent for radiation protection purposes.”
In fact, the National Academies’ Nuclear and Radiation Studies Board, the International Council on Radiation Protection, and other international bodies and regulators all use the LNT model for guidance and radiation protection.
Why radiation protection experts are concerned over EPA proposal  Ferenc Dalnoki-Veress
Scientist-in-Residence and Adjunct Professor, Middlebury Institute of International Studies at Monterey, October 19, 2018 The Takata Corporation sold defective air bag inflators that resulted in the death of 16 people in the United States and a massive recall of cars. While it was rare for the air bags to fail, the brutal consequences of this defective device in even minor collisions was easy to recognize. But the effects of low-dose ionizing radiation – high energy waves or particles that can strip electrons from atoms and physically damage cells and the DNA within – on people’s health is much harder to see, and prove.
When the Associated Press reported that the Trump administration’s Environmental Protection Agency solicited the advice of a controversial toxicologist, Edward Calabrese, to consider changes to how it regulates radiation, it sent shock waves through the radiation protection community. Calabrese is well known for his unconventional and outlying view that low-dose radiation is not dangerous.It is important to note that the health effects of high doses of radiation are well established. We all know about the horrific effects based on studies of the populations of Hiroshima and Nagasaki after the atomic bombs were dropped. Then there was also the recent case of Russian defector Alexander Litvenenko who quickly sickened and died 23 days after being poisoned with the radioactive isotope polonium-210 in 2006.However, the effects of low doses of radiation are not well understood. Part of the reason is that these low doses are difficult to measure.

Current understanding of the health effect of radiation relies primarily on a decades-long study of the survivors of the Hiroshima and Nagasaki atomic bomb attacks. That population was exposed to a one-time large dose of radiation, with individual exposure dependent on where they were at the time of the explosion.
In those high-dose radiation studies, researchers found that there is a proportionate relationship between dose and effect. The way the EPA gauges the effect of low doses of radiation draws from these studies as well as studies following other incidents. The current guidelines for the EPA adhere to what is called the linear no-threshold (LNT) model, which implies that even low doses of radiation have an effect across a population. Some scientists dubbed it to be a “reverse lottery,” where an unlucky few within a given population will get cancer during their lifetime due to their exposure to radiation.

There have been questions as to whether the LNT model is appropriate for measuring cancer risk from low doses of radiation. That’s because when the radiation-induced cancer rate is low, and the sample size is small, there is more statistical uncertainty in the measurement. This allows more wiggle room in putting forward alternative dose-response models such as Calabrese’s, which have little scientific backing but that promise financial benefits for regulated industries.

Overall, the general feeling in the radiation protection community is that for now until new research proves otherwise, the LNT model, because of the lack of understanding of the effect of low doses, is the prudent model to use to set protective limits.

Also, not being able to determine the effect of a low dose of radiation is a problem in measurement, not in the underlying linear threshold model. As doses of radiation decrease, fewer cases of radiation-induced cancers occur, making it more difficult to identify those specific cases.

This is especially true given that cancer is already a common occurrence, making it nearly impossible to disentangle radiation exposure from many other potential cancer risk factors. This is where the analogy with Takata air bags fails, because it is not possible to prove that a specific cancer death is due to ionizing radiation, but this does not make it any less real or significant.

Who profits if radiation guidelines change

The EPA issues guidance and sets regulations to “limit discharges of radioactive material affecting members of the public” associated with the nuclear energy industry. The EPA defines what radiation levels are acceptable for a protective cleanup of radioactive contamination at Superfund sites. It also provides guidance on the levels of radiation exposure that would trigger a mass evacuation. It is not surprising that certain stakeholders would welcome modifications in EPA assessment of low-dose radiation exposure given the high costs involved in preventing or cleaning up sites and in compensating victims of such exposure.

Recently, the National Council on Radiation Protection and Measurements (NCRP) – scientists who provide guidance and recommendations on radiation protection under a mandate from Congress – supported the LNT model. NCRP analyzed 29 epidemiological studies and found that the data was “broadly supportive” of the LNT model and that “no alternative dose-response relationship appears more pragmatic or prudent for radiation protection purposes.”

In fact, the National Academies’ Nuclear and Radiation Studies Board, the International Council on Radiation Protection, and other international bodies and regulators all use the LNT model for guidance and radiation protection.

From my perspective, as someone who has worked with radioactive sources, the EPA should be cognizant of the warning by the late Harvard sociologist Daniel Yankelovich that just because an effect can’t be easily quantified does not mean it is not important or does not exist.


October 20, 2018 Posted by | 2 WORLD, radiation, Reference | Leave a comment

The nuclear industry’s deceptive narrative about Fukushima earthquake in March 2011

The status of “Station Blackout” is a serious one.

“it will be many years before the Japanese people know exactly what happened at Fukushima Daiichi on 11 March 2011. One of the key mysteries was role, if any, the magnitude 9 earthquake played in damaging the plant’s reactor cooling systems. Until lethal levels of radiation inside the reactors fall and workers can carry out comprehensive investigations, the truth about the tremor’s impact will remain a subject of conjecture and contention”

Mr. Takamatsu states with expert authority that the pipes of cooling system ware not designed for the 50 second vibration of the magnitude quake. Barry Brook, kangaroo expert, disagrees and tells the world the quake caused no damage at Fukushima. Yet Mr. Brook must surely know the earthquake caused grid blackout. For reactors are all shut down by earthquakes. A solar plant would have kept generating until the last panel shattered. No one would have been evacuated from such a solar plant.

I submit that Prof. Barry Brook’s description of the effects of earthquake upon the Fukushima Diiachi on 11 March 2011 is totally ignorant of the facts as presented by many qualified experts and fly in the face of the independent commission set up by the Japanese Parliament (Diet). It is confirmed that expert investigators concern aspects of TEPCO’s explanations regarding the quake are “irrational”.

Thus any narrative based upon the nuclear industry view, in line with TEPCO’s may fairly be said to be “irrational”. For the industry view is that there is no possibility of quake damage to any structure or sub structure, such as coolant pipes and valves.

Earthquake Damage At Fukushima – is Industry’s Narrative Truthful or Certain? Nuclear History, 16 Oct 18 I am again going to contrast the statements made by Barry Brook in regard to the events and outcomes at Fukushima Daiichi in 2011 with the facts as presented by Mark Willacy. These facts are published in Willacy’s book, “Fukushima – Japan’s tsunami and the inside story of the nuclear meltdowns”, Willacy, M., Pan Macmillan, copyright 2013, Mark Willacy.

However, I will also include information related to the events which were first published and discussed in 2011. ………..

The earthquake generated the tsunami. What else did the earthquake cause?

In this blog I have included posts which give the IAEA considerations for the electrical grids which are connected to nuclear power plants. The IAEA states that the level of engineering and resilience built into such grids may be a significant additional cost for any nation considering generation to nuclear power.

It comes as no surprise then the electrical grid connected to the Fukushima Daiichi NPP failed for two reasons. 1. The earthquake caused all the nuclear reactors connected to the same grid to rapidly shut down. Thus the earthquake caused a blackout due to cessation of electrical generation. 2. The physical grid infrastructure – poles and wires – were damaged by the earthquake. At Fukushima this meant that more than one of the reactors was physically separated from the grid by the earthquake.

It can therefore be seen that the earthquake meant A. Fukushima Diiachi could not generate nuclear electricity as the quake had shut the reactors down. B. The Fukushima Diiachi Nuclear Power Plant was in Station Blackout for one reason: earth quake damage to nuclear infrastructure – the electrical grid. Continue reading

October 16, 2018 Posted by | Fukushima continuing, Reference, safety, secrets,lies and civil liberties, spinbuster | Leave a comment

What the IPCC Report 2018 says about nuclear power

 Nuclear energy can increase the risks of proliferation (SDG 16), have negative environmental effects (e.g., for water use, SDG 6), and have mixed effects for human health when replacing fossil fuels (SDGs 7 and 3) (see Table 5.2)   (CH 5 p 23) )
Nuclear power increases its share in most 1.5°C pathways by 2050, but in some pathways both the absolute capacity and share of power from nuclear generators declines (Table 2.15). There are large differences in nuclear power between models and across pathways (Kim et al., 2014; Rogelj et al., 2018). One of the reasons for this variation is that the future deployment of nuclear can be constrained by societal preferences assumed in narratives underlying the pathways (O’Neill et al., 2017; van Vuuren et al., 2017b). Some 1.5°C pathways no longer see a role for nuclear fission by the end of the century, while others project over 200 EJ yr–1 of nuclear power in 2100 (Figure 2.15).   CH 2

  Chapter 5 – Table 5.3    In spite of the industry’s overall safety track record, a non-negligible risk for accidents in nuclear power plants and waste treatment facilities remains. The long-term storage of nuclear waste is a politically fraught subject, with no large-scale long-term storage operational worldwide. Negative impacts from upsteam uranium mining and milling are comparable to those of coal, hence replacing fossil fuel combustion by nuclear power would be neutral in that aspect. Increased occurrence of childhood leukaemia in populations living within 5 km of nuclear power plants was identified by some studies, even though a direct causal relation to ionizing radiation could not be established and other studies could not confirm any correlation (low evidence/agreement in this issue).   Table 5.3

October 11, 2018 Posted by | 2 WORLD, climate change, Reference | Leave a comment

Vitrified nuclear waste: glass corrodes and melts long before the radioactive trash is inert

What causes nuclear waste glass to dissolve? Phys Org,   University of Houston  October 10th, 2018  

Immobilizing nuclear waste in glass logs—a process known as vitrification—is currently used in the United States to safeguard waste from sites associated with defense activities. Some other countries also use the process to capture waste from nuclear power plants.

Researchers know, however, that the glass can begin to dissolve after a long period of time, and the durability of these glass logs remains an active area of research.

Researchers from the University of Houston, the Department of Energy’s Pacific Northwest National Laboratory and the University of Pittsburgh are working on one of the most pressing issues—what causes the glass to begin to deteriorate relatively quickly at some point, potentially releasing radioactive waste at levels exceeding regulatory thresholds?……….”We have long observed from laboratory studies that zeolite formation in glass corrosion tests resulted in an increase in the glass corrosion rate,” said Neeway, a researcher at PNNL.   ………
Zeolite P, the zeolite that forms from the glass, is affected by temperature—Rimer said researchers synthesize it in the lab at 100 °C—but they don’t yet know how crystallization proceeds at lower temperatures and they don’t have methods to deter its formation. But controlling temperatures in the geologic formations designated as nuclear waste repositories is not necessarily practical, thus researchers are looking for other factors that might affect crystal growth, including components of the glass.

October 11, 2018 Posted by | Reference, safety, wastes | Leave a comment

“Transparency”- the Trump administration’s dirty trick to strangle access to reputable science on nuclear radiation  

Yes, radiation is bad for you. The EPA’s ‘transparency rule’ would be even worse.  The Trump administration wants to strangle access to reputable science. By Audra J. Wolfe, 8 Oct 18   Audra J. Wolfe is a Philadelphia-based writer, editor, and historian. She is the author of Freedom’s Laboratory: The Cold War Struggle for the Soul of Science.

Last Tuesday, a headline from the Associated Press sparked outrage in the ordinarily quiet world of science policy. The Environmental Protection Agency, the story suggested, was considering relaxing guidelines for low-dose ionizing radiation, on the theory that “a bit of radiation may be good for you.” Within hours, the AP had issued a correction. As it turned out, the EPA was not, after all, endorsing hormesis, the theory that small doses of toxic chemicals might help the body, much like sunlight triggers the production of vitamin D.

Instead, the EPA was doing something much scarier: It was holding hearings on the “Transparency Rule,” which would restrict the agency to using studies that make a complete set of their underlying data and models publicly available. The rule is similar to an “Open Science” order issued by the Interior Department last month, and incorporates language from the HONEST Act, a bill that passed in the House in 2017 but later stalled in the Senate. The HONEST Act originally required that scientific studies provide enough data that an independent party could replicate the experiment — which is simply not realistic for large-scale longitudinal studies.

Although these rules cite the need to base regulatory policy on the “best available science,” make no mistake: They aim to strangle access to reputable studies.

The Transparency Rule continues the Trump administration’s pattern of anti-science policies. The White House’s Office of Science and Technology Policy is a ghost town, with most of the major positions, including the director’s post, vacant since January 2017. Agencies and departments across the board, including the State Department and the Agriculture Department, are dropping their science advisers and bleeding scientific staff. It’s getting harder and harder for federal rulemakers to access expertise.

Understanding what’s wrong with “transparency,” at least as defined by these policies, requires a closer look at how scientists work. Let’s say you’re trying to understand the health effects of a one-time, accidental release of a toxic chemical. This incident might be epidemiologists’ only chance to investigate how this particular chemical interacts with both the air and the humans who breathe it, at varying doses, over a period of time. No matter how careful your approach, your study would fall short of the replicability standard.

You wouldn’t have baseline health information for the specific people who happened to be in the area. You might not have information on which residents had air filtration systems installed in their homes, or which residents were working outside when the incident took place. Your early results would, by definition, reflect only short-term health outcomes, rather than long-term effects. And you couldn’t replicate the study (with better controls) without endangering the health of thousands of people. In such cases, scientists have to extrapolate from existing, sometimes imperfect, data to protect the public.

Epidemiologists have community standards, including peer review, to evaluate these kinds of studies. A careful, peer-reviewed study of this hypothetical incident might well represent the “best available science” on this particular chemical. Regulators might rely on this study to establish the permissible levels of this chemical in the air we breathe. But now, let’s also say that this study took place 30 years ago. The leading scientists involved are dead, and no one kept their files. The raw data are, effectively, lost. Should scientists at the EPA be blocked from using the study?

Despite what made last week’s headlines, the EPA’s Oct. 3 hearing went beyond radiation. In fact, its lead witness, University of Massachusetts toxicologist Edward Calabrese, barely mentioned his theory of radiation hormesis. Instead, his testimony argued that the EPA should no longer rely on linear no-threshold (LNT) models for any number of hazards, including toxic chemicals and soil pollutants. In toxicology, LNT models assume that the biological effects of a given substance are directly connected to the amount of the exposure, with no minimum dose required. Radiation protections standards are based on LNT models; so are basic regulations involving ozone, particulate pollution, and chemical exposure.

The original studies asserting a LNT model for low-dose ionizing radiation were conducted in the 1950s. Like our hypothetical epidemiologist investigating a toxic chemical release, the geneticists who tried to understand the biological effects of atomic radiation were working with imperfect data, much of which is no longer available. The concept of a “comprehensive data management policy” simply did not exist in 1955. These particular studies were primarily based on survivors of the atomic bombing of Hiroshima and Japan. The scientists also extrapolated from high-dose exposure data in fruit flies and mice and from unethical high-dose experiments conducted on humans.

These studies are imperfect, but focusing on their limitations misses the broader scandal. These studies took place during the heyday of atmospheric nuclear weapons testing, an era when both the United States and the Soviet Union were pumping the atmosphere full of radioactive nucleotides. Some of the areas near the testing zones received so much radiation that they are still uninhabitable today. The tests coated the entire planet with a scrim of radiation. The Atomic Energy Commission, the agency in charge of the United States’ nuclear weapons program, didn’t even attempt to investigate the potential health effects of this constant, low-dose exposure to ionizing radiation on the world’s population. Studies of low-dose radiation were expensive, inconvenient, and politically risky, potentially jeopardizing the weapons testing program and therefore the United States’ ability to fight the Soviet Union. From the government’s perspective, it was better not to know.

This week, a sensational headline distracted us from a broader crisis. Without government support for research of environmental hazards, the public’s health is left to either the whims of industry researchers, who have a strong incentive to play down their dangers, or to public advocacy groups, which are too easily smeared with charges of anti-industry bias. The “transparency” movement supposedly resolves this crisis of authority by giving the public access to the underlying data on which science is based, but it ignores the power dynamics that determine which research questions get asked, and why and how they’re answered.

In the past, Americans looked to their federal science agencies and science advisers to resolve these sorts of disputes. But a few weeks ago, the EPA announced that it, too, would be eliminating its Office of the Science Adviser. With the science offices empty, who will decide?

There is one bright spot in all of this: On Sept. 28, bipartisan legislation authorized the Energy Department to restart its low-dose radiation research program. But what about the other pollutants that the EPA supposedly regulates? Who will produce the kinds of science deemed acceptable under the “transparency” rule?

“Transparency” has become another way to cultivate institutional ignorance. Americans deserve better from the agencies that are supposed to protect them. In the case of environmental hazards, what you don’t know can hurt you.

October 9, 2018 Posted by | radiation, Reference, secrets,lies and civil liberties, USA | Leave a comment

Genetic changes in children of soldiers who were exposed to ionising radiation

October 8, 2018 Posted by | Germany, radiation, Reference | Leave a comment

Tritium was identified as the primary culprit in damaging fetuses and mothers’ rapidly diving cells. 

October 6, 2018 Posted by | Germany, radiation, Reference | Leave a comment

Thorium Molten Salt Nuclear reactor (MSR) No Better Than Uranium Process

The safety issue is also not resolved, as stated above: pressurized water leaking from the steam generator into the hot, radioactive molten salt will explosively turn to steam and cause incredible damage.  The chances are great that the radioactive molten salt would be discharged out of the reactor system and create more than havoc.  Finally, controlling the reaction and power output, finding materials that last safely for 3 or 4 decades, and consuming vast quantities of cooling water are all serious problems.  

The greatest problem, though, is likely the scale-up by a factor of 500 to 1, from the tiny project at ORNL to a full-scale commercial plant with 3500 MWth output.   Perhaps these technical problems can be overcome, but why would anyone bother to try, knowing in advance that the MSR plant will be uneconomic due to huge construction costs and operating costs, plus will explode and rain radioactive molten salt when (not if) the steam generator tubes leak.

The Truth About Nuclear Power – Part 28, Sowells Law Blog , 14 July 2014 Thorium MSR No Better Than Uranium Process, 

Preface   This article, number 28 in the series, discusses nuclear power via a thorium molten-salt reactor (MSR) process.   (Note, this is also sometimes referred to as LFTR, for Liquid Fluoride Thorium Reactor)   The thorium MSR is frequently trotted out by nuclear power advocates, whenever the numerous drawbacks to uranium fission reactors are mentioned.   To this point in the TANP series, uranium fission, via PWR or BWR, has been the focus.  Some critics of TANP have already stated that thorium solves all of those problems and therefore should be vigorously pursued.  Some of the critics have stated that Sowell obviously has never heard of thorium reactors.   Quite the contrary, I am familiar with the process and have serious reservations about the numerous problems with thorium MSR.

It is interesting, though, that nuclear advocates must bring up the MSR process.  If the uranium fission process was any good at all, there would be no need for research and development of any other type of process, such as MSR and fusion. Continue reading

October 5, 2018 Posted by | 2 WORLD, Reference, technology, thorium | Leave a comment

Another nuclear film advertisement – “The New Fire”

Film review:  ‘The New Fire’ and the old Gen IV rhetoric  Author: Jim Green ‒ Nuclear Monitor editor NM866.4751, October 2018   The New Fire is a pro-nuclear propaganda film directed and produced by musician and film-maker David Schumacher.It’s similar in some respects to the 2013 film Pandora’s Promise.1,2 The New Fire premiere was held in October  2017 and it can be streamed online from 18 October 2018.

Promotional material claims that the film lacked “a supportive grant” (and celebrity endorsements and the backing of a major NGO) but the end-credits list numerous financial contributors: Berk Foundation, Isdell Foundation, Steven & Michele Kirsch Foundation, Rachel Pritzker, Roland Pritzker, Ray Rothrock, and Eric Uhrhane.

The film includes interviews with around 30 people (an overwhelming majority of them male) interspersed with footage of interviewees walking into buildings, and interviewees smiling. The musical underlay is a tedious drone ‒ a disappointment given Schumacher’s musical background.

A highlight is hearing Eric Meyer ‒ an opera singer turned pro-nuclear activist ‒ bursting into song at various locations around the COP21 climate conference in Paris in December

2015, while he and his colleagues handed out free copies of the pro-nuclear book Climate Gamble  Interviewees are mostly aging but the film’s main  message is that young entrepreneurs may save the  planet and its inhabitants with their Generation IV reactor projects. The film’s website states: “David Schumacher’s film focuses on how the generation facing the most severe impact of climate change is fighting back with ingenuity and hope. The New Fire tells a provocative and startlingly positive story about a planet in crisis and the young heroes who are trying to save it.”3

Schumacher writes (in the press kit): “These brilliant young people – some of the most gifted engineers of their generation, who in all likelihood could have cashed in for a fortune by doing something else – believe deeply that nuclear power could play a key role in saving the planet. And they are acting on that conviction. They did the research. They raised the money. They used cutting edge computer technology to perfect their designs. They are the new face of nuclear power, and to me, the newest and most unlikely climate heroes.”

These climate heroes are contrasted with anti-nuclear environmentalists. One interviewee says that “people of our generation are the first ones that have the opportunity to look at nuclear power without all the emotional baggage that previous generations have felt.” Another argues that anti-nuclear environmentalists are “very good, decent, smart people” but the “organizational DNA … that they have inherited is strongly anti-nuclear.” Another argues that environmental organizations “have been using nuclear power as a whipping boy for decades to raise funds”. Another interviewee attributes opposition to nuclear power to an “irrational fear of the unknown” (which surely poses a problem for the exotic Generation IV concepts promoted in the film) and another says that “once people sort of understand what’s going on withnuclear, they are much more open to it”.

The film trots out the usual anti-renewables tropes and falsehoods: 100% renewables is “just a fantasy”, renewables can contribute up to 20% of power supply and the remainder must be baseload: fossil fuels or nuclear power.

In rural Senegal, solar power has brought many benefits but places like Senegalese capital Dakar, with a population of one million, need electricity whether the sun is shining or not. A Senegalese man interviewed in the film states: “Many places in Africa definitely need a low cost, reliable, carbon neutral power plant that provides electricity 24/7. Nuclear offers one of the best options we have to do that kind of baseload.” The film doesn’t explain how a 1,000 megawatt nuclear plant would fit into Senegal’s electricity grid, which has a total installed capacity of 633MW.4 The ‘microreactors’ featured in The New Fire might help … if they existed.

Accidents such as those at Fukushima and Chernobyl get in the news because they are “so unusual” according to interviewee Ken Caldeira. And they get in the news, he might have added, because of the estimated death tolls (in the thousands for Fukushima5, ranging to tens of thousands for Chernobyl6), the costs (around US$700 billion for Chernobyl7, and US$192 billion (and counting) for Fukushima8), the evacuation of 160,000 people after the Fukushima disaster and the permanent relocation of over 350,000 people after the Chernobyl disaster.9

“Most people understand that it’s impossible for a nuclear power plant to literally explode in the sense of an atomic explosion”, an interviewee states. And most people understand that chemical and steam explosions at Chernobyl and Fukushima spread radionuclides over vast distances. The interviewee wants to change the name of nuclear power plants to avoid any conflation between nuclear power and weapons. Evidently he didn’t get the memo that the potential to use nuclear power plants (and related facilities) to produce weapons is fast becoming one of the industry’s key marketing points.

Conspicuously absent from the film’s list of interviewees is pro-nuclear lobbyist Michael Shellenberger. We’ve taken Shellenberger to task for his litany of falsehoods on nuclear and energy issues10 and his bizarre conversion into an advocate of worldwide nuclear weapons proliferation.11 But a recent article by Shellenberger on Generation IV nuclear technology is informative and insightful ‒ and directly at odds with the propaganda in The New Fire.12

So, let’s compare the Generation IV commentary in The New Fire with that in Shellenberger’s recent article.

Transatomic Power’s molten salt reactor concept The film spends most of its time promoting Generation IV reactor projects including Transatomic Power’s molten salt reactor (MSR) concept. [Ed note. recently failed and abandoned] .

Scott Nolan from venture capital firm Founders Fund says that Transatomic satisfies his four concerns about nuclear power: safety, waste, cost, proliferation. And he’s right ‒ Transatomic’s MSRs are faultless on all four counts, because they don’t exist. It’s doubtful whether they would satisfy any of the four criteria if they did actually exist.

Shellenberger quotes Admiral Hyman Rickover, who played a leading role in the development of nuclear-powered and armed submarines and aircraft carriers in the US: “Any plant you haven’t built yet is always more efficient than the one you have built. This is obvious. They are all efficient when you haven’t done anything on them, in the talking stage. Then they are all efficient, they are all cheap. They are all easy to build, and none have any problems.”

Shellenberger goes on to say:12 “The radical innovation fantasy rests upon design essentialism and reactor reductionism. We conflate the 2-D design with a 3-D design which we conflate with actual building plans which we conflate with a test reactor which we conflate with a full-sized power plant.

 “These unconscious conflations blind us to the many, inevitable, and sometimes catastrophic “unknowns” that only become apparent through the building and operating of a real world plant. They can be small, like the need for a midget welder, or massive, like the manufacturing failures of the AP1000.

“Some of the biggest unknowns have to do with radically altering the existing nuclear workforce, supply chain, and regulations. Such wholesale transformations of the actually existing nuclear industry are, literally and figuratively, outside the frame of alternative designs.

“Everyone has a plan until they get punched in the face,” a wise man once said. The debacles with the AP1000 and EPR are just the latest episodes of nuclear reactor designers getting punched in the face by reality.”

 Shellenberger comments on MSR technology:12

New designs often solve one problem while creating new ones. For example, a test reactor at Oak Ridge National Laboratory used chemical salts with uranium fuel dissolved within, instead of water surrounding solid uranium fuel. “The distinctive advantage of such a reactor was that it avoided the expensive process of fabricating fuel elements, moderator, control rods, and other high precision core components,” noted Hewlett and Holl.

 “In the eyes of many nuclear scientists and engineers these advantages made the homogeneous reactor potentially the most promising of all types under study, but once again the experiment did not reveal how the tricky problems of handling a highly radioactive and corrosive fluid were to be resolved.”

In The New Fire, Mark Massie from Transatomic promotes a “simpler approach that gives you safety through physics, and there’s no way to break physics”. True, you can’t break physics, but highly radioactive and corrosive fluids in MSRs could break and rust pipes and other machinery.

Leslie Dewan from Transatomic trots out the silliest advantage attributed to MSRs: that they are meltdown-proof. Of course they are meltdown-proof ‒ and not just in the sense that they don’t exist. The fuel is liquid. You can’t melt liquids. SMR liquid fuel is susceptible to dispersion in the event of steam explosions or chemical explosions or fire, perhaps more so than solid fuels.

Michael Short from MIT says in the film that over the next 2‒3 years they should have preliminary answers as to whether the materials in Transatomic MSRs are going to survive the problems of corrosion and radiation resistance. In other words, they are working on the problems ‒ but there’s no guarantee of progress let alone success.

Dewan claims that Transatomic took an earlier MSR design from Oak Ridge and “we were able to make it 20 times as power dense, much more compact, orders of magnitude cheaper, and so we are commercializing our design for a new type of reactor that can consume existing stockpiles of nuclear waste.”

Likewise, Jessica Lovering from the Breakthrough Institute says: “Waste is a concern for a lot of people. For a lot of people it’s their first concern about nuclear power. But what’s really amazing about it is that most of what we call nuclear waste could actually be used again for fuel. And if you use it again for fuel, you don’t have to store it for tens of thousands of years. With these advanced reactors you can close the fuel cycle, you can start using up spent fuel, recycling it, turning it into new fuel over and over again.”

But in fact, prototype MSRs and fast neutron reactors produce troublesome waste streams (even more so than conventional light-water reactors) and they don’t obviate the need for deep geological repositories. A recent article in the Bulletin of the Atomic Scientists ‒ co-authored by a former chair of the US Nuclear Regulatory Commission ‒ states that “molten salt reactors and sodium-cooled fast reactors – due to the unusual chemical compositions of their fuels – will actually exacerbate spent fuel storage and disposal issues.”13 It also raises proliferation concerns about ‘integral fast reactor’ and MSR technology:

“Pyroprocessing and fluoride volatility-reductive extraction systems optimized for spent fuel treatment can – through minor changes to the chemical conditions – also extract plutonium (or uranium 233 bred from thorium).”

Near the end of the film, it states: “Transatomic encountered challenges with its original design, and is now moving forward with an updated reactor that uses uranium fuel.” Transatomic’s claim that its ‘Waste-Annihilating Molten-Salt Reactor’ could “generate up to 75 times more electricity per ton of mined uranium than a light-water reactor” was severely downgraded to “more than twice” after calculation errors were discovered. And the company now says that a reactor based on the current design would not use waste as fuel and thus would “not reduce existing\ stockpiles of spent nuclear fuel”

So much for all the waste-to-fuel rhetoric scattered throughout The New Fire.

Michael Short from MIT claims MSRs will cost a “couple of billion dollars” and Dewan claims they will be “orders of magnitude cheaper” than the Oak Ridge experimental MSR. In their imaginations, perhaps. Shellenberger notes that “in the popular media and among policymakers, there has remained a widespread faith that what will make nuclear power cheaper is not greater experience but rather greater novelty. How else to explain the excitement for reactor designs invented by teenagers in their garages and famous software developers [Bill Gates / TerraPower] with zero experience whatsoever building or operating a nuclear plant?”12

Shellenberger continues:12

Rather than address the public’s fears, nuclear industry leaders, scientists, and engineers have for decades repeatedly retreated to their comfort zone: reactor design innovation. Designers say the problem isn’t that innovation has been too radical, but that it hasn’t been radical enough. If only the coolant were different, the reactors smaller, and the building methods less conventional, they insist, nuclear plants would be easier and cheaper to build.

“Unfortunately, the historical record is clear: the more radical the design, the higher the cost. This is true not only with the dominant water-cooled designs but also with the more exotic designs ‒ and particularly sodium-cooled ones.”

Oklo’s sodium-cooled fast neutron microreactor The New Fire promotes Oklo’s sodium-cooled fast neutron microreactor concept, and TerraPower’s sodium cooled fast neutron ‘traveling wave’ reactor (TerraPower is also exploring a molten chloride fast reactor concept).

Oklo co-founder Jacob DeWitte says: “There’s this huge, awesome opportunity in off-grid markets, where they need power and they are relying on diesel generators … We were talking to some of these communities and we realized they use diesel because it’s the most energy dense fuel they know of. And I was like, man, nuclear power’s two million times as energy dense … And they were like, ‘Wait, are you serious, can you build a reactor that would be at that size?’ And I said, ‘Sure’.”

Which is all well and good apart from the claim that Oklo could build such a reactor: the company has a myriad of economic, technological and regulatory hurdles to overcome. The film claims that Oklo “has begun submission of its reactor’s license application to the [US] Nuclear Regulatory Commission” but according to the NRC, Oklo is a “pre-applicant” that has gone no further than to notify the NRC of its intention to “engage in regulatory interactions”.16

There’s lots of rhetoric in the film about small reactors that “you can roll … off the assembly line like Boeings”, factory-fabricated reactors that “can look a lot like Ikea furniture”, economies of scale once there is a mass market for small reactors, and mass-produced reactors leading to “a big transition to clean energy globally”. But first you would need to invest billions to set up the infrastructure to mass produce reactors ‒ and no-one has any intention of making that investment. And there’s no mass market for small reactors ‒ there is scarcely any market at all.17

TerraPower   TerraPower is one step ahead of Transatomic and Oklo ‒ it has some serious funding. But it’s still a long way off ‒ Nick Touran from TerraPower says in the film that tests will “take years” and the company is investing in a project with “really long horizons … [it] may take a very long time”.

TerraPower’s sodium-cooled fast neutron reactor remains a paper reactor. Shellenberger writes:12

“In 2008, The New Yorker profiled Nathan Myhrvold, a former Microsoft executive, on his plans to re-invent nuclear power with Bill Gates. Nuclear scientist Edward “Teller had this idea way back when that you could make a very safe, passive nuclear reactor,” Myhrvold explained. “No moving parts. Proliferation-resistant. Dead simple.”

“Gates and Myhrvold started a company, Terrapower, that will break ground next year in China on a test reactor. “TerraPower’s engineers,” wrote a reporter recently, will “find out if their design really works.”

“And yet the history of nuclear power suggests we should have more modest expectations. While a nuclear reactor “experiment often produced valuable clues,” Hewlett and Holl wrote, “it almost never revealed a clear pathway to success.” …

“For example, in 1951, a reactor in Idaho used sodium rather than water to cool the uranium ‒ like Terrapower’s design proposes to do. “The facility verified scientific principles,” Hewlett and Holl noted, but “did not address the host of extraordinary difficult engineering problems.” …

“Why do so many entrepreneurs, journalists, and policy analysts get the basic economics of nuclear power so terribly wrong? In part, everybody’s confusing nuclear reactor designs with real world nuclear plants. Consider how frequently advocates of novel nuclear designs use the future or even present tense to describe qualities and behaviors of reactors when they should be using future conditional tense.

“Terrapower’s reactor, an IEEE Spectrum reporter noted “will be able to use depleted uranium … the heat will be absorbed by a looping stream of liquid sodium … Terrapower’s reactor stays cool”.

 “Given that such “reactors” do not actually exist as real world machines, and only exist as computer-aided designs, it is misleading to claim that Terrapower’s reactor “will” be able to do anything. The appropriate verbs for\ that sentence are “might,” “may,” and “could.” …

“Myhrvold expressed great confidence that he had proven that Terrapower’s nuclear plant could run on nuclear waste at a low cost. How could he be so sure? He had modeled it. “Lowell and I had a month-long, no-holdsbarred nuclear-physics battle. He didn’t believe waste would work. It turns out it does.” Myhrvold grinned. “He concedes it now.”

 “Rickover was unsparing in his judgement of this kind of thinking. “I believe this confusion stems from a failure to distinguish between the academic and the practical,” he wrote. “The academic-reactor designer is a dilettante. He has not had to assume any real responsibility in connection with his projects. He is free to luxuriate in elegant ideas, the practical shortcomings of which can be relegated to the category of ‘mere technical details.””

October 1, 2018 Posted by | 2 WORLD, Reference, spinbuster, technology | 5 Comments

On September 26, 1983, Stanislav Petrov saved the world

September 28, 2018 Posted by | depleted uranium, history, PERSONAL STORIES, politics international, Reference, Religion and ethics | Leave a comment

Debunking the claims about generation IV nuclear waste

Generation IV nuclear waste claims debunked, Nuclear Monitor 24 Sept 18   Lindsay Krall and Allison Macfarlane have written an important article in the Bulletin of the Atomic Scientists debunking claims that certain Generation IV reactor concepts promise major advantages with respect to nuclear waste management. Krall is a post-doctoral fellow at the George Washington University. Macfarlane is a professor at the same university, a former chair of the US Nuclear Regulatory Commission from July 2012 to December 2014, and a member of the Blue Ribbon Commission on America’s Nuclear Future from 2010 to 2012.

Krall and Macfarlane focus on molten salt reactors and sodium-cooled fast reactors, and draw on the experiences of the US Experimental Breeder Reactor II and the US Molten Salt Reactor Experiment.

The article abstract notes that Generation IV developers and advocates “are receiving substantial funding on the pretense that extraordinary waste management benefits can be reaped through adoption of these technologies” yet “molten salt reactors and sodium-cooled fast reactors – due to the unusual chemical compositions of their fuels – will actually exacerbate spent fuel storage and disposal issues.”

Here is the concluding section of the article: Continue reading

September 28, 2018 Posted by | 2 WORLD, Reference, technology | Leave a comment

The relative hazards of nuclear fuel in reactor cores, spent fuel pools, and dry storage

September 28, 2018 Posted by | 2 WORLD, Reference, wastes | Leave a comment

Facing up to the reality of nuclear wastes: it requires longterm continuing stewardship

Gordon Edwards: Nuclear Waste Mismanagement


A conversation with Dr. Gordon Edwards: contemporary issues in the Canadian nuclear industry, and a look back at the achievements of the Canadian Coalition for Nuclear Responsibility (CCNR), Montreal, August 25, 2018,   Nuclear waste management: an exercise in cynical thinking., 24 Sept 2018 “…….. Proliferation of thousands of non-naturally occurring radioactive isotopes

Our organization has come to the conclusion that these wastes did not exist seventy-five years ago. It’s only in the last 70 some years that these wastes have been produced, and there are thousands of human-made radioactive materials, in addition to the couple of dozen radioactive materials that exist in nature. There are naturally occurring radioactive materials, but the difference is most of the existing radioactive materials are different chemical species from the non-radioactive materials. You can separate them chemically. Uranium, thorium, radium and so on are different chemical species than normal non-radioactive atoms.

In a nuclear power plant what you’re-creating is hundreds and hundreds of radioactive varieties of otherwise non-radioactive materials. Non-radioactive iodine is now contaminated with radioactive iodine. Non-radioactive cesium is contaminated with radioactive cesium—non-radioactive strontium and so on. And the result is that once these things are blended together, the radioactive and the non-radioactive, you can’t separate them anymore. It is an impossible task to separate out the radioactive from the non-radioactive once you have created duplicates of virtually every element in the periodic table of a radioactive variety.

15. Rolling stewardship

So we feel that for the foreseeable future, and that means for however long it takes, 100 years 200 years or more, we should not fool ourselves into thinking we have a solution. We should adopt a policy of rolling stewardship which means that we have to keep these things under constant surveillance, constant monitoring and they must be retrievable, and they must be guarded, and they must also have a built-in mechanism, a social mechanism, for ensuring that there is funding and knowledge and resources and tools available to future generations so that they can, in fact, know what these wastes are, that they can monitor them, and that they can take corrective measures when things start going wrong, and that they can improve the containment so that this is not just a status quo.

This is not an idea of just leaving it where it is and ignoring it. On the contrary, it’s an active involvement, an active engagement to continually improve the storage of these materials because we know how to do this. We know how to store the materials in such a way that they do not get out into the environment, and we can do this for periods of decades or even centuries, depending on the circumstances.

We feel that this is the policy that we should be following, not that this is an acceptable long-term solution, either, but it is something that can be managed over an intergenerational period of time indefinitely. The point here is that rather than abandoning the waste, which is what the industry now wants to do…

And by the way, it’s not only industry that wants to abandon the waste. It’s also the regulatory agency because the regulatory agency wants to also cut its liability. They don’t want to have to look after or be responsible for these wastes beyond a certain point in time. So they have a conflict of interest. Institutionally, they have an interest in abandoning the waste and saying it’s not our problem anymore. Any problems that are caused are your problem, not ours. Unfortunately, the people who are more likely to suffer the consequences of major leakage or major failure of containment will not have the resources or the knowledge. So abandonment actually presupposes amnesia. It means that you’re saying that we’re just going to forget it, and that means that when these things do come back to the surface, if they do, and do contaminate surface waters and food paths and so on, nobody knows anymore. It’s a question of rediscovering what these materials are, how we contain them, and so on.

So we feel that rolling stewardship is a more responsible approach and that entails really admitting that we don’t have a solution, and admitting that we should stop producing the waste. One of the reasons why we continue to produce this waste is because we are continually being presented with a dangled carrot, with the idea that the solution is just around the corner, and that we’re working on the solution. As long as we’re working on the solution, how can you possibly object to us just continuing?

So we feel that that’s the fundamental track of the nuclear power dilemma and that somehow we have to wake people up to this and make them realize that this is not leading us to a sustainable future. It’s leading us quite in the opposite direction……..

September 26, 2018 Posted by | 2 WORLD, Reference, wastes | 1 Comment

Don’t miss this conversation with Dr Gordon Edwards – about Canada’s nuclear wastes

A conversation with Dr. Gordon Edwards: contemporary issues in the Canadian nuclear industry, and a look back at the achievements of the Canadian Coalition for Nuclear Responsibility (CCNR), Montreal, August 25, 2018, DIANUKE.ORG, SEPTEMBER 24, 2018 

  1. Nuclear waste management: an exercise in cynical thinking.
  2. Private solutions for public problems.
  3. Early days: ignorance about nuclear waste.
  4. Belated realization of the problem.
  5. Barbaric plans for nuclear waste.
  6. In situ abandonment of nuclear facilities.
  7. Wrong people in charge, telling rather than consulting.
  8. The next big thing: unfeasible small modular reactors.
  9. The elusive “willing host community.”
  10. The great unknowable: long term care for nuclear waste. Who pays? Who cares?.
  11. A disturbed “undisturbed” geological formation is no longer undisturbed.
  12. Six hundred Lake Superiors needed to dilute nuclear waste to a safe level.
  13. No solution assumed.
  14. Proliferation of thousands of non-naturally occurring radioactive isotopes.
  15. Rolling stewardship.
  16. Opportunity costs of sticking with nuclear energy.
  17. Convenient disposal of a problem, no disposal of nuclear materials.
  18. What to expect from media and politicians.
  19. Victories.
  20. Cross-border activism for environmental protection.
  21. High, medium or low-level waste: similar ingredients in all of them.
  22. About the CCNR.
  23. Demystifying nuclear energy.
  24. Nuclear moratoria.
  25. Public hearings are a waste of time.
  26. Old nuclear plants are living on borrowed time.
  27. “I would do what I’m doing regardless whether it was effective or not.”
  28. Activism as scientific method: try it and see what happens.
  29. Being a conservative radical.
  30. The all-important nuclear weapons question.
  31. Propaganda battle over the film No act of God.
  32. The slowpoke journal: the short, lonely life of a district heating reactor.………….

September 26, 2018 Posted by | Reference | Leave a comment

Authorities deceive the public on radiation from Fukushima Daiichi

Dr Yamashita is only one among a host of politicians, bureaucrats, experts and advertising and media consultants who support the post-3.11 safety mantra of anshin (secure 安心), anzen (safe 安全), fukkō (recovery 復 興). Through public meetings, media channels, education manuals and workshops,54 local citizens in Fukushima Prefecture were inundated with optimistic and reassuring messages.
At the same time, to reduce ‘radiophobia’ and anxiety, while focusing on the psychological impact from stress, health risks from radiation exposures have been trivialised and/or normalised for the general public.
This approach is backed up by international nuclear-related agencies. As stipulated on 28 May 1959 in the ‘WHA12-40’ agreement, the WHO is mandated to report all data on health effects from radiation exposures to the IAEA, which controls publication.
Nevertheless, it is no longer possible to ignore a significant body of research, including 20 years of scientific studies compiled in Belarus and Ukraine that show serious depopulation, ongoing illnesses and state decline.

Informal Labour, Local Citizens and the Tokyo Electric Fukushima Daiichi Nuclear Crisis: Responses to Neoliberal Disaster Management Adam Broinowski {extensive footnotes and references on original]  September 2018, “……… (Official Medicine: The (Il)logic of Radiation Dosimetry On what basis have these policies on radiation from Fukushima Daiichi been made? Instead of containing contamination, the authorities have mounted a concerted campaign to convince the public that it is safe to live with radiation in areas that should be considered uninhabitable and unusable according to internationally accepted standards. To do so, they have concealed from public knowledge the material conditions of radiation contamination so as to facilitate the return of the evacuee population to ‘normalcy’, or life as it was before 3.11. This position has been further supported by the International Atomic Energy Agency (IAEA), which stated annual doses of up to 20 mSv/y are safe for the total population including women and children.43 The World Health Organisation (WHO) and United Nations Scientific Commission on the Effects of Atomic Radiation (UNSCEAR) also asserted that there were no ‘immediate’ radiation related illnesses or deaths (genpatsu kanren shi 原発 関連死) and declared the major health impact to be psychological.

While the central and prefectural governments have repeatedly reassured the public since the beginning of the disaster that there is no immediate health risk, in May 2011 access to official statistics for cancer-related illnesses (including leukaemia) in Fukushima and southern Miyagi prefectures was shut down. On 6 December 2013, the Special Secrets Protection Law (Tokutei Himitsu Hogo Hō 特定秘密保護法) aimed at restricting government employees and experts from giving journalists access to information deemed sensitive to national security was passed (effective December 2014). Passed at the same time was the Cancer Registration Law (Gan Tōroku Hō 癌登録法), which made it illegal to share medical data or information on radiation-related issues including evaluation of medical data obtained through screenings, and denied public access to certain medical records, with violations punishable with a 2 million yen fine or 5–10 years’ imprisonment. In January 2014, the IAEA, UNSCEAR and Fukushima Prefecture and Fukushima Medical University (FMU) signed a confidentiality agreement to control medical data on radiation. All medical personnel (hospitals) must submit data (mortality, morbidity, general illnesses from radiation exposures) to a central repository run by the FMU and IAEA.44 It is likely this data has been collected in the large Fukushima Centre for Environmental Creation, which opened in Minami-Sōma in late 2015 to communicate ‘accurate information on radiation to the public and dispel anxiety’. This official position contrasts with the results of the first round of the Fukushima Health Management Survey (October 2011 – April 2015) of 370,000 young people (under 18 at the time of the disaster) in Fukushima prefecture since 3.11, as mandated in the Children and Disaster Victims Support Act (June 2012).45 The survey report admitted that paediatric thyroid cancers were ‘several tens of times larger’ (suitei sareru yūbyōsū ni kurabete sūjūbai no ōdā de ōi 推定される有病数に比べて数十倍の オーダーで多い) than the amount estimated.46 By 30 September 2015, as part of the second-round screening (April 2014–March 2016) to be conducted once every two years until the age of 20 and once every five years after 20, there were 15 additional confirmed thyroid cancers coming to a total of 152 malignant or suspected paediatric thyroid cancer cases with 115 surgically confirmed and 37 awaiting surgical confirmation. Almost all have been papillary thyroid cancer with only three as poorly differentiated thyroid cancer (these are no less dangerous). By June 2016, this had increased to 173 confirmed (131) or suspected (42) paediatric thyroid cancer cases.47

The National Cancer Research Center also estimated an increase of childhood thyroid cancer by 61 times, from the 2010 national average of 1–3 per million to 1 in 3,000 children. Continue reading

September 22, 2018 Posted by | Fukushima continuing, radiation, Reference, secrets,lies and civil liberties, spinbuster | 4 Comments