Nuclear Energy Is Dirty, Unsafe And Uneconomic: Environmental Scientist https://newmatilda.com/2015/02/21/nuclear-energy-dirty-unsafe-and-uneconomic-environmental-scientist by Dr Dr Mark Diesendorf , Associate Professor and Deputy Director within the Institute of Environmental Studies at the University of NSW………there seem to be three shaky legs upon which proponents attempt to stand their campaign to expand nuclear energy:
1. Nuclear energy has allegedly no or low greenhouse gas emissions.
2. New nuclear reactor technologies are allegedly safer than the present generation of reactors.
3. New and existing reactors are allegedly cheaper than other low-carbon technologies, notably renewable energy.
Let’s examine these claims.
1. Green House Gas emissions
Neither nuclear energy nor most renewable technologies emit carbon dioxide during operation. However, to do a meaningful comparison, we must compare the whole life-cycles from mining the raw materials to managing the wastes. In a peer-reviewed journal paper published in 2008, nuclear physicist and nuclear energy supporter Manfred Lenzen compared life-cycle emissions from nuclear, wind and natural gas power stations.
For nuclear energy based on mining high-grade uranium ore, he found average emissions of 60 grams of carbon dioxide per kilowatt-hour (g/kWh) of electricity generation, for wind 10–20 g/kWh and for gas 500–600 g/kWh. Now comes the part that most nuclear proponents try to ignore or misrepresent.
The world has only a few decades of high-grade uranium ore reserves left. As the ore-grade inevitably declines, the fossil fuel used to mine and mill uranium increases and so do the resulting greenhouse gas emissions.
Lenzen calculates the life-cycle greenhouse gas emissions when low-grade uranium ore is used to be 131 g/kWh. This is unacceptable in terms of climate science, especially taking into account that Lenzen’s analysis favoured nuclear energy by assuming that mountains of radioactive uranium mine waste are left to blow in the wind for thousands of years.
2. New reactor technologies Continue reading
Bjorn Lomborg Think Tank Funder Revealed As Billionaire Republican ‘Vulture Capitalist’ Paul Singer DESMOGBLOG, GRAHAM READFEARN, 9 FEB 15, A billionaire “vulture capitalist” and major backer of the US Republican Party is a major funder of the think tank of Danish climate science contrarian and fossil fuels advocateBjørn Lomborg, DeSmogBlog has found.
New York-based hedge fund manager Paul Singer’s charitable foundation gave $200,000 to Lomborg’s Copenhagen Consensus Center (CCC) in 2013, latest US tax disclosures reveal.
The grant to Lomborg’s think tank is revealed in the tax form of the Paul E. Singer Foundation covering that foundation’s activities between December 2012 and November 2013.
Singer, described as a “passionate defender of the 1%”, has emerged as a major force in the Republican party in recent years and was a key backer and influencer during Mitt Romney’s failed tilt at the Presidency.
The $200,000 grant represented almost one third of the $621,057 in donations declared by the Copenhagen Consensus Center in 2013……..
Lomborg, a Danish political scientist, is often cited on lists of the world’s most influential people.
He writes extensively on climate change and energy issues with his columns appearing in many of the world’s biggest news outlets.
The CCC think tank produces reports that consistently argue that cutting greenhouse gas emissions and increasing the roll-out of current renewable energy technologies should be low priorities for policy makers.
Most recently, Lomborg wrote a column for the Wall Street Journal arguing climate change was not the urgent problem that many thought.
He wrote that “the narrative that the world’s climate is changing from bad to worse is unhelpful alarmism”.
Lomborg argues the poorest countries need fossil fuels to lift themselves out of poverty – a position that gained support from the world’s richest man, Bill Gates.
At a G20 side event in Brisbane last year, Lomborg appeared at an event sponsored by the world’s largest private coal company, Peabody Energy, where he again argued that the world’s poor needed fossil fuels.
The CCC’s keystone project is the Post 2015 Consensus that is trying to influence the formulation of the next set of global development goals being discussed by the United Nations. Those goals will replace the millennium development goals.
Lomborg’s CCC think tank was registered as a not-for-profit in the US in 2008 and has attracted almost $5 million in donations since then. In 2013, the CCCpaid Lomborg, its founder and president, $200,484 for his work. The previous year Lomborg was paid $775,000……
he discovery of support from Paul Singer comes after a DeSmogBlog investigation last year found that CCC’s early funders included conservative think tanks with links to the network of organisations funded by the Koch brothers, who have pushed millions into organisations denying climate science and blocking action to cut fossil fuel emissions.
In the 2014 US political spending cycle, data presented by OpenSecrets shows Singer spent $9.4 million influencing Republicans – the biggest disclosed individual spender on the conservative side of US politics.
Singer, whose Elliott Management hedge fund manages about $25 billion in assets, has been branded a “vulture capitalist” enterprise due to investment strategies employed by his firm that targets foreign economies in trouble……
As well as the generosity shown to Bjorn Lomborg’s think tank, Singer’s foundation gave $500,000 to the Manhattan Institute for Policy Research, where Singer is chairman of the board of trustees.
The Manhattan Institute is also known for downplaying the impacts of climate change while promoting fossil fuels.
In October 2014, Manhattan senior fellow Robert Bryce wrote a report Not Beyond Coal arguing that the future for the coal industry was bright and the fossil fuel was “essential” for addressing poverty in developing countries — a position identical to that pushed by Lomborg.
Bryce also attacks the wind industry claiming it cannot cut emissions, describing wind turbines as “climate change scarecrows”. In testimony to theUS Senate Environment and Public Works Committee in February 2014, Bryce said wind turbines were “slaughtering wildlife” ………http://www.desmogblog.com/2015/02/09/exclusive-bjorn-lomborg-think-tank-funder-revealed-billionaire-republican-vulture-capitalist-paul-singer
– Climate Denial Crock of the Week, Climate Crocks, with Peter Sinclair Thanks Dr. Evil! Fossil Fuel Propaganda Misfire Goes Viral
February 12, 2015 Every once in a while we can pull back the curtain and get a good look at the evil elves and Madison Avenue Orcs deployed by the fossil fuel barons. Look hard, climate deniers. This is the man pulling your strings.
Posted by a front group called the “Environmental Policy Alliance”, this corporate forged “viral” video popped up a couple days ago. Had to check and make sure this wasn’t a joke, but it’s real. …..
Big Green Radicals is a front group operated by the PR firm Berman & Co. Berman & Co. operates a network of dozens of front groups, attack-dog web sites, and alleged think tanks that work to counteract minimum wage campaigns, keep wages low for restaurant workers, and to block legislation on food safety, secondhand cigarette smoke, drunk driving, and more.
Big Green Radicals describes itself as “a project of the Environmental Policy Alliance (EPA), which exists to educate the public about the real agenda of well-funded environmental activist groups” according its website. “The EPA receives support from individuals, businesses, and foundations.”
Richard Berman is the type of corporate hit man that Aaron Eckhart played in “Thank You For Smoking” – amoral, vicious, and dishonest. PR guys like him usually don’t make the headlines, preferring to remain the man behind the curtain – but a few months ago he showed up in the New York Times, because recommendations he made in a presentation were so vile and offensive that even members of the oil industry audience were disgusted.
If the oil and gas industry wants to prevent its opponents from slowing its efforts to drill in more places, it must be prepared to employ tactics like digging up embarrassing tidbits about environmentalists and liberal celebrities, a veteran Washington political consultant told a room full of industry executives in a speech that was secretly recorded.
The blunt advice from the consultant, Richard Berman, the founder and chief executive of the Washington-based Berman & Company consulting firm, came as Mr. Berman solicited up to $3 million from oil and gas industry executives to finance an advertising and public relations campaign called Big Green Radicals.
The company executives, Mr. Berman said in his speech, must be willing to exploit emotions like fear, greed and anger and turn them against the environmental groups. And major corporations secretly financing such a campaign should not worry about offending the general public because “you can either win ugly or lose pretty,” he said.
“Think of this as an endless war,” Mr. Berman told the crowd at the June event in Colorado Springs, sponsored by the Western Energy Alliance, a group whose members include Devon Energy, Halliburton and Anadarko Petroleum, which specialize in extracting oil and gas through hydraulic fracturing, also known as fracking. “And you have to budget for it.”
What Mr. Berman did not know — and what could now complicate his task of marginalizing environmental groups that want to impose limits on fracking — is that one of the energy industry executives recorded his remarks and was offended by them.
“That you have to play dirty to win,” said the executive, who provided a copy of the recording and the meeting agenda to The New York Times under the condition that his identity not be revealed. “It just left a bad taste in my mouth.”
Speaking of bad taste, “60 Minutes” profiled Berman as an attack dog for the purveyors of poisonous junk food, and he was proud enough of that to post it on his own Youtube channel,
Berman was paid well by Philip Morris (PM)…. has worked for companies that privatize the profits and socialize the costs. He attacked fine scientists like Steve Schneider (Stanford) and Stan Glantz (UCSF)……….
The latest zombie climate myth to rise from the dead involves the oldest form of global warming denial. It’s a conspiracy theory that the Earth isn’t really warming; rather, fraudulent climate scientists are “fiddling” with the data to introduce a false warming trend…..
In reality climate scientists process the raw temperature data for very good reasons. Sometimes temperature monitoring station locations move. Sometimes the time of day at which they’re read changes. Sometimes changes are made to the instruments themselves. In each case, if adjustments aren’t made, then biases will be included in the data that don’t reflect actual changes in temperatures.
Richard Muller at UC Berkeley was skeptical that climate scientists were doing all these adjustments correctly, so he assembled the Berkeley Earth Surface Temperature (BEST) team to check the data for themselves. The biggest initial financial contribution to the project came from the Koch brothers.
As Muller discusses in the video below [see original article] , his team confirmed that the Earth’s surface temperatures are warming. In fact, BEST finds that NASA, NOAA, and the UK Met Office have slightly underestimated the warming over the past 15 years…..
This particular conspiracy theory is an old one, but it’s easy to understand its origins. Certain groups have an ideological opposition to the government policies that would solve the global warming problem. If the problem doesn’t exist because scientists are fudging the data, then voilà, those distasteful policies aren’t necessary.
Global warming denial can usually be traced back to this sort of ideological bias. That’s why contrarian attempts at scientific arguments like Booker’s are so poor,contradictory, and transparently wrong. These myths are just a means to an end; that end being the opposition to climate policies. Any argument that seems to justify that climate opposition will suffice, no matter how flimsy.
Unfortunately, the problem we face is a real one. Scientists only make adjustments to the data where they’re scientifically justified. The accuracy of those adjustments has been confirmed over and over and over again. And the adjustments slightly reduce the long-term global warming trend. Moreover, even if you distrust it, “fiddling” with data doesn’t make ice melt or sea levels rise. Nature’s thermometers register global warming too…….
As a society we’ve usually been smart enough to acknowledge the dangers we face and take action to mitigate them, even with environmental threats. When people resort to conspiracy theories and slip into denial, it’s time to stop listening to them and instead look for serious voices who are trying to find palatable solutions to the problem. http://www.theguardian.com/environment/climate-consensus-97-per-cent/2015/feb/11/fiddling-with-global-warming-conspiracies-while-rome-burns
Climate change is the one massive and unrelentlingly growing threat to life on this planet. However the threat from nuclear war and nuclear accidents is an equal threat, and could even bring rapid climate change.
Some new converts to the idea of climate change are the proponents of the nuclear industry, who claim, (quite incorrectly) to have the cure for climate change. In fact, to go down the “nuclear power cure” route, is to give the fossil fuel industries more time, while we all wait for this spurious cure
“Trying to solve climate change with nuclear is like trying to solve world hunger with caviar,” he said
Straight.com. Peter Dykstra, 9 Feb 15 “…In recent years, some major science and environmental players have come forward to endorse nuclear power. Former EPA administrator and Obama climate czar Carol Browner is one of the glitziest.
Browner signed up for the newest and shiniest effort to sell nuke plants, the year-old Nuclear Matters, founded by electric giant Exelon in 2014.
Nuclear Matters is run by public relations agency Sloane & Associates. Critics call it a nuclear front group, but Sloane prefers to bill it as “starting a national conversation on nuclear power,” and adds that other utilities, nuke builders and suppliers have joined Exelon as sponsors.
The group recruited several other bipartisan political heavyweights as paid spokespeople but none that are catnip for the environmental community, where opposition to nuclear power is the rule, not the exception.
So when Nuclear Matters hauled in Browner as a spokesperson of its Leadership Council last year, she was a big catch.
Browner said she typically devotes a few hours a week to Nuclear Matters and is compensated for her time, but neither she nor Nuclear Matters will discuss her fee. Continue reading
My articles over the past three months have covered the failure of nuclear advocates to make much progress with gaining public acceptance over the past few years, with the prime need now to undertake a serious effort to gain better public understanding…
…….There remains one piece in the jigsaw and that is to abandon climate change as a prime argument for supporting a much higher use of nuclear power to satisfy rapidly-rising world power needs…….
We have seen no nuclear renaissance (instead, a notable number of reactor closures in some countries, combined with strong growth in China) the story has not changed very much. The 2014 edition of the International Energy Agency’s World Energy Outlook shows nuclear playing a small but indispensable part in those scenarios maintaining greenhouse gas emissions at much lower and environmentally safer levels to 2030 and beyond. ……….
The International Atomic Energy Agency has also just released the 2014 edition of its publication Climate Change and Nuclear Power which addresses the perceived need for a lot more nuclear power for this reason, together with the range of issues which inevitably surround this transition.
The problem is that the hoped-for process is not working. Countries such as Germany and Switzerland that claim environmental credentials are moving strongly away from nuclear. Even with rapid nuclear growth in China, nuclear’s share in world electricity is declining. The industry is doing little more than hoping that politicians and financiers eventually see sense and back huge nuclear building programmes. On current trends, this is looking more and more unlikely. The high and rising nuclear share in climate-friendly scenarios is false hope, with little in the real outlook giving them any substance.
Far more likely is the situation posited in the World Nuclear Industry Status Report covered in September’s article (September 2014, ‘The world nuclear industry – is it in terminal decline?’). Although this report is produced by anti-nuclear activists, its picture of the current reactors gradually shutting down with numbers of new reactors failing to replace them has more than an element of truth given the recent trends………
….The nuclear industry giving credence to climate change from fossil fuels has simply led to a stronger renewables industry. Nuclear seems to be “too difficult” and gets sidelined – as it has within the entire process since the original Kyoto accords. And now renewables, often thought of as useful complements to nuclear, begin to threaten it in power markets when there is abundant power from renewables when the wind blows and the sun shines.
Climate change is also an issue now seemingly irretrievably linked to some combination of higher taxes and prices, bigger and more intrusive government intervention, lower economic growth, and less disposable income. The nuclear sector doesn’t want to be associated with any of this. ……Nuclear should not be cosying up to anything that costs money. It should promote itself as inherently cheap energy, vital for economic growth…..http://www.neimagazine.com/opinion/opinionis-climate-change-the-worst-argument-for-nuclear-4493537/
AZ Senate Committee Says Nuclear Power Is a Renewable Energy Resource http://blogs.phoenixnewtimes.com/valleyfever/2015/02/senate_committee_declares_nuclear_power_a_renewable_energy_resource.php By Miriam Wasser Tue., Feb. 3 2015Arizona is one step closer to officially declaring nuclear power a renewable-energy source. (Yes, you read that correctly.)
The Senate Committee on Water and Energy narrowly passed SB 1134, a bill that classifies “nuclear energy from sources fueled by uranium fuel rods that include 80 percent or more of recycled nuclear fuel and natural thorium reactor resources under development” to be a renewable-energy source.
Environmentalists are not happy, and frankly, no one who cares about linguistics should be either.
A renewable resource doesn’t get depleted with use: the sun keeps shining if we harvest solar power, the wind keeps blowing if we erect turbines, the earth keeps producing heat if we harness geothermal power. But nuclear?
Sandy Bahr, chapter director of the Arizona Sierra Club, said “the very nature of mining”–which must be done to get nuclear material–“is that you are depleting a resource.” Thus, nuclear energy cannot be called renewable.
As it stands now, the Arizona Administrative Code R14-2-1801 says nuclear and fossil fuels are not renewable resources. But Senator Steve Smith, a Republican from District 23, and the main sponsor of SB 1134, would like that to be changed.
He told the committee that by not recycling nuclear fuel rods like some European countries do, Arizona is missing out on a lot of potential energy. “Basically we just want to burn that energy twice,” he said, and should Arizona decide to incorporate that technology in the future, this bill would allow us to count that as a renewable energy source.
After a handful of citizens and energy industry officials spoke for or against the bill, it was in the hands of the committee. Senator Lynne Pancrazi said she considers nuclear an “alternative energy,” but “can’t agree that nuclear is renewable;” Senator David Bradley said he “[appreciates] the fact that technology is allowing us to use rods a few times, but that doesn’t make it a renewable;” and Senator Sylvia Allen said they could argue back and forth about the definitions of renewable and recyclable, but that it isn’t the point of the bill.”
What is the point of the bill, then, Senator Allen?
In the end, SB 1134 passed by one vote. “Luckily, it has a ways to go,” Bahr told New Times.
The bill still needs to get through the rules committee, the Senate, and the House of Representatives, which rejected a similar bill last year. “We’ll see about this year, though” she said.
After the committee session, New Times caught up with Senator Smith and asked about his bill. What, for instance, is the difference between recyclable resources and renewable ones. He paused for moment, and then smiled. When it comes to nuclear materials, he said,”we have so much that can be reused that it’s almost renewable!”
Is climate change the worst argument for nuclear? Nuclear Engineering International 21 January 2015 Jumping on the environmental bandwagon may not be the best choice for the nuclear industry….. By Steve Kidd
While it is true that some previously anti-nuclear activists and advocates have moved over to the nuclear side on account of their new conviction that nuclear is essential to curb climate change, these are very uncomfortable bedfellows.
They are likely to do as much damage to the nuclear case as good. The industry has hailed the recent “Pandora’s Promise” movie, but the five new nuclear disciples look rather like enemy turncoats in a war-time propaganda movie, trying to urge their former colleagues also to “see the light”. Why, after so many years of being “wrong”, should anyone have faith in the new (and apparently deeply-held) convictions of these people? Will they not change their minds again once the wind changes?
Why on earth would one cosy up to the very people who killed your market in the first place because their foolish advocacy led to much higher costs? Their general lack of soundness is invariably amplified by attaching themselves to next generation reactor technologies, thorium or whatever. …….
The other issue with those who belatedly come to endorse nuclear is that it becomes a “last resort” technology. Once everything else has been tried and found lacking, we simply have to use nuclear, or the world will risk coming to an end. Even though they still believe that nuclear has the same host of problems, they also now believe we need it badly. But this won’t work for one minute. As soon as anything goes wrong, the support of these people will melt away. Nuclear needs a strong positive endorsement from supporters who recognise that the arguments marshalled against it were always phony…..http://www.neimagazine.com/opinion/opinionis-climate-change-the-worst-argument-for-nuclear-4493537/
Nuclear power additions ‘need to quadruple’ to hit climate goals, IEA says, The Carbon Brief 31 Jan 2015, 14:50 Simon Evans “………Governments can choose whether to support new nuclear or not, the IEA says. They could finance guarantees, as well as reviewing electricity market arrangements. The UK has done both, through its electricity market reforms and fixed-price contracts for nuclear power.
The nuclear industry needs to show it can deliver projects on time and within budget so that these financing costs can be reduced, the IEA says. It says new nuclear plants should cost around £3.8 billion per gigawatt in Europe. The UK’s Hinkley C plant is expected to cost almost that, partly because of the costs of borrowing money to finance the scheme.
Existing nuclear plants will also need to stay open for longer as part of the 2050 roadmap, which depends on further research and investment. The IEA sees plants operating for up to 60 years or more. Nuclear operator EDF recently announced a ten-year life extension at one of its Dungeness B plants, and it hopes to agree similar extensions at its other UK plants.
Small modular nuclear reactors (SMRs) could play a “niche” role in future, the IEA says. It points out that just three prototype modular reactors are under construction, that none are yet operating and that the economics of SMRs “have yet to be proven”. Former environment secretary Owen Paterson gave SMRs a starring role in his vision for the UK’s energy future in a speech last year…….”
If Liquid Fluoride Thorium Reactors (LFTRs) s are used to ‘burn up’ waste from conventional reactors, their fuel now comprises 238U, 235U, 239Pu, 240Pu and other actinides.
Operated in this way, what is now a mixed-fuel molten salt reactor will breed plutonium (from 238U) and other long lived actinides, perpetuating the plutonium cycle.
How Much Safer Would Thorium Based Nuclear Power Be? http://www.newsaddicted.com/2015/01/04/how-much-safer-would-thorium-based-nuclear-power-be/?tb January 4, 2015 | By News Junkie Uploaded by Alchemist-hp via Free Art License 1.3 (FAL 1.3)
According to Oliver Tickell, not much:
Numerous advantages for thorium as a nuclear fuel and for the LFTR design over conventional solid fuel reactors have been claimed. In this section we consider each of these claims in turn.
3.1 Abundance of thorium relative to uranium
Claim: Thorium is several times more abundant in the Earth’s crust than uranium.
Response: Thorium (232Th) is indeed more abundant than uranium, by a factor of three to four. But whereas 0.7% of uranium occurs as fissile 235U, none of the thorium is fissile. The world already possesses an estimated 1.2 million tonnes of depleted uranium (mainly 238U), like thorium a fertile but non-fissile material. So the greater abundance of thorium than uranium confers no advantage, other than a very marginal advantage in energy security to those countries in which it is abundant.
3.2 Relative utility of thorium and uranium as fuel
Claim: 100% of the thorium is usable as fuel, in contrast to the low (~0.7%) proportion of fissile 235U in natural uranium.
Response: Thorium must be subjected to neutron irradiation to be transformed into a fissile material suitable for nuclear fuel (uranium, 233U). The same applies to the 238U that makes up depleted uranium, which as already observed, is plentiful. In theory, 100% of either metal could be bred into nuclear fuel. However, uranium has a strong head start, as 0.7% of it is fissile (235U) in its naturally-occurring form.
3.3 Nuclear weapons proliferation
Claim: thorium reactors do not produce plutonium, and so create little or no proliferation hazard.
Response: thorium reactors do not produce plutonium. But an LFTR could (by including 238U in the fuel) be adapted to produce plutonium of a high purity well above normal weapons-grade, presenting a major proliferation hazard. Beyond that, the main proliferation hazards arise from:
the need for fissile material (plutonium or uranium) to initiate the thorium fuel cycle, which could be diverted, and
the production of fissile uranium 233U.Claim: the fissile uranium (233U) produced by thorium reactors is not “weaponisable” owing to the presence of highly radiotoxic 232U as a contaminant. Response: 233U was successfully used in a 1955 bomb test in the Nevada Desert under the USA’s Operation Teapot and so is clearly weaponisable notwithstanding
any 232U present. Moreover, the continuous pyro-processing / electro-refining technologies intrinsic to MSRs / LFTRs could generate streams of 233U very low in 232U at a purity well above weapons grade as currently defined.
Claim: LFTRs are intrinsically safe, because the reactor operates at low pressure and is and incapable of melting down.
Response: the design of molten salt reactors does indeed mitigate against reactor meltdown and explosion. However, in an LFTR the main danger has been shifted from the reactor to the on-sitecontinuous fuel reprocessing operation – a high temperature process involving highly hazardous, explosive and intensely radioactive materials. A further serious hazard lies in the potential failure of the materials used for reactor and fuel containment in a highly corrosive chemical environment, under intense neutron and other radiation.
3.5 State of technology
Claim: the technology is already proven.
Response: important elements of the LFTR technology were proven during the 1970s Molten SaltBreeder Reactor (MSBR) at Oak Ridge National Laboratory. However, this was a small research reactor rated at just 7MW and there are huge technical and engineering challenges in scaling up this experimental design to make a ‘production’ reactor. Specific challenges include:
developing materials that can both resist corrosion by liquid fluoride salts including diverse fission products, and withstand decades of intense neutron radiation;
scaling up fuel reprocessing techniques to deal safely and reliably with large volumes of highly radioactive material at very high temperature;
keeping radioactive releases from the reprocessing operation to an acceptably low level;
achieving a full understanding of the thorium fuel cycle.
3.6 Nuclear waste
Claim: LFTRs produce far less nuclear waste than conventional solid fuel reactors.
Response: LFTRs are theoretically capable of a high fuel burn-up rate, but while this may indeed reduce the volume of waste, the waste is more radioactive due to the higher volume of radioactive fission products. The continuous fuel reprocessing that is characteristic of LFTRs will also produce hazardous chemical and radioactive waste streams, and releases to the environment will be unavoidable.
Claim: Liquid fluoride thorium reactors generate no high-level waste material.
Response: This claim, although made in the report from the House of Lords, has no basis in fact. High-level waste is an unavoidable product of nuclear fission. Spent fuel from any LFTR will be intensely radioactive and constitute high level waste. The reactor itself, at the end of its lifetime, will constitute high level waste.
Claim: the waste from LFTRs contains very few long-lived isotopes, in particular transuranic actinides such as plutonium.
Response: the thorium fuel cycle does indeed produce very low volumes of plutonium and other long-lived actinides so long as only thorium and 233U are used as fuel. However, the waste contains many radioactive fission products and will remain dangerous for many hundreds of years. A particular hazard is the production of 232U, with its highly radio-toxic decay chain.
Claim: LFTRs can ‘burn up’ high level waste from conventional nuclear reactors, and stockpiles of plutonium.
Response: if LFTRs are used to ‘burn up’ waste from conventional reactors, their fuel now comprises 238U, 235U, 239Pu, 240Pu and other actinides. Operated in this way, what is now a mixed-fuel molten salt reactor will breed plutonium (from 238U) and other long lived actinides, perpetuating the plutonium cycle.
3.7 Cost of electricity
Claim: the design of LFTRs tends towards low construction cost and very cheap electricity.
Response: while some elements of LFTR design may cut costs compared to conventional reactors, other elements will add cost, notably the continuous fuel reprocessing using high-temperature ‘pyro-processing’ technologies. Moreover, a costly experimental phase of ~20-40 years duration will be required before any ‘production’ LFTR reactors can be built.
It is very hard to predict the cost of the technology that finally emerges, but the economics of nuclear fuel reprocessing to date suggests that the nuclear fuel produced from breeder reactors is about 50 times more expensive than ‘virgin’ fuel. It therefore appears probable that any electricity produced from LFTRs will be expensive.
We must also consider the prospect that relatively novel or immature energy sources, such as photovoltaic electricity and photo-evolved hydrogen, will have become well established as low-cost technologies long before LFTRs are in the market.
Claim: Thorium and the LFTR offer a solution to current and medium-term energy supply deficits.
Response: The thorium fuel cycle is immature. Estimates from the UK’s National Nuclear Laboratory and the Chinese Academy of Sciences (see 4.2 below) suggest that 10-15 years of research will be needed before thorium fuels are ready to be deployed in existing reactor designs. Production LFTRs will not be deployable on any significant scale for 40-70 years.
I worry that even the environmental science on Fukushima and other radioactive contamination processes will be corrupted by capture.
Beta Spikes and Rising Radiation Levels http://majiasblog.blogspot.jp/2014/10/yesterday-and-perhaps-day-before.html
Yesterday and perhaps the day before Phoenix encountered a radioactive plume: I don’t know where it came from. It could have derived from Fukushima, Diablo Canyon nuclear plant, or Palo Verde nuclear power plant. In the end, I guess it doesn’t matter because the overarching point is that nuclear power plants are contaminating our environment with man-made radionuclides (and I do mean “man” made).
After seeing this uptick in beta count, I perused the other west coast sites. Many Radnet sites are no longer reporting beta data at all, while gamma data patterns look odd,
The EPA Radnet data over the last three years have not been reliable because of many problems with collection, inexplicable temporary outages, and permanently offline sites. I strongly suspect these problems are deliberate because the EPA Inspector General chastised the Radnet system, and Gina McCarthy who was responsible for EPA’s atmospheric radiation monitoring, for poor performance during the March 2011 Fukushima disaster and yet the problems cited in their report remain unaddressed and now Gina is heading the EPA. Poor performance was richly rewarded.
My guess is that there have been deliberate efforts made to halt and/or censor atmospheric radiation reporting at locations that show strong beta surges with incoming radiation plumes from Fukushima and other spewing nuclear power plants. Continue reading
The nuclear industry and its supporters have contrived a variety of narratives to justify and explain away nuclear catastrophes, writes John Downer. None of them actually hold water, yet they serve their purpose – to command political and media heights, and reassure public sentiment on ‘safety’. But if it’s so safe, why the low limits on nuclear liabilities?
Speaking at press conference soon after the accident began, the UK government’s former chief science advisor, Sir David King, reassured journalists that the natural disaster that precipitated the failure had been “an extremely unlikely event”.
In doing so, he exemplified the many early accounts of Fukushima that emphasised the improbable nature of the earthquake and tsunami that precipitated it.
A range of professional bodies made analogous claims around this time, with journalists following their lead. This lamentation, by a consultant writing in the New American, is illustrative of the general tone:
” … the Fukushima ‘disaster’ will become the rallying cry against nuclear power. Few will remember that the plant stayed generally intact despite being hit by an earthquake with more than six times the energy the plant was designed to withstand, plus a tsunami estimated at 49 feet that swept away backup generators 33 feet above sea level.”
The explicit or implicit argument in all such accounts is that the Fukushima’s proximate causes are so rare as to be almost irrelevant to nuclear plants in the future. Nuclear power is safe, they suggest, except against the specific kind of natural disaster that struck Japan, which is both a specifically Japanese problem, and one that is unlikely to re-occur, anywhere, in any realistic timeframe
An appealing but tenuous logic
The logic of this is tenuous on various levels. The ‘improbability’ of the natural disaster is disputable, for one, as there were good reasons to believe that neither the earthquake nor the tsunami should have been surprising. The area was well known to be seismically active after all, and the quake, when it came, was only the fourth largest of the last century.
The Japanese nuclear industry had even confronted its seismic under-preparedness four years earlier, on 16 July 2007, when an earthquake of unanticipated magnitude damaged the Kashiwazaki-Kariwa nuclear plant.
This had led several analysts to highlight Fukushima’s vulnerability to earthquakes, but officials had said much the same then as they now said in relation to Fukushima. The tsunami was not without precedent either.
Geologists had long known that a similar event had occurred in the same area in July 869. This was a long time ago, certainly, but the data indicated a thousand-year return cycle.
Several reports, meanwhile, have suggested that the earthquake alone might have precipitated the meltdown, even without the tsunami – a view supported by a range of evidence, from worker testimony, to radiation alarms that sounded before the tsunami. Haruki Madarame, the head of Japan’s Nuclear Safety Commission, has criticised Fukushima’s operator, TEPCO, for denying that it could have anticipated the flood.
The claim that Japan is ‘uniquely vulnerable’ to such hazards is similarly disputable. In July 2011, for instance, the Wall Street Journal reported on private NRC emails showing that the industry and its regulators had evidence that many US reactors were at risk from earthquakes that had not been anticipated in their design.
It noted that the regulator had taken very little or no action to accommodate this new understanding. As if to illustrate their concern, on 23 August 2011, less than six months after Fukushima, North Anna nuclear plant in Mineral, Virginia, was rocked by an earthquake that exceeded its design-basis predictions.
Every accident is ‘unique’ – just like the next one
There is, moreover, a larger and more fundamental reason to doubt the ‘unique events or vulnerabilities’ narrative, which lies in recognising its implicit assertion that nuclear plants are safe against everything except the events that struck Japan.
It is important to understand that those who assert that nuclear power is safe because the 2011 earthquake and tsunami will not re-occur are, essentially, saying that although the industry failed to anticipate those events, it has anticipated all the others.
Yet even a moment’s reflection reveals that this is highly unlikely. It supposes that experts can be sure they have comprehensively predicted all the challenges that nuclear plants will face in its lifetime (or, in engineering parlance: that the ‘design basis’ of every nuclear plant is correct) – even though a significant number of technological disasters, including Fukushima, have resulted, at least in part, from conditions that engineers failed to even consider.
As Sagan points out: “things that have never happened before, happen all the time”. The terrorist attacks of 9/11 are perhaps the most iconic illustration of this dilemma but there are many others.
Perrow (2007) painstakingly explores a landscape of potential disaster scenarios that authorities do not formally recognise, but it is highly unlikely that he has considered them all.
More are hypothesised all the time. For instance, researchers have recently speculated about the effects of massive solar storms, which, in pre-nuclear times, have caused electrical systems over North America and Europe to fail for weeks at a time.
Human failings that are unrepresentative and / or correctable
A second rationale that accounts of Fukushima invoke to establish that accidents will not re-occur focuses on the people who operated or regulated the plant, and the institutional culture in which they worked. Observers who opt to view the accident through this lens invariably construe it as the result of human failings – either error, malfeasance or both.
The majority of such narratives relate the failings they identify directly to Fukushima’s specific regulatory or operational context, thereby portraying it as a ‘Japanese’ rather than a ‘nuclear’ accident.
Many, for instance, stress distinctions between US and Japanese regulators; often pointing out that the Japanese nuclear regulator (NISA) was subordinate to the Ministry of Trade and Industry, and arguing that this created a conflict of interest between NISA’s responsibilities for safety and the Ministry’s responsibility to promote nuclear energy.
They point, for instance, to the fact that NISA had recently been criticised by the International Atomic Energy Agency (IAEA) for a lack of independence, in a report occasioned by earthquake damage at another plant. Or to evidence that NISA declined to implement new IAEA standards out of fear that they would undermine public trust in the nuclear industry.
Other accounts point to TEPCO, the operator of the plant, and find it to be distinctively“negligent”. A common assertion in vein, for instance, is that it concealed a series of regulatory breaches over the years, including data about cracks in critical circulation pipes that were implicated in the catastrophe.
There are two subtexts to these accounts. Firstly, that such an accident will not happen here (wherever ‘here’ may be) because ‘our’ regulators and operators ‘follow the rules’. And secondly, that these failings can be amended so that similar accidents will not re-occur, even in Japan.
Where accounts of the human failings around Fukushima do portray those failings as being characteristic of the industry beyond Japan, the majority still construe those failings as eradicable.
In March 2012, for instance, the Carnegie Endowment for International Peace issued a report that highlighted a series of organisational fallings associated with Fukushima, not all of which they considered to be meaningfully Japanese.
Nevertheless, the report – entitled ‘Why Fukushima was preventable’ – argued that such failings could be resolved. “In the final analysis”, it concluded, “the Fukushima accident does not reveal a previously unknown fatal flaw associated with nuclear power.”
The same message echoes in the many post-Fukushima actions and pronouncements of nuclear authorities around the world promising managerial reviews and reforms, such as the IAEA’s hastily announced ‘five-point plan’ to strengthen reactor oversight.
Myths of exceptionality
As with the previous narratives about exogenous hazards, however, the logic of these ‘human failure’ arguments is also tenuous. Despite the editorial consternation that revelations about Japanese malfeasance and mistakes have inspired, for instance, there are good reasons to believe that neither were exceptional…………..
Plant design is unrepresentative and/or correctable
Parallel to narratives about Fukushima’s circumstances and operation, outlined above, are narratives that emphasise the plant itself.
These limit the relevance of accident to the wider nuclear industry by arguing that the design of its reactor (a GE Mark-1) was unrepresentative of most other reactors, while simultaneously promising that any reactors that were similar enough to be dangerous could be rendered safe by ‘correcting’ their design.
Accounts in this vein frequently highlight the plant’s age, pointing out that reactor designs have changed over time, presumably becoming safer. A UK civil servant exemplified this narrative, and the strategic decision to foreground it, in an internal email (later printed in the Guardian ), in which he asserted that
“We [The Department of Business, Innovation and Skills] need to … show that events in Japan, whilst looking dramatic, are all part of the safety processes of this 1960’s reactor.”
Stressing the age of the reactor in this way became a mainstay of Fukushima discourse in the disaster’s immediate aftermath. Guardian columnist George Monbiot (2011b), for instance, described Fukushima as “a crappy old plant with inadequate safety features”.
He concluded that its failure should not speak to the integrity of later designs, like that of the neighboring plant, Fukushima ‘Daini’, which did not fail in the tsunami. “Using a plant built 40 years ago to argue against 21st-century power stations”, he wrote, “is like using the Hindenburg disaster to contend that modern air travel is unsafe.”
Other accounts highlighted the reactor’s design but focused on more generalisable failings, such as the “insufficient defense-in-depth provisions for tsunami hazards” (IAEA 2011a: 13), which could not be construed as indigenous only to the Mark-1 reactors or their generation.
The implication – we can and will fix all these problems
These failings could be corrected, however, or such was the implication. The American Nuclear Society set the tone, soon after the accident, when it reassured the world that:“the nuclear power industry will learn from this event, and redesign our facilities as needed to make them safer in the future.”
Almost every official body with responsibility for nuclear power followed in their wake. The IAEA, for instance, orchestrated a series of rolling investigations, which eventually cumulated in the announcement of its ‘Action Plan on Nuclear Safety’ and a succession of subsequent meetings where representatives of different technical groups could pool their analyses and make technical recommendations.
The groups invariably conclude that “many lessons remain to be learned” and recommend further study and future meetings. Again, however, there is ample cause for scepticism.
Firstly, there are many reasons to doubt that Fukushima’s specific design or generation made it exceptionally vulnerable. As noted above, for instance, many of the specific design failings identified after the disaster – such as the inadequate water protection around reserve power supplies – were broadly applicable across reactor designs.
And even if the reactor design or its generation were exceptional in some ways, that exceptionalism is decidedly limited. There are currently 32 Mark-1 reactors in operation around the world, and many others of a similar age and generation, especially in the US, where every reactor currently in operation was commissioned before the Three Mile Island accident in 1979.
Secondly, there is little reason to believe that most existing plants could be retrofitted to meet all Fukushima’s lessons. Significantly raising the seismic resilience of a nuclear plant, for instance, implies such extensive design changes that it might be more practical to decommission the entire structure and rebuild from scratch.
This perhaps explains why progress has been halting on the technical recommendations. It might be true that different, or more modern reactors are safer, therefore, but these are not the reactors we have.
In March 2012, the NRC did announce some new standards pertaining to power outages and fuel pools – issuing three ‘immediately effective’ orders requiring operators to implement some of the more urgent recommendations. The required modifications were relatively modest, however, and ‘immediately’ in this instance meant ‘by December 31st 2016′.
Meanwhile, the approvals for four new reactors the NRC granted around this time contained no binding commitment to implement the wider lessons it derived from Fukushima. In each case, the increasingly marginalised NRC chairman, Gregory Jaczko, cast a lone dissenting vote. He was also the only committee member to object to the 2016 timeline
Fukushima and the institutional invisibility of nuclear disaster, Ecologist, John Downer 20th December 2014 “……..Science? Or propaganda? Different sides in this contest of numbers routinely assume their rivals are actively attempting to mislead – a wide range of critics argue that most official accounts are authored by industry apologists who ‘launder’ nuclear catastrophes by dicing evidence of their human fallout into an anodyne melée of claims and counter claims.
When John Gofman, a former University of California Berkeley Professor of Medical Physics, wrote that the Department of Energy was “conducting a Josef Goebels propaganda war” by advocating a conservative model of radiation damage, for instance, his charge more remarkable for its candor than its substance.
And there is certainly some evidence for this. There can be little doubt that in the past the US government has intentionally clouded the science of radiation hazards to assuage public concerns. The 1995 US Advisory Committee on Human Radiation Experiments, for instance, concluded that Cold War radiation research was heavily sanitised for political ends.
A former AEC (NRC) commissioner testified in the early 1990s that: “One result of the regulators’ professional identification with the owners and operators of the plants in the battles over nuclear energy was a tendency to try to control information to disadvantage the anti-nuclear side.” It is perhaps more useful, however, to say they are each discriminating about the realities to which they adhere.
In this realm there are no entirely objective facts, and with so many judgements it is easy to imagine how even small, almost invisible biases, might shape the findings of seemingly objective hazard calculations.
Indeed, many of the judgements that separate divergent nuclear hazard calculations are inherently political, with the result that there can be no such thing as an entirely neutral account of nuclear harm.
Researchers must decide whether a ‘stillbirth’ counts as a ‘fatality’, for instance. They must decide whether an assessment should emphasise deaths exclusively, or if it should encompass all the injuries, illnesses, deformities and dis abilities that have been linked to radiation. They must decide whether a life ‘shortened’ constitutes a life ‘lost’.
There are no correct answers to such questions. More data will not resolve them. Researchers simply have to make choices. The net effect is that the hazards of any nuclear disaster can only be glimpsed obliquely through a distorted lens.
So much ambiguity and judgement is buried in even the most rigorous calculations of Fukushima’s health impacts that no study can be definitive. All that remains are impressions and, for the critical observer, a vertiginous sense of possibility.
Estimating the costs – how many $100s of billions?
The only thing to be said for sure is that declarative assurances of Fukushima’s low death toll are misleading in their surety. Given the intense fact-figure crossfire around radiological mortality, it is unhelpful to view Fukushima purely through the lens of health.
In fact, the emphasis on mortality might itself be considered a way of minimising Fukushima, considering that there are other – far less ambiguous – lenses through which to view the disaster’s consequences.
Fukushima’s health effects are contested enough that they can be interpreted in ways that make the accident look tolerable, but it is much more challenging to make a case that it was tolerable in other terms. http://www.theecologist.org/News/news_analysis/2684383/fukushima_and_the_institutional_invisibility_of_nuclear_disaster.html
- 1 NUCLEAR ISSUES
- business and costs
- climate change
- indigenous issues
- marketing of nuclear
- opposition to nuclear
- politics international
- Religion and ethics
- secrets,lies and civil liberties
- weapons and war
- 2 WORLD
- MIDDLE EAST
- NORTH AMERICA
- SOUTH AMERICA
- Christina's notes
- Christina's themes
- rare earths
- resources – print
- Resources -audiovicual