If Liquid Fluoride Thorium Reactors (LFTRs) s are used to ‘burn up’ waste from conventional reactors, their fuel now comprises 238U, 235U, 239Pu, 240Pu and other actinides.
Operated in this way, what is now a mixed-fuel molten salt reactor will breed plutonium (from 238U) and other long lived actinides, perpetuating the plutonium cycle.
How Much Safer Would Thorium Based Nuclear Power Be? http://www.newsaddicted.com/2015/01/04/how-much-safer-would-thorium-based-nuclear-power-be/?tb January 4, 2015 | By News Junkie Uploaded by Alchemist-hp via Free Art License 1.3 (FAL 1.3)
According to Oliver Tickell, not much:
Numerous advantages for thorium as a nuclear fuel and for the LFTR design over conventional solid fuel reactors have been claimed. In this section we consider each of these claims in turn.
3.1 Abundance of thorium relative to uranium
Claim: Thorium is several times more abundant in the Earth’s crust than uranium.
Response: Thorium (232Th) is indeed more abundant than uranium, by a factor of three to four. But whereas 0.7% of uranium occurs as fissile 235U, none of the thorium is fissile. The world already possesses an estimated 1.2 million tonnes of depleted uranium (mainly 238U), like thorium a fertile but non-fissile material. So the greater abundance of thorium than uranium confers no advantage, other than a very marginal advantage in energy security to those countries in which it is abundant.
3.2 Relative utility of thorium and uranium as fuel
Claim: 100% of the thorium is usable as fuel, in contrast to the low (~0.7%) proportion of fissile 235U in natural uranium.
Response: Thorium must be subjected to neutron irradiation to be transformed into a fissile material suitable for nuclear fuel (uranium, 233U). The same applies to the 238U that makes up depleted uranium, which as already observed, is plentiful. In theory, 100% of either metal could be bred into nuclear fuel. However, uranium has a strong head start, as 0.7% of it is fissile (235U) in its naturally-occurring form.
3.3 Nuclear weapons proliferation
Claim: thorium reactors do not produce plutonium, and so create little or no proliferation hazard.
Response: thorium reactors do not produce plutonium. But an LFTR could (by including 238U in the fuel) be adapted to produce plutonium of a high purity well above normal weapons-grade, presenting a major proliferation hazard. Beyond that, the main proliferation hazards arise from:
the need for fissile material (plutonium or uranium) to initiate the thorium fuel cycle, which could be diverted, and
the production of fissile uranium 233U.Claim: the fissile uranium (233U) produced by thorium reactors is not “weaponisable” owing to the presence of highly radiotoxic 232U as a contaminant. Response: 233U was successfully used in a 1955 bomb test in the Nevada Desert under the USA’s Operation Teapot and so is clearly weaponisable notwithstanding
any 232U present. Moreover, the continuous pyro-processing / electro-refining technologies intrinsic to MSRs / LFTRs could generate streams of 233U very low in 232U at a purity well above weapons grade as currently defined.
Claim: LFTRs are intrinsically safe, because the reactor operates at low pressure and is and incapable of melting down.
Response: the design of molten salt reactors does indeed mitigate against reactor meltdown and explosion. However, in an LFTR the main danger has been shifted from the reactor to the on-sitecontinuous fuel reprocessing operation – a high temperature process involving highly hazardous, explosive and intensely radioactive materials. A further serious hazard lies in the potential failure of the materials used for reactor and fuel containment in a highly corrosive chemical environment, under intense neutron and other radiation.
3.5 State of technology
Claim: the technology is already proven.
Response: important elements of the LFTR technology were proven during the 1970s Molten SaltBreeder Reactor (MSBR) at Oak Ridge National Laboratory. However, this was a small research reactor rated at just 7MW and there are huge technical and engineering challenges in scaling up this experimental design to make a ‘production’ reactor. Specific challenges include:
developing materials that can both resist corrosion by liquid fluoride salts including diverse fission products, and withstand decades of intense neutron radiation;
scaling up fuel reprocessing techniques to deal safely and reliably with large volumes of highly radioactive material at very high temperature;
keeping radioactive releases from the reprocessing operation to an acceptably low level;
achieving a full understanding of the thorium fuel cycle.
3.6 Nuclear waste
Claim: LFTRs produce far less nuclear waste than conventional solid fuel reactors.
Response: LFTRs are theoretically capable of a high fuel burn-up rate, but while this may indeed reduce the volume of waste, the waste is more radioactive due to the higher volume of radioactive fission products. The continuous fuel reprocessing that is characteristic of LFTRs will also produce hazardous chemical and radioactive waste streams, and releases to the environment will be unavoidable.
Claim: Liquid fluoride thorium reactors generate no high-level waste material.
Response: This claim, although made in the report from the House of Lords, has no basis in fact. High-level waste is an unavoidable product of nuclear fission. Spent fuel from any LFTR will be intensely radioactive and constitute high level waste. The reactor itself, at the end of its lifetime, will constitute high level waste.
Claim: the waste from LFTRs contains very few long-lived isotopes, in particular transuranic actinides such as plutonium.
Response: the thorium fuel cycle does indeed produce very low volumes of plutonium and other long-lived actinides so long as only thorium and 233U are used as fuel. However, the waste contains many radioactive fission products and will remain dangerous for many hundreds of years. A particular hazard is the production of 232U, with its highly radio-toxic decay chain.
Claim: LFTRs can ‘burn up’ high level waste from conventional nuclear reactors, and stockpiles of plutonium.
Response: if LFTRs are used to ‘burn up’ waste from conventional reactors, their fuel now comprises 238U, 235U, 239Pu, 240Pu and other actinides. Operated in this way, what is now a mixed-fuel molten salt reactor will breed plutonium (from 238U) and other long lived actinides, perpetuating the plutonium cycle.
3.7 Cost of electricity
Claim: the design of LFTRs tends towards low construction cost and very cheap electricity.
Response: while some elements of LFTR design may cut costs compared to conventional reactors, other elements will add cost, notably the continuous fuel reprocessing using high-temperature ‘pyro-processing’ technologies. Moreover, a costly experimental phase of ~20-40 years duration will be required before any ‘production’ LFTR reactors can be built.
It is very hard to predict the cost of the technology that finally emerges, but the economics of nuclear fuel reprocessing to date suggests that the nuclear fuel produced from breeder reactors is about 50 times more expensive than ‘virgin’ fuel. It therefore appears probable that any electricity produced from LFTRs will be expensive.
We must also consider the prospect that relatively novel or immature energy sources, such as photovoltaic electricity and photo-evolved hydrogen, will have become well established as low-cost technologies long before LFTRs are in the market.
Claim: Thorium and the LFTR offer a solution to current and medium-term energy supply deficits.
Response: The thorium fuel cycle is immature. Estimates from the UK’s National Nuclear Laboratory and the Chinese Academy of Sciences (see 4.2 below) suggest that 10-15 years of research will be needed before thorium fuels are ready to be deployed in existing reactor designs. Production LFTRs will not be deployable on any significant scale for 40-70 years.
I worry that even the environmental science on Fukushima and other radioactive contamination processes will be corrupted by capture.
Beta Spikes and Rising Radiation Levels http://majiasblog.blogspot.jp/2014/10/yesterday-and-perhaps-day-before.html
Yesterday and perhaps the day before Phoenix encountered a radioactive plume: I don’t know where it came from. It could have derived from Fukushima, Diablo Canyon nuclear plant, or Palo Verde nuclear power plant. In the end, I guess it doesn’t matter because the overarching point is that nuclear power plants are contaminating our environment with man-made radionuclides (and I do mean “man” made).
After seeing this uptick in beta count, I perused the other west coast sites. Many Radnet sites are no longer reporting beta data at all, while gamma data patterns look odd,
The EPA Radnet data over the last three years have not been reliable because of many problems with collection, inexplicable temporary outages, and permanently offline sites. I strongly suspect these problems are deliberate because the EPA Inspector General chastised the Radnet system, and Gina McCarthy who was responsible for EPA’s atmospheric radiation monitoring, for poor performance during the March 2011 Fukushima disaster and yet the problems cited in their report remain unaddressed and now Gina is heading the EPA. Poor performance was richly rewarded.
My guess is that there have been deliberate efforts made to halt and/or censor atmospheric radiation reporting at locations that show strong beta surges with incoming radiation plumes from Fukushima and other spewing nuclear power plants. Continue reading
The nuclear industry and its supporters have contrived a variety of narratives to justify and explain away nuclear catastrophes, writes John Downer. None of them actually hold water, yet they serve their purpose – to command political and media heights, and reassure public sentiment on ‘safety’. But if it’s so safe, why the low limits on nuclear liabilities?
Speaking at press conference soon after the accident began, the UK government’s former chief science advisor, Sir David King, reassured journalists that the natural disaster that precipitated the failure had been “an extremely unlikely event”.
In doing so, he exemplified the many early accounts of Fukushima that emphasised the improbable nature of the earthquake and tsunami that precipitated it.
A range of professional bodies made analogous claims around this time, with journalists following their lead. This lamentation, by a consultant writing in the New American, is illustrative of the general tone:
” … the Fukushima ‘disaster’ will become the rallying cry against nuclear power. Few will remember that the plant stayed generally intact despite being hit by an earthquake with more than six times the energy the plant was designed to withstand, plus a tsunami estimated at 49 feet that swept away backup generators 33 feet above sea level.”
The explicit or implicit argument in all such accounts is that the Fukushima’s proximate causes are so rare as to be almost irrelevant to nuclear plants in the future. Nuclear power is safe, they suggest, except against the specific kind of natural disaster that struck Japan, which is both a specifically Japanese problem, and one that is unlikely to re-occur, anywhere, in any realistic timeframe
An appealing but tenuous logic
The logic of this is tenuous on various levels. The ‘improbability’ of the natural disaster is disputable, for one, as there were good reasons to believe that neither the earthquake nor the tsunami should have been surprising. The area was well known to be seismically active after all, and the quake, when it came, was only the fourth largest of the last century.
The Japanese nuclear industry had even confronted its seismic under-preparedness four years earlier, on 16 July 2007, when an earthquake of unanticipated magnitude damaged the Kashiwazaki-Kariwa nuclear plant.
This had led several analysts to highlight Fukushima’s vulnerability to earthquakes, but officials had said much the same then as they now said in relation to Fukushima. The tsunami was not without precedent either.
Geologists had long known that a similar event had occurred in the same area in July 869. This was a long time ago, certainly, but the data indicated a thousand-year return cycle.
Several reports, meanwhile, have suggested that the earthquake alone might have precipitated the meltdown, even without the tsunami – a view supported by a range of evidence, from worker testimony, to radiation alarms that sounded before the tsunami. Haruki Madarame, the head of Japan’s Nuclear Safety Commission, has criticised Fukushima’s operator, TEPCO, for denying that it could have anticipated the flood.
The claim that Japan is ‘uniquely vulnerable’ to such hazards is similarly disputable. In July 2011, for instance, the Wall Street Journal reported on private NRC emails showing that the industry and its regulators had evidence that many US reactors were at risk from earthquakes that had not been anticipated in their design.
It noted that the regulator had taken very little or no action to accommodate this new understanding. As if to illustrate their concern, on 23 August 2011, less than six months after Fukushima, North Anna nuclear plant in Mineral, Virginia, was rocked by an earthquake that exceeded its design-basis predictions.
Every accident is ‘unique’ – just like the next one
There is, moreover, a larger and more fundamental reason to doubt the ‘unique events or vulnerabilities’ narrative, which lies in recognising its implicit assertion that nuclear plants are safe against everything except the events that struck Japan.
It is important to understand that those who assert that nuclear power is safe because the 2011 earthquake and tsunami will not re-occur are, essentially, saying that although the industry failed to anticipate those events, it has anticipated all the others.
Yet even a moment’s reflection reveals that this is highly unlikely. It supposes that experts can be sure they have comprehensively predicted all the challenges that nuclear plants will face in its lifetime (or, in engineering parlance: that the ‘design basis’ of every nuclear plant is correct) – even though a significant number of technological disasters, including Fukushima, have resulted, at least in part, from conditions that engineers failed to even consider.
As Sagan points out: “things that have never happened before, happen all the time”. The terrorist attacks of 9/11 are perhaps the most iconic illustration of this dilemma but there are many others.
Perrow (2007) painstakingly explores a landscape of potential disaster scenarios that authorities do not formally recognise, but it is highly unlikely that he has considered them all.
More are hypothesised all the time. For instance, researchers have recently speculated about the effects of massive solar storms, which, in pre-nuclear times, have caused electrical systems over North America and Europe to fail for weeks at a time.
Human failings that are unrepresentative and / or correctable
A second rationale that accounts of Fukushima invoke to establish that accidents will not re-occur focuses on the people who operated or regulated the plant, and the institutional culture in which they worked. Observers who opt to view the accident through this lens invariably construe it as the result of human failings – either error, malfeasance or both.
The majority of such narratives relate the failings they identify directly to Fukushima’s specific regulatory or operational context, thereby portraying it as a ‘Japanese’ rather than a ‘nuclear’ accident.
Many, for instance, stress distinctions between US and Japanese regulators; often pointing out that the Japanese nuclear regulator (NISA) was subordinate to the Ministry of Trade and Industry, and arguing that this created a conflict of interest between NISA’s responsibilities for safety and the Ministry’s responsibility to promote nuclear energy.
They point, for instance, to the fact that NISA had recently been criticised by the International Atomic Energy Agency (IAEA) for a lack of independence, in a report occasioned by earthquake damage at another plant. Or to evidence that NISA declined to implement new IAEA standards out of fear that they would undermine public trust in the nuclear industry.
Other accounts point to TEPCO, the operator of the plant, and find it to be distinctively“negligent”. A common assertion in vein, for instance, is that it concealed a series of regulatory breaches over the years, including data about cracks in critical circulation pipes that were implicated in the catastrophe.
There are two subtexts to these accounts. Firstly, that such an accident will not happen here (wherever ‘here’ may be) because ‘our’ regulators and operators ‘follow the rules’. And secondly, that these failings can be amended so that similar accidents will not re-occur, even in Japan.
Where accounts of the human failings around Fukushima do portray those failings as being characteristic of the industry beyond Japan, the majority still construe those failings as eradicable.
In March 2012, for instance, the Carnegie Endowment for International Peace issued a report that highlighted a series of organisational fallings associated with Fukushima, not all of which they considered to be meaningfully Japanese.
Nevertheless, the report – entitled ‘Why Fukushima was preventable’ – argued that such failings could be resolved. “In the final analysis”, it concluded, “the Fukushima accident does not reveal a previously unknown fatal flaw associated with nuclear power.”
The same message echoes in the many post-Fukushima actions and pronouncements of nuclear authorities around the world promising managerial reviews and reforms, such as the IAEA’s hastily announced ‘five-point plan’ to strengthen reactor oversight.
Myths of exceptionality
As with the previous narratives about exogenous hazards, however, the logic of these ‘human failure’ arguments is also tenuous. Despite the editorial consternation that revelations about Japanese malfeasance and mistakes have inspired, for instance, there are good reasons to believe that neither were exceptional…………..
Plant design is unrepresentative and/or correctable
Parallel to narratives about Fukushima’s circumstances and operation, outlined above, are narratives that emphasise the plant itself.
These limit the relevance of accident to the wider nuclear industry by arguing that the design of its reactor (a GE Mark-1) was unrepresentative of most other reactors, while simultaneously promising that any reactors that were similar enough to be dangerous could be rendered safe by ‘correcting’ their design.
Accounts in this vein frequently highlight the plant’s age, pointing out that reactor designs have changed over time, presumably becoming safer. A UK civil servant exemplified this narrative, and the strategic decision to foreground it, in an internal email (later printed in the Guardian ), in which he asserted that
“We [The Department of Business, Innovation and Skills] need to … show that events in Japan, whilst looking dramatic, are all part of the safety processes of this 1960’s reactor.”
Stressing the age of the reactor in this way became a mainstay of Fukushima discourse in the disaster’s immediate aftermath. Guardian columnist George Monbiot (2011b), for instance, described Fukushima as “a crappy old plant with inadequate safety features”.
He concluded that its failure should not speak to the integrity of later designs, like that of the neighboring plant, Fukushima ‘Daini’, which did not fail in the tsunami. “Using a plant built 40 years ago to argue against 21st-century power stations”, he wrote, “is like using the Hindenburg disaster to contend that modern air travel is unsafe.”
Other accounts highlighted the reactor’s design but focused on more generalisable failings, such as the “insufficient defense-in-depth provisions for tsunami hazards” (IAEA 2011a: 13), which could not be construed as indigenous only to the Mark-1 reactors or their generation.
The implication – we can and will fix all these problems
These failings could be corrected, however, or such was the implication. The American Nuclear Society set the tone, soon after the accident, when it reassured the world that:“the nuclear power industry will learn from this event, and redesign our facilities as needed to make them safer in the future.”
Almost every official body with responsibility for nuclear power followed in their wake. The IAEA, for instance, orchestrated a series of rolling investigations, which eventually cumulated in the announcement of its ‘Action Plan on Nuclear Safety’ and a succession of subsequent meetings where representatives of different technical groups could pool their analyses and make technical recommendations.
The groups invariably conclude that “many lessons remain to be learned” and recommend further study and future meetings. Again, however, there is ample cause for scepticism.
Firstly, there are many reasons to doubt that Fukushima’s specific design or generation made it exceptionally vulnerable. As noted above, for instance, many of the specific design failings identified after the disaster – such as the inadequate water protection around reserve power supplies – were broadly applicable across reactor designs.
And even if the reactor design or its generation were exceptional in some ways, that exceptionalism is decidedly limited. There are currently 32 Mark-1 reactors in operation around the world, and many others of a similar age and generation, especially in the US, where every reactor currently in operation was commissioned before the Three Mile Island accident in 1979.
Secondly, there is little reason to believe that most existing plants could be retrofitted to meet all Fukushima’s lessons. Significantly raising the seismic resilience of a nuclear plant, for instance, implies such extensive design changes that it might be more practical to decommission the entire structure and rebuild from scratch.
This perhaps explains why progress has been halting on the technical recommendations. It might be true that different, or more modern reactors are safer, therefore, but these are not the reactors we have.
In March 2012, the NRC did announce some new standards pertaining to power outages and fuel pools – issuing three ‘immediately effective’ orders requiring operators to implement some of the more urgent recommendations. The required modifications were relatively modest, however, and ‘immediately’ in this instance meant ‘by December 31st 2016′.
Meanwhile, the approvals for four new reactors the NRC granted around this time contained no binding commitment to implement the wider lessons it derived from Fukushima. In each case, the increasingly marginalised NRC chairman, Gregory Jaczko, cast a lone dissenting vote. He was also the only committee member to object to the 2016 timeline
Fukushima and the institutional invisibility of nuclear disaster, Ecologist, John Downer 20th December 2014 “……..Science? Or propaganda? Different sides in this contest of numbers routinely assume their rivals are actively attempting to mislead – a wide range of critics argue that most official accounts are authored by industry apologists who ‘launder’ nuclear catastrophes by dicing evidence of their human fallout into an anodyne melée of claims and counter claims.
When John Gofman, a former University of California Berkeley Professor of Medical Physics, wrote that the Department of Energy was “conducting a Josef Goebels propaganda war” by advocating a conservative model of radiation damage, for instance, his charge more remarkable for its candor than its substance.
And there is certainly some evidence for this. There can be little doubt that in the past the US government has intentionally clouded the science of radiation hazards to assuage public concerns. The 1995 US Advisory Committee on Human Radiation Experiments, for instance, concluded that Cold War radiation research was heavily sanitised for political ends.
A former AEC (NRC) commissioner testified in the early 1990s that: “One result of the regulators’ professional identification with the owners and operators of the plants in the battles over nuclear energy was a tendency to try to control information to disadvantage the anti-nuclear side.” It is perhaps more useful, however, to say they are each discriminating about the realities to which they adhere.
In this realm there are no entirely objective facts, and with so many judgements it is easy to imagine how even small, almost invisible biases, might shape the findings of seemingly objective hazard calculations.
Indeed, many of the judgements that separate divergent nuclear hazard calculations are inherently political, with the result that there can be no such thing as an entirely neutral account of nuclear harm.
Researchers must decide whether a ‘stillbirth’ counts as a ‘fatality’, for instance. They must decide whether an assessment should emphasise deaths exclusively, or if it should encompass all the injuries, illnesses, deformities and dis abilities that have been linked to radiation. They must decide whether a life ‘shortened’ constitutes a life ‘lost’.
There are no correct answers to such questions. More data will not resolve them. Researchers simply have to make choices. The net effect is that the hazards of any nuclear disaster can only be glimpsed obliquely through a distorted lens.
So much ambiguity and judgement is buried in even the most rigorous calculations of Fukushima’s health impacts that no study can be definitive. All that remains are impressions and, for the critical observer, a vertiginous sense of possibility.
Estimating the costs – how many $100s of billions?
The only thing to be said for sure is that declarative assurances of Fukushima’s low death toll are misleading in their surety. Given the intense fact-figure crossfire around radiological mortality, it is unhelpful to view Fukushima purely through the lens of health.
In fact, the emphasis on mortality might itself be considered a way of minimising Fukushima, considering that there are other – far less ambiguous – lenses through which to view the disaster’s consequences.
Fukushima’s health effects are contested enough that they can be interpreted in ways that make the accident look tolerable, but it is much more challenging to make a case that it was tolerable in other terms. http://www.theecologist.org/News/news_analysis/2684383/fukushima_and_the_institutional_invisibility_of_nuclear_disaster.html
Scrutiny on the misleading spin about the health effects of Fukushima nuclear disaster being “tolerable”
The second basic narrative through which accounts of Fukushima have kept the accident from undermining the wider nuclear industry rests on the claim that its effects were tolerable – that even though the costs of nuclear accidents might look high, when amortised over time they are acceptable relative to the alternatives.
The ‘accidents are tolerable’ argument is invariably framed in relation to the health effects of nuclear accidents. Continue reading
The co-signatories “support the broad conclusions drawn in the article ‘Key role for nuclear energy in global biodiversity conservation’, published in Conservation Biology.” The open letter states: “Brook and Bradshaw argue that the full gamut of electricity-generation sources − including nuclear power − must be deployed to replace the burning of fossil fuels, if we are to have any chance of mitigating severe climate change.”
So, here’s my open letter in response to the open letter initiated by Brook and Bradshaw:
– – –
Dear conservation scientists, Continue reading
Oxford Professor in Japan: Well so what if Fukushima had triple meltdown? People enjoy effects of radioactive contamination; Sunshine is much more dangerous; Effect of radiation same as oxygen — Former WHO Official: “The man is dangerous… He’s a crank” (VIDEOS)
Wade Allison, Emeritus Professor of Physics at Oxford University, Foreign Correspondents Club of Japan, Dec 3, 2014 (emphasis added):
- 7:30 — Nuclear protestors have no good arguments for saying that nuclear is dangerous, this is demonstrated by what happened at Fukushima.
- 19:30 – The scientific question is, ‘Why is radiation so safe?‘ Because it is very powerful and so that’s very surprising… That’s the job biology does… Any life form that did not look after the effects of radiation and oxygen, which does the same kind of thing, would fail.
- 27:30 — On holiday… we should take [children] around a nuclear power station.
- 31:00 — What can we do to explain… to people and shove under their noses?
- 39:15 – That excellent film Pandora’s Promise, anybody who hasn’t seen that should.
- 45:00 — [Bury the used nuclear fuel] anywhere, anywhere… Fission products [have a] half-life is 30 years or so… it quickly becomes the same activity as the stuff that you dig out of the ground. You need a mine or a hole in the ground which is going to contain stuff for 500 years — but it doesn’t have to be perfect. Here in Japan, people go to Onsen, and enjoy the effects of radioactive contamination of groundwater…everybody’s very happy to do that. That’s what they do on holiday.
- 47:30 — Triple meltdown? Where did you get those words from? Hollywood? What do you mean by a triple meltdown? So what? I’m telling you — so nothing, very much…Triple meltdown, well so what?… It wasn’t a tragedy.
- 52:45 — The sunshine… that’s much more dangerous… than nuclear radiation.
- 1:01:45 — We need people’s confidence. We need to talk to children in school.
- 1:04:45 — The idea that special precautions have to be taken just doesn’t wash,nuclear is not especially dangerous. It’s not as dangerous as fire.
Allison at the Institute of Physics: New safety levels for human radiation exposure are suggested… 100 mSv in total in any month; 5,000 mSv as a total of whole-of-life exposure.
Keith Baverstock, head of the World Health Organization’s Radiation Protection Program for Europe (1991-2003), Foreign Correspondents Club of Japan, Nov 20, 2014:
- 20:00 — Question: My name is Hiroyuki Fujita, [inaudible] Shimbun editor/writer… According to [Dr. Wade Allison], so called low dose radiation, 100 mSv or less, not so bad for human health… all the scientific knowledge is rooted on the experience of the fruit fly.
- 21:00 – Wade Allison, do you know what his scientific expertise is? Physics… not public health, not medicine, not biology… I did a review of [his] book… I said his book is highly entertaining… it is fiction… We don’t have to rely on fruit flies to know what the effects of radiation are. We know what they are on human health. We have a lot of epidemiological information — which he ignores. I think the man is dangerous, I think you are putting yourself in a dangerous position if you believe him… He’s a crank.
Paranoid US millionaires can buy Survival Condos to survive nuclear war in style, Telegraph UK, By Rosa Prince, New York 11 Nov 2014 Former weapons silo in Kansas converted into luxury underground apartments complete with pool, bar, dog park and cinema Paranoid millionaires in the United States are being offered the opportunity to buy a flat in a luxury underground apartment building in which to see out a nuclear apocalypse in style.
Called “Survival Condos,” the apartments in a former weapons silo at a secret location near Concordia in rural Kansas are on the market for between $1.5 million (£950,000) and $4.5 million (£2.8 million), and come complete with access to a swimming pool, bar, movie theatre and “hydroponic” vegetable garden.
Dog lovers have a place to walk their pets, while a “holding cell” can be used as a prison to house any residents who turn unruly during the long days and nights underground. There is a shooting range and rock climbing wall.
In the event of nuclear war or other disaster, the complex is designed to keep up to 70 people alive for five years, providing food, water and even entertainment.
The complex’s online marketing brochure describes itself as “Survival Bunker Security / Full Luxury Resort Living.”……..http://www.telegraph.co.uk/news/worldnews/northamerica/usa/11223714/Paranoid-US-millionaires-can-buy-Survival-Condos-to-survive-nuclear-war-in-style.html
Fossil fuel industry’s dirty tricks campaign exposed, Independent Australia, DeSmog Blog 4 November 2014 A leaked tape from an oil and gas industry conference shows how Big Carbon uses dirty tricks to undermine science, vilify its critics and discredit journalists who criticise the use of fossil fuels, writesSharon Kelly via DeSmogBlog.
Leave it to Washington’s top attack-dog lobbyist Richard Berman to verify what many always suspected: that the oil and gas industry uses dirty tricks to undermine science, vilify its critics and discredit journalists who cast doubt on the prudence of fossil fuels.
In a speech at an industry conference in June, surreptitiously recorded by an energy executive, Rick Berman ‒ the foremost go-to guy for Republican smear campaigns ‒ gave unusually candid advice to a meeting of drilling companies.
“Think of this as an endless war,” he told executives in a speech, which was leaked to the New York Times by an attendee at the conference who was offended by Berman’s remarks. “And you have to budget for it.”
He said the industry needs to dig up embarrassing tidbits about environmentalists and liberal celebrities, exploit the public’s short attention span for scientific debate, and play on people’s emotions: Continue reading
Professor Grimes came to speak to us outside saying “I’ve never had a demonstration outside where I’ve been speaking before.” He should come to the NW more often! There were only a few of us but listening to his cheerleading for the nastiest most vicious industry on earth, there should have been tens of thousands demonstrating outside…….. Continue reading
Patrick Moore, fossil fuel and nuclear mercenary, tours Australia, courtesy of front group Galileo Movement
Moore’s trip to Australia has been financed through the climate science denial organisation the Galileo Movement.
Moore is almost always described as a co-founder of Greenpeace, despite Greenpeace itself contesting that he wasn’t a co-founder.
An archive of Moore’s CV shows his work for corporations and organisations in logging, pulp and paper and mining. He has also been an advocate for the nuclear energy industry.
Climate Science Denialist Patrick Moore Tours Australia After Comparing Students to the Taliban http://www.desmogblog.com/2014/10/23/climate-science-denialist-patrick-moore-tours-australia-after-comparing-students-taliban#disqus_thread Canadian climate science denialist Patrick Moore is at the beginning of a tour around Australia speaking to audiences across the country.
But here’s a warning.
If you do find yourself in the audience and don’t want to be compared to the “Taliban” then don’t even think about walking out in protest.
Less than two weeks before flying to Australia, Moore spoke on the campus of Amherst Collegein Massachusetts.
When members of the college’s environmental group decided they had heard enough and walked, Moore said they had a “Taliban mindset”.
When he was later asked to apologise, a report in the Amherst College student newspaper says Moore instead chose to double-down on his remark.
“Fifty people walk out, and I say that’s a pretty Taliban thing to do,” Moore is reported to have said, characterizing the behavior of the young students to that of the fundamentalist regime that massacred thousands and committed brutal repression of women.
Who is Patrick Moore?
Moore has no scientific credibility on climate change and has never published a scientific paper on the issue.
Yet Moore claims there is “no scientific proof” that humans are causing global warming and that “throwing bones on the ground” would have a better predictive ability than most climate models.
His opinion on the science runs against all the major national science academies in the world and about 97 per cent of all the peer reviewed studies on climate change carried out since the early 1990s. Continue reading
“……..As one PR pro put it, “I can’t remember the last time I read an op-ed piece argued so ineptly that it thoroughly demolished its own premise.”
This pro went on to point out the actual underlying themes of Conca’s piece:
1. Nobody cares about preserving nuclear power. You only hear from people opposing it.
2. Nobody is speaking out about preserving nuclear power. The only ones on the Hill who speak out oppose it.
3. There is no base of voters you can win over by being in favor of nuclear power.
Ironically, for most politicians politics is about power–not the kind that comes out of a wall socket, but the real stuff: who has it and how to get more of it. This piece is intended to make the case for nuclear power needing to have more political power, but, in doing so, exposes it as utterly powerless.
Back on April 1, I wrote about the founding of Nuclear Matters, “Creation of such a group is itself a sign of the industry’s desperation–who knew a technology that is so self-evidently advantageous (at least in the minds of the industry itself, if for no one else) would need a new organization not to promote industry growth but to try to postpone its inevitable stumble into oblivion?”
That desperation has now devolved into a new level of pathos, where an organization with a very fat wallet, backed by a utility worth billions and supported by an industry collectively worth hundreds of billions, now describes itself as powerless and grasping for someone to hold out a branch of support.
Nuclear Matters doesn’t matter. And it’s not for lack of effort on Exelon’s part, nor that of the organization’s many other industry supporters. It’s because their fundamental argument is that ratepayers should pay far more for their electricity than they need to simply because some nuclear utilities bet the wrong way on the future–and refuse even now to prepare for the inevitable shutdown of reactors–and because nuclear has a myriad of advantages that only nuclear utilities seem able to perceive.
The issue isn’t that these aging, uneconomic reactors are needed to keep the lights on and the beer cold. They’re not. In fact, the problem for nuclear is that the alternatives are both cheaper and cleaner. Nuclear Matters doesn’t matter because its fundamental argument simply makes no sense. http://safeenergy.org/2014/10/14/why-nuclear-matters-doesnt-matter/
Why Nuclear Matters doesn’t matter, Greenworld, by Michael Mariotte, 14 Oct 14 Regular readers of GreenWorld know that we have dropped a lot of digital ink writing about Nuclear Matters, the astroturf group launched by Exelon early this year to try to make the case to save the utility’s aging and uneconomic nuclear fleet.
Exelon and the PR firm Sloane and Company that runs the public end of Nuclear Matters have assembled a seemingly potent team of paid-for spokespeople to make the utility’s case: former Senators like Evan Bayh and Judd Gregg; former DOE secretary James Abraham; and the big catch, former EPA Administrator, Obama climate czar, and current League of Conservation Voters board chair Carol Browner.
These and others in Nuclear Matters’ assembled-team of backers have been writing (or, more likely, allowing their names to be used as having written) op-eds in publications across the country, appearing at Nuclear Matters-organized (ie Sloane and Company) events such as one in New York City the week of the People’s Climate March, and otherwise spreading the news that nuclear power is so important that it shouldn’t matter how costly to ratepayers or how old and unsafe a reactor is, it should keep operating for, apparently, perpetuity.
Maybe it’s just that the message isn’t exactly compelling. Or perhaps former politicians don’t carry the kind of clout Exelon needs. After all, making the case that millions of people should pay higher electricity rates than they otherwise would need to because, well, nuclear!, can’t be an easy sell to current politicians who have to answer to voters.
But the cat is out of the bag. In a remarkable column in which he tries to argue that Nuclear Matters should matter, Forbes’ incessant nuclear industry apologist James Conca inadvertently makes the case that it doesn’t matter. Continue reading
- 1 NUCLEAR ISSUES
- business and costs
- climate change
- indigenous issues
- marketing of nuclear
- opposition to nuclear
- politics international
- Religion and ethics
- secrets,lies and civil liberties
- weapons and war
- 2 WORLD
- MIDDLE EAST
- NORTH AMERICA
- SOUTH AMERICA
- Christina's notes
- Christina's themes
- rare earths
- resources – print
- Resources -audiovicual