nuclear-news

The News That Matters about the Nuclear Industry Fukushima Chernobyl Mayak Three Mile Island Atomic Testing Radiation Isotope

Countering the nuclear lobby’s deceptive spin about ionising radiation

The video below is several years old. Children in Ukraine and Belarus are still suffering with cancers and other serious health effects of the nuclear disaster. The ABC ‘s ”Foreign Correspondent” recently covered their plight, which is still terrible, but the video of that seems to be unavailable.

Extract from The nuclear industry’s updated songsheet remains outdatedPearls and Irritations, By Mark Diesendorf, 22 Oct 21

”…………. Another misleading pro-nuclear statement revived following the Fukushima Daiichi disaster in 2011 is that no excess cancer incidence has been observed around Fukushima, implying that no cancers will be induced. The logical error is to assume that the absence of evidence implies no impact.

For a start, it is still too early for most types of cancer, which have latent periods of 20–60 years, to appear around Fukushima. The only cancers likely to appear within a decade after exposure are thyroid cancer and leukemia. A large increase in thyroid cancers has been observed in the region, but their cause is debated by some on the grounds that the increase could be the result of better screening. Leukemia is an uncommon disease and so even a large percentage increase would be impossible to verify statistically with high confidence. (See UNSCEAR 2020b)

Fortunately for the citizens of Tokyo, the wind was mostly blowing offshore during the meltdowns of three Fukushima reactors, sending about 80 per cent of the emitted radioactive material out over the Pacific. Soon after the disaster an exclusion zone was established around the power station and more than 100,000 people evacuated. For these reasons, Fukushima tells us very little about radiation-induced cancers. 

Most of the evidence that low-level radiation is carcinogenic comes from detailed studies of the survivors of Hiroshima and Nagasaki, medical professionals who worked with radiation, uranium miners, children living near nuclear power stations, and children who were exposed in utero in the bad old days when pregnant women were routinely x-rayed. This is the basis of the linear-no-threshold model, the scientific understanding that the number of cancers induced by ionising radiation is proportional to the dose received and that there’s no threshold. Therefore, even natural background radiation, to which we are all exposed, and medical x-rays contribute very small fractions of cancer prevalence…………https://johnmenadue.com/the-nuclear-industrys-updated-songsheet-remains-outdated/

October 23, 2021 Posted by | 2 WORLD, radiation, Reference, spinbuster | Leave a comment

Radioactive contamination from the partially-burned former Santa Susanna nuclear research facility

Radioactive microparticles related to the Woolsey Fire in Simi Valley, CA  SCience Direct, MarcoKaltofenaMaggieGundersenbArnieGundersenb    Worcester Polytechnic Institute, Dept. of Physics, Fairewinds Energy Education, 8 October 2021. 

Highlights

Wildfire in radiologically contaminated zones is a global concern; contaminated areas around Chernobyl, Fukushima, Los Alamos, and the Nevada Nuclear Test Site have all experienced wildfires.

Three hundred sixty samples of soil, dust and ash were collected in the immediate aftermath of the Los Angeles (CA, USA) Woolsey fire in 2018.

Radioactive contamination from the partially-burned former Santa Susanna nuclear research facility was found in the fire zone.

A limited number of widely scattered locations had evidence of radioactive microparticles originating at the research facility.

X-ray data showed that ashes from the fire could spread site contaminants to distant, but widely spaced, locations.

Abstract

In November 2018, the Woolsey Fire burned north of Los Angeles, CA, USA, potentially remobilizing radioactive contaminants at the former Santa Susana Field Laboratory, a shuttered nuclear research facility contaminated by chemical and radiochemical releases. Wildfire in radiologically contaminated zones is a global concern; contaminated areas around Chernobyl, Fukushima, Los Alamos, and the Nevada Nuclear Test Site have all experienced wildfires. Three weeks after the Woolsey Fire was controlled, sampling of dusts, ashes, and surface soils (n = 360) began and were analyzed by alpha- and beta-radiation counting. Samples were collected up to a 16 km radius from the perimeter of the laboratory. Controls and samples with activities 1σ greater than background were also examined by alpha and/or gamma spectroscopy or Scanning Electron Microscopy with Energy Dispersive X-ray analysis. Of the 360 samples collected, 97% showed activities at or close to site-specific background levels. However, offsite samples collected in publicly-accessible areas nearest to the SSFL site perimeter had the highest alpha-emitting radionuclides radium, thorium, and uranium activities, indicating site-related radioactive material has escaped the confines of the laboratory. 

In two geographically-separated locations, one as far away as 15 km, radioactive microparticles containing percent-concentrations of thorium were detected in ashes and dusts that were likely related to deposition from the Woolsey fire. These offsite radioactive microparticles were colocated with alpha and beta activity maxima. Data did not support a finding of widespread deposition of radioactive particles. However, two radioactive deposition hotspots and significant offsite contamination were detected near the site perimeter……………………………

4. Conclusions

A significant majority of samples (97% of 360 samples) collected in the study zone registered radioactivity levels that matched existing area background levels. Nevertheless, some ashes and dusts collected from the Woolsey Fire zone in the fire’s immediate aftermath contained high activities of radioactive isotopes associated with the Santa Susana Field Laboratory (SSFL). The data show that Woolsey Fire ash did, in fact, spread SSFL-related radioactive microparticles, and the impacts were confined to areas closest to SSFL and at least three other scattered locations in the greater Simi Valley area. Alpha and beta counting, high-resolution alpha and gamma spectroscopy, and X-ray microanalysis using SEM/EDS confirmed the presence of radioactive microparticles in the Woolsey Fire-related ashes and dusts.

Most of the fire-impacted samples found near the SSFL site’s perimeter were on lands accessible to the public. There were, however, scattered localized areas of increased radioactivity due to the presence of radioactive microparticles in ash and recently-settled dusts collected just after the Woolsey fire. These radioactive outliers were found in Thousand Oaks, CA, and Simi Valley, CA, about 15 and 5 km distant from SSFL, respectively. The Thousand Oaks samples had alpha count rates up to 19 times background, and X-ray spectroscopy (SEM) identified alpha-emitting thorium as the source of this excess radioactivity. Excessive alpha radiation in small particles is of particular interest because of the relatively high risk of inhalation-related long-term biological damage from internal alpha emitters compared to external radiation.

The nuclides identified as the sources of excess radioactivity in impacted samples were predominately isotopes of radium, uranium, and thorium. These have naturally-occurring sources, but these isotopes are also contaminants of concern at SSFL and were detected at generally increasing activities as the distance from SSFL decreased. In addition, the number of radioactive microparticles per gram of particulate matter also increased strongly with decreasing distance from SSFL. These data demonstrate that fire and/or other processes have spread SSFL contamination beyond the facility boundary………..

……https://www.sciencedirect.com/science/article/pii/S0265931X21002277?dgcid=coauthor

October 18, 2021 Posted by | environment, radiation, Reference, USA | Leave a comment

Terra Power’s Natrium nuclear reactor will be an economic lemon

This host of factors makes it reasonably certain that the Natrium will not be economically competitive.

In other words, even if has no technical problems, it will be an economic lemon.


Ramana, Makhijani: Look before you leap on nuclear   
https://trib.com/opinion/columns/ramana-makhijani-look-before-you-leap-on-nuclear/article_4508639b-d7e6-50df-b305-07c929de40ed.html, Oct 16, 2021  

The Cowboy State is weighing plans to host a multi-billion dollar “demonstration” nuclear power plant — TerraPower’s Natrium reactor. The long history of similar nuclear reactors, dating back to 1951, indicates that Wyoming is likely to be left with a nuclear lemon on its hands.

The Natrium reactor design, which uses molten sodium as a coolant (water is used in most existing commercial nuclear reactors), is likely to be problematic. Sodium reacts violently with water and burns if exposed to air, a serious vulnerability. A sodium fire, within a few months of the reactor starting to generate power, led to Japan’s Monju [at left] demonstration reactor being shut down.

At 1,200 megawatts, the French Superphénix was the largest sodium-cooled reactor, designed to demonstrate commercial feasibility. Plagued by operational problems, including a major sodium leak, it was shut down in 1998 after 14 years, having operated at an average capacity of under 7 percent compared to the 80 to 90 percent required for commercial operation. Other sodium-cooled reactors have also experienced leaks, which are very difficult to prevent because of chemical interactions between sodium and the stainless steel used in various reactor components. Finally, sodium, being opaque, makes reactor maintenance and repairs notoriously difficult.

Sodium-cooled reactors can experience rapid and hard-to-control power surges. Under severe conditions, a runaway chain reaction can even result in an explosion. Such a runaway reaction was the central cause of the 1986 Chernobyl reactor explosion, though that was a reactor of a different design. Following Chernobyl, Germany’s Kalkar sodium-cooled reactor, about the same size as the proposed Natrium, was abandoned without ever being commissioned, though it was complete.

All these technical and safety challenges naturally drive up the costs of sodium-cooled reactors, making them significantly more expensive than conventional nuclear reactors. More than $100 billion, in today’s dollars, has been spent worldwide in the attempt to commercialize essentially this design and associated technologies, to no avail.

The Natrium design, being even more expensive than present-day reactors, will therefore be more expensive than practically every other form of electricity generation. The Wall Street firm, Lazard, estimates that electricity from new nuclear plants is several times more than the costs at utility-scale solar and wind power plants. Further, the difference has been increasing.

To this bleak picture, Terrapower has added another economically problematic feature: molten salt storage to allow its electric output to vary. Terrapower hopes this feature will help it integrate better into an electricity grid that has more variable electricity sources, notably wind and solar.

Molten salt storage would be novel in a nuclear reactor, but it is used in concentrating solar power projects, where it can cost an additional $2,000 per kilowatt of capacity. At that rate, it could add a billion dollars to the Natrium project.

This host of factors makes it reasonably certain that the Natrium will not be economically competitive. In other words, even if has no technical problems, it will be an economic lemon.

To top it all off, the proposed Wyoming TerraPower demonstration project depends on government funds. Last year, the Department of Energy awarded TerraPower $80 million in initial taxpayer funding; this may increase $1.6 billion over seven years, “subject to the availability of future appropriations” and Terrapower coming up with matching funds.

Despite government support, private capital has recently abandoned a more traditional project, the mPower small modular reactor, resulting in its termination in 2017. And it was Congress that refused to appropriate more money for the sodium-cooled reactor proposed for Clinch River, Tennessee when its costs skyrocketed, thereby ending the project in 1983.

A much harder look at the facts is in order, lest Wyoming add to the total of many cancelled nuclear projects and abandoned construction sites. Of course, the Natrium lemon might be made into lemonade by converting it to an amusement park if it is never switched on, like the Kalkar reactor, now refashioned into Wunderland Kalkar, an amusement park in Germany, near the border with the Netherlands. For energy, the state might look to its natural heritage – its wind power potential is greater than the combined generation of all 94 operating U.S. nuclear reactors put together, which are on average, about three times the size of Natrium.

M. V. Ramana is Professor and Simons Chair in Disarmament, Global and Human Security and the Director of the Liu Institute for Global Issues at the School of Public Policy and Global Affairs, University of British Columbia. Dr. Ramana holds a Ph.D. in Physics from Boston University.

Arjun Makhijani, President of the Institute for Energy and Environmental Research, holds a Ph.D. in engineering (nuclear fusion) from the University of California at Berkeley.

October 18, 2021 Posted by | Reference, Small Modular Nuclear Reactors, spinbuster, USA | Leave a comment

‘Profiteers of Armageddon’: Report Reveals Who Benefits From US ‘Nuclear Modernization’ Plan

While “a handful of prime contractors” are the initial recipients and main beneficiaries of public money spent on bombers, missiles, and submarines, “the funds trickle down to subcontractors” that often include other prominent companies. The report names firms such as Bechtel, General Dynamics, Honeywell, Lockheed Martin, Northrop Grumman, and Raytheon.

Hartung directs attention to the millions of dollars in political activities by key contractors, writing that “while not all of this spending is devoted to lobbying on nuclear weapons programs, these expenditures are indicative of the political clout they can bring to bear on Congress as needed to sustain and expand the budgets for their nuclear weapons-related programs.”

They also spent $57.9 million on lobbying last year, employing 380 lobbyists, over two-thirds of whom “passed through the ‘revolving door’ from top positions in Congress, the Pentagon, and the Department of Energy to work for nuclear weapons contractors as executives or board members.”

And it should be noted that the revolving door swings both ways,” the report adds, noting that “three of the past five secretaries of defense worked as lobbyists or board members of major nuclear weapons contractors before taking up their positions in the Pentagon: James Mattis (General Dynamics); Mark Esper (Raytheon); and Lloyd Austin (Raytheon).”

Profiteers of Armageddon’: Report Reveals Who Benefits From US ‘Nuclear Modernization’ Plan, While taking aim at special interest lobbying and corporate profits that impede “sensible” policy, the author argues the “only way to be truly safe from nuclear weapons is to eliminate them altogether.”      https://www.commondreams.org/news/2021/10/12/profiteers-armageddon-report-reveals-who-benefits-us-nuclear-modernization-plan

JESSICA CORBETT  A short list of contractors that pour large sums of money into campaign contributions, lobbying, and industry-friendly think tanks benefits from the U.S. government’s ongoing, decadeslong “nuclear modernization” plan worth up to $2 trillion, according to a report out Tuesday.

The issue brief—entitled Profiteers of Armageddon: Producers of the next generation of nuclear weapons—was authored by William Hartung, director of the Arms and Security Program at the Center for International Policy, who also outlined his report in Inkstick.

Hartung details how the U.S. departments of Defense (DOD) and Energy (DOE) are ramping up a plan to build the next generation of nuclear-armed bombers, missiles, and submarines as well as warheads, and the beneficiaries are major contractors along with operators of the National Nuclear Security Administration’s (NNSA) nuclear weapons complex.

The brief notes the U.S. nuclear weapons budget has climbed in recent years to over $43 billion in the Biden administration’s proposed budget for fiscal year 2022, and warns that “this figure will grow dramatically,” pointing to a Congressional Budget Office (CBO) estimate that parts of the Pentagon’s plan “will cost tens of billions each over the next decade, including $145 billion for ballistic missile submarines, $82 billion for the new Intercontinental Ballistic Missile (ICBM), and $53 billion for the new nuclear-armed bomber.”

“And the costs will not end there,” the report continues, noting that “the estimated lifetime cost of building and operating the new ICBM is $264 billion.”

While “a handful of prime contractors” are the initial recipients and main beneficiaries of public money spent on bombers, missiles, and submarines, “the funds trickle down to subcontractors” that often include other prominent companies. The report names firms such as Bechtel, General Dynamics, Honeywell, Lockheed Martin, Northrop Grumman, and Raytheon.

Hartung directs attention to the millions of dollars in political activities by key contractors, writing that “while not all of this spending is devoted to lobbying on nuclear weapons programs, these expenditures are indicative of the political clout they can bring to bear on Congress as needed to sustain and expand the budgets for their nuclear weapons-related programs.”

From 2012 to 2020, campaign contributions from contractors mentioned in the brief topped $119 million, more than a quarter of which was in the 2020 cycle alone. They also spent $57.9 million on lobbying last year, employing 380 lobbyists, over two-thirds of whom “passed through the ‘revolving door’ from top positions in Congress, the Pentagon, and the Department of Energy to work for nuclear weapons contractors as executives or board members.”

And it should be noted that the revolving door swings both ways,” the report adds, noting that “three of the past five secretaries of defense worked as lobbyists or board members of major nuclear weapons contractors before taking up their positions in the Pentagon: James Mattis (General Dynamics); Mark Esper (Raytheon); and Lloyd Austin (Raytheon).”

The brief also pushes back against “routinely exaggerated” claims about job creation that both companies and lawmakers use to promote nuclear weapons programs, and points out that contractors pump millions into supporting think tanks that opine on relevant policy.

Continued lobbying for the modernization plan “ignores the fact that building a new generation of nuclear weapons at this time will make the world a more dangerous place and increase the risk of nuclear war while fueling the new arms race,” Hartung argues. “It’s long past time that we stopped allowing special interest lobbying and corporate profits stand in the way of a more sensible nuclear policy.”

While asserting that “the only way to be truly safe from nuclear weapons is to eliminate them altogether,” in line with a global treaty that states with such weapons continue to oppose, Hartung also highlights that “the organization Global Zero has outlined an alternative nuclear posture that would eliminate ICBMs, reduce the numbers of bombers and ballistic missile submarines, and implement a policy of no first use of nuclear weapons as part of a ‘deterrence-only’ strategy that would reduce the danger of a nuclear conflict.”

Global Zero CEO Derek Johnson welcomed Hartung’s brief in a tweet Tuesday.

Earlier this year, Sen. Elizabeth Warren (D-Mass.) and Rep. Adam Smith (D-Calif.) led the reintroduction of legislation (S.1219/H.R. 2603) to establish that “it is the policy of the United States to not use nuclear weapons first,” but the bill has not advanced in Congress, despite pressure from progressive lawmakers and campaigners.

Peace Action of Wisconsin’s Pamela Richard said in August that while activists encourage the passage of Warren and Smith’s bill as well as a related one (S. 1148/H.R. 669) from Sen. Ed Markey (D-Mass.) and Rep. Ted Lieu (D-Calif.), “our long-term goal is total nuclear disarmament.” 

October 14, 2021 Posted by | business and costs, politics, Reference | Leave a comment

Smoke from nuclear war would devastate ozone layer, alter climate

SMOKE FROM NUCLEAR WAR WOULD DEVASTATE OZONE LAYER, ALTER CLIMATE   Atmospheric impacts of global nuclear war would be more severe than previously thought   https://news.ucar.edu/132813/smoke-nuclear-war-would-devastate-ozone-layer-alter-climate

OCT 13, 2021 – BY DAVID HOSANSKY    The massive columns of smoke generated by a nuclear war would alter the world’s climate for years and devastate the ozone layer, endangering both human health and food supplies, new research shows.

The international study paints an even grimmer picture of a global nuclear war’s aftermath than previous analyses. The research team used newly developed computer climate modeling techniques to learn more about the effects of a hypothetical nuclear exchange, including complex chemistry interactions in the stratosphere that influence the amounts of ultraviolet (UV) radiation that reach the planet’s surface.

Since the ozone layer protects Earth’s surface from harmful UV radiation, such impacts would be devastating to humans and the environment. High levels of UV radiation have been linked to certain types of skin cancer, cataracts, and immunological disorders. The ozone layer also protects terrestrial and aquatic ecosystems, as well as agriculture.

“Although we suspected that ozone would be destroyed after nuclear war and that would result in enhanced ultraviolet light at the Earth’s surface, if there was too much smoke, it would block out the ultraviolet light,” said study co-author Alan Robock, a professor of climate science at Rutgers University. “Now, for the first time, we have calculated how this would work and quantified how it would depend on the amount of smoke.”

Continue reading

October 14, 2021 Posted by | 2 WORLD, climate change, Reference | Leave a comment

Chris Busby on the truth about black rain, radiation and cancer

the major cause of cancer in the low and medium dose groups (0-100mSv) in the Hiroshima lifespan study was not the immediate radiation from the detonation, the external gamma radiation and neutrons, but was in fact exposure to Uranium 234 particles from the bomb itself which rained out over the city in the black rain. Torrential black rain fell over the city and surrounding areas from 30 minutes to several hours after the atomic explosion.

Hiroshima Black Rain and the Test Veterans,  https://www.labrats.international/post/hiroshima-black-rain-and-the-test-veterans     Chris Busby, 13th Sept 2021
    The absolute key study of the effects of radiation on cancer risk is the Lifespan Study (LSS) of the survivors of the Hiroshima bomb. It provides the evidence used by the Secretary of State for Defence (the MoD) to refuse pensions in all the UK Test Vet cases. Groups were assembled in 1952 some 7 years after the bomb and divided into high, medium and low doses on the basis of their distance from Ground Zero, with a No Dose group consisting of those who were outside the City and came in later. They were thrown out in 1973 as using them as a control gave too many cancers. This study continues today and the risks of different cancers after exposures are obtained from the excess risk of any type of cancer in each dose group. The risk factors for cancer which are currently the basis for all laws relating to exposure are based on this study. You have to get a Dose of about 1000mSv to get a 40% excess risk of cancer on the basis of the LSS results. Naturally, since no Test veteran got anywhere near this dose, all the pension applications (and appeals) are refused.

But on Sept 9th a scientific report I wrote was published in the peer-reviewed Journal Cancer Investigations. My paper The Hiroshima A-Bomb black rain and the lifespan study—a resolution of the Enigma shows that the LSS was dishonestly manipulated and that its results are totally unsafe. It spells the end of the radiation risk model and the beginning of justice for the test veterans. How?

What it shows, is that the major cause of cancer in the low and medium dose groups (0-100mSv) in the Hiroshima lifespan study was not the immediate radiation from the detonation, the external gamma radiation and neutrons, but was in fact exposure to Uranium 234 particles from the bomb itself which rained out over the city in the black rain. Torrential black rain fell over the city and surrounding areas from 30 minutes to several hours after the atomic explosion. Doses from the inhalation and ingestion of the Uranium particles in the black rain were very low. Since the Christmas Island vets were also exposed to rainout after the bombs, they are in the same category of victims as the Hiroshima low dose LSS victims (<5mSv). The Japanese government lost a court case in July on this issue, one which it will not appeal. Those living in the black rain areas who developed cancer will get compensation and attention in the same way as those who received an external dose from the detonation, even though the black rain victims’ dose was zero. The separation of external radiation from internal in terms of risk also shows that the types of cancers believed in the model to result from radiation must also be reassessed.

Of course, the MoD knew all this. It is the biggest secret of all, since it supports everything nuclear: bombs, energy, naval propulsion, Depleted Uranium, winnable nuclear war and raises the issue of enormous amounts of compensation. It had to be kept out. In 2013, during the run-up to the big test veteran appeal in the Royal Courts of Justice, I obtained from the late Major Alan Batchelor in Australia an official British document which was submitted to the Australian National Commission test vet hearings. It listed the quantity of Uranium isotopes in the Enriched Uranium used by the British in their bombs. I also had obtained a copy (when I was advising Rosenblatts in 2009 in the Foskett case) of a memo from 1953 on the dangers of Uranium 234 at the test sites. But these documents were suddenly made subject to the Official Secrets Act.

In 2013 after Rosenblatts had pulled out, Hogan Lovells removed all my 4 years of evidence and reports, 12 documents, and also removed me from the case without consulting any of the veterans they represented. In 2014 Judge Charles in the Upper Tier ruled that I could not act as an expert witness (I was biased) and anything I had written or argued previously had to be ignored. I neatly reverted from expert to representative and argued in 2016 before Judge Blake in the RCJ that the exposure of interest at Christmas Island was to Uranium from the material of the bomb. We flew in Professor Shoji Sawada all the way from Japan to make the same point. But Blake either ignored him or pretended to. In Blake’s final judgement he wrote:

14. . .it is submitted that prolonged exposure to radiation by inhalation or ingestion of radioactive particles deposited on the land or in the sea off CI is a real possibility. . .

15. In the appeals relating to Messrs Battersby and Smith Dr Busby, on their behalf, advances a more radical submission that the guidance issued by the International Commission on Radiological Protection in the UK and EU is flawed and underestimates the risk to health from internal exposure to radiation, and in particular radiation from Uranium.

What the new paper shows, is that we were exactly right and Blakes judgement exactly wrong; he listened to the experts brought in by the MoD, who did not (or they say they were told by MoD lawyer Adam Heppinstall) not to address our experts or their evidence; to keep the evidence out. The Scots Upper Tier has now reversed the Charles decision on my expertise, Judge DJ May QC calling it “Unlawful”. The British Tribunals, however, ignore the Scottish UT decision and persist in keeping my evidence out.

The Lifespan Study was dishonestly manipulated to provide support for the continued radioactive contamination of the environment by atmospheric bomb testing. The evidence is that this stitch-up has resulted in the biggest public health scandal in human history. The internal radiation effects on the children born at the peak period, 1959-63 caused genetic damage, infant deaths and the cancer epidemic which began in 1980. The effect is also in the children and grandchildren as new data clearly show. My study of the BNTVA also found a 10-fold congenital malformation rate in the children and 9-fold in the grandchildren. The Black Rain paper proves that the risk model that permitted this is wildly wrong. For those who are interested, read the paper: it is easy to understand. Then get angry and do something.

Meanwhile, I do what I can: I have two test vet cases ongoing: Trevor Butler and Christopher Donne, and also a Nuclear submarine sailor in Scotland who died from lymphoma. Here I am up against twisty Adam Heppinstall once more. He has begun, in true style, by removing all our evidence from the Bundle.

October 7, 2021 Posted by | radiation, Reference, weapons and war | 1 Comment

Independent scientists speak the truth about ionising radiation.

How monolithic institutions decide what is safe for the rest of us, Beyond Nuclear, By Christine Fassert and Tatiana Kasperski, 12 Sept 21,

”………………..The condemnation of this [ Fukushima area radiation] threshold came first of all from within: the special adviser on radiation protection of the Prime Minister’s Office, Professor Toshiso Kosako, resigned in tears on April 30, 2011:

“I cannot accept such a threshold, being applied to babies, children, and elementary school students, not only from an academic point of view, but also because of my humanistic values,” he said.

Many critiques

At the international level, the decision to raise the threshold was also criticized by the two successive UN Special Rapporteurs, Anand Grover and Baskut Tuncak. Moreover, the two experts question the very foundations of radiation protection, which rely on the ALARA principle: As Low as Reasonably Achievable.

This “reasonably” indicates that criteria other than health are taken into account, which Grover criticizes, referring to the “right to health”. Indeed, the rapporteur points out that “the ICRP recommendations are based on the principle of optimization and justification, according to which all government actions should maximize the benefits over the detriments. Such a risk-benefit analysis is not in line with the framework of the right to health, because it gives priority to collective interests over individual rights”.

Tuncak echoes Grover’s criticism in his October 2018 report, stating that “the Japanese government’s decision to increase what is considered the acceptable level of radiation exposure by a factor of 20 is deeply troubling.”

Better protecting individuals

Similar arguments were also used by Belarusian and Ukrainian scientists who, in the late 1980s, opposed the lifetime dose limit of 35 rem (350msv) over a maximum period of 70 years from the time of the accident — a limit that Soviet experts in Moscow, with the support of ICRP representatives, including the head of the French Central Service for Protection against Ionizing Radiation, Pierre Pellerin, were trying to impose as the basis for all post-accident response measures. 

The Belarusian and Ukrainian researchers considered the 35 rem criterion to be unacceptable not only from a scientific point of view but also, and above all, from an ethical point of view.

They pointed out that under the conditions of scientific uncertainty about the effects of ionizing radiation, it was dangerous to underestimate the risks that radioactivity represented for the inhabitants of the affected territories, and they considered that the country’s authorities had a moral obligation to devote all the necessary means to greater protection of the inhabitants of the affected regions, especially the most vulnerable individuals.

The danger of low doses

The protagonists of the optimization of radiation protection in the post-accident context insist on the absence of studies proving significant health effects below these thresholds.

For a long time, the arguments for and against these thresholds have been discussed in the public arena and by social scientists in terms of scientific and medical “controversies” — opposing scientists connected to the nuclear sphere who have long denied the harmfulness of low doses, to scientists outside this sphere who consider that the risks were underestimated.

The question of the level of danger of low doses of radioactivity is one of the best known examples of such controversies, which regularly resurface despite the development of scientific knowledge about these risks.

This debate did not arise at the time of the Fukushima accident, but has been going on for a long time and is part of the “motives” also found in the debates about Chernobyl as well as other nuclear accidents such as Kyshtym, in Russia, in 1957………………… https://beyondnuclearinternational.org/2021/09/12/vested-interests/

October 5, 2021 Posted by | 2 WORLD, radiation, Reference, spinbuster | Leave a comment

The sunken nuclear submarines: Russia’s ‘slow-motion Chernobyl’ at sea 

One of them is the K-27, once known as the “golden fish” because of its high cost. The 360ft-long (118m) attack submarine (a submarine designed to hunt other submarines) was plagued with problems since its 1962 launch with its experimental liquid-metal-cooled reactors, one of which ruptured six years later and exposed nine sailors to fatal doses of radiation. In 1981 and 1982, the navy filled the reactor with asphalt and scuttled it east of Novaya Zemlya island in a mere 108ft (33m) of water. A tugboat had to ram the bow after a hole blown in the ballast tanks only sank the aft end.

The K-27 was sunk after some safety measures were installed that should keep the wreck safe until 2032. But another incident is more alarming. The K-159, a 350ft (107m) November-class attack submarine, was in service from 1963 to 1989. The K-159 sank with no warning, sending 800kg (1,760lb) of spent uranium fuel to the seafloor beneath busy fishing and shipping lanes just north of Murmansk. Thomas Nilsen, editor of The Barents Observer online newspaper, describes the submarines as a “Chernobyl in slow motion on the seabed”.

While the vast size of the oceans quickly dilutes radiation, even very small levels can become concentrated in animals at the top of the food chain through “bioaccumulation” – and then be ingested by humans. But economic consequences for the Barents Sea fishing industry, which provides the vast majority of cod and haddock at British fish and chip shops, “may perhaps be worse than the environmental consequences”, says Hilde Elise Heldal, a scientist at Norway’s Institute of Marine Research.

But an accident while raising the submarine, on the other hand, could suddenly jar the reactor, potentially mixing fuel elements and starting an uncontrolled chain reaction and explosion. That could boost radiation levels in fish 1,000 times normal or, if it occurred on the surface, irradiate terrestrial animals and humans, another Norwegian study found.

Russia’s ‘slow-motion Chernobyl’ at sea  https://www.bbc.com/future/article/20200901-the-radioactive-risk-of-sunken-nuclear-soviet-submarines, By Alec Luhn, 2nd September 2020

Beneath some of the world’s busiest fisheries, radioactive submarines from the Soviet era lie disintegrating on the seafloor. Decades later, Russia is preparing to retrieve them.

By tradition, Russians always bring an odd number of flowers to a living person and an even number to a grave or memorial. But every other day, 83-year-old Raisa Lappa places three roses or gladiolas by the plaque to her son Sergei in their hometown Rubtsovsk, as if he hadn’t gone down with his submarine during an ill-fated towing operation in the Arctic Ocean in 2003.

Continue reading

October 2, 2021 Posted by | oceans, PERSONAL STORIES, Reference, Russia, wastes, weapons and war | Leave a comment

Plutonium: How Nuclear Power’s Dream Fuel Became a Nightmare. 

The history of nuclear power’s imagined future: Plutonium’s journey from asset to waste, Bulletin of the Atomic Scientists, By William Walker, September 7, 2021 

Bill Gates is deluded in believing that the plutonium-fuelled, sodium-cooled, “Versatile Power Reactor” in which his company Terrapower is involved, has a commercial future.[18] His support is also unwelcome insofar as it helps to perpetuate the myth that plutonium is a valuable fuel, posing acceptable risks to public safety and international security. Reprocessing is a waste-producing, not an asset-creating, technology. It adds cost rather than value. It merits no future when seen in this way.

‘ ………..Plutonium’s history  and its legacies are the subject of a recent book by Frank von Hippel, Masafumi Takubo and Jungmin Kang.[1] Plutonium: How Nuclear Power’s Dream Fuel Became a Nightmare. It is an impressive study of technological struggle and ultimate failure, and of plutonium’s journey from regard as a vital energy asset to an eternally troublesome waste

Toward heaven or hell?  The conflict over plutonium’s future…………..

From creation of a future to preservation of the present

Construction of the British and French reprocessing plants at Sellafield and Cap de la Hague proceeded throughout the 1980s.[6] Their primary justification—preparing for the introduction of fast breeder reactors—had lost all credibility by the time of their completion. The German, British and French breeder programs had been cut back, soon to be abandoned, and in 1988 Germany cancelled plans to build its own bulk reprocessing plant at Wackersorf. Although Japan’s confidence in its fast breeder reactor program also waned, it was kept alive to avoid disrupting construction of the reprocessing plant at Rokkasho-mura.

Faced by the plutonium economy’s demise, reprocessing was re-purposed by its supporters to provide the industry and its governmental backers with reason not to do the obvious—abandon ship. Creating an essential future was replaced by a rationale designed to preserve and activate the newly established reprocessing infrastructures. ……. plutonium’s energy value could be realised through its replacement of fissile uranium in “mixed-oxide fuels” for use in existing thermal reactors………

Thirty years after the Euro-Japanese reprocessing/recycling system’s launch, the experiment can only be judged a failure. The reasons are set out in persuasive detail in von Hippel, Takubo and Kang’s book. It is a system undergoing irreversible contraction after a long struggle, involving heavy expenditure and many troubles. Germany and the UK have already exited, the UK shutting its THORP reprocessing plant in 2018 and delaying its Magnox reprocessing plant’s closure only because of the coronavirus pandemic.[9] Instead, its Nuclear Decommissioning Authority has been given the costly (more than $138 billion) and long-lasting (more than 100 years) task of returning Sellafield and Dounreay to “green-field sites.”

Japan’s engagement with reprocessing and plutonium recycling was already deeply troubled before the Fukushima accident closed reactors: The Rokkasho-mura reprocessing plant was operating only fitfully, MOX recycling was not happening, and plutonium separated from Japanese spent fuels in France and the UK was marooned there, probably indefinitely, by inability to manage its return in MOX fuel (cutting a very long story short).[10] The declared intention to soldier on with bulk reprocessing seems increasingly bizarre and is surely unsustainable. ……….

France’s national utility EDF, saddled with enormous debts, is striving to reduce its exposure to reprocessing.  It is symptomatic that no spent fuel discharged from EDF-owned and -operated reactors in the UK, including those under construction at Hinkley Point, will be reprocessed………

The move away from reprocessing is being accompanied by a transition towards dry-cask storage of spent fuels. It entails their removal from water pools at reactors after a few years’ cooling and their insertion in large concrete or stainless steel containers, ………

 Reprocessing continues in India and Russia, if fitfully, where fast reactor programmes are still being funded. Japan’s commitment remains. ………

There is particular concern about China’s engagement with reprocessing and its dual civil and military purposes…………

……………. Separated plutonium is a waste

The authors remind readers of the persistent dangers that reprocessing poses to public safety and international security: the risks of accident and exposure to radiation, the proliferation of weapons, the possibility of diversion into nuclear terrorism, and the undesirable complication of radioactive waste disposal. “In our view, it is time to ban the separation of plutonium for any purpose” (their italics) is their concluding sentence. This may be the case, but the US and other governments are unlikely to respond to their call. They have so much else to contend with—climate change, pandemics, economic distress, arms racing on a long list—leaving a ban on plutonium separation low in their priorities. They are also all too aware of past failures to institute such bans,  whether in commercial or military domains, from the Carter Policy in the 1970s to the stalled Fissile Material Cutoff Treaty in the 1990s and subsequently.

Another conclusion cries out to be drawn from this book. Plutonium’s separation and usage for energy purposes was an experiment that can now decisively be pronounced a failure.  Experience has shown that separated civil plutonium is a waste. The book’s first of many figures, reproduced below,  is the most telling. Up to the mid-1980s, the global stock of separated plutonium was predominately military and held in warheads, peaking at around 200 tons. It now exceeds 500 tons. The increase is due to the ballooning of civil stocks as plutonium’s separation has outstripped consumption. The global stock of separated plutonium now includes material extracted from the post-Cold War dismantlement of Russian and US nuclear warheads that is also effectively a waste.[16]

Civil plutonium is therefore not an asset, it is not “surplus to requirement;” it is a waste.  This is the message that needs to be proclaimed and acknowledged, especially by governments, utilities, and industries desiring that nuclear power have a solid future and make a contribution to the avoidance of global warming. For reasons set out in von Hippel’s recent article in the Bulletin, Bill Gates is deluded in believing that the plutonium-fuelled, sodium-cooled, “Versatile Power Reactor” in which his company Terrapower is involved, has a commercial future.[18] His support is also unwelcome insofar as it helps to perpetuate the myth that plutonium is a valuable fuel, posing acceptable risks to public safety and international security. Reprocessing is a waste-producing, not an asset-creating, technology. It adds cost rather than value. It merits no future when seen in this way.

Even if all civil reprocessing ceased tomorrow, the experiment would have bequeathed the onerous task of guarding and disposing of over 300 tons of plutonium waste, and considerably more when US and Russia’s military excess is added in. Proposals come and go.  Burn it in specially designed reactors? Blend it with other radioactive wastes? Bury it underground after some form of immobilization? Send it into space? All options are costly and hard to implement. Lacking ready solutions, most plutonium waste will probably remain in store above ground for decades to come, risking neglect. How to render this dangerous waste eternally safe and secure is now the question.     Extensive References . https://thebulletin.org/premium/2021-09/the-history-of-nuclear-powers-imagined-future-plutoniums-journey-from-asset-to-waste/?utm_source=Newsletter&utm_medium=Email&utm_campaign=ThursdayNewsletter09272021&utm_content=NuclearRisk_HistoryOfNuclearPowersImagined_09102021


September 28, 2021 Posted by | - plutonium, 2 WORLD, Reference | Leave a comment

The Record-Breaking Failures and Costs of Nuclear Power

Let’s look at the track record as a whole. According to Wikipedia’s article, List of cancelled nuclear reactors in the United States: “Of the 253 nuclear power reactors originally ordered in the United States from 1953 to 2008, 48 percent were cancelled, 11 percent were prematurely shut down, 14 percent experienced at least a one-year-or-more outage, and 27 percent are operating without having a year-plus outage. Thus, only about one fourth of those ordered, or about half of those completed, are still operating and have proved relatively reliable.”

Wikipedia’s stunning list on the same page details 157 reactors that were either canceled before or during construction.

The Record-Breaking Failures of Nuclear Power,    https://www.counterpunch.org/2021/09/24/the-record-breaking-failures-of-nuclear-power/ BY LINDA GUNTER  SEPTEMBER 24, 2021The Tennessee Valley Authority could likely rightfully claim a place in the Guinness Book of World Records, but it’s not an achievement for which the federally-owned electric utility corporation would welcome notoriety.

After taking a whopping 42 years to build and finally bring on line its Watts Bar Unit 2 nuclear power reactor in Tennessee, TVA just broke its own record for longest nuclear plant construction time. However, this time, the company failed to deliver a completed nuclear plant.

Watts Bar 2 achieved criticality in May 2016, then promptly came off line due to a transformer fire three months later. It finally achieved full operational status on October 19, 2016, making it  the first United States reactor to enter commercial operation since 1996.

Now, almost five years later, TVA has announced it has abandoned its unfinished two-reactor Bellefonte nuclear plant in Alabama, a breathtaking 47 years after construction began.

TVA was apparently happy to get out of the nuclear construction business, because, as the Chattanooga Times Free Press reported, the company “did not see the need for such a large and expensive capacity generation source.” No kidding!

Ironically, this is precisely the argument used to advance renewables, in an energy environment that cannot and will no longer support inflexible, large, thermo-electric generators that are completely impractical under the coming smart grids as well as climate change-induced conditions.

Accordingly, TVA was more than happy to accept overtures from a purchaser for Bellefonte — the Haney real estate company— whose director, Frank Haney, gained his own notoriety by lavishing $1 million on former President Trump and courting Trump’s lawyer, Michael Cohen, possibly, suggested media reports, to curry regulatory favors for his new nuclear toy.

But when TVA announced last month that it had withdrawn its construction permit for Bellefonte, Haney got his down payment back — to the tune of $22.9 million plus interest. TVA had itself spent at least $5.8 billion on Bellefonte over the 47 years, which included long stoppages, before finally pulling the plug.

This kind of colossal waste of time and money on failed nuclear power projects is, of course, the more typical story than the myths spun in the press about the need for “low carbon” nuclear energy, a misleading representation used to argue for nuclear power’s inclusion in climate change mitigation.

In reality, the story of nuclear power development in the US over the last 50 years is beyond pitiful and would not pass muster under any “normal” business plan. How the nuclear industry gets away with it remains baffling.

As Beyond Nuclear’s Paul Gunter told the Chattanooga Times Free Press, “Bellefonte is just the most recent failure for this industry,” noting that “of the 30 reactors the industry planned to build 15 years ago with the so-called nuclear renaissance, only two are still being built. (Those two, at Plant Vogtle in Georgia, are years behind schedule with a budget that has more than doubled to $27 billion.)

As Gunter noted in the same article, “TVA has had major problems meeting projected costs and timetables for new nuclear plants, as the entire industry has had over the past 50 years. The inability to meet any budgets for these plants is what has repeatedly been the demise of nuclear energy.

“Nuclear energy is the most expensive way ever conceived to boil water and Bellefonte just shows once again how unreliable this technology really is in terms of projecting what it will cost and how long it will take to build these power plants,” Gunter told the newspaper.

That was certainly true for Westinghouse Electric Company and SCANA, still embroiled in the ever unraveling scandal around the failure to complete two new reactors at the V.C. Summer nuclear power plant in South Carolina. As executives of the bankrupt Westinghouse and SCANA, who retained them, continue to face criminal charges, Westinghouse has already had to shell out $2.168 billion in settlement payments related to the Summer debacle.

In August, news reports said Westinghouse would also be required to reimburse low-income ratepayers to the tune of $21.25 million. That’s because the new reactors got funded in part through electricity rates, even though they never delivered a single watt of electricity. The cost of the project itself eventually ballooned to more than $9 billion before collapsing.

Let’s look at the track record as a whole. According to Wikipedia’s article, List of cancelled nuclear reactors in the United States: “Of the 253 nuclear power reactors originally ordered in the United States from 1953 to 2008, 48 percent were cancelled, 11 percent were prematurely shut down, 14 percent experienced at least a one-year-or-more outage, and 27 percent are operating without having a year-plus outage. Thus, only about one fourth of those ordered, or about half of those completed, are still operating and have proved relatively reliable.”

Wikipedia’s stunning list on the same page details 157 reactors that were either canceled before or during construction.

The massive costs, of course, send most corporations running scared, the Haney family notwithstanding. Even when meaty subsidies have been dangled — as they were for the Calvert Cliffs 3 EPR project in Maryland — utility companies balk and bail. In the case of Calvert Cliffs, Constellation Energy was the US partner with the French government utility EDF. But even when offered a $7.5 billion loan guarantee by the Obama administration, Constellation viewed those terms as “too expensive and burdensome” and quit.

This left EDF, a foreign company, as sole owner, a violation of the Atomic Energy Act. The project duly collapsed, one of many referred to earlier by Paul Gunter as the fantasy of a nuclear renaissance that first sputtered, then went out.

President Obama, of course, was no friend to the anti-nuclear movement. So eager was he to boost new nuclear construction in the US that he called for the inclusion of $55 billion for nuclear loan guarantees in his $3.8 trillion 2011 budget. In his State of the Union address that year, Obama talked of “building a new generation of safe, clean nuclear power plants in this country.” Kool-Aid thoroughly drunk, then.

All of this should send an obvious message to the deaf ears of Ben Cardin (D-MD), Sheldon Whitehouse (D-RI) and Cory Booker (D-NJ), the leading pro-nuclear evangelists in the U.S. Senate. Cardin’s power production credit bill actually has the gall to describe nuclear power as “zero-emission”, a lie that even Cardin’s own staffer was forced to concede in a recent meeting attended by Paul Gunter who called him out on it.

Not that any of this will stop the bill going forward and almost certainly passing. Like the three not-so-wise monkeys, those Senators and their colleagues will acknowledge no negatives about nuclear power, even as the industry’s appalling litany of financial fiascoes and failures stares them in the face. They will forge right ahead, thus dooming to its own failure the very progress on climate change they claim to champion.

September 25, 2021 Posted by | 2 WORLD, business and costs, Reference | Leave a comment

Interaction of Nuclear Waste With the Environment More Complicated Than Previously Thought

Interaction of Nuclear Waste With the Environment May Be More Complicated Than Previously Thoughthttps://www.technologynetworks.com/applied-sciences/news/interaction-of-nuclear-waste-with-the-environment-may-be-more-complicated-than-previously-thought-353879, September 22 2021
| Original story from the Lawrence Livermore National Laboratory  Lawrence Livermore National Laboratory (LLNL) scientists and collaborators have proposed a new mechanism by which nuclear waste could spread in the environment.

The new findings, which involve researchers at Penn State and Harvard Medical School, have implications for nuclear waste management and environmental chemistry. The research is published in the Journal of the American Chemical Society.

“This study relates to the fate of nuclear materials in nature, and we stumbled upon a previously unknown mechanism by which certain radioactive elements could spread in the environment,” said LLNL scientist and lead author Gauthier Deblonde. “We show that there are molecules in nature that were not considered before, notably proteins like ‘lanmodulin’ that could have a strong impact on radioelements that are problematic for nuclear waste management, such as americium, curium, etc.”

Past and present nuclear activities (e.g., for energy, research or weapon tests) have increased the urgency to understand the behavior of radioactive materials in the environment. Nuclear wastes containing actinides (e.g. plutonium, americium, curium and neptunium) are particularly problematic, as they remain radioactive and toxic for thousands of years.

However, very little is known about the chemical form of these elements in the environment, forcing scientists and engineers to use models to predict their long-term behavior and migration patterns. Thus far, these models have only considered interactions with small natural compounds, mineral phases and colloids, and the impact of more complex compounds like proteins has been largely ignored. The new study demonstrates that a type of protein that is abundant in nature vastly outcompetes molecules that scientists previously considered as the most problematic in terms of actinide migration in the environment.

 “The recent discovery that some bacteria specifically use rare-earth elements has opened new areas of biochemistry with important technological applications and potential implications for actinide geochemistry, because of chemical similarities between the rare-earths and actinides,” said Joseph Cotruvo Jr., Penn State assistant professor and co-corresponding author on the paper.

The protein called lanmodulin is a small and abundant protein in many rare-earth-utilizing bacteria. It was discovered by the Penn State members of the team in 2018. While the Penn State and LLNL team has studied in detail how this remarkable protein works and how it can be applied to extract rare-earths, the protein’s relevance to radioactive contaminants in the environment was previously unexplored.


“Our results suggest that lanmodulin, and similar compounds, play a more important role in the chemistry of actinides in the environment than we could have imagined,” said LLNL scientist Annie Kersting. “Our study also points to the important role that selective biological molecules can play in the differential migration patterns of synthetic radioisotopes in the environment.”

“The study also shows for the first time that lanmodulin prefers the actinide elements over any other metals, including the rare-earth elements, an interesting property than could be used for novel separation processes,” said LLNL scientist Mavrik Zavarin.

Rare-earth element biochemistry is a very recent field that Penn State and LLNL have helped to pioneer, and the new work is the first to explore how the environmental chemistry of actinides may be linked to nature’s use of rare-earth elements. Lanmodulin’s higher affinity for actinides might even mean that rare-earth-utilizing organisms that are ubiquitous in nature may preferentially incorporate certain actinides into their biochemistry, according to Deblonde.

Reference
Deblonde GJ-P, Mattocks JA, Wang H, et al. Characterization of Americium and Curium Complexes with the Protein Lanmodulin: A Potential Macromolecular Mechanism for Actinide Mobility in the Environment. J Am Chem Soc. Published online September 20, 2021. doi:10.1021/jacs.1c07103

September 23, 2021 Posted by | 2 WORLD, radiation, Reference | Leave a comment

Nuclear power: Why molten salt reactors are problematic and Canada investing in them is a waste

China’smolten salt nuclear reactor

Nuclear power: Why molten salt reactors are problematic and Canada investing in them is a waste https://theconversation.com/nuclear-power-why-molten-salt-reactors-are-problematic-and-canada-investing-in-them-is-a-waste-167019, MV Ramana, Simons Chair in Disarmament, Global and Human Security at the Liu Institute for Global Issues, University of British ColumbiaSeptember 15, 2021 

Should an MSR be built, it will also saddle society with the challenge of dealing with the radioactive waste it will produce. This is especially difficult for MSRs because the waste is in chemical forms that are “not known to occur in nature” and it is unclear “which, if any, disposal environment could accommodate this high-level waste.” The Union of Concerned Scientists has also detailed the safety and security risks associated with MSR designs.

 One of the beneficiaries of the run-up to a potential federal election has been the nuclear energy industry, specifically companies that are touting new nuclear reactor designs called small modular reactors. The largest two financial handouts have been to two companies, both developing a specific class of these reactors, called molten salt reactors (MSRs).

First, in October 2020, Canada’s minister of innovation, science and industry announced a $20-million grant to Ontario-based Terrestrial Energy and its integral molten salt reactor (IMSR) design. In March 2021, New Brunswick-based Moltex received $50.5 million from the Strategic Innovation Fund and Atlantic Canada Opportunities Agency.

As a physicist who has analyzed different nuclear reactor designs, including small modular reactors, I believe that molten salt reactors are unlikely to be successfully deployed anytime soon. MSRs face difficult technical problems, and cannot be counted on to produce electricity consistently.

How they work

Molten salt reactors use melted chemicals like lithium fluoride or magnesium chloride to remove the heat produced within the reactor. In many MSRs, the fuel is also dissolved in a molten salt.

These designs are very different from traditional reactor designs — currently, the Canada Deuterium Uranium (CANDU) design dominates Canada’s nuclear energy landscape. CANDU uses heavy water (water with deuterium, the heavier isotope of hydrogen) to transport heat, slow down or “moderate” neutrons produced during fission, and natural uranium fabricated into solid pellets as fuel. Slower neutrons are more effective in triggering fission reactions as compared to highly energetic, or fast, neutrons.

Terrestrial’s IMSR is fuelled by uranium which contains higher concentrations of uranium-235, a lighter isotope as compared to uranium found in nature (natural uranium), which is used in CANDU reactors. The enriched uranium is dissolved in a fluoride salt in the IMSR. The IMSR also uses graphite, instead of heavy water used in CANDU reactors, to moderate neutrons.

Moltex’s Stable Salt Reactor (SSR), on the other hand, uses a mixture of uranium and plutonium and other elements, dissolved in a chloride salt and placed inside a solid assembly, as fuel. It does not use any material to slow down neutrons.

Because of the different kinds of fuel used, these MSR designs need special facilities — not present in Canada currently — to fabricate their fuel. The enriched uranium for the IMSR must be produced using centrifuges, while the Moltex design proposes to use a special chemical process called pyroprocessing to produce the plutonium required to fuel it. Pyroprocessing is extremely costly and unreliable.

Both processes are intimately linked to the potential to make fissile materials used in nuclear weapons. Earlier this year, nine non-proliferation experts from the United States wrote to Prime Minister Justin Trudeau expressing serious concerns “about the technology Moltex proposes to use.”

Difficult questions

Experience with MSRs has not been very encouraging either. All current designs draw upon the only two MSRs ever built: the 1954 Aircraft Reactor Experiment that ran for just 100 hours and the Molten Salt Reactor Experiment that operated intermittently from 1965 to 1969. Over those four years, the latter reactor’s operations were interrupted 225 times; of these, only 58 were planned. The remaining were due to various unanticipated technical problems. In other words, the reactor had to be shut down at least once every four out of five weeks — that is not what one would expect of a reliable power plant.

Even the U.S. Atomic Energy Commission that had funded the U.S. MSR program for nearly two decades raised difficult questions about the technology in a devastating 1972 report. Many of the problems identified continue to be technical challenges confronting MSR designs.

Another basic problem with MSRs is that the materials used to manufacture the various reactor components will be exposed to hot salts that are chemically corrosive, while being bombarded by radioactive particles. So far, there is no material that can perform satisfactorily in such an environment. A 2018 review from the Idaho National Laboratory could only recommended that “a systematic development program be initiated” to develop new alloys that might work better. There is, of course, no guarantee that the program will be successful.

These problems and others have been identified by various research laboratories, ranging from France’s Institut de radioprotection et de sûreté nucléaire (IRSN) to the Nuclear Innovation and Research Office in the United Kingdom. Their conclusion: molten salt reactors are still far from proven.

As the IRSN put it in 2015: “numerous technological challenges remain to be overcome before the construction of an MSR can be considered,” going as far as saying that it does not envision construction of such reactors “during the first half of this century.”

Should an MSR be built, it will also saddle society with the challenge of dealing with the radioactive waste it will produce. This is especially difficult for MSRs because the waste is in chemical forms that are “not known to occur in nature” and it is unclear “which, if any, disposal environment could accommodate this high-level waste.” The Union of Concerned Scientists has also detailed the safety and security risks associated with MSR designs.

Problematic solutions

The Liberal government’s argument for investing in molten salt reactors is that nuclear power is necessary to mitigate climate change. There are good reasons to doubt this claim. But even if one were to ignore those reasons, the problems with MSRs laid out here show that they cannot be deployed for decades.

The climate crisis is far more urgent. Investing in technologies that are proven to be problematic is no way to deal with this emergency.

he Liberal government’s 

September 16, 2021 Posted by | Canada, Reference, technology | Leave a comment

USA developing space-based electromagnetic warfare

This is just the beginning.

How DOD is taking its Mission to Space https://www.thecipherbrief.com/column_article/how-dod-is-taking-its-mission-to-space?utm_source=Join+the+Community+Subscribers&utm_campaign=010f6454d2-EMAIL_CAMPAIGN_2021_09_14_12_18&utm_medium=email&utm_term=0_02cbee778d-010f6454d2-122765993&mc_cid=010f6454d2&mc_eid=b560fb1ddc. SEPTEMBER 14, 2021 | WALTER PINCUS  Pulitzer Prize Winning Journalist Walter Pincus is a contributing senior national security columnist at The Cipher Brief.  Pincus spent forty years at The Washington Post, writing on topics from nuclear weapons to politics.  He is the author of Blown to Hell: America’s Deadly Betrayal of the Marshall Islanders (releasing November 2021)

While others this past weekend have been looking back to 9/11, U.S. Space Command is looking forward to the next domain of warfare — in the heavens — to be directed from a Space Electromagnetic Operating Base somewhere in the United States.

Space Command’s Systems Command, Enterprise Corps and Special Programs Directorate, located at Los Angeles Air Force Base, Calif., are looking for potential contractors to run an ambitious, five-year program that will, by 2027, design, develop, deliver and operate a Space Electromagnetic Warfare facility whose primary purpose would be to jam or destroy enemy satellite and land-based communications in time of war.

It all was described in a request for information published September 1, for possible contractors to provide their potential capabilities and interest in taking on the job.

The U.S. may not have done well here on earth against the Taliban in Afghanistan, but Space Command is moving to stay ahead of its big-power competitors in using the electromagnetic spectrum for use as a weapon against potential adversary satellites in space.

As the Congressional Research Service (CRS) recently described it, “The electromagnetic spectrum is the range of wavelengths or frequencies of electromagnetic radiation. It includes radio waves, microwaves, visible light, X-rays, and gamma rays.”

The majority of military communications capabilities use radio waves and microwaves. Infrared and ultraviolet spectrums can disseminate large volumes of data, including video, over long distances – for example, intelligence collection and distribution. The military can also use lasers offensively, to dazzle satellite sensors, destroy drones, and for other purposes, according to the CRS.

Electronic warfare is not new – it was extensively used in World War II and its uses have been growing ever since.

CRS described it this way: “Missiles in general, and anti-air munitions in particular, use either infrared or radar for terminal guidance (i.e., guiding a missile once it has been launched) to targets. Electronic jammers are used to deny an adversary access to the spectrum. These jammers are primarily used in the radio and microwave frequencies (and sometimes paired together), preventing communications (both terrestrially and space-based) as well as radar coverage. Militaries have also begun using lasers to disable intelligence collection sensors, destroy small unmanned aerial systems (aka ‘drones’), and communicate with satellites.”

Back in 1977, I covered a House hearing when Dr. George Ullrich, then-Deputy Director of Defense Special Weapons Agency, described resumption of atmospheric nuclear testing in 1962 following a three-year testing moratorium. One test, called Starfish Prime, was a 1.4 megaton, high-altitude detonation. It took place over Johnston Island in the South Pacific at an altitude of about 250 miles – the largest nuclear test ever conducted in outer space.

Ullrich testified that the EMP (electromagnetic pulse) effects of the Starfish explosion surprisingly knocked out the telephone service and street lights on Hawaiian Islands, which were 800 miles east of the detonation. Years later, Ullrich wrote that another surprise outcome had been that months after the 1962 detonation, an AT&T satellite transmitting television signals from space died prematurely followed by the early failure of other satellites.

Ullrich closed on a note more relevant to today. “High-altitude EMP does not distinguish between military and civilian systems. Unhardened infrastructure systems, such as commercial power grids, telecommunication networks, as we have discussed before, remain vulnerable to widespread outages and upsets due to high-altitude EMP. While DOD (Defense Department) hardens their assets it deems vital, no comparable civilian programs exist. Thus, the detonation of one or a few high-altitude nuclear weapons could result in serious problems for the entire U.S. civil and commercial infrastructure.”

There are also non-nuclear, EMP weapons that produce pulses of energy that create a powerful electromagnetic field capable of short-circuiting a wide range of electronic equipment, particularly computers, satellites, radios, radar receivers and even civilian traffic lights.

Key to the proposed Space Electromagnetic Operating Base is L3Harris’ next generation CCS (Counter Communications System) electronic warfare system known as Meadowlands, that can reversibly deny adversaries’ satellite communications. In March 2020, Space Force declared initial operational capability of Meadowlands as “the first offensive weapon system in the United States Space Force.” Currently a road-mobile system, an additional $30 million was added to the program in this fiscal year (2021) to “design forward garrison systems…Accelerate development of new mission techniques to meet advancing threat and integrate techniques into the CCS program of record.”

Defense Daily reported last month that in May, Space Force put out a bid for production of an additional 26 Meadowlands systems with production to go on through fiscal 2025.

The first task listed for the proposed, new Space Electromagnetic Operating Base is to provide a “Space EW (Electromagnetic Warfare) Common Operating Picture” that displays relevant space electromagnetic warfare information via the remote modular terminals (RMTs) of the Meadowland program. Another task will be mission planning to include providing “executable tactical instructions, planning weapon-target pairings, & enabling automated control of multiple SEW assets by a single operator.”

The proposal called for the Space EW common picture to depict the current adversary’s Space Order of Battle (SOB), the current state of space electromagnetic warfare tasking, and real-time status of operations.  The information displayed will come from “real time intelligence, C2, and operational units.  The information from intelligence will include SOB and Battle Damage Assessment (BDA). Command and control (C2) will provide its information to SEWOL [Space Electromagnetic Warfare Operating Location] via secure communications. Operational units will provide systems status, electromagnetic support (ES) reporting, Electromagnetic Attack (EA) strike assessment, and remote assets situational awareness (SA).”

The eventual contractor “will integrate the Meadowlands and RMT Remote Operations capability into the facility’s eventual architecture,” according to the proposal. The architecture of the proposed space warfare operating base “will be scalable and flexible to allow incorporation of future SEW [space electromagnetic warfare] systems. Future SEW systems could have substantially different interfaces from the RMT and Meadowlands systems without a baseline interface, and the development of the COI [common operating interface] will help streamline integration of future systems,” according to the proposal.

While Space Command is focused on an initial location in the continental U.S., the proposal said, “It will then expand to include multiple geographically dispersed operating locations…[which] will be able to control a scalable number of assets. In addition, they can be used interchangeably and/or collaboratively to provide high resiliency and operational flexibility.”

Three weeks ago, on August 24, Army Gen. James Dickinson, U.S. Space Command commander, declared the nation’s 11th combatant command achieved initial operational capability (IOC). “We are a very different command today at IOC then we were at stand-up in 2019 — having matured and grown into a war fighting force, prepared to address threats from competition to conflict in space, while also protecting and defending our interests in this vast and complex domain.”

to conflict in space, while also protecting and defending our interests in this vast and complex domain.”

This is just the beginning.

September 16, 2021 Posted by | Reference, space travel, USA, weapons and war | Leave a comment

The Cold War near disasters at RAF Lakenheath could have left Suffolk as a nuclear wasteland

Boeing B-47B rocket-assisted take off on April 15, 1954. (U.S. Air Force photo)

The Cold War near disasters at RAF Lakenheath could have left Suffolk as a nuclear wasteland https://www.suffolknews.co.uk/mildenhall/go-anywhere-just-get-away-from-here-how-suffolk-almost-9215663/ By Dan Barker – dan.barker@iliffepublishing.co.uk , 13 September 2021  During the height of the Cold War nuclear bombs were dotted across the country, ready to wipe the USSR off the face of the map at a moment’s notice: but, on two separate occasions, Suffolk almost became victim to the very weapons which were meant to protect it.

July 27, 1956 was like any other summer’s day. Across the country attention was glued to the Ashes fourth test at Old Trafford, and four American airmen were in a B-47 bomber, on a routine training mission from RAF Lakenheath.  But, as they were practising touch-and-go landings, their bomber careered out of control and went off the runway.

it ploughed into an igloo containing three Mark-6 nuclear weapons, tearing the building apart.

The plane then

exploded, killing all four men on board, and showered the world-ending weapons with burning aviation fuel.

Most of A/C [Aircraft] wreckage pivoted on igloo and came to rest with A/C nose just beyond igloo bank which kept main fuel fire outside smashed igloo. “Preliminary exam by bomb disposal officers says a miracle that one Mark Six with exposed detonators sheared didn’t go. Firefighters extinguished fire around Mark Sixes fast.” – Telegram from RAF Lakenheath to Washington DC

Fortunately the atomic power of the bomb was missing that day, with the cores un-installed in all three for storage, but the explosives needed to trigger the deadly nuclear reaction were still in place.

With 8,000 pounds of high explosives combined with depleted uranium-238, they were a nuclear ticking time bomb as firefighters fought to put out the blaze.

Had they exploded the radioactive uranium would have been scattered over a wide area, and, depending on the wind, tens of thousands of people would have been at risk from the toxic dust across Suffolk.

Knowing the enormity of the situation base fire chief Master Sgt L. H. Dunn ordered his crew to ignore the burning wreckage of the bomber, and the airman inside, and douse the flames engulfing the nuclear storage building.

At the time it had been shrouded in secrecy, but decades later one senior US officer made it very clear how lucky Suffolk was to have narrowly missed out on a nuclear disaster.  “It is possible that part of Eastern England would have become a desert,” the then former officer told Omaha World Herald in Nebraska, who revealed the potentially catastrophic incident in November 1979.

Another said that “disaster was averted by tremendous heroism, good fortune and the will of God”.

A top secret telegram sent to Washington DC from the base, which has since been revealed, told of the near miss. “Most of A/C [Aircraft] wreckage pivoted on igloo and came to rest with A/C nose just beyond igloo bank which kept main fuel fire outside smashed igloo.

Another said that “disaster was averted by tremendous heroism, good fortune and the will of God”.

A top secret telegram sent to Washington DC from the base, which has since been revealed, told of the near miss. “Most of A/C [Aircraft] wreckage pivoted on igloo and came to rest with A/C nose just beyond igloo bank which kept main fuel fire outside smashed igloo.

Suffolk was lucky this time, but the incident caused great alarm in the British government, and it was decided it would try and block US authorities from ordering base evacuations because of the concern of causing mass panic in the country.

But what would happen if word got out that its most important ally had, almost, accidentally, made a huge part of the United Kingdom a nuclear wasteland?

Simple: Its policy for decades, if the press ever caught wind of the near miss, was to just deny it. After the news was broken in the American press in 1979, only then was it acknowledged something happened.

On November 5 that year the US Air Force and the Ministry of Defence would only admit the B-47 did crash.

In fact it took until 1996, some four decades after the near disaster, for the British state to accept the true scale of the accident in public.

But that near miss wasn’t the only one.

For on January 16, 1961, an F-100 Super Sabre, loaded with a Mark 28 hydrogen bomb caught on fire after the pilot jettisoned his fuel tanks when he switched his engines on.

As they hit the concrete runway the fuel ignited and engulfed the nuclear weapon – a 70 kilotons – and left it “scorched and blistered”.

Suffolk was saved again by the brave work of base firefighters who brought the blaze under control before the bomb’s high explosive detonated or its arming components activated.

T

errifyingly it was later discovered by American engineers that a flaw in the wiring of Mark 28 hydrogen bombs could allow prolonged heat to circumvent the safety mechanisms and trigger a nuclear explosion.

Had it gone, thousands of people would be dead within seconds, and thousands more would have been injured. As with the first incident, as well as the immediate blast, radioactive debris could have fallen in towns as far away as Ipswich and Lowestoft, given the right wind direction, spreading the toxic dust across Suffolk.

Since Clement Attlee ordered the scientists to investigate the creation of a nuclear bomb in August 1945, the British state has known that being a nuclear power comes with risk as well as reward.

It also knew it paid to be part of a nuclear alliance,

NATO, and with it came American nuclear bombs and the risk they brought.


Beyond the maths of working out how large the explosion would have been, it is impossible to know the true implications.

RAF Lakenheath was listed as a probable target for Soviet attack according to now released Cold War era documents, and intelligence agencies and war planners expected two 500 kiloton missiles to hit the site if the West was under attack.

Disaster creates uncertainty. Nobody would have known it was an accident within the minutes and hours after a blast, they would have just been dragged into a nuclear bunker and told of a large explosion at an airbase in Suffolk.

Where would that have left a British prime minister, an American president, and the rest of NATO, thinking they have come under attack?

In July 1956, and again in January 1961, those firefighters didn’t just save Suffolk … they might have saved the world.

September 14, 2021 Posted by | history, incidents, Reference, UK | Leave a comment

Nuclear ballistic missile submarine meltdown, 1961

Ki19 Russianballistic missile submarine

August 24, https://www.quora.com/Has a nuclear submarine ever had a meltdown? Laurence Schmidt, Worked at Air Liquide America (1975–2010,

In the early Cold War Era, many Russian nuclear submarines had catastrophic engineering plant failures. These failures were caused by the soviet’s rush to equal the USN in its nuclear submarine ballistic missile program; they were poorly design and constructed, lack safety system redundancy and had haphazardly trained crews. But the crews of these boats were heroic in risking their lives to save their boats in stark life and death emergencies at sea.

One example is the case of the K-19, the first Russian nuclear powered ballistic missile submarine, nicknamed the “Hiroshima” boat, because of her numerous incidences.

On July 4, 1961, while at sea, one of its two nuclear reactors SCRAMMED. The primary cooling system had failed, flooding the reactor spare with radioactive water, and there was no backup system to cool the reactor core. As the reactor rods overheated, the engineering staff try a desperate plan to improvise a cooling system; to tie into the sub’s drinking water system. But it would require several men entering the highly radioactive reactor compartment to weld new piping to pumps and valves. The first jury-rigged attempt failed with 8 crewmen being horribly burnt by the high temperatures and exposed to lethal doses of radiation. They all soon died. After other attempts, the jury-rigged system finally worked, but other crew members too close to the reactor compartment would also soon die. The crew was evacuated to a nearby submarine, and the K-19 was towed back to base for repair. In total, 22 of the crew of 139 died of radiation sickness.

A section of the radiation contaminated hull was replaced, and a new power reactor unit was installed. The two original reactors, including their fuel rods, were dumped in the Kara Sea in 1965. A favorite dumping ground for Russian navy nuclear waste, including damaged nuclear reactors to whole ships.

Did the K-19 reactor meltdown? I would say yes.

September 14, 2021 Posted by | incidents, Reference, Russia | 1 Comment