nuclear-news

The News That Matters about the Nuclear Industry Fukushima Chernobyl Mayak Three Mile Island Atomic Testing Radiation Isotope

A robot’s attempt to get a sample of the melted nuclear fuel at Japan’s damaged reactor is suspended

An attempt to use an extendable robot to remove a fragment of melted fuel from a wrecked reactor at Japan’s tsunami-hit nuclear plant has been suspended due to a technical issue

abc news, By MARI YAMAGUCHI Associated Press, August 22, 2024,

TOKYO — An attempt to use an extendable robot to remove a fragment of melted fuel from a wrecked reactor at Japan’s tsunami-hit Fukushima Daiichi nuclear power plant was suspended Thursday due to a technical issue.

The collection of a tiny sample of the debris inside the Unit 2 reactor’s primary containment vessel would start the fuel debris removal phase, the most challenging part of the decadeslong decommissioning of the plant where three reactors were destroyed in the March 11, 2011, magnitude 9.0 earthquake and tsunami disaster.

The work was stopped when workers noticed that five 1.5-meter (5-foot) pipes used to maneuver the robot were placed in the wrong order and could not be corrected within the time limit for their radiation exposure, the plant operator Tokyo Electric Power Company Holdings said.

The pipes were to be used to push the robot inside and pull it back out when it finished. Once inside the vessel, the robot is operated remotely from a safer location.

The robot can extend up to about 22 meters (72 feet) to reach its target area to collect a fragment from the surface of the melted fuel mound using a device equipped with tongs that hang from the tip of the robot.

The mission to obtain the fragment and return with it is to last two weeks. TEPCO said a new start date is undecided…………………………………………………………….

The government and TEPCO are sticking to a 30-40-year cleanup target set soon after the meltdown, despite criticism it is unrealistic. No specific plans for the full removal of the melted fuel debris or its storage have been decided.  https://abcnews.go.com/Technology/wireStory/robots-attempt-sample-melted-nuclear-fuel-japans-damaged-113049701

August 22, 2024 Posted by | Fukushima continuing, technology | Leave a comment

Nuclear power on the prairies is a green smokescreen.

By M. V. Ramana & Quinn Goranson, August 19th 2024, Canada’s National Observer https://www.nationalobserver.com/2024/08/19/opinion/nuclear-power-prairies-green-smokescree

On April 2, Alberta Premier Danielle Smith declared on X (formerly Twitter) “we are encouraged and optimistic about the role small modular reactors (SMRs) can play” in the province’s plans to “achieve carbon neutrality by 2050.” 

SMRs, for those who haven’t heard this buzzword, are theoretical nuclear reactor designs that aim to produce smaller amounts of electricity compared to the current reactor fleet in Canada. The dream of using small reactors to produce nuclear power dates back to the 1950s — and so has their record of failing commercially. 

That optimism about SMRs will be costing taxpayers at least $600,000, which will fund the company, X-Energy’s research “into the possibility of integrating small modular reactors (SMRs) into Alberta’s electric grid.” This is on top of the $7 million offered by Alberta’s government in September 2023 to oil and gas producer Cenovus Energy to study how SMRs could be used in the oil sands

Last August, Saskatchewan’s Crown Investments Corporation provided $479,000 to prepare local companies to take part in developing SMRs. Alberta and Saskatchewan also have a Memorandum of Understanding to “advance the development of nuclear power generation in support of both provinces’ need for affordable, reliable and sustainable electricity grids by 2050”. 

What is odd about Alberta and Saskatchewan’s talk about carbon neutrality and sustainability is that, after Nunavut, these two provinces are most reliant on fossil fuels for their electricity; as of 2022, Alberta derived 81 per cent of its power from these sources;  Saskatchewan was at 79 percent. In both provinces, emissions have increased  more than 50 per cent above 1990 levels

It would appear neither province is particularly interested in addressing climate change, but that is not surprising given their commitment to the fossil fuel industry. Globally, that industry has long obstructed transitioning to low-carbon energy sources, so as to continue profiting from their polluting activities.

Canadian companies have played their part too. Cenovus Energy, the beneficiary of the $7 million from Alberta, is among the four largest Canadian oil and gas companies that “demonstrate negative climate policy engagement,” and advocate for provincial government investment in offshore oil and gas development. It is also a part of the Pathways Alliance that academic scholars charge with greenwashing, in part because of its plans to use a problematic technology, carbon capture and storage, to achieve “net-zero emissions from oilsands operations by 2050.” 

Carbon capture and storage is just one of the unproven technologies that the fossil fuel industry and its supporters use as part of their “climate pledges and green advertising.” Nuclear energy is another — especially when it involves new designs such as SMRs that have never been deployed in North America, or have failed commercially. 

X-energy, the company that is to receive $600,000, is using a technology that has been tried out in Germany and the United States with no success. The last high-temperature, gas-cooled reactor built in the United States was shut down within a decade, producing, on average, only 15 per cent of what it could theoretically produce

Even if one were to ignore these past failures, building nuclear reactors is slow and usually delayed. In Finland, construction of the Olkiluoto-3 reactor started in 2005, but it was first connected to the grid in 2022, a thirteen-year delay from the anticipated 2009. 

Construction of Argentina’s CAREM small modular reactor started in February 2014 but it is not expected to start operating till at least the “end of 2027,” and most likely later. Both Finland and Argentina have established nuclear industries. Neither Alberta nor Saskatchewan possess any legislative capacity to regulate a nuclear industry

Floating the idea of adding futuristic SMR technology into the energy mix is one way to publicly appear to be committed to climate action, without doing anything tangible. Even if SMRs were to be deployed to supply energy in the tar sands, that does not address downstream emissions from burning the extracted fossil fuels. 

Relying on new nuclear for emission reductions prevents phasing out fossil fuels at a pace necessary for the scientific consensus in favour of rapid and immediate decarbonization. An obstructionist focus on unproven technologies will not help. 

Quinn Goranson is a recent graduate from the University of British Columbia’s School of Public Policy and Global Affairs with a specialization in environment, resources and energy. Goranson has experience working in research for multiple renewable energy organizations, including the CEDAR project, in environmental policy in the public sector, and as an environmental policy consultant internationally.

M.V. Ramana is the Simons Chair in Disarmament, Global and Human Security and Professor at the School of Public Policy and Global Affairs, at the University of British Columbia in Vancouver. He is the author of The Power of Promise: Examining Nuclear Energy in India (Penguin Books, 2012) and “Nuclear is not the Solution: The Folly of Atomic Power in the Age of Climate Change” (Verso Books, 2024). 

August 21, 2024 Posted by | Canada, Small Modular Nuclear Reactors, spinbuster | Leave a comment

The Risk of Bringing AI Discussions Into High-Level Nuclear Dialogues

Overly generalized discussions on the emerging technology may be unproductive or even undermine consensus to reduce nuclear risks at a time when such consensus is desperately needed.

by Lindsay Rand, August 19, 2024,  https://carnegieendowment.org/posts/2024/08/ai-nuclear-dialogue-risks-npt?lang=en

Last month, nuclear policymakers and experts convened in Geneva to prepare for a major conference to review the implementation of the Treaty on the Non-Proliferation of Nuclear Weapons (NPT). At the meeting, calls for greater focus on the implications of artificial intelligence (AI) for nuclear policy pervaded diverse discussions. This echoes many recent pushes from within the nuclear policy community to consider emerging technologies in nuclear security–focused dialogues. However, diplomats should hesitate before trying to tackle the AI-nuclear convergence. Doing so in official, multilateral nuclear security dialogues risks being unproductive or even undermining consensus to reduce nuclear risks at a time when such consensus is desperately needed.

Last month, nuclear policymakers and experts convened in Geneva to prepare for a major conference to review the implementation of the Treaty on the Non-Proliferation of Nuclear Weapons (NPT). At the meeting, calls for greater focus on the implications of artificial intelligence (AI) for nuclear policy pervaded diverse discussions. This echoes many recent pushes from within the nuclear policy community to consider emerging technologies in nuclear security–focused dialogues. However, diplomats should hesitate before trying to tackle the AI-nuclear convergence. Doing so in official, multilateral nuclear security dialogues risks being unproductive or even undermining consensus to reduce nuclear risks at a time when such consensus is desperately needed.

The level of interest in AI at the preparatory committee meeting isn’t surprising, given how much attention is being paid to the implications of AI for nuclear security and international security more broadly. Concerns range from increased speed of engagement, which could reduce human decisionmaking time, to automated target detection that could increase apprehension over second-strike survivability, or even increase propensity for escalation. In the United States, the State Department’s International Security Advisory Board recently published a report that examines AI’s potential impacts on arms control, nonproliferation, and verification, highlighting the lack of consensus around definitions and regulations to govern Lethal Autonomous Weapons Systems (LAWS). Internationally, there have also been calls for the five nuclear weapon states (P5) to discuss AI in nuclear command and control at the P5 Process, a forum where the P5 discuss how to make progress toward meeting their obligations under the NPT. Observers have called for the P5 to issue a joint statement on the importance of preserving human responsibility in nuclear decisionmaking processes.

However, injecting AI into nuclear policy discussions at the diplomatic level presents potential pitfalls. The P5 process and NPT forums, such as preparatory committee meetings and the NPT Review Conference, are already fraught with challenges. Introducing the complexities of AI may divert attention from other critical nuclear policy issues, or even become linked to outstanding areas of disagreement in a way that further entrenches diplomatic roadblocks.

Before introducing discussions about AI into official nuclear security dialogues, policymakers should address the following questions:

  1. In which forums could discussions about AI be productive?
  2. What specific topics could realistically foster more productive dialogue?
  3. Who should facilitate and participate in these discussions?

Forum Selection………………………………………………………………

Topic Selection………………………………..

Participants………………………………

August 21, 2024 Posted by | technology | Leave a comment

Tech Companies Are Racing to Harness Nuclear Power

Oil Price, By Felicity Bradstock – Aug 18, 2024

  • Tech companies are investing heavily in nuclear energy to power their AI operations.
  • Regulatory challenges and utility opposition are hindering the development of new nuclear projects.

With the demand for power increasing rapidly, tech companies are looking for innovative solutions to meet the demand created by artificial intelligence (AI) and other new technologies. In addition to solar and wind power, several tech companies are investing in nuclear energy projects to power operations. The clear shift in the public perception of nuclear power has once again put the abundant clean [!] energy source on the table as an option, with the U.S. nuclear energy capacity expected to rise significantly over the coming decades. ……………………………

Tech companies have invested heavily in wind and solar energy to power their data centers and are now looking for alternative clean power supplies. In 2021, Sam Altman, the CEO of OpenAI, invested $375 in the nuclear fusion startup Helion Energy. Last year, Microsoft signed a deal to purchase power from Helion beginning in 2028. Altman also chairs the nuclear fission company Oklo. Oklo is planning to build a massive network of small-scale nuclear reactors in rural southeastern Idaho to provide power to data centers as the electricity demand grows. It is also planning to build two commercial plants in southern Ohio. 

However, getting some of these nuclear projects off the ground is no easy feat. Oklo has found it difficult to get the backing of nuclear regulators. In 2022, the Federal Nuclear Regulatory Commission (FERC), which oversees commercial nuclear power plants, rejected the firm’s application for the design of its Idaho “Aurora” project, for not providing enough safety information. …………………………………………

In addition to the red tape from regulators, many utilities are opposing new nuclear projects due to their anticipated impact on the grid. Some data centers require 1 GW or more of power, which is around the total capacity of a nuclear reactor in the U.S. PJM Interconnection, the biggest grid operator in the U.S., recently warned that power supply and demand is tightening as the development of new generation is falling behind demand. However, some tech companies are proposing to connect data centers directly to nuclear plants, also known as co-location, to reduce the burden on the grid. 

However, several U.S. utilities oppose co-location plans……………………………………………………… more https://oilprice.com/Alternative-Energy/Nuclear-Power/Tech-Companies-Are-Racing-to-Harness-Nuclear-Power.html

August 20, 2024 Posted by | ENERGY, technology | Leave a comment

Amazon Vies for Nuclear-Powered Data Center 

The deal has become a flash point over energy fairness

1EEE Spectrum, Andrew Moseman, 12 Aug 2024

When Amazon Web Services paid US $650 million in March for another data center to add to its armada, the tech giant thought it was buying a steady supply of nuclear energy to power it, too. The Susquehanna Steam Electric Station outside of Berick, Pennsylvania, which generates 2.5 gigawatts of nuclear power, sits adjacent to the humming data center and had been directly powering it since the center opened in 2023.

After striking the deal, Amazon wanted to change the terms of its original agreement to buy 180 megawatts of additional power directly from the nuclear plant. Susquehanna agreed to sell it. But third parties weren’t happy about that, and their deal has become bogged down in a regulatory battle that will likely set a precedent for data centers, cryptocurrency mining operations, and other computing facilities with voracious appetites for clean electricity.

Putting a data center right next to a power plant so that it can draw electricity from it directly, rather than from the grid, is becoming more common as data centers seek out cheap, steady, carbon-free power. Proposals for co-locating data centers next to nuclear power have popped up in New JerseyTexas, Ohio, and elsewhere. Sweden is considering using small modular reactors to power future data centers.

However, co-location raises questions about equity and energy security, because directly-connected data centers can avoid paying fees that would otherwise help maintain grids. They also hog hundreds of megawatts that could be going elsewhere.

When Amazon Web Services paid US $650 million in March for another data center to add to its armada, the tech giant thought it was buying a steady supply of nuclear energy to power it, too. The Susquehanna Steam Electric Station outside of Berick, Pennsylvania, which generates 2.5 gigawatts of nuclear power, sits adjacent to the humming data center and had been directly powering it since the center opened in 2023.

After striking the deal, Amazon wanted to change the terms of its original agreement to buy 180 megawatts of additional power directly from the nuclear plant. Susquehanna agreed to sell it. But third parties weren’t happy about that, and their deal has become bogged down in a regulatory battle that will likely set a precedent for data centers, cryptocurrency mining operations, and other computing facilities with voracious appetites for clean electricity.

Putting a data center right next to a power plant so that it can draw electricity from it directly, rather than from the grid, is becoming more common as data centers seek out cheap, steady, carbon-free power. Proposals for co-locating data centers next to nuclear power have popped up in New JerseyTexas, Ohio, and elsewhere. Sweden is considering using small modular reactors to power future data centers.

However, co-location raises questions about equity and energy security, because directly-connected data centers can avoid paying fees that would otherwise help maintain grids. They also hog hundreds of megawatts that could be going elsewhere.

“They’re effectively going behind the meter and taking that capacity off of the grid that would otherwise serve all customers,” says Tony Clark, a senior advisor at the law firm Wilkinson Barker Knauer and a former commissioner at the Federal Energy Regulatory Commission (FERC), who has testified to a U.S. House subcommittee on the subject.

Amazon’s nuclear power deal meets hurdles

The dust-up over the Amazon-Susquehanna agreement started in June, after Amazon subsidiary Amazon Web Services filed a notice to change its interconnection service agreement (ISA) in order to buy more nuclear power from Susquehanna’s parent company, Talen Energy. Amazon wanted to increase the amount of behind-the-meter power it buys from the plant from 300 MW to 480 MW. Shortly after it requested the change, utility giants Exelon and American Electric Power (AEP), filed a protest against the agreement and asked FERC to hold a hearing on the matter…………………………………………………………………………………………………………….

Costs of data centers seeking nuclear energy

Yet such arrangements could have major consequences for other energy customers, Clark argues. For one, directing all the energy from a nuclear plant to a data center is, fundamentally, no different than retiring that plant and taking it offline. “It’s just a huge chunk of capacity leaving the system,” he says, resulting in higher prices and less energy supply for everyone else.

Another issue is the “behind-the-meter” aspect of these kinds of deals. A data center could just connect to the grid and draw from the same supply as everyone else, Clark says. But by connecting directly to the power plant, the center’s owner avoids paying the administrative fees that are used to maintain the grid and grow its infrastructure. Those costs could then get passed on to businesses and residents who have to buy power from the grid. “There’s just a whole list of charges that get assessed through the network service that if you don’t connect through the network, you don’t have to pay,” Clark says. “And those charges are the part of the bill that will go up” for everyone else.

Even the “carbon-free” public relations talking points that come with co-location may be suspect in some cases. In Washington State, where Schneider works, new data centers are being planted next to the region’s abundant hydropower stations, and they’re using so much of that energy that parts of the state are considering adding more fossil fuel capacity to make ends meet. This results in a “zero-emissions shell game,” Clark wrote in a white paper on the subject.

These early cases are likely only the beginning. A report posted in May from the Electric Power Research Institute predicts energy demand from data centers will double by 2030, a leap driven by the fact that AI queries need ten times more energy than traditional internet searches. The International Energy Agency puts the timeline for doubling sooner–in 2026. Data centers, AI, and the cryptocurrency sector consumed an estimated 460 terawatt-hours (TWh) in 2022, and could reach more than 1000 TWh in 2026, the agency predicts.

Data centers face energy supply challenges

New data centers can be built in a matter of months, but it takes years to build utility-scale power projects, says Poorvi Patel, manager of strategic insights at Electric Power Research Institute and contributor to the report. The potential for unsustainable growth in electricity needs has put grid operators on alert, and in some cases has sent them sounding the alarm. Eirgrid, a state-owned transmission operator in Ireland, last week warned of a “mass exodus” of data centers in Ireland if it can’t connect new sources of energy. ……………………………………………………………………………………..more https://spectrum.ieee.org/amazon-data-center-nuclear-power

August 19, 2024 Posted by | ENERGY, technology | Leave a comment

The Great Global Computer Outage Is a Warning We Ignore at Our Peril

Is there a limit in the natural order of things to the amount of technological complexity that’s sustainable?

by Tom Valovic, 2 August 24  https://www.counterpunch.org/author/tom-valovic/

July 18, 2024, will go down in history books as an event that shook up the world in a unique way. It gave the mass of humanity a pointed wake-up call about the inherent fragility of the technological systems we’ve created and the societal complexities they’ve engendered. Critical services at hospitals, airports, banks, and government facilities around the world were all suddenly unavailable. We can only imagine what it must have been like to be undergoing treatment in an emergency room at the time with a serious or life-threatening illness.

So, what are we to make of this event and how can we rationally get our collective arms around its meaning and significance? As a journalist who specializes in writing about the impacts of technology on politics and culture, I would like to share a few initial thoughts.

For some of us who have worked in the tech field for many years, such an event was entirely predictable. This is simply because of three factors: 1) the inherent fragility of computer code, 2) the always-present possibility of human error, and 3) the fact that when you build interconnected systems, a vulnerability in one part of the system can easily spread like a contagion to other parts. We see this kind of vulnerability in play daily in terms of a constant outpouring of news stories about hacking, identity theft, and security breaches involving all sorts of companies and institutions. However, none of these isolated events had sufficient scale to engender greater public awareness and alarm until The Great Global Computer Outage of July 18.

Inherent Fragility is Always Present

As impressive as our new digital technologies are, our technocrats and policymakers often seem to lose sight of an important reality. These now massively deployed systems are also quite fragile in the larger scheme of things. Computers and the communications systems that support them—so called virtual systems—can concentrate huge amounts of informational power and control by wielding it like an Archimedean lever to manage the physical world. A cynic could probably argue that we’re now building our civilizational infrastructures on a foundation of sand.

At the recently held Aspen Security Forum, Anne Neuberger—a senior White House cybersecurity expert—noted, “We need to really think about our digital resilience not just in the systems we run but in the globally connected security systems, the risks of consolidation, how we deal with that consolidation and how we ensure that if an incident does occur it can be contained and we can recover quickly.” With all due respect, Ms. Neuberger was simply stating the obvious and not digging deep enough.

The problem runs much deeper. Our government and that of other advanced Western nations is now running on two separate but equal tracks: technology and governance. The technology track is being overseen by Big Tech entities with little accountability or oversight concerning the normative functions of government. In other words, they’re more or less given a free hand to operate according to the dictates of the free market economy.

Further, consider this thought experiment: Given AI’s now critical role in shaping key aspects of our lives and given its very real and fully acknowledged downsides and risks, why was it not even being discussed in the presidential debate? The answer is simple: These issues are often being left to unelected technocrats or corporate power brokers to contend with. But here’s the catch: Most technocrats don’t have the policy expertise needed to guide critical decision-making at a societal level while, at the same time, our politicians (and yes, sadly, most of our presidential candidates) don’t have the necessary technology expertise.

Scope, Scale, and Wisdom

Shifting to a more holistic perspective, humanity’s ability to continue to build these kinds of systems runs into the limitations of our conceptual ability to embrace their vastness and complexity. So, the question becomes: Is there a limit in the natural order of things to the amount of technological complexity that’s sustainable? If so, it seems reasonable to assume that this limit is determined by the ability of human intelligence to encompass and manage that complexity.

To put it more simply: At what point in pushing the envelope of technology advancement do we get in over our heads and to what degree is a kind of Promethean hubris involved?

As someone who has written extensively about the dangers of AI, I would argue that we’re now at a tipping point whereby it’s worth asking if we can even control what we’ve created and whether the “harmful side effects” of seeming constant chaos is now militating against the quality of life. Further, we can only speculate as to whether we should consider if the CrowdStrike event was somehow associated with some sort of still poorly understood or recognized AI hacking or error. The bottom line is: If we cannot control the effects of our own technological invention then in what sense can those creations be said to serve human interests and needs in this already overly complex global environment?

Finally, the advent of under-the-radar hyper-technologies such as nanotechnology and genetic engineering also need to be considered in this context. These are also technologies that can only be understood in the conceptual realm and not in any concrete and more immediate way because (I would argue) their primary and secondary effects on society, culture, and politics can no longer be successfully envisioned. Decisively moving into these realms, therefore, is like ad hoc experimentation with nature itself. But as many environmentalists have pointed out, “Nature bats last.” Runaway technological advancement is now being fueled by corporate imperatives and a “growth at any cost” mentality that offers little time for reflection. New and seemingly exciting prospects for advanced hyper-technology may dazzle us, but if in the process they also blind us, how can we guide the progress of technology with wisdom?

Tom Valovic is a journalist and the author of Digital Mythologies (Rutgers University Press), a series of essays that explored emerging social and political issues raised by the advent of the Internet. He has served as a consultant to the former Congressional Office of Technology Assessment. Tom has written about the effects of technology on society for a variety of publications including Columbia University’s Media Studies Journal, the Boston Globe, and the San Francisco Examiner, among others.

August 5, 2024 Posted by | technology | Leave a comment

Is the dream of nuclear fusion dead? Why the international experimental reactor is in ‘big trouble’

The 35-nation Iter project has a groundbreaking aim to create clean and limitless energy but it is turning into the ‘most delayed and cost-inflated science project in history’

Guardian, Robin McKie Science Editor, 4 Aug 24

It was a project that promised the sun. Researchers would use the world’s most advanced technology to design a machine that could generate atomic fusion, the process that drives the stars – and so create a source of cheap, non-polluting power.

That was initially the aim of the International Thermonuclear Experimental Reactor (Iter) which 35 countries – including European states, China, Russia and the US – agreed to build at Saint-Paul-lez-Durance in southern France at a starting cost of $6bn. Work began in 2010, with a commitment that there would be energy-producing reactions by 2020.

Then reality set in. Cost overruns, Covid, corrosion of key parts, last-minute redesigns and confrontations with nuclear safety officials triggered delays that mean Iter is not going to be ready for another decade, it has just been announced. Worse, energy-producing fusion reactions will not be generated until 2039, while Iter’s budget – which has already soared to $20bn – will increase by a further $5bn.

Other estimates suggest the final price tag could rise well above this figure and make Iter “the most delayed and most cost-inflated ­science project in history”, the journal Scientific American has warned. For its part, the journal Science has stated simply that Iter is now in “big trouble”, while Nature has noted that the project has been “plagued by a string of hold-ups, cost overruns and management issues”.

Dozens of private companies now threaten to create fusion reactors on a shorter timescale, warn scientists. These include Tokamak Energy in Oxford and Commonwealth Fusion Systems in the US.

“The trouble is that Iter has been going on for such a long time, and suffered so many delays, that the rest of the world has moved on,” said fusion expert Robbie Scott of the UK Science and Technology Facilities Council. “A host of new technologies have emerged since Iter was planned. That has left the project with real problems.”

A question mark now hangs over one of the world’s most ambitious technological projects in its global bid to harness the process that drives the stars. It involves the nuclei of two light atoms being forced to combine to form a single heavier nucleus, while releasing massive amounts of energy. This is nuclear fusion, and it only occurs at colossally high temperatures.

To create such heat, a doughnut-shaped reactor, called a tokamak, will use magnetic fields to contain a plasma of hydrogen nuclei that will then be bombarded by particle beams and microwaves. When temperatures reach millions of degrees Celsius, the mix of two hydrogen isotopes – deuterium and tritium – will fuse to form helium, neutrons and a great deal of excess energy.

Containing plasma at such high temperatures is exceptionally difficult. “It was originally planned to line the tokamak reactor with protective beryllium but that turned out to be very tricky. It is toxic and eventually it was decided to replace it with tungsten,” said David Armstrong, professor of materials science and engineering at Oxford University. “That was a major design change taken very late in the day.”

Then huge sections of tokamak made in Korea were found not to fit together properly, while threats that there could be leaks of radioactive materials led the French nuclear regulators to call a halt on the plant’s construction. More delays in construction were announced as problems piled up………………………………………………………….

For its part, Iter denies that it is “in big trouble” and rejects the idea that it is a record-breaking science project for cost overruns and delays. Just look at the International Space Station or for that matter the UK’s HS2 rail link, said a spokesman.

Others point out that fusion power’s limited carbon emissions would boost the battle against climate change. “However, fusion will arrive too late to help us cut carbon emissions in the short term,” said Aneeqa Khan, a research fellow in nuclear fusion at the University of Manchester. “Only if fusion power plants produce significant amounts of electricity later in the century will they help keep our carbon emissions down – and that will become crucial in the fight against climate change.”  https://www.theguardian.com/technology/article/2024/aug/03/is-the-dream-of-nuclear-fusion-dead-why-the-international-experimental-reactor-is-in-big-trouble

August 5, 2024 Posted by | EUROPE, technology | Leave a comment

Is nuclear waste able to be recycled? Would that solve the  nuclear waste problem?

Radioactive Wastes from Nuclear Reactors, Questions and Answers, Gordon Edwards 28 July 24.

Well, you know, the very first reactors did not produce electricity. They were built for the express purpose of creating plutonium for atomic bombs. Plutonium is a uranium derivative. It is one of the hundreds of radioactive byproducts created inside every uranium-fuelled reactor. Plutonium is the stuff from which nuclear weapons are made. Every large nuclear warhead in the world’s arsenals uses plutonium as a trigger.

But plutonium can also be used as a nuclear fuel. That first power reactor that started up in 1951 in Idaho, the first electricity-producing reactor, was called the EBR-1 — it actually suffered a partial meltdown. EBR stands for “Experimental Breeder Reactor” and it was cooled, not with water, but with hot liquid sodium metal.

By the way, another sodium-cooled electricity producing reactor was built right here in California, and it also had a partial meltdown. The dream of the nuclear industry was, and still is, to use plutonium as the fuel of the future, replacing uranium. A breeder reactor is one that can “burn” plutonium fuel and simultaneously produce even more plutonium than it uses. Breeder reactors are usually sodium-cooled.

In fact sodium-cooled reactors have failed commercially all over the world, in the US, France, Britain, Germany, and Japan, but it is still the holy grail of the nuclear industry, the breeder reactor, so watch out.

To use plutonium, you have to extract it from the fiercely radioactive used nuclear fuel. This technology of plutonium extraction is called reprocessing. It must be carried out robotically because of the deadly penetrating radiation from the used fuel.

Most reprocessing involves dissolving used nuclear fuel in boiling nitric acid and chemically separating the plutonium from the rest of the radioactive garbage. This creates huge volumes of dangerous liquid wastes that can spontaneously explode (as in Russia in 1957) or corrode and leak into the ground (as has happened in the USA). A single gallon of this liquid high-level waste is enough to ruin an entire city’s water supply.

In 1977, US President Jimmy Carter banned reprocessing in the USA because of fears of proliferation of nuclear weapons at home and abroad. Three years earlier, in 1974, India tested its first atomic bomb using plutonium from a Canadian research reactor given to India as a gift.

The problem with using plutonium as a fuel is that it is then equally available for making bombs. Any well-equipped group of criminals or terrorists can make its own atomic bombs with a sufficient quantity of plutonium – and it only takes about 8 kilograms to do so. Even the crudest design of a nuclear explosive device is enough to devastate the core of any city.

Plutonium is extremely toxic when inhaled. A few milligrams is enough to kill any human within weeks through massive fibrosis of the lungs.

A few micrograms – a thousand times less– can cause fatal lung cancer with almost 100% certainty. So even small quantities of plutonium can be used by terrorists in a so-called “dirty bomb”. That’s a radioactive dispersal device using conventional explosives. Just a few grams of fine plutonium dust could threaten the lives of thousands if released into the ventilation system of a large office building.

So beware of those who talk about “recycling” used nuclear fuel. What they are really talking about is reprocessing – plutonium extraction – which opens a Pandora’s box of possibilities. The liquid waste and other leftovers are even more environmentally threatening, more costly, and more intractable, than the solid waste. Perpetual isolation is still required. ————

www.ccnr.org/Radioactive_Q&A_2024.pdf

July 29, 2024 Posted by | - plutonium, reprocessing | Leave a comment

Humans should teach AI how to avoid nuclear war—while they still can

By Cameron VegaEliana Johns | July 22, 2024,  https://thebulletin.org/2024/07/humans-should-teach-ai-how-to-avoid-nuclear-war-while-they-still-can/?fbclid=IwZXh0bgNhZW0CMTEAAR2M_EOXy8gbl1C9knrlD6Qox7m3ZMlORORVIO7cUXuQjvu7rt1RoN5mWLo_aem_0VOtqNpJ2N7mxCdvmakvNw#post-heading

When considering the potentially catastrophic impacts of military applications of Artificial Intelligence (AI), a few deadly scenarios come to mind: autonomous killer robots, AI-assisted chemical or biological weapons development, and the 1983 movie WarGames.

The  the 1983 movie WarGames,  features a self-aware AI-enabled supercomputer that simulates a Soviet nuclear launch and convinces US nuclear forces to prepare for a retaliatory strike. The crisis is only partly averted because the main (human) characters persuade US forces to wait for the Soviet strike to hit before retaliating. It turns out that the strike was intentionally falsified by the fully autonomous AI program. The computer then attempts to launch a nuclear strike on the Soviets without human approval until it is hastily taught about the concept of mutually assured destruction, after which the program ultimately determines that nuclear war is a no-win scenario: “Winner: none.”

US officials have stated that an AI system would never be given US nuclear launch codes or the ability to take control over US nuclear forces. However, AI-enabled technology will likely become increasingly integrated into nuclear targeting and command and control systems to support decision-making in the United States and other nuclear-armed countries. Because US policymakers and nuclear planners may use AI models in conducting analyses and anticipating scenarios that may ultimately influence the president’s decision to use nuclear weapons, the assumptions under which these AI-enabled systems operate require closer scrutiny.

Pathways for AI integration. The US Defense Department and Energy Department already employ machine learning and AI models to make calculation processes more efficient, including for analyzing and sorting satellite imagery from reconnaissance satellites and improving nuclear warhead design and maintenance processes. The military is increasingly forward-leaning on AI-enabled systems. For instance, it initiated a program in 2023 called Stormbreaker that strives to create an AI-enabled system called “Joint Operational Planning Toolkit” that will incorporate “advanced data optimization capabilities, machine learning, and artificial intelligence to support planning, war gaming, mission analysis, and execution of all-domain, operational level course of action development.” While AI-enabled technology presents many benefits for security, it also brings significant risks and vulnerabilities.

One concern is that the systemic use of AI-enabled technology and an acceptance of AI-supported analysis could become a crutch for nuclear planners, eroding human skills and critical thinking over time. This is particularly relevant when considering applications for artificial intelligence in systems and processes such as wargames that influence analysis and decision-making. For example, NATO is already testing and preparing to launch an AI system designed to assist with operational military command and control and decision-making by combining an AI wargaming tool and machine learning algorithms. Even though it is still unclear how this system will impact decision-making led by the United States, the United Kingdom, and NATO’s Nuclear Planning Group concerning US nuclear weapons stationed in Europe, this type of AI-powered analytical tool would need to consider escalation factors inherent to nuclear weapons and could be used to inform targeting and force structure analysis or to justify politically motivated strategies.

The role given to AI technology in nuclear strategy, threat prediction, and force planning can reveal more about how nuclear-armed countries view nuclear weapons and nuclear use. Any AI model is programmed under certain assumptions and trained on selected data sets. This is also true of AI-enabled wargames and decision-support systems tasked with recommending courses of action for nuclear employment in any given scenario. Based on these assumptions and data sets alone, the AI system would have to assist human decision-makers and nuclear targeters in estimating whether the benefits of nuclear employment outweigh the cost and whether a nuclear war is winnable.

Do the benefits of nuclear use outweigh the costs? Baked into the law of armed conflict is a fundamental tension between any particular military action’s gains and costs. Though fiercely debated by historians, the common understanding of the US decision to drop two atomic bombs on Japan in 1945 demonstrates this tension: an expedited victory in East Asia in exchange for hundreds of thousands of Japanese casualties.

Understanding how an AI algorithm might weigh the benefits and costs of escalation depends on how it integrates the country’s nuclear policy and strategy. Several factors contribute to one’s nuclear doctrine and targeting strategy—ranging from fear of consequences of breaking the tradition of non-use of nuclear weapons to concern of radioactive contamination of a coveted territory and to sheer deterrence because of possible nuclear retaliation by an adversary. While strategy itself is derived from political priorities, military capabilities, and perceived adversarial threats, nuclear targeting incorporates these factors as well as many others, including the physical vulnerability of targets, overfly routes, and accuracy of delivery vehicles—all aspects to further consider when making decisions about force posture and nuclear use.

In the case of the United States, much remains classified about its nuclear decision-making and cost analysis. It is understood that, under guidance from the president, US nuclear war plans target the offensive nuclear capabilities of certain adversaries (both nuclear and non-nuclear armed) as well as the infrastructure, military resources, and political leadership critical to post-attack recovery. But while longstanding US policy has maintained to “not purposely threaten civilian populations or objects” and “not intentionally target civilian populations or targets in violation of [the law of armed conflict],” the United States has previously acknowledged that “substantial damage to residential structures and populations may nevertheless result from targeting that meets the above objectives.” This is in addition to the fact that the United States is the only country to have used its nuclear weapons against civilians in war.

There is limited public information with which to infer how an AI-enabled system would be trained to consider the costs of nuclear detonation. Certainly, any plans for nuclear employment are determined by a combination of mathematical targeting calculations and subjective analysis of social, economic, and military costs and benefits. An AI-enabled system could improve some of these analyses in weighing certain military costs and benefits, but it could also be used to justify existing structures and policies or further ingrain biases and risk acceptance into the system. These factors, along with the speed of operation and innate challenges in distinguishing between data sets and origins, could also increase the risks of escalation—either deliberate or inadvertent.

Is a nuclear war “winnable”? Whether a nuclear war is winnable depends on what “winning” means. Policymakers and planners may define winning as merely the benefits of nuclear use outweighing the cost when all is said and done. When balancing costs and benefits, the benefits need only be one “point” higher for an AI-enabled system to deem the scenario a “win.”

In this case, “winning” may be defined in terms of national interest without consideration of other threats. A pyrrhic victory could jeopardize national survival immediately following nuclear use and still be considered a win by the AI algorithm. Once a nuclear weapon has been used, it could either incentivize an AI system to not recommend nuclear use or, on the contrary, recommend the use of nuclear weapons on a broader scale to eliminate remaining threats or to preempt further nuclear strikes.

“Winning” a nuclear war could also be defined in much broader terms. The effects of nuclear weapons go beyond the immediate destruction within their blast radius; there would be significant societal implications from such a traumatic experience, including potential mass migration and economic catastrophe, in addition to dramatic climatic damage that could result in mass global starvation. Depending on how damage is calculated and how much weight is placed on long-term effects, an AI system may determine that a nuclear war itself is “unwinnable” or even “unbearable.

Uncovering biases and assumptions. The question of costs and benefits is relatively uncontroversial in that all decision-making involves weighing the pros and cons of any military option. However, it is still unknown how an AI system will weigh these costs and benefits, especially given the difficulty of comprehensively modeling all the effects of nuclear weapon detonations. At the same time, the question of winning a nuclear war has long been a thorn in the side of nuclear strategists and scholars. All five nuclear-weapon states confirmed in 2022 that “a nuclear war cannot be won and must never be fought.” For them, planning to win a nuclear war would be considered inane and, therefore, would not require any AI assistance. However, deterrence messaging and discussion of AI applications for nuclear planning and decision-making illuminate the belief that the United States must be prepared to fight—and win—a nuclear war.

July 26, 2024 Posted by | technology, weapons and war | Leave a comment

How close are we to chaos? It turns out, just one blue screen of death

Keeping cash as a backup is a smart idea in the event of a payment systems outage,

David Swan, Technology editor, 22 July 24,  https://www.theage.com.au/technology/how-close-are-we-to-chaos-it-turns-out-just-one-blue-screen-of-death-20240720-p5jv6t.html

In some places, Friday’s mass tech outage resembled the beginning of an apocalyptic zombie movie. Supermarket checkouts were felled across the country and shoppers were turned away, airports became shelters for stranded passengers, and live TV and radio presenters were left scrambling to fill airtime. The iconic Windows “blue screen of death” hit millions of devices globally and rendered them effectively useless.

The ABC’s national youth station Triple J issued a call-out for anyone who could come to their Sydney studio to DJ in person. One woman was reportedly unable to open her smart fridge to access her food.

All because of a failure at CrowdStrike, a company that most of us – least of all those who were worst hit – had never heard of before.

It’s thought to be the worst tech outage in history and Australia was at its epicentre: the crisis began here, and spread to Europe and the US as the day progressed. Surgeries were cancelled in Austria, Japanese airlines cancelled flights and Indian banks were knocked offline. It was a horrifying demonstration of how interconnected global technology is, and how quickly things can fall apart.

At its peak, it reminded us of some of the most stressful periods of the pandemic, when shoppers fought each other for rolls of toilet paper and argued about whether they needed to wear masks.

Many of us lived through the Y2K panic. We avoided the worst outcomes but it was an early harbinger of how vulnerable our technology is to bugs and faults, and showed the work required to keep everything up and running. The CrowdStrike meltdown felt closer to what’s really at risk when things go wrong.

As a technology reporter, for years I’ve had warnings from industry executives of the danger of cyberattacks or mass outages. These warnings have become real.

The cause of this outage was not anything malicious. It was relatively innocuous: CrowdStrike has blamed a faulty update from its security software, which then caused millions of Windows machines to crash and enter a recovery boot loop.

Of course Australians are no strangers to mass outages, even as they become more common and more severe.

The Optus network outage that froze train networks and disrupted hospital services just over six months ago was eerily similar to the events on Friday, not least because it was also caused by what was supposed to be a routine software upgrade.

The resignation of chief executive Kelly Bayer Rosmarin did little to prevent another Optus outage a month later. If anything, Friday’s CrowdStrike outage highlights how many opportunities there are for one failure to cripple millions of devices and grind the global economy to a halt. So many of the devices that underpin our economy have hundreds of different ways that they can be knocked offline, whether through a cyberattack or human error, as was likely the case with CrowdStrike.

The incident would likely have been even worse were it a cyberattack. Experts have long warned about the vulnerability of critical infrastructure – including water supplies and electricity – to malicious hackers. Everything is now connected to the internet and is therefore at risk.

And yet the potential damage of such attacks is only growing. We are now more reliant than ever on a concentrated number of software firms, and we have repeatedly seen their products come up short when we need them to just work.

In the US, the chair of the Federal Trade Commission, Lina Khan, put it succinctly.

“All too often these days, a single glitch results in a system-wide outage, affecting industries from healthcare and airlines to banks and auto-dealers,” Khan said on Saturday.

“Millions of people and businesses pay the price.”

Khan is right. The technology we rely on is increasingly fragile, and is increasingly in the hands of just a few companies. The world’s tech giants like Microsoft and Apple now effectively run our daily lives and businesses, and an update containing a small human error can knock it all over, from Australia to India.

The heat is now on CrowdStrike, as well as the broader technology sector on which we rely so heavily, and some initial lessons are clear. Airlines have backup systems to help keep some flights operational in the case of a technological malfunction. As everyday citizens, it’s an unfortunate reality that we need to think similarly.

Keeping cash as a backup is a smart idea in the event of a payment systems outage, as is having spare battery packs for your devices. Many smart modems these days, like those from Telstra and Optus, offer 4G or 5G internet if their main connection goes down. We need more redundancies built in to the technology we use, and more alternatives in case the technology stops working altogether.

For IT executives at supermarkets, banks and hospitals, the outage makes it clear that “business as usual” will no longer cut it, and customers rightly should expect adequate backups to be in place. Before the Optus outage, a sense of complacency had permeated our IT operations rooms and our company boardrooms, and it still remains. No longer.

The “blue screen of death”, accompanied by a frowny face, was an apt metaphor for the current state of play when it comes to our overreliance on technology. Our technology companies – and us consumers, too – need to do things differently if we’re to avoid another catastrophic global IT outage. There’s too much at stake not to.

July 23, 2024 Posted by | technology | Leave a comment

High hopes and security fears for next-gen nuclear reactors

Fuel for advanced reactors is raising nuclear proliferation concerns.

The Verge, By Justine Calma, a senior science reporter covering energy and the environment with more than a decade of experience. She is also the host of Hell or High Water: When Disaster Hits Home, a podcast from Vox Media and Audible Originals, Jul 20, 2024

Next-generation nuclear reactors are heating up a debate over whether their fuel could be used to make bombs, jeopardizing efforts to prevent the proliferation of nuclear weapons. 

Uranium in the fuel could theoretically be used to develop a nuclear weapon. Older reactors use such low concentrations that they don’t really pose a weapons proliferation threat. But advanced reactors would use higher concentrations, making them a potential target of terrorist groups or other countries wanting to take the fuel to develop their own nuclear weapons, some experts warn.

They argue that the US hasn’t prepared enough to hedge against that worst-case scenario and are calling on Congress and the Department of Energy to assess potential security risks with advanced reactor fuel.

Other experts and industry groups still think it’s unfeasible for such a worst-case scenario to materialize. But the issue is starting to come to a head as nuclear reactors become a more attractive energy source, garnering a rare show of bipartisan support in Congress.

……. Earlier this month, President Joe Biden signed bipartisan legislation into law meant to speed the development of next-generation nuclear reactors in the US by streamlining approval processes.

………….The US Nuclear Regulatory Commission (NRC) certified an advanced small modular reactor design for the first time last year. And we’re likely still years away from seeing commercial plants in action. But if the US ever wants to get there, it’ll also have to build up a supply chain for the fuel those advanced reactors would consume. The Inflation Reduction Act includes $700 million to develop that domestic fuel supply.

Today’s reactors generally run on fuel made with a uranium isotope called U-235. Naturally occurring uranium has quite low concentrations of U-235; it has to be “enriched” — usually up to a 5 percent concentration of U-235 for a traditional reactor. Smaller advanced reactors would run on more energy-dense fuel that’s enriched with between 5 to 20 percent U-235, called HALEU (short for high-assay low-enriched uranium).

That higher concentration is what has some experts worried. “If the weapons usability of HALEU is borne out, then even a single reactor would pose serious security concerns,” says a policy analysis penned by a group of nuclear proliferation experts and engineers published in the journal Science last month (including an author credited with being one of the architects of the first hydrogen bomb).

Fuel with a concentration of at least 20 percent is considered highly enriched uranium, which could potentially be used to develop nuclear weapons. With HALEU designs reaching 19.75 percent U-235, the authors argue, it’s time for the US to think hard about how safe the next generation of nuclear reactors would be from malicious intent.

“We need to make sure that we don’t get in front of ourselves here and make sure that all the security and safety provisions are in place first before we go off and start sending [HALEU] all around the country,” says R. Scott Kemp, associate professor of nuclear science and engineering and director of the MIT Laboratory for Nuclear Security and Policy.

That 20 percent threshold goes back to the 1970s, and bad actors ostensibly have more information and computational tools at their disposal to develop weapons, Kemp and his coauthors write in the paper. It might even be possible to craft a bomb with HALEU well under the 20 percent threshold, the paper contends……………………………………………………………………………………..

Aside from asking Congress for an updated security assessment of HALEU, the paper suggests setting a lower enrichment limit for uranium based on new research or ramping up security measures for HALEU to more closely match those for weapons-usable fuels. 

…………………………“Unless there’s a really good reason to switch to fuels that pose greater risks of nuclear proliferation, then it’s irresponsible to pursue those,” says Edwin Lyman, director of nuclear power safety at the Union of Concerned Scientists and another author of the paper. Lyman has also raised concerns about the radioactive waste from nuclear reactors over the years. “There is no good reason.”  https://www.theverge.com/24201610/next-generation-nuclear-energy-reactors-security-weapons-proliferation-risk

July 23, 2024 Posted by | safety, technology, USA | Leave a comment

Massive IT outage spotlights major vulnerabilities in the global information ecosystem

the world may finally be realizing that modern information-based society is based on a very fragile foundation.

Richard Forno, Principal Lecturer in Computer Science and Electrical Engineering, University of Maryland, Baltimore County: July 20, 2024  https://theconversation.com/massive-it-outage-spotlights-major-vulnerabilities-in-the-global-information-ecosystem-235155

The global information technology outage on July 19, 2024, that paralyzed organizations ranging from airlines to hospitals and even the delivery of uniforms for the Olympic Games represents a growing concern for cybersecurity professionals, businesses and governments.

The outage is emblematic of the way organizational networks, cloud computing services and the internet are interdependent, and the vulnerabilities this creates. In this case, a faulty automatic update to the widely used Falcon cybersecurity software from CrowdStrike caused PCs running Microsoft’s Windows operating system to crash. Unfortunately, many servers and PCs need to be fixed manually, and many of the affected organizations have thousands of them spread around the world.

For Microsoft, the problem was made worse because the company released an update to its Azure cloud computing platform at roughly the same time as the CrowdStrike update. Microsoft, CrowdStrike and other companies like Amazon have issued technical work-arounds for customers willing to take matters into their own hands. But for the vast majority of global users, especially companies, this isn’t going to be a quick fix.

Modern technology incidents, whether cyberattacks or technical problems, continue to paralyze the world in new and interesting ways. Massive incidents like the CrowdStrike update fault not only create chaos in the business world but disrupt global society itself. The economic losses resulting from such incidents – lost productivity, recovery, disruption to business and individual activities – are likely to be extremely high.

As a former cybersecurity professional and current security researcher, I believe that the world may finally be realizing that modern information-based society is based on a very fragile foundation.

The bigger picture

Interestingly, on June 11, 2024, a post on CrowdStrike’s own blog seemed to predict this very situation – the global computing ecosystem compromised by one vendor’s faulty technology – though they probably didn’t expect that their product would be the cause.

Software supply chains have long been a serious cybersecurity concern and potential single point of failure. Companies like CrowdStrike, Microsoft, Apple and others have direct, trusted access into organizations’ and individuals’ computers. As a result, people have to trust that the companies are not only secure themselves, but that the products and updates they push out are well-tested and robust before they’re applied to customers’ systems. The SolarWinds incident of 2019, which involved hacking the software supply chain, may well be considered a preview of today’s CrowdStrike incident.

CrowdStrike CEO George Kurtz said “this is not a security incident or cyberattack” and that “the issue has been identified, isolated and a fix has been deployed.” While perhaps true from CrowdStrike’s perspective – they were not hacked – it doesn’t mean the effects of this incident won’t create security problems for customers. It’s quite possible that in the short term, organizations may disable some of their internet security devices to try and get ahead of the problem, but in doing so they may have opened themselves up to criminals penetrating their networks.

It’s also likely that people will be targeted by various scams preying on user panic or ignorance regarding the issue. Overwhelmed users might either take offers of faux assistance that lead to identity theft, or throw away money on bogus solutions to this problem.

What to do

Organizations and users will need to wait until a fix is available or try to recover on their own if they have the technical ability. After that, I believe there are several things to do and consider as the world recovers from this incident.

Companies will need to ensure that the products and services they use are trustworthy. This means doing due diligence on the vendors of such products for security and resilience. Large organizations typically test any product upgrades and updates before allowing them to be released to their internal users, but for some routine products like security tools, that may not happen.

Governments and companies alike will need to emphasize resilience in designing networks and systems. This means taking steps to avoid creating single points of failure in infrastructure, software and workflows that an adversary could target or a disaster could make worse. It also means knowing whether any of the products organizations depend on are themselves dependent on certain other products or infrastructures to function.

Organizations will need to renew their commitment to best practices in cybersecurity and general IT management. For example, having a robust backup system in place can make recovery from such incidents easier and minimize data loss. Ensuring appropriate policies, procedures, staffing and technical resources is essential.

Problems in the software supply chain like this make it difficult to follow the standard IT recommendation to always keep your systems patched and current. Unfortunately, the costs of not keeping systems regularly updated now have to be weighed against the risks of a situation like this happening again.

July 22, 2024 Posted by | technology | Leave a comment

Small Modular Nuclear Reactors (SMRs)- Dirty Dangerous Distractions from Real Climate Action.

Dale Dewar, 21 July 24

The current hype about Small Modular Nuclear Reactors (SMRs) is that they are safe, carbon neutral, emissions’ free, have no effect upon the environment or human health, have little or no waste and are essential to address the threat of climate change. Nuclear industry executives claim that these ingenious things can be built and running within the next decade.

No nuclear power plant has ever been built on time or within budget. What of the other claims?

Safety. In order to make this claim, the nuclear industry overlooks the effects of radioactivity on both the environment and human health. Catastrophic accidents are ignored. Who speaks for the children? Over 60 research papers identify an increase in leukemia in children in the vicinity of nuclear power plants.

Carbon neutral. Do claims of carbon neutrality include mining, refining, trucking, enriching, fuel rod manufacture, site construction, decommissioning, and waste management? To be fair, these should be included in other sources of energy as well, but enrichment itself is an unusually energy-intensive process. 

Emissions’ free. Nuclear power plants release radioactive gasses as a regular part of their operations and sometimes by accident. Tritium is a particularly noxious emission because it can be incorporated into every cellular function and structure in biological organisms. It is likely the culprit in the increased incidence of leukemia in children. Other gasses include krypton and radon. Minute amounts of cesium-137, strontium-90, iodine-131 and carbon-14 are also found in the released gas.

No effect upon the environment. Reactors that use water as their coolant return the water to the rivers and lakes at a higher temperature. A cascade of effects involves fish populations, algae growth and changed mineral content. The proposed SMR for Saskatchewan is a Boiling Water Reactor which will require coolant.

No effect upon human health. A prominent scientific panel in the United States which periodically reviews ionizing radiation and health stated in 2005 that “the smallest dose has the potential to cause a small increase in risk” of cancer in humans. 

Have little or no waste. While the volume of waste may be small, it is not easily contained. Recycling, reprocessing and pyroprocessing are not simple processes, nor are they “clean”. Locations where they have been done remain extremely contaminated. (eg. Mayak in Russia and Hanford in the USA). Furthermore, the treatment removes only the plutonium which is an extremely small proportion of the waste.

Essential to address climate catastrophe. Nothing could be further from the truth. Countries that have avoided the nuclear energy money pit have been able to address their carbon footprint with new and innovated ways to provide their energy needs. The belief that it would provide “baseload” energy is a myth at best because it cannot be powered up and down in the nimble fashion required. 

Up and running within the decade. SMRs are a new technology (or an old, discarded technology being brushed off for new sales) and, based upon the record to date, even less likely to fulfill this promise.

Why are the Canadian and United States governments pouring federal tax dollars into the nuclear industry? We are already committed to over $50 million and Premier Moe says the commitment will go to $5 billion! What is the attraction?

The “Nuclear Age” was ushered into being for production of atomic bombs. Nuclear power was an afterthought. With the USA and the UK “modernizing” their nuclear arsenal, we should not overlook the possibility that plutonium extraction is still the motivating factor. It is our tax money that’s funding this project. Is this what we want?

July 22, 2024 Posted by | Small Modular Nuclear Reactors | 1 Comment

Please, No Weapons and Wars in Space

Honoring the Spirit of Apollo 11,
BILL ASTORE, JUL 21, 2024 https://bracingviews.substack.com/p/please-no-weapons-and-wars-in-space

This weekend marks the 55th anniversary of humanity’s first trip to the moon, when Neil Armstrong and Buzz Aldrin got moon dust on their boots as Michael Collins waited in moon orbit to pick them up. It all went remarkably well, if not perfectly smoothly, for Apollo 11.

Humans haven’t been back to the moon to cavort on it for more than fifty years. Apollo 17 was the last mission in December of 1972. Once America beat the Soviets to the moon and explored it a few times, the program lost its impetus as people grew nonchalant if not bored with the Apollo missions. What a shame!

Apollo 11 left a plaque on the moon saying they went there in the name of peace and for all mankind. It’s a groovy sentiment, but tragically space has become yet another realm of war. Instead of occupying the moral high ground, the United States with its Space Force wants to dominate the military “high ground” of space. The dream of space as a realm for peace is increasingly a nightmare of information dominance and power projection.

A powerful trend is space exploitation by billionaires rather than space exploration funded and supported by the people. Privatization of space and its weaponization are proceeding together, even feeding off each other.

Of course, the military has always dreamed of weaponizing space. The new dream, apparently, is becoming super-rich by mining rare strategic minerals and the like, along with space tourism by the ultra-rich.

Again, the U.S. military sees space as its domain, working with a diverse range of countries, such as the UK, South Korea, and Sweden, among others, on new space ports, radar and launch sites, and related facilities. A key buzzword is “interoperability” between the U.S. and its junior partners in space, which, for you “Star Trek” fans, is akin to being assimilated by the Borg collective. (All of the Borg are “interoperable”; too bad they have no autonomy.)

We humans should not be exporting our violence and wars beyond our own planet. If you believe space should be reserved for peace, check out Space4Peace.org. Follow this link. It’s a global organization of people dedicated to the vision that space should remain free of weapons and wars. The group is kind enough to list me as one of its “advisers.”

Mark your calendars for the next “Keep Space for Peace” week from October 5-12. Together, let’s reject star wars and instead embrace peaceful star treks.

July 22, 2024 Posted by | space travel | Leave a comment

Nuclear Free Local Authorities challenge UK government on New Cleo’s application for “justification” of its small nuclear “fast” reactor

2 In April 2024, the Nuclear Industry Association applied for
‘justification’ on behalf of NewCleo and its lead-cooled LFR-AS-200 fast
reactor to the Department for Environment, Food and Rural Affairs (DEFRA).
The department will support the future Secretary of State in their role as
the authority responsible for the ‘justification’ decision.

This was the first application for ‘justification’ for a so-called advanced modular
design in the UK. In its media release, NewCleo oozed confidence its
reactor design will meet with approval:

‘Justification’ is a regulatory
process which requires a Government decision before any new class or type
of practice involving ionising radiation can be introduced in the UK. A
justification decision is one of the required steps for the operation of a
new nuclear technology in the UK, but it is not a permit or licence that
allows a specific project to go ahead.

Instead, it is a generic decision
based on a high-level evaluation of the potential benefits and detriments
of the proposed new nuclear practice as a pre-cursor to future regulatory
processes.

The design failed the government’s readiness test to be entered
into the Generic Design Assessment (GDA). Even if justification is
forthcoming, with the design not selected for the GDA, it would surely have
to undergo an equally rigorous, but more uncertain, process.

Furthermore, the reactor operates using MOX (mixed uranium and plutonium reprocessed
fuel). Although, the press has previously reported NewCleo’s plan ‘to
take advantage of the UK’s massive stockpile of waste at Sellafield, where
it wanted to invest £2bn in a waste reprocessing factory and advanced
modular reactors that would have created around 500 jobs’, the government’s
recently published Civil Nuclear Roadmap makes clear that this material
will not be forthcoming: ‘We are providing clarity to vendors by
committing not to support the use of plutonium stored at Sellafield by
Advanced Nuclear Technologies whilst high hazard reduction activities are
prioritised at Sellafield’.

The other puzzle is X-Energy, which was given
£3.4m by government, but seemingly, like NewCleo, has been turned down for
consideration for a GDA. X-Energy have previously announced plans to deploy
its reactor design at a site in Hartlepool.

In response to the news, NFLA
Scotland Advisor Pete Roche and Emeritus Professor of Energy Policy at the
University of Greenwich Stephen Thomas put together a question set
exploring and challenging the justification and Generic Design processes.
This was sent by the NFLAs to Nuclear Minister Andrew Bowie. On receiving
the Minister’s response, a second letter with supplementary questions was
drafted and sent, and this has just been replied to by a senior Department
of Energy Security and Net Zero (DESNZ) official. The correspondence is
reproduced in this briefing for your information.

NFLA 16th July 2024

https://www.nuclearpolicy.info/wp/wp-content/uploads/2024/07/A416-NB302-Correspondence-with-DESNZ-and-EA-over-nuclear-design-justification-Jun-2024.pdf

July 19, 2024 Posted by | Small Modular Nuclear Reactors, UK | Leave a comment