Last Energy nabs $40M to realize vision of super-small nuclear reactors

These investors are joining the wave in public and private financing of nuclear energy that has swelled to $14 billion so far this year — double last year’s total, according to Axios. Investment in new fission technologies, such as microreactors, has increased tenfold from 2023.
The startup wants to mass-manufacture 20MW nuclear reactors that can be built and shipped within 24 months. It’s looking to get its first reactor online in Europe.
By Eric Wesoff, 29 August 2024
A startup looking to build really small nuclear reactors just announced a big new funding round.
Last Energy, a Washington, D.C.–based next-generation nuclear company, announced that it closed a $40 million Series B funding round, a move that will add more financial and human capital to the reinvigorated nuclear sector.
The startup aims to eventually deploy thousands of its modular microreactors, though to date it has not brought any online. The first reactor might appear in Europe as soon as 2026, assuming Last Energy manages to meet its extremely aggressive construction, financial, and regulatory timelines — not a common occurrence in the nuclear industry. Venture capital heavyweight Gigafund led the round, which closed early this year but was revealed only today. The startup has raised a total of $64 million since its 2019 founding.
Last Energy is part of a cohort of companies betting that small, replicable, and mass-produced reactors will overcome the economic challenges associated with building emissions-free baseload nuclear power — and restore the moribund U.S. nuclear industry to its former glory. But the microreactor dream has yet to be realized; few of these small modular reactors (SMRs) have been built worldwide. None have been completed in the U.S., though one design from long-in-the-tooth startup NuScale Power has gotten regulatory approval.
The 20-megawatt size of Last Energy’s microreactor stands in stark contrast to that of a conventional nuclear reactor like the recently commissioned Vogtle units in Georgia, which each generate about 1,100 megawatts. A Last Energy microreactor, the size of about 75 shipping containers, might power a small factory, while a Vogtle unit can power a city.
Instead of the cathedral-style stick-built construction of modern large reactors, SMRs and microreactors are meant to be manufactured at scale in factories, transported to the site, and assembled on location. Rather than develop an advanced reactor design with exotic fuels — an approach taken by other SMR hopefuls, including the Bill Gates–backed TerraPower — Last Energy chose to scale down the well-established light-water reactor technology that powers America’s 94 existing nuclear reactors.
“We came to the conclusion that using the existing, off-the-shelf technology was the way to scale,” CEO Bret Kugelmass said in a 2022 interview with Canary Media. “We don’t innovate at all when it comes to the nuclear process or components — we do systems integration and business-model innovation.”
The startup claims that its microreactor is designed to be fabricated, transported, and built within 24 months, and is the right size to serve industrial clients. Under its business model, Last Energy aims to build, own, and operate its power plant at the customer’s site, avoiding the yearslong wait times to plug a new generation project into the power grid.
Like an independent power producer, Last Energy doesn’t sell power plants; instead, it sells electricity to customers through long-term power-purchase contracts.
“Data centers and heavy industry are trying to grapple with a very complex set of energy challenges, and Last Energy has seen them realize that micro-nuclear is the only capable solution,” said Kugelmass, who claims in today’s press release that the startup has inked commercial agreements for 80 units — with 39 of those units destined to serve power-hungry data center customers.
Last Energy isn’t the only microreactor company attracting venture funding. There are several other examples from this month alone: Aalo Atomics raised $27 million from 50Y, Valor Equity Partners, Harpoon Ventures, Crosscut, SNR, Alumni Ventures, Preston Werner, Earth Venture, Garage Capital, Wayfinder, Jeff Dean, and Nucleation Capital to scale up a 85-kilowatt design from the U.S. Department of Energy’s MARVEL program. While Deep Fission, a startup aiming to bury arrays of microreactors 1 mile underground, just raised $4 million led by 8VC, a venture firm founded by Joe Lonsdale.
These investors are joining the wave in public and private financing of nuclear energy that has swelled to $14 billion so far this year — double last year’s total, according to Axios. Investment in new fission technologies, such as microreactors, has increased tenfold from 2023.
Investors happen to be backing startups in a heavily subsidized market. Tens of billions of dollars from the Bipartisan Infrastructure Law, the U.S. DOE’s Loan Programs Office, and the Inflation Reduction Act support the development of a non-Russian supply of enriched uranium; the IRA also introduced a ridiculously generous $15-per-megawatt-hour production tax credit, meant to keep today’s existing nuclear fleet competitive with gas and renewables, as well as a similarly charitable investment tax credit to incentivize new plant construction.
The flood of funding comes as nuclear power enjoys the most public support it has had in years. Nuclear now has a favorable public opinion, with the majority of Americans supporting atomic energy and its record of safety and performance. And nuclear energy is one of the few topics that Democrat and Republican politicians have been able to agree on in recent memory.
Still, despite the rising financial, political, and public support, the U.S. nuclear industry remains frozen, plagued by a legacy of cost and timeline overruns for conventional reactors and regulatory challenges around new designs. It’s unclear when the country will get another nuclear reactor online — as of last year, the leading contender was an SMR project from NuScale, but that fell apart due to cost. In all likelihood, the next reactor to plug into the grid will be the mothballed Palisades nuclear plant in Michigan, which won government support for an unprecedented effort to recommission the plant by the end of next year.
For its part, Last Energy is not banking on the U.S. to lead the charge; it’s targeting industrial customers in Poland, Romania, and the U.K. for its initial sites, in the hopes that it will find a more favorable regulatory and financial environment.
Ryan McEntush of investment firm a16z suggests in an essay that “the success of nuclear power is much more about project management, financing, and policy than it is cutting-edge engineering or safety.”
That’s Last Energy’s philosophy too — and it’s going to need more money and more years to prove it’s the right one.
Recent Events Prove Western Nations Are Highly Vulnerable To Cyber Calamity
Alt-Market.US, August 27, 2024

COMMENT. The original of this article contains a conspiratorial view of Covid-19 and its causes.
I can’t really agree to that opinion on Covid.
BUT – the dangers of cyber calamity seem all too real to me, and this article sets it out well
As most people are aware, this month there was a sweeping internet outage across the US which led to a failure in roughly 8.5 million Microsoft Windows devices. Disruptions included banks, airline networks, emergency call centers, online retailers and numerous corporate networks. The outage is estimated to have caused at least $5.4 billion in profit losses and it only lasted about a day.
The alleged cause of the breakdown was Crowdstrike, a cyber-security company that uses large scale data updates to Microsoft Windows networks to counter cyber threats. Instead, the company uploaded bugged code and caused a cascading outage. Mac and Linux machines were not affected.
The scale of the shutdown was immense – Over 25% of Fortune 500 companies were frozen. Travel essentially stopped. Business transactions for many companies ceased. Some banks including Bank of America, Capital One, Chase, TD Bank and Wells Fargo could not function and customers could not access their accounts.
The event reminded me of the panic surrounding the Y2K scare 25 years ago. Of course, that was all nonsense; US systems were definitely not digitized to an extent great enough to cause a disaster should there be an internet crash or a software crash. But today things are very different. Nearly every sector of the American (and European) economy and many utilities are directly dependent on a functioning internet.
The fear that prevailed during Y2K was unrealistic in 1999. Now, it makes perfect sense.
………………First and foremost, there is the potential for random error like the Crowdstrike incident. Then there’s the potential for a foreign attack on US and European digital infrastructure. Then, there’s the potential for a false flag event BLAMED on random error or a foreign government in order to foment war or economic collapse.
……………………..In June of 2021 there was an internet outage that led to large swaths of the web going completely dark, including a number of mainstream news sites, Amazon, eBay, Twitch, Reddit, etc. A host of government websites also went down. All this happened when content delivery network (CDN) company Fastly experienced a “bug.” Although Amazon had its website back online within 20 minutes, the brief outage cost the company over $5.5 million in sales.
A content delivery network is a geographically distributed network of proxy servers and their data centers. They make up what is known as the “backbone” of the internet. Only a handful of these company’s support a vast majority of internet activity. All it would take is for a few to go down, and the internet goes down, taking our economy with it.
The recent Crowdstrike situation is perhaps the worst web disruption of all time, and that was just a bug in a software update. Imagine if someone wanted to deliberately damage internet functions for an extended period of time? The results would be catastrophic.
With supply chains completely dependent on “just-in-time” freight deliveries and those deliveries dependent on efficient digital communications and payments between retailers and manufacturers, a web-down scenario for more than a few days would cause an immediate loss of consumer goods. Stores would empty within hours should the public realize that new shipments might not arrive for a long time.
Keep in mind, I’m not even accounting for payment processing between customers and retailers. If that shuts down, then ALL sales shut down. Then, whatever food you have left in your pantry or in storage is what you will have to live on until the problem is fixed. If it is ever fixed…
Network attacks are difficult to independently trace, which means anyone can initiate them and anyone can be blamed afterwards. With the increasing tensions between western and eastern nations the chances of an attack are high. And corrupt government officials could also trigger an internet crisis and blame it on foreign enemies – Either to convince the public to go to war, or to convince the public to accept greater authoritarianism.
…………….Figuring out who triggered the breakdown would be nearly impossible. We could suspect, but proving who did it is another matter. In the meantime, western officials controlled by globalist interests could lock down internet traffic and eliminate alterna
What are the most practical solutions to this? As always we can store necessities to protect our families and friends. To protect data, I recommend shutting OFF Windows Updates to prevent something like a Crowdstrike error from affecting your devices. You can also set up a Linux-based device with all your important data storage secured.
You can purchase an exterior hard drive and clone your computer data, then throw it in a closet or a waterproof case. Then there is the option of building a completely offline device (a computer that has never and will never connect to the internet).tive media platforms they don’t like, giving the public access to corporate news sources only.
These options protect you and your valuable files, but there’s not much that can be done to prevent a national scale cyber attack and the damage that one could cause. Organizing for inevitable chaos and violence is all you can do.
With a cyber-event there is the distinct danger of communications disruptions – No cell phones, no email, no social media, nothing. So, having knowledge in ham radio and radio communications is a must. I’m a general class ham and I’m still finding there’s more to learn, but a basic knowledge of radios, frequency bands and repeaters will help you to at least listen in on chatter and get important information outside of controlled news networks.
The people who used to claim it’s “doom mongering” to examine the threat of cyber attacks have been proven utterly wrong this past month. We just witnessed one of the worst internet implosions of all time and more are on the way. Prepare accordingly and remember that technological dependency is a double-edged sword. Use your tech wisely and don’t let it run your life. https://alt-market.us/recent-events-prove-western-nations-are-highly-vulnerable-to-cyber-calamity/
The UK nuclear fusion start-up helping the US develop stealth submarines

Tokamak
Energy’s collaboration with Darpa among ways UK company is seeking to
monetise its magnet breakthroughs. UK start-up Tokamak Energy is supporting
a US Defense Advanced Research Projects Agency programme to make silent
submarines.
FT 30th Aug 2024
https://www.ft.com/content/570267a4-657e-4c6b-805d-7b29a637e546
Molten salt reactors were trouble in the 1960s—and they remain trouble today

Many molten salt reactor developers and proponents seem to have decided that the Molten Salt Reactor Experiment experience was so successful that all that remains is for it to be scaled up and deployed across the world. But is this really the case? A careful look suggests otherwise.
A few years after the Molten Salt Reactor Experiment was shut down, the Atomic Energy Commission terminated the entire molten salt reactor program, although it continued to fund the molten salt breeder reactor program until the end of fiscal year 1976.
Bulletin, By M.V. Ramana | June 20, 2022
Molten salt nuclear reactors are all the rage among some nuclear power enthusiasts. They promise designs that will soon lower emissions from shipping, be cheaper to run and consume nuclear waste, and be transportable in shipping containers. The Canadian government has provided two companies, Terrestrial Energy and Moltex, with tens of millions of dollars in funding. Indonesia’s Ministry of Defense has sponsored a study of thorium-based molten salt reactors. The International Atomic Energy Agency organized a webinar calling molten salt reactors “A game changer in the nuclear industry.” Unsurprisingly, China has plans to build one.
Unlike other nuclear reactor designs that can claim multiple roots, the technology underlying molten salt reactors has a fairly clear origin: the Oak Ridge National Laboratory in Tennessee. All molten salt reactors are based, in one way or another, on the Molten Salt Reactor Experiment that operated at Oak Ridge from 1965 to 1969. That experimental reactor, in turn, was based on another experimental reactor, the Aircraft Reactor Experiment, that had operated a decade earlier at the same facility.
Among developers, the Molten Salt Reactor Experiment has a legendary status. For example, in 2015, an official from Terrapower, the nuclear venture funded in part by Bill Gates, noted that his company was “excited to celebrate and build upon” the experiment by designing a molten chloride fast reactor. His accompanying slide show reinforced the message with pictures of the Molten Salt Reactor Experiment assembly, the red hot heat exchanger, and Alvin Weinberg, the leader of Oak Ridge at that time, noting that the experiment had operated for 6,000 hours. Also in 2015, Terrestrial Energy’s David LeBlanc made “a kind of pilgrimage to Oak Ridge” to celebrate the 50th anniversary of the Molten Salt Reactor Experiment becoming critical.
Many molten salt reactor developers and proponents seem to have decided that the Molten Salt Reactor Experiment experience was so successful that all that remains is for it to be scaled up and deployed across the world. But is this really the case? A careful look suggests otherwise.
Molten salt reactors’ early history. Molten salt reactors go back to the US Air Force’s failed effort to build a nuclear-powered, long-range bomber aircraft. The Air Force spent more than $1 billion (over $7 billion in today’s dollars) between 1946 and 1961 on its Aircraft Nuclear Propulsion program. President John F. Kennedy, seeing how little had been achieved, told Congress on March 28, 1961 that the possibility of success in the foreseeable future was “still very remote” and recommended terminating the program.
As part of this effort, the Air Force made Oak Ridge National Laboratory responsible for building the Aircraft Reactor Experiment as part of its effort to fly a bomber on nuclear power. The 2.5 megawatt reactor operated for a mere nine days in November 1954. Some Oak Ridge officials considered running the reactor longer, but others grew concerned about overheating of one of the reactor components. That concern was legitimate; five days later, this component failed and “released radioactive gas into the reactor compartment.” But Oak Ridge National Lab officials were undeterred. For them, the experience showed “the feasibility of molten-salt fuel” and they “persuaded the Atomic Energy Commission to fund a study of molten-salt power reactors.” In 1958, the commission did just that, and thus began the Molten Salt Reactor Experiment.
To understand the interest in molten salt reactors, start by adopting a 1950s mindset. At the time, nuclear power was expected to expand rapidly, and some energy planners were worried that there would be insufficient uranium to fuel all the reactors to be built over upcoming decades. Alvin Weinberg, the head of Oak Ridge, expressed this eloquently when he prophesized that humanity would need to “burn the rocks” in what are called breeder reactors in order to live a “passably abundant life.” While the dominant types of reactors around the world (light water reactors and heavy water reactors) use only a small fraction of the uranium and thorium found in the Earth’s crust, breeder reactors can exploit a much larger fraction of these minerals.
The concern among nuclear power advocates about running out of uranium was also at the heart of another major nuclear development during this period: the liquid metal (sodium) cooled fast breeder reactor. These reactors were an effort to tap the energy present in the uranium-238 isotope that is not used in standard light and heavy water reactors by converting it into plutonium. Glenn Seaborg, who discovered the element and rose to become Chairman of the US Atomic Energy Commission from 1961 to 1971, predicted in 1970 that, by the year 2000, plutonium “can be expected to be a predominant energy source in our lives.” By contrast, the molten salt reactors were mostly intended as a pathway to use thorium, which was more plentiful than uranium, by converting it into uranium-233.
In retrospect, these expectations proved mistaken in three ways. First, energy demand has risen much more slowly, both in the United States and globally, than predicted. For example, in 1959, Weinberg assumed that the global population would stabilize at 7 billion and that it would need at least 1.9 billion, billion BTU per year. In comparison, in 2020, the world used a little over a quarter of this level of energy for nearly 8 billion people.
Second, nuclear energy proved much more expensive than envisioned in the heady “too cheap to meter” era. As nuclear power’s poor economics became apparent, reactor construction declined dramatically and has never achieved anywhere near the levels seen in the 1970s and 1980s………………………………………………………
Third, uranium proved to be more ubiquitous than anticipated, and global uranium resource estimates have continuously increased. ………………………………………..
…..the Molten Salt Reactor Experiment, Oak Ridge’s proposal for the next step in the molten salt reactor research process, was designed and constructed. As one of the Oak Ridge team leaders described it, “Design of the [Molten Salt Reactor Experiment] started in the summer of 1960, and construction started 18 months later, at the beginning of 1962. The reactor went critical in June 1965.”
In 1965, when the reactor started operating, it was fueled by a mixture of 150 kilograms of depleted uranium and 90 kilograms of weapons-grade, highly-enriched uranium (93 percent of uranium-235). After March 1968, the fuel was changed to one involving another weapons-usable material, uranium-233, which was derived from thorium. After this switch, the Molten Salt Reactor Experiment went critical in October 1968 and reached full power in January 1969. But at the end of that year, the experiment shut down. No more molten salt reactors have been built since.
The Molten Salt Reactor Experiment operation. Proponents of molten salt reactors have claimed for decades that the Molten Salt Reactor Experiment operated successfully. Indeed, they started making this claim even when it had barely started operating. In May 1966, for example, Paul Haubenreich, Oak Ridge National Laboratory associate director, cockily announced that the experiment “will live up to the name which we think goes with the initials M.S.R.E.—Mighty Smooth Running Experiment.” This, after listing many problems, including a basic one that was never resolved.
That basic problem was the reactor’s power level. The Molten Salt Reactor Experiment was designed to produce 10 megawatts (MW) of heat. The power level is given only in terms of heat production because its designers did not even try to generate electricity from the power produced in the reactor. Instead, the experiment just dissipated the heat produced to the surrounding air.
But this design power level was never reached. As Haubenreich described while pronouncing that the experiment was running “mighty smooth,” the operators “ran into some difficulties” and could only operate “at powers up to 5 MW.” …………………It turned out that the designers of the reactor had “miscalculated the heat transfer characteristics” of the system used for dissipating the heat produced into the atmosphere, and the reactor could not operate at its intended power level.
……………the reactor operated for just 13,172 hours over those four years, or only around 40 percent of the time……………….
During its operational lifetime, the Molten Salt Reactor Experiment was shut down 225 times. Of these 225 interruptions, only 58 were planned………………….
One persistent problem was with the electrical system, which experienced “eleven important failures.”…………………………………….. unexpected failures and shutdowns ended only in December 1969, when the Molten Salt Reactor Experiment was shut down.
The patchy experience of the experiment was by no means unique. Many other reactor designs have been plagued by unreliable operations and frequent shutdowns, that in many cases only became worse when scaled up. Consider, for example, sodium cooled fast breeder reactors. France, the country most reliant on nuclear power, tried to commercialize this technology after operating pilot-scale and demonstration reactors. This “commercial” version was the Superphénix, which started operating in 1986, experienced a series of accidents, and was shut down in 1997. During this period, it generated less than 8 percent of the electrical energy of what it would have generated running at full power round-the-clock. In the United States, the first and only commercial sodium cooled breeder reactor, Fermi-1, suffered a disastrous meltdown in 1966 as a result of a series of failures that had been dismissed as not credible by reactor engineers. Likewise, high-temperature, gas-cooled reactors have historically performed poorly.
The Molten Salt Reactor Experiment aftermath. For Oak Ridge officials and other molten salt reactor proponents, these problems with the Molten Salt Reactor Experiment were not worthy of significant concern. They moved forward with plans to build a larger molten salt breeder reactor. (Remember that the ultimate goal was to use thorium to breed nuclear fuel.) But the experiment did identify major hurdles in the path of building reliable molten salt reactors.
Here’s a key concern: Materials used to manufacture molten-salt-reactor components must maintain their integrity in highly radioactive and corrosive environments at elevated temperatures. The corrosion is a result of the reactor’s nature, which involves the use of a fuel consisting of uranium mixed with the hot salts for which the reactor is named. As anyone living near a seashore knows, chemically corrosive salt water eats most metallic objects.
To deal with this problem, Oak Ridge developed a new alloy known as IN0R-8 or Hastelloy-N in the late 1950s. While Hastelloy-N did not get significantly corroded—at least during the four years of intermittent operations—it had two significant problems. First, the material had trouble managing stresses. It became brittle, for example. Second the material developed cracks on surfaces exposed to the fuel salt. Both of these could lead to the component failing.
These problems remain relevant. Even today, no material can perform satisfactorily in the high-radiation, high-temperature, and corrosive environment inside a molten salt reactor. In 2018, scientists at the Idaho National Laboratory conducted an extensive review of different materials and, in the end, could only recommend that “a systematic development program be initiated.” In other words, fifty years after the molten salt reactor was shut down, technical experts still have questions about materials development for a new molten salt reactor design.
A few years after the Molten Salt Reactor Experiment was shut down, the Atomic Energy Commission terminated the entire molten salt reactor program, although it continued to fund the molten salt breeder reactor program until the end of fiscal year 1976…………………………………………………………………………………………………………………………
The Atomic Energy Commission, for its part, justified its decision in a devastating report that listed a number of problems with the large molten salt reactor that Oak Ridge scientists had conceptualized. The list included problems with materials, some of which have been earlier described; the challenge of controlling the radioactive tritium gas that is produced in molten salt reactors; the many large components, such as steam generators, that woud have to be developed from scratch (as researchers had no experience with such components for a molten salt reactor); the difficulties associated with molten-salt-reactor maintenance because radioactive fission products would be dispersed throughout the reactor; some safety disadvantages (though these are balanced by pointing out some of the safety advantages); and problems with graphite, which is used in molten-salt-reactor designs to slow down neutrons, because it swells when subjected to the nuclear reactor’s high radiation doses.
Other institutions too questioned the idea. A 1975 Office of Technology Assessment report listed the pros and cons of maintaining support for the molten salt breeder reactor program. An important set of arguments listed there proved prescient: “the [molten salt breeder reactor] may never work, its economics would be doubtful even if it did, and the chances of needing it are small.” As a result, in the years after the Molten Salt Reactor Experiment was shut down, many arguments were advanced to abandon the molten salt route, including not throwing good money after bad.
The Molten Salt Reactor Experiment’s long difficult tail.………………………………………………………. The distribution of the numbers of papers indicates the challenge of dealing with the waste resulting from a small molten salt reactor.
Dealing with radioactive salt wastes involves at least two separate concerns. The first, ongoing problem is that managing the radioactive salts that contain the uranium isotopes and the fission products is difficult. In the 1990s, researchers discovered that uranium had migrated and settled in other parts of the facility, leading to the possibility of an accidental criticality.
The second challenge is that of securely storing the uranium-233 from the Molten Salt Reactor Experiment. Although the uranium-233 used in the Molten Salt Reactor Experiment is but a small part of the larger US stockpile of the substance, it occurs in chemical forms that are difficult to manage. Further, urarnium-233 is usable in nuclear weapons, and any loss of this material might lead to security concerns.
In all, the costs incurred so far have run into the hundreds of millions of dollars—dozens of times the cost of constructing the reactor itself. …………………………………………………………………………………
Molten salt reactors are a bad idea. The Molten Salt Reactor Experiment’s history is riddled with extensive problems, both during its operational lifetime and the half century thereafter. These problems were not accidental but a result of the many material challenges faced by the reactor itself………………………………………………………………..
Should molten salt reactors ever be constructed, they are unlikely to operate reliably. And if they are deployed, they would likely result in various safety and security risks. And they would produce several different waste streams, all of which would require extensive processing and would face disposal related challenges. Investing in molten salt reactors is not worth the cost or the effort.
This article has benefited from research support from Maggie Chong, a materials engineering student at the University of British Columbia. une 2022
https://thebulletin.org/2022/06/molten-salt-reactors-were-trouble-in-the-1960s-and-they-remain-trouble-today/
A robot’s attempt to get a sample of the melted nuclear fuel at Japan’s damaged reactor is suspended
An attempt to use an extendable robot to remove a fragment of melted fuel from a wrecked reactor at Japan’s tsunami-hit nuclear plant has been suspended due to a technical issue
abc news, By MARI YAMAGUCHI Associated Press, August 22, 2024,
TOKYO — An attempt to use an extendable robot to remove a fragment of melted fuel from a wrecked reactor at Japan’s tsunami-hit Fukushima Daiichi nuclear power plant was suspended Thursday due to a technical issue.
The collection of a tiny sample of the debris inside the Unit 2 reactor’s primary containment vessel would start the fuel debris removal phase, the most challenging part of the decadeslong decommissioning of the plant where three reactors were destroyed in the March 11, 2011, magnitude 9.0 earthquake and tsunami disaster.
The work was stopped when workers noticed that five 1.5-meter (5-foot) pipes used to maneuver the robot were placed in the wrong order and could not be corrected within the time limit for their radiation exposure, the plant operator Tokyo Electric Power Company Holdings said.
The pipes were to be used to push the robot inside and pull it back out when it finished. Once inside the vessel, the robot is operated remotely from a safer location.
The robot can extend up to about 22 meters (72 feet) to reach its target area to collect a fragment from the surface of the melted fuel mound using a device equipped with tongs that hang from the tip of the robot.
The mission to obtain the fragment and return with it is to last two weeks. TEPCO said a new start date is undecided…………………………………………………………….
The government and TEPCO are sticking to a 30-40-year cleanup target set soon after the meltdown, despite criticism it is unrealistic. No specific plans for the full removal of the melted fuel debris or its storage have been decided. https://abcnews.go.com/Technology/wireStory/robots-attempt-sample-melted-nuclear-fuel-japans-damaged-113049701
Nuclear power on the prairies is a green smokescreen.

By M. V. Ramana & Quinn Goranson, August 19th 2024, Canada’s National Observer https://www.nationalobserver.com/2024/08/19/opinion/nuclear-power-prairies-green-smokescree
On April 2, Alberta Premier Danielle Smith declared on X (formerly Twitter) “we are encouraged and optimistic about the role small modular reactors (SMRs) can play” in the province’s plans to “achieve carbon neutrality by 2050.”
SMRs, for those who haven’t heard this buzzword, are theoretical nuclear reactor designs that aim to produce smaller amounts of electricity compared to the current reactor fleet in Canada. The dream of using small reactors to produce nuclear power dates back to the 1950s — and so has their record of failing commercially.
That optimism about SMRs will be costing taxpayers at least $600,000, which will fund the company, X-Energy’s research “into the possibility of integrating small modular reactors (SMRs) into Alberta’s electric grid.” This is on top of the $7 million offered by Alberta’s government in September 2023 to oil and gas producer Cenovus Energy to study how SMRs could be used in the oil sands.
Last August, Saskatchewan’s Crown Investments Corporation provided $479,000 to prepare local companies to take part in developing SMRs. Alberta and Saskatchewan also have a Memorandum of Understanding to “advance the development of nuclear power generation in support of both provinces’ need for affordable, reliable and sustainable electricity grids by 2050”.
What is odd about Alberta and Saskatchewan’s talk about carbon neutrality and sustainability is that, after Nunavut, these two provinces are most reliant on fossil fuels for their electricity; as of 2022, Alberta derived 81 per cent of its power from these sources; Saskatchewan was at 79 percent. In both provinces, emissions have increased more than 50 per cent above 1990 levels.
It would appear neither province is particularly interested in addressing climate change, but that is not surprising given their commitment to the fossil fuel industry. Globally, that industry has long obstructed transitioning to low-carbon energy sources, so as to continue profiting from their polluting activities.
Canadian companies have played their part too. Cenovus Energy, the beneficiary of the $7 million from Alberta, is among the four largest Canadian oil and gas companies that “demonstrate negative climate policy engagement,” and advocate for provincial government investment in offshore oil and gas development. It is also a part of the Pathways Alliance that academic scholars charge with greenwashing, in part because of its plans to use a problematic technology, carbon capture and storage, to achieve “net-zero emissions from oilsands operations by 2050.”
Carbon capture and storage is just one of the unproven technologies that the fossil fuel industry and its supporters use as part of their “climate pledges and green advertising.” Nuclear energy is another — especially when it involves new designs such as SMRs that have never been deployed in North America, or have failed commercially.
X-energy, the company that is to receive $600,000, is using a technology that has been tried out in Germany and the United States with no success. The last high-temperature, gas-cooled reactor built in the United States was shut down within a decade, producing, on average, only 15 per cent of what it could theoretically produce.
Even if one were to ignore these past failures, building nuclear reactors is slow and usually delayed. In Finland, construction of the Olkiluoto-3 reactor started in 2005, but it was first connected to the grid in 2022, a thirteen-year delay from the anticipated 2009.
Construction of Argentina’s CAREM small modular reactor started in February 2014 but it is not expected to start operating till at least the “end of 2027,” and most likely later. Both Finland and Argentina have established nuclear industries. Neither Alberta nor Saskatchewan possess any legislative capacity to regulate a nuclear industry.
What Alberta and Saskatchewan are indulging in through all these announcements and funding for small modular nuclear reactors is an obstructionist tactic to slow down the transition away from fossil fuels. Discussing nuclear technology shifts attention from present and projected GHG emissions, while enabling a ramp-up of fossil fuel reliance in the medium-term and delaying climate action into the long-term.
Floating the idea of adding futuristic SMR technology into the energy mix is one way to publicly appear to be committed to climate action, without doing anything tangible. Even if SMRs were to be deployed to supply energy in the tar sands, that does not address downstream emissions from burning the extracted fossil fuels.
Relying on new nuclear for emission reductions prevents phasing out fossil fuels at a pace necessary for the scientific consensus in favour of rapid and immediate decarbonization. An obstructionist focus on unproven technologies will not help.
Quinn Goranson is a recent graduate from the University of British Columbia’s School of Public Policy and Global Affairs with a specialization in environment, resources and energy. Goranson has experience working in research for multiple renewable energy organizations, including the CEDAR project, in environmental policy in the public sector, and as an environmental policy consultant internationally.
M.V. Ramana is the Simons Chair in Disarmament, Global and Human Security and Professor at the School of Public Policy and Global Affairs, at the University of British Columbia in Vancouver. He is the author of The Power of Promise: Examining Nuclear Energy in India (Penguin Books, 2012) and “Nuclear is not the Solution: The Folly of Atomic Power in the Age of Climate Change” (Verso Books, 2024).
The Risk of Bringing AI Discussions Into High-Level Nuclear Dialogues
Overly generalized discussions on the emerging technology may be unproductive or even undermine consensus to reduce nuclear risks at a time when such consensus is desperately needed.
by Lindsay Rand, August 19, 2024, https://carnegieendowment.org/posts/2024/08/ai-nuclear-dialogue-risks-npt?lang=en
Last month, nuclear policymakers and experts convened in Geneva to prepare for a major conference to review the implementation of the Treaty on the Non-Proliferation of Nuclear Weapons (NPT). At the meeting, calls for greater focus on the implications of artificial intelligence (AI) for nuclear policy pervaded diverse discussions. This echoes many recent pushes from within the nuclear policy community to consider emerging technologies in nuclear security–focused dialogues. However, diplomats should hesitate before trying to tackle the AI-nuclear convergence. Doing so in official, multilateral nuclear security dialogues risks being unproductive or even undermining consensus to reduce nuclear risks at a time when such consensus is desperately needed.
Last month, nuclear policymakers and experts convened in Geneva to prepare for a major conference to review the implementation of the Treaty on the Non-Proliferation of Nuclear Weapons (NPT). At the meeting, calls for greater focus on the implications of artificial intelligence (AI) for nuclear policy pervaded diverse discussions. This echoes many recent pushes from within the nuclear policy community to consider emerging technologies in nuclear security–focused dialogues. However, diplomats should hesitate before trying to tackle the AI-nuclear convergence. Doing so in official, multilateral nuclear security dialogues risks being unproductive or even undermining consensus to reduce nuclear risks at a time when such consensus is desperately needed.
The level of interest in AI at the preparatory committee meeting isn’t surprising, given how much attention is being paid to the implications of AI for nuclear security and international security more broadly. Concerns range from increased speed of engagement, which could reduce human decisionmaking time, to automated target detection that could increase apprehension over second-strike survivability, or even increase propensity for escalation. In the United States, the State Department’s International Security Advisory Board recently published a report that examines AI’s potential impacts on arms control, nonproliferation, and verification, highlighting the lack of consensus around definitions and regulations to govern Lethal Autonomous Weapons Systems (LAWS). Internationally, there have also been calls for the five nuclear weapon states (P5) to discuss AI in nuclear command and control at the P5 Process, a forum where the P5 discuss how to make progress toward meeting their obligations under the NPT. Observers have called for the P5 to issue a joint statement on the importance of preserving human responsibility in nuclear decisionmaking processes.
However, injecting AI into nuclear policy discussions at the diplomatic level presents potential pitfalls. The P5 process and NPT forums, such as preparatory committee meetings and the NPT Review Conference, are already fraught with challenges. Introducing the complexities of AI may divert attention from other critical nuclear policy issues, or even become linked to outstanding areas of disagreement in a way that further entrenches diplomatic roadblocks.
Before introducing discussions about AI into official nuclear security dialogues, policymakers should address the following questions:
- In which forums could discussions about AI be productive?
- What specific topics could realistically foster more productive dialogue?
- Who should facilitate and participate in these discussions?
Forum Selection………………………………………………………………
Topic Selection………………………………..
Participants………………………………
Tech Companies Are Racing to Harness Nuclear Power

Oil Price, By Felicity Bradstock – Aug 18, 2024
- Tech companies are investing heavily in nuclear energy to power their AI operations.
- Regulatory challenges and utility opposition are hindering the development of new nuclear projects.
With the demand for power increasing rapidly, tech companies are looking for innovative solutions to meet the demand created by artificial intelligence (AI) and other new technologies. In addition to solar and wind power, several tech companies are investing in nuclear energy projects to power operations. The clear shift in the public perception of nuclear power has once again put the abundant clean [!] energy source on the table as an option, with the U.S. nuclear energy capacity expected to rise significantly over the coming decades. ……………………………
Tech companies have invested heavily in wind and solar energy to power their data centers and are now looking for alternative clean power supplies. In 2021, Sam Altman, the CEO of OpenAI, invested $375 in the nuclear fusion startup Helion Energy. Last year, Microsoft signed a deal to purchase power from Helion beginning in 2028. Altman also chairs the nuclear fission company Oklo. Oklo is planning to build a massive network of small-scale nuclear reactors in rural southeastern Idaho to provide power to data centers as the electricity demand grows. It is also planning to build two commercial plants in southern Ohio.
However, getting some of these nuclear projects off the ground is no easy feat. Oklo has found it difficult to get the backing of nuclear regulators. In 2022, the Federal Nuclear Regulatory Commission (FERC), which oversees commercial nuclear power plants, rejected the firm’s application for the design of its Idaho “Aurora” project, for not providing enough safety information. …………………………………………
In addition to the red tape from regulators, many utilities are opposing new nuclear projects due to their anticipated impact on the grid. Some data centers require 1 GW or more of power, which is around the total capacity of a nuclear reactor in the U.S. PJM Interconnection, the biggest grid operator in the U.S., recently warned that power supply and demand is tightening as the development of new generation is falling behind demand. However, some tech companies are proposing to connect data centers directly to nuclear plants, also known as co-location, to reduce the burden on the grid.
However, several U.S. utilities oppose co-location plans……………………………………………………… more https://oilprice.com/Alternative-Energy/Nuclear-Power/Tech-Companies-Are-Racing-to-Harness-Nuclear-Power.html
Amazon Vies for Nuclear-Powered Data Center

The deal has become a flash point over energy fairness
1EEE Spectrum, Andrew Moseman, 12 Aug 2024
When Amazon Web Services paid US $650 million in March for another data center to add to its armada, the tech giant thought it was buying a steady supply of nuclear energy to power it, too. The Susquehanna Steam Electric Station outside of Berick, Pennsylvania, which generates 2.5 gigawatts of nuclear power, sits adjacent to the humming data center and had been directly powering it since the center opened in 2023.
After striking the deal, Amazon wanted to change the terms of its original agreement to buy 180 megawatts of additional power directly from the nuclear plant. Susquehanna agreed to sell it. But third parties weren’t happy about that, and their deal has become bogged down in a regulatory battle that will likely set a precedent for data centers, cryptocurrency mining operations, and other computing facilities with voracious appetites for clean electricity.
Putting a data center right next to a power plant so that it can draw electricity from it directly, rather than from the grid, is becoming more common as data centers seek out cheap, steady, carbon-free power. Proposals for co-locating data centers next to nuclear power have popped up in New Jersey, Texas, Ohio, and elsewhere. Sweden is considering using small modular reactors to power future data centers.
However, co-location raises questions about equity and energy security, because directly-connected data centers can avoid paying fees that would otherwise help maintain grids. They also hog hundreds of megawatts that could be going elsewhere.
When Amazon Web Services paid US $650 million in March for another data center to add to its armada, the tech giant thought it was buying a steady supply of nuclear energy to power it, too. The Susquehanna Steam Electric Station outside of Berick, Pennsylvania, which generates 2.5 gigawatts of nuclear power, sits adjacent to the humming data center and had been directly powering it since the center opened in 2023.
After striking the deal, Amazon wanted to change the terms of its original agreement to buy 180 megawatts of additional power directly from the nuclear plant. Susquehanna agreed to sell it. But third parties weren’t happy about that, and their deal has become bogged down in a regulatory battle that will likely set a precedent for data centers, cryptocurrency mining operations, and other computing facilities with voracious appetites for clean electricity.
Putting a data center right next to a power plant so that it can draw electricity from it directly, rather than from the grid, is becoming more common as data centers seek out cheap, steady, carbon-free power. Proposals for co-locating data centers next to nuclear power have popped up in New Jersey, Texas, Ohio, and elsewhere. Sweden is considering using small modular reactors to power future data centers.
However, co-location raises questions about equity and energy security, because directly-connected data centers can avoid paying fees that would otherwise help maintain grids. They also hog hundreds of megawatts that could be going elsewhere.
“They’re effectively going behind the meter and taking that capacity off of the grid that would otherwise serve all customers,” says Tony Clark, a senior advisor at the law firm Wilkinson Barker Knauer and a former commissioner at the Federal Energy Regulatory Commission (FERC), who has testified to a U.S. House subcommittee on the subject.
Amazon’s nuclear power deal meets hurdles
The dust-up over the Amazon-Susquehanna agreement started in June, after Amazon subsidiary Amazon Web Services filed a notice to change its interconnection service agreement (ISA) in order to buy more nuclear power from Susquehanna’s parent company, Talen Energy. Amazon wanted to increase the amount of behind-the-meter power it buys from the plant from 300 MW to 480 MW. Shortly after it requested the change, utility giants Exelon and American Electric Power (AEP), filed a protest against the agreement and asked FERC to hold a hearing on the matter…………………………………………………………………………………………………………….
Costs of data centers seeking nuclear energy
Yet such arrangements could have major consequences for other energy customers, Clark argues. For one, directing all the energy from a nuclear plant to a data center is, fundamentally, no different than retiring that plant and taking it offline. “It’s just a huge chunk of capacity leaving the system,” he says, resulting in higher prices and less energy supply for everyone else.
Another issue is the “behind-the-meter” aspect of these kinds of deals. A data center could just connect to the grid and draw from the same supply as everyone else, Clark says. But by connecting directly to the power plant, the center’s owner avoids paying the administrative fees that are used to maintain the grid and grow its infrastructure. Those costs could then get passed on to businesses and residents who have to buy power from the grid. “There’s just a whole list of charges that get assessed through the network service that if you don’t connect through the network, you don’t have to pay,” Clark says. “And those charges are the part of the bill that will go up” for everyone else.
Even the “carbon-free” public relations talking points that come with co-location may be suspect in some cases. In Washington State, where Schneider works, new data centers are being planted next to the region’s abundant hydropower stations, and they’re using so much of that energy that parts of the state are considering adding more fossil fuel capacity to make ends meet. This results in a “zero-emissions shell game,” Clark wrote in a white paper on the subject.
These early cases are likely only the beginning. A report posted in May from the Electric Power Research Institute predicts energy demand from data centers will double by 2030, a leap driven by the fact that AI queries need ten times more energy than traditional internet searches. The International Energy Agency puts the timeline for doubling sooner–in 2026. Data centers, AI, and the cryptocurrency sector consumed an estimated 460 terawatt-hours (TWh) in 2022, and could reach more than 1000 TWh in 2026, the agency predicts.
Data centers face energy supply challenges
New data centers can be built in a matter of months, but it takes years to build utility-scale power projects, says Poorvi Patel, manager of strategic insights at Electric Power Research Institute and contributor to the report. The potential for unsustainable growth in electricity needs has put grid operators on alert, and in some cases has sent them sounding the alarm. Eirgrid, a state-owned transmission operator in Ireland, last week warned of a “mass exodus” of data centers in Ireland if it can’t connect new sources of energy. ……………………………………………………………………………………..more https://spectrum.ieee.org/amazon-data-center-nuclear-power—
The Great Global Computer Outage Is a Warning We Ignore at Our Peril

Is there a limit in the natural order of things to the amount of technological complexity that’s sustainable?
by Tom Valovic, 2 August 24 https://www.counterpunch.org/author/tom-valovic/
July 18, 2024, will go down in history books as an event that shook up the world in a unique way. It gave the mass of humanity a pointed wake-up call about the inherent fragility of the technological systems we’ve created and the societal complexities they’ve engendered. Critical services at hospitals, airports, banks, and government facilities around the world were all suddenly unavailable. We can only imagine what it must have been like to be undergoing treatment in an emergency room at the time with a serious or life-threatening illness.
So, what are we to make of this event and how can we rationally get our collective arms around its meaning and significance? As a journalist who specializes in writing about the impacts of technology on politics and culture, I would like to share a few initial thoughts.
For some of us who have worked in the tech field for many years, such an event was entirely predictable. This is simply because of three factors: 1) the inherent fragility of computer code, 2) the always-present possibility of human error, and 3) the fact that when you build interconnected systems, a vulnerability in one part of the system can easily spread like a contagion to other parts. We see this kind of vulnerability in play daily in terms of a constant outpouring of news stories about hacking, identity theft, and security breaches involving all sorts of companies and institutions. However, none of these isolated events had sufficient scale to engender greater public awareness and alarm until The Great Global Computer Outage of July 18.
Inherent Fragility is Always Present
As impressive as our new digital technologies are, our technocrats and policymakers often seem to lose sight of an important reality. These now massively deployed systems are also quite fragile in the larger scheme of things. Computers and the communications systems that support them—so called virtual systems—can concentrate huge amounts of informational power and control by wielding it like an Archimedean lever to manage the physical world. A cynic could probably argue that we’re now building our civilizational infrastructures on a foundation of sand.
At the recently held Aspen Security Forum, Anne Neuberger—a senior White House cybersecurity expert—noted, “We need to really think about our digital resilience not just in the systems we run but in the globally connected security systems, the risks of consolidation, how we deal with that consolidation and how we ensure that if an incident does occur it can be contained and we can recover quickly.” With all due respect, Ms. Neuberger was simply stating the obvious and not digging deep enough.
The problem runs much deeper. Our government and that of other advanced Western nations is now running on two separate but equal tracks: technology and governance. The technology track is being overseen by Big Tech entities with little accountability or oversight concerning the normative functions of government. In other words, they’re more or less given a free hand to operate according to the dictates of the free market economy.
Further, consider this thought experiment: Given AI’s now critical role in shaping key aspects of our lives and given its very real and fully acknowledged downsides and risks, why was it not even being discussed in the presidential debate? The answer is simple: These issues are often being left to unelected technocrats or corporate power brokers to contend with. But here’s the catch: Most technocrats don’t have the policy expertise needed to guide critical decision-making at a societal level while, at the same time, our politicians (and yes, sadly, most of our presidential candidates) don’t have the necessary technology expertise.
Scope, Scale, and Wisdom
Shifting to a more holistic perspective, humanity’s ability to continue to build these kinds of systems runs into the limitations of our conceptual ability to embrace their vastness and complexity. So, the question becomes: Is there a limit in the natural order of things to the amount of technological complexity that’s sustainable? If so, it seems reasonable to assume that this limit is determined by the ability of human intelligence to encompass and manage that complexity.
To put it more simply: At what point in pushing the envelope of technology advancement do we get in over our heads and to what degree is a kind of Promethean hubris involved?
As someone who has written extensively about the dangers of AI, I would argue that we’re now at a tipping point whereby it’s worth asking if we can even control what we’ve created and whether the “harmful side effects” of seeming constant chaos is now militating against the quality of life. Further, we can only speculate as to whether we should consider if the CrowdStrike event was somehow associated with some sort of still poorly understood or recognized AI hacking or error. The bottom line is: If we cannot control the effects of our own technological invention then in what sense can those creations be said to serve human interests and needs in this already overly complex global environment?
Finally, the advent of under-the-radar hyper-technologies such as nanotechnology and genetic engineering also need to be considered in this context. These are also technologies that can only be understood in the conceptual realm and not in any concrete and more immediate way because (I would argue) their primary and secondary effects on society, culture, and politics can no longer be successfully envisioned. Decisively moving into these realms, therefore, is like ad hoc experimentation with nature itself. But as many environmentalists have pointed out, “Nature bats last.” Runaway technological advancement is now being fueled by corporate imperatives and a “growth at any cost” mentality that offers little time for reflection. New and seemingly exciting prospects for advanced hyper-technology may dazzle us, but if in the process they also blind us, how can we guide the progress of technology with wisdom?
Tom Valovic is a journalist and the author of Digital Mythologies (Rutgers University Press), a series of essays that explored emerging social and political issues raised by the advent of the Internet. He has served as a consultant to the former Congressional Office of Technology Assessment. Tom has written about the effects of technology on society for a variety of publications including Columbia University’s Media Studies Journal, the Boston Globe, and the San Francisco Examiner, among others.
Is the dream of nuclear fusion dead? Why the international experimental reactor is in ‘big trouble’

The 35-nation Iter project has a groundbreaking aim to create clean and limitless energy but it is turning into the ‘most delayed and cost-inflated science project in history’
Guardian, Robin McKie Science Editor, 4 Aug 24
It was a project that promised the sun. Researchers would use the world’s most advanced technology to design a machine that could generate atomic fusion, the process that drives the stars – and so create a source of cheap, non-polluting power.
That was initially the aim of the International Thermonuclear Experimental Reactor (Iter) which 35 countries – including European states, China, Russia and the US – agreed to build at Saint-Paul-lez-Durance in southern France at a starting cost of $6bn. Work began in 2010, with a commitment that there would be energy-producing reactions by 2020.
Then reality set in. Cost overruns, Covid, corrosion of key parts, last-minute redesigns and confrontations with nuclear safety officials triggered delays that mean Iter is not going to be ready for another decade, it has just been announced. Worse, energy-producing fusion reactions will not be generated until 2039, while Iter’s budget – which has already soared to $20bn – will increase by a further $5bn.
Other estimates suggest the final price tag could rise well above this figure and make Iter “the most delayed and most cost-inflated science project in history”, the journal Scientific American has warned. For its part, the journal Science has stated simply that Iter is now in “big trouble”, while Nature has noted that the project has been “plagued by a string of hold-ups, cost overruns and management issues”.
Dozens of private companies now threaten to create fusion reactors on a shorter timescale, warn scientists. These include Tokamak Energy in Oxford and Commonwealth Fusion Systems in the US.
“The trouble is that Iter has been going on for such a long time, and suffered so many delays, that the rest of the world has moved on,” said fusion expert Robbie Scott of the UK Science and Technology Facilities Council. “A host of new technologies have emerged since Iter was planned. That has left the project with real problems.”
A question mark now hangs over one of the world’s most ambitious technological projects in its global bid to harness the process that drives the stars. It involves the nuclei of two light atoms being forced to combine to form a single heavier nucleus, while releasing massive amounts of energy. This is nuclear fusion, and it only occurs at colossally high temperatures.
To create such heat, a doughnut-shaped reactor, called a tokamak, will use magnetic fields to contain a plasma of hydrogen nuclei that will then be bombarded by particle beams and microwaves. When temperatures reach millions of degrees Celsius, the mix of two hydrogen isotopes – deuterium and tritium – will fuse to form helium, neutrons and a great deal of excess energy.
Containing plasma at such high temperatures is exceptionally difficult. “It was originally planned to line the tokamak reactor with protective beryllium but that turned out to be very tricky. It is toxic and eventually it was decided to replace it with tungsten,” said David Armstrong, professor of materials science and engineering at Oxford University. “That was a major design change taken very late in the day.”
Then huge sections of tokamak made in Korea were found not to fit together properly, while threats that there could be leaks of radioactive materials led the French nuclear regulators to call a halt on the plant’s construction. More delays in construction were announced as problems piled up………………………………………………………….
For its part, Iter denies that it is “in big trouble” and rejects the idea that it is a record-breaking science project for cost overruns and delays. Just look at the International Space Station or for that matter the UK’s HS2 rail link, said a spokesman.
Others point out that fusion power’s limited carbon emissions would boost the battle against climate change. “However, fusion will arrive too late to help us cut carbon emissions in the short term,” said Aneeqa Khan, a research fellow in nuclear fusion at the University of Manchester. “Only if fusion power plants produce significant amounts of electricity later in the century will they help keep our carbon emissions down – and that will become crucial in the fight against climate change.” https://www.theguardian.com/technology/article/2024/aug/03/is-the-dream-of-nuclear-fusion-dead-why-the-international-experimental-reactor-is-in-big-trouble
Is nuclear waste able to be recycled? Would that solve the nuclear waste problem?

Radioactive Wastes from Nuclear Reactors, Questions and Answers, Gordon Edwards 28 July 24.
Well, you know, the very first reactors did not produce electricity. They were built for the express purpose of creating plutonium for atomic bombs. Plutonium is a uranium derivative. It is one of the hundreds of radioactive byproducts created inside every uranium-fuelled reactor. Plutonium is the stuff from which nuclear weapons are made. Every large nuclear warhead in the world’s arsenals uses plutonium as a trigger.
But plutonium can also be used as a nuclear fuel. That first power reactor that started up in 1951 in Idaho, the first electricity-producing reactor, was called the EBR-1 — it actually suffered a partial meltdown. EBR stands for “Experimental Breeder Reactor” and it was cooled, not with water, but with hot liquid sodium metal.
By the way, another sodium-cooled electricity producing reactor was built right here in California, and it also had a partial meltdown. The dream of the nuclear industry was, and still is, to use plutonium as the fuel of the future, replacing uranium. A breeder reactor is one that can “burn” plutonium fuel and simultaneously produce even more plutonium than it uses. Breeder reactors are usually sodium-cooled.
In fact sodium-cooled reactors have failed commercially all over the world, in the US, France, Britain, Germany, and Japan, but it is still the holy grail of the nuclear industry, the breeder reactor, so watch out.
To use plutonium, you have to extract it from the fiercely radioactive used nuclear fuel. This technology of plutonium extraction is called reprocessing. It must be carried out robotically because of the deadly penetrating radiation from the used fuel.
Most reprocessing involves dissolving used nuclear fuel in boiling nitric acid and chemically separating the plutonium from the rest of the radioactive garbage. This creates huge volumes of dangerous liquid wastes that can spontaneously explode (as in Russia in 1957) or corrode and leak into the ground (as has happened in the USA). A single gallon of this liquid high-level waste is enough to ruin an entire city’s water supply.
In 1977, US President Jimmy Carter banned reprocessing in the USA because of fears of proliferation of nuclear weapons at home and abroad. Three years earlier, in 1974, India tested its first atomic bomb using plutonium from a Canadian research reactor given to India as a gift.
The problem with using plutonium as a fuel is that it is then equally available for making bombs. Any well-equipped group of criminals or terrorists can make its own atomic bombs with a sufficient quantity of plutonium – and it only takes about 8 kilograms to do so. Even the crudest design of a nuclear explosive device is enough to devastate the core of any city.
Plutonium is extremely toxic when inhaled. A few milligrams is enough to kill any human within weeks through massive fibrosis of the lungs.
A few micrograms – a thousand times less– can cause fatal lung cancer with almost 100% certainty. So even small quantities of plutonium can be used by terrorists in a so-called “dirty bomb”. That’s a radioactive dispersal device using conventional explosives. Just a few grams of fine plutonium dust could threaten the lives of thousands if released into the ventilation system of a large office building.
So beware of those who talk about “recycling” used nuclear fuel. What they are really talking about is reprocessing – plutonium extraction – which opens a Pandora’s box of possibilities. The liquid waste and other leftovers are even more environmentally threatening, more costly, and more intractable, than the solid waste. Perpetual isolation is still required. ————
Humans should teach AI how to avoid nuclear war—while they still can

By Cameron Vega, Eliana Johns | July 22, 2024, https://thebulletin.org/2024/07/humans-should-teach-ai-how-to-avoid-nuclear-war-while-they-still-can/?fbclid=IwZXh0bgNhZW0CMTEAAR2M_EOXy8gbl1C9knrlD6Qox7m3ZMlORORVIO7cUXuQjvu7rt1RoN5mWLo_aem_0VOtqNpJ2N7mxCdvmakvNw#post-heading
When considering the potentially catastrophic impacts of military applications of Artificial Intelligence (AI), a few deadly scenarios come to mind: autonomous killer robots, AI-assisted chemical or biological weapons development, and the 1983 movie WarGames.
The the 1983 movie WarGames, features a self-aware AI-enabled supercomputer that simulates a Soviet nuclear launch and convinces US nuclear forces to prepare for a retaliatory strike. The crisis is only partly averted because the main (human) characters persuade US forces to wait for the Soviet strike to hit before retaliating. It turns out that the strike was intentionally falsified by the fully autonomous AI program. The computer then attempts to launch a nuclear strike on the Soviets without human approval until it is hastily taught about the concept of mutually assured destruction, after which the program ultimately determines that nuclear war is a no-win scenario: “Winner: none.”
US officials have stated that an AI system would never be given US nuclear launch codes or the ability to take control over US nuclear forces. However, AI-enabled technology will likely become increasingly integrated into nuclear targeting and command and control systems to support decision-making in the United States and other nuclear-armed countries. Because US policymakers and nuclear planners may use AI models in conducting analyses and anticipating scenarios that may ultimately influence the president’s decision to use nuclear weapons, the assumptions under which these AI-enabled systems operate require closer scrutiny.
Pathways for AI integration. The US Defense Department and Energy Department already employ machine learning and AI models to make calculation processes more efficient, including for analyzing and sorting satellite imagery from reconnaissance satellites and improving nuclear warhead design and maintenance processes. The military is increasingly forward-leaning on AI-enabled systems. For instance, it initiated a program in 2023 called Stormbreaker that strives to create an AI-enabled system called “Joint Operational Planning Toolkit” that will incorporate “advanced data optimization capabilities, machine learning, and artificial intelligence to support planning, war gaming, mission analysis, and execution of all-domain, operational level course of action development.” While AI-enabled technology presents many benefits for security, it also brings significant risks and vulnerabilities.
One concern is that the systemic use of AI-enabled technology and an acceptance of AI-supported analysis could become a crutch for nuclear planners, eroding human skills and critical thinking over time. This is particularly relevant when considering applications for artificial intelligence in systems and processes such as wargames that influence analysis and decision-making. For example, NATO is already testing and preparing to launch an AI system designed to assist with operational military command and control and decision-making by combining an AI wargaming tool and machine learning algorithms. Even though it is still unclear how this system will impact decision-making led by the United States, the United Kingdom, and NATO’s Nuclear Planning Group concerning US nuclear weapons stationed in Europe, this type of AI-powered analytical tool would need to consider escalation factors inherent to nuclear weapons and could be used to inform targeting and force structure analysis or to justify politically motivated strategies.
The role given to AI technology in nuclear strategy, threat prediction, and force planning can reveal more about how nuclear-armed countries view nuclear weapons and nuclear use. Any AI model is programmed under certain assumptions and trained on selected data sets. This is also true of AI-enabled wargames and decision-support systems tasked with recommending courses of action for nuclear employment in any given scenario. Based on these assumptions and data sets alone, the AI system would have to assist human decision-makers and nuclear targeters in estimating whether the benefits of nuclear employment outweigh the cost and whether a nuclear war is winnable.
Do the benefits of nuclear use outweigh the costs? Baked into the law of armed conflict is a fundamental tension between any particular military action’s gains and costs. Though fiercely debated by historians, the common understanding of the US decision to drop two atomic bombs on Japan in 1945 demonstrates this tension: an expedited victory in East Asia in exchange for hundreds of thousands of Japanese casualties.
Understanding how an AI algorithm might weigh the benefits and costs of escalation depends on how it integrates the country’s nuclear policy and strategy. Several factors contribute to one’s nuclear doctrine and targeting strategy—ranging from fear of consequences of breaking the tradition of non-use of nuclear weapons to concern of radioactive contamination of a coveted territory and to sheer deterrence because of possible nuclear retaliation by an adversary. While strategy itself is derived from political priorities, military capabilities, and perceived adversarial threats, nuclear targeting incorporates these factors as well as many others, including the physical vulnerability of targets, overfly routes, and accuracy of delivery vehicles—all aspects to further consider when making decisions about force posture and nuclear use.
In the case of the United States, much remains classified about its nuclear decision-making and cost analysis. It is understood that, under guidance from the president, US nuclear war plans target the offensive nuclear capabilities of certain adversaries (both nuclear and non-nuclear armed) as well as the infrastructure, military resources, and political leadership critical to post-attack recovery. But while longstanding US policy has maintained to “not purposely threaten civilian populations or objects” and “not intentionally target civilian populations or targets in violation of [the law of armed conflict],” the United States has previously acknowledged that “substantial damage to residential structures and populations may nevertheless result from targeting that meets the above objectives.” This is in addition to the fact that the United States is the only country to have used its nuclear weapons against civilians in war.
There is limited public information with which to infer how an AI-enabled system would be trained to consider the costs of nuclear detonation. Certainly, any plans for nuclear employment are determined by a combination of mathematical targeting calculations and subjective analysis of social, economic, and military costs and benefits. An AI-enabled system could improve some of these analyses in weighing certain military costs and benefits, but it could also be used to justify existing structures and policies or further ingrain biases and risk acceptance into the system. These factors, along with the speed of operation and innate challenges in distinguishing between data sets and origins, could also increase the risks of escalation—either deliberate or inadvertent.
Is a nuclear war “winnable”? Whether a nuclear war is winnable depends on what “winning” means. Policymakers and planners may define winning as merely the benefits of nuclear use outweighing the cost when all is said and done. When balancing costs and benefits, the benefits need only be one “point” higher for an AI-enabled system to deem the scenario a “win.”
In this case, “winning” may be defined in terms of national interest without consideration of other threats. A pyrrhic victory could jeopardize national survival immediately following nuclear use and still be considered a win by the AI algorithm. Once a nuclear weapon has been used, it could either incentivize an AI system to not recommend nuclear use or, on the contrary, recommend the use of nuclear weapons on a broader scale to eliminate remaining threats or to preempt further nuclear strikes.
“Winning” a nuclear war could also be defined in much broader terms. The effects of nuclear weapons go beyond the immediate destruction within their blast radius; there would be significant societal implications from such a traumatic experience, including potential mass migration and economic catastrophe, in addition to dramatic climatic damage that could result in mass global starvation. Depending on how damage is calculated and how much weight is placed on long-term effects, an AI system may determine that a nuclear war itself is “unwinnable” or even “unbearable.
Uncovering biases and assumptions. The question of costs and benefits is relatively uncontroversial in that all decision-making involves weighing the pros and cons of any military option. However, it is still unknown how an AI system will weigh these costs and benefits, especially given the difficulty of comprehensively modeling all the effects of nuclear weapon detonations. At the same time, the question of winning a nuclear war has long been a thorn in the side of nuclear strategists and scholars. All five nuclear-weapon states confirmed in 2022 that “a nuclear war cannot be won and must never be fought.” For them, planning to win a nuclear war would be considered inane and, therefore, would not require any AI assistance. However, deterrence messaging and discussion of AI applications for nuclear planning and decision-making illuminate the belief that the United States must be prepared to fight—and win—a nuclear war.
How close are we to chaos? It turns out, just one blue screen of death

Keeping cash as a backup is a smart idea in the event of a payment systems outage,
David Swan, Technology editor, 22 July 24, https://www.theage.com.au/technology/how-close-are-we-to-chaos-it-turns-out-just-one-blue-screen-of-death-20240720-p5jv6t.html
In some places, Friday’s mass tech outage resembled the beginning of an apocalyptic zombie movie. Supermarket checkouts were felled across the country and shoppers were turned away, airports became shelters for stranded passengers, and live TV and radio presenters were left scrambling to fill airtime. The iconic Windows “blue screen of death” hit millions of devices globally and rendered them effectively useless.
The ABC’s national youth station Triple J issued a call-out for anyone who could come to their Sydney studio to DJ in person. One woman was reportedly unable to open her smart fridge to access her food.
All because of a failure at CrowdStrike, a company that most of us – least of all those who were worst hit – had never heard of before.
It’s thought to be the worst tech outage in history and Australia was at its epicentre: the crisis began here, and spread to Europe and the US as the day progressed. Surgeries were cancelled in Austria, Japanese airlines cancelled flights and Indian banks were knocked offline. It was a horrifying demonstration of how interconnected global technology is, and how quickly things can fall apart.
At its peak, it reminded us of some of the most stressful periods of the pandemic, when shoppers fought each other for rolls of toilet paper and argued about whether they needed to wear masks.
Many of us lived through the Y2K panic. We avoided the worst outcomes but it was an early harbinger of how vulnerable our technology is to bugs and faults, and showed the work required to keep everything up and running. The CrowdStrike meltdown felt closer to what’s really at risk when things go wrong.
As a technology reporter, for years I’ve had warnings from industry executives of the danger of cyberattacks or mass outages. These warnings have become real.
The cause of this outage was not anything malicious. It was relatively innocuous: CrowdStrike has blamed a faulty update from its security software, which then caused millions of Windows machines to crash and enter a recovery boot loop.
Of course Australians are no strangers to mass outages, even as they become more common and more severe.
The Optus network outage that froze train networks and disrupted hospital services just over six months ago was eerily similar to the events on Friday, not least because it was also caused by what was supposed to be a routine software upgrade.
The resignation of chief executive Kelly Bayer Rosmarin did little to prevent another Optus outage a month later. If anything, Friday’s CrowdStrike outage highlights how many opportunities there are for one failure to cripple millions of devices and grind the global economy to a halt. So many of the devices that underpin our economy have hundreds of different ways that they can be knocked offline, whether through a cyberattack or human error, as was likely the case with CrowdStrike.
The incident would likely have been even worse were it a cyberattack. Experts have long warned about the vulnerability of critical infrastructure – including water supplies and electricity – to malicious hackers. Everything is now connected to the internet and is therefore at risk.
And yet the potential damage of such attacks is only growing. We are now more reliant than ever on a concentrated number of software firms, and we have repeatedly seen their products come up short when we need them to just work.
In the US, the chair of the Federal Trade Commission, Lina Khan, put it succinctly.
“All too often these days, a single glitch results in a system-wide outage, affecting industries from healthcare and airlines to banks and auto-dealers,” Khan said on Saturday.
“Millions of people and businesses pay the price.”
Khan is right. The technology we rely on is increasingly fragile, and is increasingly in the hands of just a few companies. The world’s tech giants like Microsoft and Apple now effectively run our daily lives and businesses, and an update containing a small human error can knock it all over, from Australia to India.
The heat is now on CrowdStrike, as well as the broader technology sector on which we rely so heavily, and some initial lessons are clear. Airlines have backup systems to help keep some flights operational in the case of a technological malfunction. As everyday citizens, it’s an unfortunate reality that we need to think similarly.
Keeping cash as a backup is a smart idea in the event of a payment systems outage, as is having spare battery packs for your devices. Many smart modems these days, like those from Telstra and Optus, offer 4G or 5G internet if their main connection goes down. We need more redundancies built in to the technology we use, and more alternatives in case the technology stops working altogether.
For IT executives at supermarkets, banks and hospitals, the outage makes it clear that “business as usual” will no longer cut it, and customers rightly should expect adequate backups to be in place. Before the Optus outage, a sense of complacency had permeated our IT operations rooms and our company boardrooms, and it still remains. No longer.
The “blue screen of death”, accompanied by a frowny face, was an apt metaphor for the current state of play when it comes to our overreliance on technology. Our technology companies – and us consumers, too – need to do things differently if we’re to avoid another catastrophic global IT outage. There’s too much at stake not to.
High hopes and security fears for next-gen nuclear reactors

Fuel for advanced reactors is raising nuclear proliferation concerns.
The Verge, By Justine Calma, a senior science reporter covering energy and the environment with more than a decade of experience. She is also the host of Hell or High Water: When Disaster Hits Home, a podcast from Vox Media and Audible Originals, Jul 20, 2024
Next-generation nuclear reactors are heating up a debate over whether their fuel could be used to make bombs, jeopardizing efforts to prevent the proliferation of nuclear weapons.
Uranium in the fuel could theoretically be used to develop a nuclear weapon. Older reactors use such low concentrations that they don’t really pose a weapons proliferation threat. But advanced reactors would use higher concentrations, making them a potential target of terrorist groups or other countries wanting to take the fuel to develop their own nuclear weapons, some experts warn.
They argue that the US hasn’t prepared enough to hedge against that worst-case scenario and are calling on Congress and the Department of Energy to assess potential security risks with advanced reactor fuel.
Other experts and industry groups still think it’s unfeasible for such a worst-case scenario to materialize. But the issue is starting to come to a head as nuclear reactors become a more attractive energy source, garnering a rare show of bipartisan support in Congress.
……. Earlier this month, President Joe Biden signed bipartisan legislation into law meant to speed the development of next-generation nuclear reactors in the US by streamlining approval processes.
………….The US Nuclear Regulatory Commission (NRC) certified an advanced small modular reactor design for the first time last year. And we’re likely still years away from seeing commercial plants in action. But if the US ever wants to get there, it’ll also have to build up a supply chain for the fuel those advanced reactors would consume. The Inflation Reduction Act includes $700 million to develop that domestic fuel supply.
Today’s reactors generally run on fuel made with a uranium isotope called U-235. Naturally occurring uranium has quite low concentrations of U-235; it has to be “enriched” — usually up to a 5 percent concentration of U-235 for a traditional reactor. Smaller advanced reactors would run on more energy-dense fuel that’s enriched with between 5 to 20 percent U-235, called HALEU (short for high-assay low-enriched uranium).
That higher concentration is what has some experts worried. “If the weapons usability of HALEU is borne out, then even a single reactor would pose serious security concerns,” says a policy analysis penned by a group of nuclear proliferation experts and engineers published in the journal Science last month (including an author credited with being one of the architects of the first hydrogen bomb).
Fuel with a concentration of at least 20 percent is considered highly enriched uranium, which could potentially be used to develop nuclear weapons. With HALEU designs reaching 19.75 percent U-235, the authors argue, it’s time for the US to think hard about how safe the next generation of nuclear reactors would be from malicious intent.
“We need to make sure that we don’t get in front of ourselves here and make sure that all the security and safety provisions are in place first before we go off and start sending [HALEU] all around the country,” says R. Scott Kemp, associate professor of nuclear science and engineering and director of the MIT Laboratory for Nuclear Security and Policy.
That 20 percent threshold goes back to the 1970s, and bad actors ostensibly have more information and computational tools at their disposal to develop weapons, Kemp and his coauthors write in the paper. It might even be possible to craft a bomb with HALEU well under the 20 percent threshold, the paper contends……………………………………………………………………………………..
Aside from asking Congress for an updated security assessment of HALEU, the paper suggests setting a lower enrichment limit for uranium based on new research or ramping up security measures for HALEU to more closely match those for weapons-usable fuels.
…………………………“Unless there’s a really good reason to switch to fuels that pose greater risks of nuclear proliferation, then it’s irresponsible to pursue those,” says Edwin Lyman, director of nuclear power safety at the Union of Concerned Scientists and another author of the paper. Lyman has also raised concerns about the radioactive waste from nuclear reactors over the years. “There is no good reason.” https://www.theverge.com/24201610/next-generation-nuclear-energy-reactors-security-weapons-proliferation-risk
-
Archives
- January 2026 (246)
- December 2025 (358)
- November 2025 (359)
- October 2025 (376)
- September 2025 (258)
- August 2025 (319)
- July 2025 (230)
- June 2025 (348)
- May 2025 (261)
- April 2025 (305)
- March 2025 (319)
- February 2025 (234)
-
Categories
- 1
- 1 NUCLEAR ISSUES
- business and costs
- climate change
- culture and arts
- ENERGY
- environment
- health
- history
- indigenous issues
- Legal
- marketing of nuclear
- media
- opposition to nuclear
- PERSONAL STORIES
- politics
- politics international
- Religion and ethics
- safety
- secrets,lies and civil liberties
- spinbuster
- technology
- Uranium
- wastes
- weapons and war
- Women
- 2 WORLD
- ACTION
- AFRICA
- Atrocities
- AUSTRALIA
- Christina's notes
- Christina's themes
- culture and arts
- Events
- Fuk 2022
- Fuk 2023
- Fukushima 2017
- Fukushima 2018
- fukushima 2019
- Fukushima 2020
- Fukushima 2021
- general
- global warming
- Humour (God we need it)
- Nuclear
- RARE EARTHS
- Reference
- resources – print
- Resources -audiovicual
- Weekly Newsletter
- World
- World Nuclear
- YouTube
-
RSS
Entries RSS
Comments RSS





