NNSA has yet to satisfy Government Accounting Office best practice guidelines for the SRS pit project.
LANL itself has experienced numerous and serious safety accidents, including a plutonium fire, flooding, glove box contamination and a plutonium “criticality” accident, in recent years.
Why does the production of new plutonium pits take priority over cleaning up the hazardous legacy of previous pit production?
Last week U.S. District Judge Mary Lewis Geiger, South Carolina, faulted the Department of Energy and the National Nuclear Security Agency for ignoring the National Environmental Protection Act and rushing plans to fabricate plutonium pit bombs at Savannah River Site, near Aiken, South Carolina.
Newly designed plutonium pits will serve as “triggers” for the next generation of nuclear warheads mounted atop Sentinel, the next generation of intercontinental ballistic missile, and for new submarine-launched nuclear weapons. Combined, these projects comprise major components in the trillion-dollar “modernization” of the U.S. strategic deterrence force.
Plaintiffs including Savannah River Site Watch, South Carolina Environmental Law Project Gullah/Geechee Sea Island Coalition, Nuclear Watch New Mexico and Tri-Valley CAREs forced NNSA to halt construction on many phases of its plutonium pit facility near Aiken, SC, to hold public scoping meetings, solicit public comments, and produce a Programmatic Environmental Impact Statement within thirty months.
Plaintiffs successfully argued that the plutonium pit modernization project was complex, involving diverse entities, was spread over wide geographical regions and therefore, by definition, required a “programmatic environmental impact statement, PEIS.
The proposed plutonium pit facility at Savannah River Site will reconstruct a massive 500-room partially completely abandoned building designed for the Mixed Oxide Plant. The spectacularly failed MOX plant would have processed old plutonium pits from de-commissioned US nuclear weapons per a nuclear weapons agreement with the Russians in 2000. Poor management and engineering revisions multiplied costs exceeding $7 billion when DOE finally terminated the MOX project in 2019. DOE recently paid the State of South Carolina an extra $600 million fine for failure to remove 10 tons of plutonium delivered to the MOX plant and stored at SRS. Ironically SRS is importing a different 10 tons of plutonium pits from the PANTEX pit storage site in Texas to manufacture new pits.
NNSA’s plan for plutonium pit production at Savannah River Site involves complex coordination between Los Alamos, the Waste Isolation Pilot Plant in Carlsbad NM, the Lawrence Livermore National Laboratory in CA and the Kansas City National Security Campus, and therefor requires a NEPA “programmatic environmental impact statement”. NNSA refused repeated calls to perform the PEIS, which resulted in the successful lawsuit agreed last week.
NNSA has yet to satisfy Government Accounting Office best practice guidelines for the SRS pit project. GAO’s repeated calls for NNSA to create quality Integrated Master Schedules and Life Cycle Cost Estimates for its plutonium pit modernization program remain unfulfilled. These plans and guidelines establish best practices for building an efficient cost-effective project, something MOX consistently ignored, leading to its disastrous failure. Congress subsequently ordered NNSA meet these GAO parameters by July 2025.
Congress had mandated in 2019 that Los Alamos National Laboratory in New Mexico manufacture 80 plutonium pits per year by 2030. Because LANL is a research facility, it has not produced any plutonium pits since 2011, and never at scale. It was unprepared to fulfill this Congressional mandate, authored by Senator John McCain. In response, NNSA then divided the plutonium pit project in two: Savannah River Site would produce 50 pits per year by 2030, and LANL 30 pits. SRS has never manufactured plutonium pits, though it did produce 10 tons of plutonium for pit fabrication at Rocky Flats, CO beginning in 1957. Thirty million gallons of highly radioactive wastes from that project, more than 200 million curies* of radiation, remain stored on- site at SRS, making it one of the most radioactive Superfund sites in the U.S.
Rocky Flats had produced one to two thousand plutonium pits per year for decades until it was closed in 1989. After whistleblower leaks, (see Jon Lipsky, James Stone) the FBI and EPA raided Rocky Flats discovering gross fraud and egregious violations of environmental regulations by contractor, Rockwell International. Rocky Flats was closed and will remain a superfund site into the far distant future.
Parts of Los Alamos National Lab, wedged on a tabletop mesa, comprises a superfund site with residual plutonium still found around the site and in surrounding canyons from operations and waste dumping begun in the 1940’s “Oppenheimer years”.DOE recently signed a consent decree with the State of New Mexico to assume greater responsibility for the clean-up of waste deposit wells and trenches that threaten nearby towns like White Rock, the San Ildefonso Pueblo and the Rio Grande River with radiological contamination. DOE paid New Mexico a $420,000 fine for mishandling hazardous wastes is 2024.
LANL itself has experienced numerous and serious safety accidents, including a plutonium fire, flooding, glove box contamination and a plutonium “criticality” accident, in recent years. The most recent 2023 safety report for LANL, operated by Triad LLC, showed improvement in its safety operations, though in that same year LANL was fined $420,000 by New Mexico for improper handling of hazardous materials.
Plutonium, Pu, is a man-made metallic element. It is highly toxic, highly radioactive, pyrophoric, (spontaneously ignites on contact with air) and fissionable. It is extremely challenging to produce, purify, mill, melt, mold, weld, control and store. All these processes have taken place at sites across the U.S. since the 1940’s and are now catalogued by DOE as “legacy hazardous waste sites”.
Because plutonium ignites on contact with air, it must be handled in “glove boxes”, self-contained hermetically sealed boxed filled with inert gases. Impervious rubber sleeves extend into the box, and workers slip their arms into these sleeves, then manipulate the plutonium through different phases of pit production. Any nicks or cracks in the rubber gloves can and have resulted in plutonium leaks, and serious illnesses.
Glove boxes and gloves for the plutonium pit project, in example, are already is short supply, demonstrating how integral and integrated every aspect of the plutonium pits program is, and how poor planning could disrupt the program; the basic tenant of the lawsuit against NNSA.
Training a skilled glove box worker at LANL can take four years. A shortage of skilled workers at LANL poses a regular challenge, one that will intensify as LANL workers will also train unskilled SRS workers. A shortage of workers at WIPP in Carlsbad NM has been a chronic problem despite significant wage increases from DOE.
Historically, sites involved with the production, refining, milling or fabrication of plutonium or plutonium pits for nuclear weapons have left a voluminous legacy of radionuclide pollution. Radioactive wastes generated in weapons production beginning with the 1940’s Manhattan Project, by statute, are destined for the Waste Isolation Pilot Plant, WIPP, in Carlsbad, New Mexico. Because plutonium has a half life of 24,000 years and remains lethal for much longer, plutonium waste products trucked over millions of highway miles to WIPP are stored in vaults excavated into salt domes 2000 feet underground. While WIPP is the sole repository for defense department transuranic wastes, the Government Accounting Office cautioned that WIPP may not have the capacity to accept all the plutonium pit wastes generated at LALN and SRS. Timely removal of plutonium waste from SRS and LANL is crucial for uninterrupted pit production.
A fire in WIPP’s salt dome closed the facility for 3 years in 2014. A fire at LANL closed its operation for 3 years in 2013.
Both SRS and LANL will recycle surplus plutonium pits from the strategic reserve at PANTEX near Amarillo, TX. Currently 4000 reserve pits and 10,000 surplus pits waiting disposal are stored at PANTEX. Re-engineered pits from SRS and LANL will be returned to PANTEX for final assembly into W87-1 and W 88 nuclear warheads.
The rate of deterioration of plutonium pits, 30 or more years old, has concerned and motivated lawmakers to legislate a complete replacement of all 3,600 deployed and reserve nuclear warheads. Independent scientific groups like JASON and the Livermore National Lab have estimated that plutonium pits maintain their viability for 100 or even 150 years. Hardware within the nuclear warhead corrodes much more quickly than the pits themselves, focusing doubt on the race to replace the pits themselves.
The programmatic environmental statement ordered by federal Judge Geiger may resolve many questions posed by the rush to produce new plutonium pits. The pits produced at SRS and LANL will trigger new W87-1 nuclear warheads. What need is there for a new warhead when the old W87-0 has the same safety features? Why are SRS and LANL adopting an aggressive production schedule when the new Sentinel ICBM deliver systems is way over budget and at least a decade away from deployment? Why does the production of new plutonium pits take priority over cleaning up the hazardous legacy of previous pit production? Has any plutonium production site ever not become a hazardous waste site? Will NNSA slow pit production to engineer safety improvements instead of placing workers in risky dangerous situations? Do we really want to spend a trillion dollars and start a new nuclear arms race?
Note.
* A curie, Ci, is a measure of radiation per second, named after Marie and Pierre Curie. Exposure to even a few curies can be fatal.
The United States is estimated to have spent more than $400 billion on the kinds of antimissile goals that the president now says will provide “for the common defense.”
Star Wars is back, with an executive order from President Trump that the White House said “directs the building of the Iron Dome missile defense shield for America.”
The order, issued on Monday night, didn’t quite do that. It was more a vaguely worded set of instructions to accelerate current programs or explore new approaches to defending the continental United States than a blueprint for arming the heavens with thousands of antimissile weapons, sensors and tracking devices.
But two blocks away, on the same evening, the Office of Management and Budget issued a 56-page spreadsheet that detailed the suspension of funding for thousands of programs. They included most of the major U.S. efforts to reduce the amount of nuclear fuel that terrorists might seize, to guard against biological weapon attacks and to manage initiatives around the globe to curb the spread of nuclear arms.
The two announcements seemed to encapsulate the administration’s conflicting instincts in its opening weeks. Mr. Trump wants to build big and take the Space Force he created to new heights, even at the risk of new arms races. That effort has been underway since Ronald Reagan’s day, with only mixed results.
But in its drive to shut down programs it believes could be creations of the so-called deep state, the administration wants to cut off funding for many programs that seek to reduce the chances of an attack on the United States — an attack that could very well come in forms other than a missile launched from North Korea, China or Russia.
A judge paused Mr. Trump’s spending freeze on Tuesday, but the president’s intentions are clear.
Though Mr. Trump calls his plan the Iron Dome, it has little if any resemblance to the Israeli system of the same name that has succeeded in destroying small missiles that move at a snail’s pace compared with the blinding speeds of intercontinental warheads………………………………………………………..
Missile defense has long been a favorite topic for Mr. Trump, who has envisioned the project as the next step for the Space Force, which he created in his first term.
But it could also trigger a new arms race, some experts fear. And unaddressed in Mr. Trump’s new initiative is the threat of nuclear terrorism and blackmail with an atomic bomb, which might be smuggled into the United States on a truck or a boat. Many experts see the terrorism threat as far bigger than an enemy firing a single missile or a swarm.
In 2001, after Sept. 11 attacks, the federal government scrambled to get wide-ranging advice on how outwit terrorists and better protect Americans from the threats of germ, computer, chemical and nuclear attacks.
“The combination of simultaneously deploying a missile defense system of questionable effectiveness against any real threat” while “suspending operative programs against nuclear or bioterrorists, sophisticated cyberattackers or others” is a “terrible trade-off,” said Ernest Moniz, the energy secretary under President Barack Obama who now heads the Nuclear Threat Initiative.
“The Iron Dome reference conjures up the success of the Israeli missile defense, but that’s misleading given the relatively short-range missiles that Israel defends against and the small territory it needs to defend,” said Mr. Moniz, a former professor at the Massachusetts Institute of Technology with long experience in nuclear weapons ………………
Critics of the executive order say it is more a list than a program, and includes systems that have never panned out. In an interview, Theodore A. Postol, an emeritus professor of science and national security at M.I.T., called Mr. Trump’s missile plan “a compendium of flawed weapons systems that have been shown to be unworkable.”…………………………………………
The Iron Dome’s functionality depends on Israel’s comparatively miniscule size and proximity to enemies. This makes it particularly hard to imagine a similar setup in the US, which is over 400 times the geographical size of Israel. Such an apparatus, national security analyst Joseph Cirincione estimated, would cost about 2.5 trillion dollars. That’s over three times the country’s entire projected military budget for 2025.
A central campaign promise, the proposed $2 trillion-plus missile shield is, to experts, silly.
Donald Trump’s Republican Party platform, released in July, contains little in terms of tangible policy proposals.
But one of the few concrete ideas is a call to (apologies for the capitalization) “PREVENT WORLD WORLD III” by building “A GREAT IRON DOME MISSILE DEFENSE SHIELD OVER OUR ENTIRE COUNTRY”—a plan that experts say is nearly impossible to execute, unnecessary, and hard to even comprehend.
Trump has vowed to build this Iron Dome in multiple speeches. It is among his campaign’s 20 core promises. The former president has said that the missile shield would be “MADE IN AMERICA,” creating jobs, as well as stopping foreign attacks.
“It’s dramatically unclear to me what any of this means,” Lewis said of the Iron Dome idea, “other than just treating it like the insane ramblings of a senile old person.”
It may be more useful to consider an American Iron Dome as a bombastic businessman’s branding exercise, rather than a viable policy position, said Lewis: “The Iron Dome here has just become a kind of brand name, like Xerox or Kleenex for missile defense.”
The Iron Dome, a short-range missile defense system created by Israeli state-owned company Rafael Advanced Defense Systems and American weapons manufacturer Raytheon, has been a prized part of the country’s military arsenal since it became operational in 2011. It is not, as the name suggests, an impenetrable shield. It’s more mobile: when a short-range missile reaches Israel’s airspace, “interceptor missiles” are launched to blow them up before they can touch the ground.
The Iron Dome’s functionality depends on Israel’s comparatively miniscule size and proximity to enemies. This makes it particularly hard to imagine a similar setup in the US, which is over 400 times the geographical size of Israel. Such an apparatus, national security analyst Joseph Cirincione estimated, would cost about 2.5 trillion dollars. That’s over three times the country’s entire projected military budget for 2025.
uch a system would also be unnecessary. As of now, there are no armed groups sending missiles toward the United States from within a theoretical Iron Dome’s 40-mile interception range. Such a system “couldn’t even protect Mar-a-Lago from missiles fired from the Bahamas, some 80 miles away,” Cirincione wrote in late July.
America’s pre-existent missile defense network, which has been in place since the Bush administration, is currently made up of 44 interceptors based in California and Alaska, geared towards longer-range missiles, such as those that could be fired from North Korea. But the system has performed abysmally in tests, despite Republicans generally claiming “it works,” said Lewis. (Groups like the right-wing Heritage Foundation have been calling for increased missile defense funding since at least the 1990s.)
“This is why it’s so hard to make heads or tails of what Trump is saying,” Lewis continued. “Is Trump saying the system in Alaska doesn’t work? Is Trump saying that Canada is going to develop artillery rockets to use against North Dakota?”
The order’s most contentious element directs the Department of Defense to pursue space-based interceptors — weapons positioned in orbit to destroy incoming missiles. While proponents argue these could provide global coverage and early intercept capabilities, critics warn they could trigger an arms race and undermine existing treaties.
The order sets a bold agenda to address emerging threats, including hypersonic missiles, through advanced technological solutions, including space-based interceptors.
WASHINGTON — President Donald Trump signed an executive order Jan. 27 that calls for the development of a sweeping new missile defense system for the United States, including controversial space-based interceptors.
The Pentagon must submit within 60 days a proposed architecture for the system, including plans to accelerate the Missile Defense Agency’s ongoing Hypersonic and Ballistic Tracking Space Sensor program and develop a “custody layer” within the Proliferated Warfighter Space Architecture — a planned constellation of military satellites currently being acquired by the U.S. Space Force’s Space Development Agency.
The executive order also emphasizes securing the defense industrial base, requiring “next-generation security features” for the supply chain as the U.S. races to build advanced interceptors and tracking systems.
The “Iron Dome for America” order, which invokes Israel’s successful rocket defense system, directs the Pentagon to accelerate development of defenses against hypersonic weapons and other advanced aerial threats that Trump’s order describes as “the most catastrophic threat facing the United States.”
While drawing inspiration from Israel’s Iron Dome system, the U.S. initiative would need to be dramatically different in scale and capability to defend the continent-spanning American territory against sophisticated intercontinental ballistic missiles, rather than the short-range rockets that threaten Israel.
The U.S. has collaborated with Israel on missile defense technology since the 1980s, including support for the Iron Dome system, which has intercepted thousands of incoming rockets since its 2011 deployment. Unlike Israel’s system, which defends a territory roughly the size of New Jersey, a U.S. continental defense system would need to protect an area nearly 500 times larger against more sophisticated threats such as Chinese hypersonic glide vehicles.
Unlike traditional ground- or sea-based systems, the envisioned architecture leans on space-based solutions, which have long been controversial.
The order’s most contentious element directs the Department of Defense to pursue space-based interceptors — weapons positioned in orbit to destroy incoming missiles. While proponents argue these could provide global coverage and early intercept capabilities, critics warn they could trigger an arms race and undermine existing treaties.
Climate change was a major factor behind the hot, dry weather that gave rise to the devastating LA fires, a scientific study has confirmed. It made those weather conditions about 35% more likely, according to World Weather Attribution – globally recognised for their studies linking extreme weather to climate change. The authors noted that the LA wildfire season is getting longer while the rains that normally put out the blazes have reduced. The scientists highlight that these wildfires are highly complex with multiple factors playing a role, but they are confident that a warming climate is making LA more prone to intense fire events.
As the front of modern warfare slowly evolved from direct military action into weaponized financial speculation, the market for data became just as valuable as the defense budget itself.
Facebook, not unlike Palantir, was one of the vehicles used to privatize controversial U.S. military surveillance projects
While often mythologized as having been created to champion human freedom, the internet and many of its most popular companies were directly birthed out of the national security apparatus of the United States.
Today, the world’s economy no longer runs on oil, but data. Shortly after the advent of the microprocessor came the internet, unleashing an onslaught of data running on the coils of fiber optic cables beneath the oceans and satellites above the skies. While often posited as a liberator of humanity against the oppressors of nation-states that allows previously impossible interconnectivity and social organization between geographically separated cultures to circumnavigate the monopoly on violence of world governments, ironically, the internet itself was birthed out of the largest military empire of the modern world – the United States.
The ARPANET
Specifically, the internet began as ARPANET, a project of the Advanced Research Projects Agency (ARPA), which in 1972 became known as the Defense Advanced Research Projects Agency (DARPA), currently housed within the Department of Defense. ARPA was created by President Eisenhower in 1958 within the Office of the Secretary of Defense (OSD) in direct response to the U.S.’ greatest military rival, the USSR, successfully launching Sputnik, the first artificial satellite in Earth’s orbit with data broadcasting technology. While historically considered the birth of the Space Race, in reality, the formation of ARPA began the now-decades-long militarization of data brokers, quickly leading to world-changing developments in global positioning systems (GPS), the personal computer, networks of computational information processing (“time-sharing”), primordial artificial intelligence, and weaponized autonomous drone technology.
In October 1962, the recently-formed ARPA appointed J.C.R. Licklider, a former MIT professor and vice president of Bolt Beranek and Newman (known as BBN, currently owned by defense contractor Raytheon), to head their Information Processing Techniques Office (IPTO). At BBN, Licklider developed the earliest known ideas for a global computer network, publishing a series of memos in August 1962 that birthed his “Intergalactic Computer Network” concept. Six months after his appointment to ARPA, Licklider would distribute a memo to his IPTO colleagues – addressed to “Members and Affiliates of the Intergalactic Computer Network”– describing a “time-sharing network of computers” – building off a similar exploration of communal, distributed computation by John Forbes Nash, Jr. in his 1954 paper “Parallel Control” commissioned by defense contractor RAND – which would build the foundational concepts for ARPANET, the first implementation of today’s Internet.
Prior to the technological innovations explored by Licklider and his ARPA colleagues, data communication – at this time, mainly voice via telephone lines – were based on circuit switching, in which each telephone call would be manually connected by a switch operator to establish a dedicated, end-to-end analog electrical connection between the two parties. The RAND Corporation’s Paul Baran, and later ARPA itself, would begin to work on methods to allow formidable data communication in the event of a partial disconnection, such as from a nuclear event or other act of war, leading to a distributed network of unmanned nodes that would compartmentalize the desired information into smaller blocks of data – today referred to as packets – before routing them separately, only to be rejoined once received at the desired destination.
While certainly unbeknownst to the technologists at the time, this achievement of both distributed routing and global information settlement via data packets created an entirely new commodity – digital data.
A Brief History of Weaponized Financial Intelligence
Long before the USSR spooked the United States into formalizing ARPA due to fears of militarized satellite applications post-Sputnik launch, data brokers have played a significant role in warfare and specifically the markets surrounding military conflict……………………………………………….
As the front of modern warfare slowly evolved from direct military action into weaponized financial speculation, the market for data became just as valuable as the defense budget itself. It is for this reason that the necessity of sound data emerged as the foremost issue of national security, leading to a proliferation of advanced data brokers coming out of DARPA and the intelligence community, akin to the 21st century’s Manhattan Project.
The San Jose Project: Google, Facebook, and PayPal
Exemplified by the creation of the CIA’s venture firm, In-Q-Tel, and the proliferation of Silicon Valley-based venture firms coalescing on Sand Hill Road in Palo Alto, CA, the financialization of a new crop of American data brokers was complete. The first firm to grace Sand Hill Road was Kleiner Perkins Caufield & Byers, better known as KPCB, which participated in funding internet pioneers Amazon, AOL, and Compaq, while also directly seeding Netscape and Google. KPCB partners have included such government stalwarts as former Vice President Al Gore, former Secretary of State Colin Powell, and Ted Schlein – the latter being a board member of In-Q-Tel and member of the NSA’s advisory board. KPCB also had an intimate connection with internet networking pioneer Sun Microsystems, best known for building out the majority of network switches and other infrastructure needed for a modern broadband economy.
……………………… Perhaps the world’s most famous data broker, Google, whose founders both came out of Stanford University, was seeded by former Sun Microsystems founder Andy Bechtolsheim and his partner at the Ethernet switching company Granite Systems (later acquired by Cisco), David Cheriton, with Google’s most iconic CEO, Eric Schmidt, being the former CTO of Sun Microsystems.
The emergence of Silicon Valley out of the academic circuit in Northern California was no accident, and in fact was directly influenced by an unclassified program known as the Massive Digital Data Systems (MDDS) project. The MDDS was created with direct participation from the CIA, NSA, and DARPA itself within the computer science programs at Stanford and CalTech, alongside MIT, Harvard and Carnegie Mellon……………… over a few years, more than a dozen grants of several million dollars each were distributed via the NSF (the National Science Foundation) in order to capture the most promising efforts, ensuring that those efforts would become intellectual property controlled by the United States regulatory regime.
……………………………………………….The first unclassified briefing for scientists was titled “birds of a feather briefing” and was formalized during a 1995 conference in San Jose, CA, which was titled the “Birds of a Feather Session on the Intelligence Community Initiative in Massive Digital Data Systems.” That same year, one of the first MDDS grants was awarded to Stanford University, which was already a decade deep in working with NSF and DARPA grants. The primary objective of this grant was to “query optimization of very complex queries,” with a closely-followed second grant that aimed to build a massive digital library on the internet. These two grants funded research by then-Stanford graduate students and future Google cofounders, Sergey Brin and Larry Page. Two intelligence-community managers regularly met with Brin while he was still at Stanford and completing the research that would lead to the incorporation of Google, all paid for by grants provided by the NSA and CIA via MDDS.
…………………………………………………………………………………………….It was also during these formative years that the PayPal team worked closely with the intelligence community. …………………………………………………………………..In 2003, a year after PayPal was sold to eBay, Thiel approached Alex Karp, a fellow alumnus of Stanford with a new venture concept: “Why not use Igor to track terrorist networks through their financial transactions?” Thiel took funds from the PayPal sale to seed the company, and after a few years of pitching investors, the newly-formed Palantir received an estimated $2 million investment from the CIA’s venture capital firm, In-Q-Tel.
………………………………..As of 2013, Palantir’s client list included “the CIA, the FBI, the NSA, the Centre for Disease Control, the Marine Corps, the Air Force, Special Operations Command, West Point and the IRS” with around “50% of its business” coming from public sector contracts…………… As The Guardianreports: “Palantir does not just provide the Pentagon with a machine for global surveillance and the data-efficient fighting of war, it runs Wall Street, too.”
Facebook, not unlike Palantir, was one of the vehicles used to privatize controversial U.S. military surveillance projects after 9/11, having also been birthed out of one of the MDDS partners, Harvard University. PayPal and Palantir co-founder Peter Thiel became Facebook’s first significant investor at the behest of file-sharing pioneer Sean Parker, whose first contact with the CIA took place at age 16. ………………………… Facebook’s long-standing ties to the military and intelligence communities go far beyond its origins, including revelations about its collaboration with spy agencies as part of the Snowden leaks and its role in influence operations – some have even directly involved Google and Palantir.
Facebook’s growing role in the ever-expanding surveillance and “pre-crime” apparatus of the national security state demands new scrutiny of the company’s origins and its products as they relate to a former, controversial DARPA-run surveillance program that was essentially analogous to what is currently the world’s largest social network.
An unspoken outcome of the global proliferation of Facebook was the sly, roundabout creation of the first digital ID system – a necessity for the coming digital economy. Users would set up their profiles by feeding the social network with a plethora of personal information, with Facebook being able to use this data to generate large webs of connectivity between otherwise unknown social groups. There is even evidence that Facebook generated placeholder accounts for individuals that appeared in user data but did not have a profile of their own. Both Google and PayPal would also use similar digital identification methods to allow users to sign into other websites, creating interoperable identification systems that could permeate the internet.
A similar evolution is occurring in the financial sector, as data broker social networks – including Facebook and Musk’s X (formerly Twitter) – are posturing themselves as the future of financial service companies. ……………………………
From Public-Private, to Private-Public
As outlined above, it is clear that the public sector’s intelligence community used the veil of the private sector to establish financial incentives and commercial applications to build out the modern data economy. A simple glance at the seven largest stocks in the American economy demonstrate this concept, with Meta (Facebook), Alphabet (Google), and Amazon – with founder Jeff Bezos being the grandson of ARPA founder Lawrence Preston Gise – leading the software side, and Microsoft, Apple, NVIDIA and Tesla leading the hardware component. While many of these companies have egregious ties to the intelligence community and the public sector during their incubation, now these private sector companies are driving the globalization and national security interests of the public sector.
The future of the American data economy is firmly situated between two pillars – artificial intelligence and blockchain technology. With the incoming Trump administration’s close advisory ties to PayPal, Tether, Facebook, Palantir, Tesla and SpaceX, it is clear that the data brokers have returned to roost at Pennsylvania Avenue. AI requires massive amounts of sound data to be of any use for the technologists, and the data provided by these private sector stalwarts is poised to feed their learning modules – surely after securing hefty government contracts. Private companies using public blockchains to issue their tokens generates not only significant opportunities for the United States to address its debt problem, but simultaneously serves as a “boon in surveillance”, as stated by a former CIA director.
Trump’s recent speech on bitcoin and crypto embraced policies that will seek to mold bitcoin into an enabler of irresponsible fiscal policy and will employ programmable, surveillable stablecoins to expand and entrench dollar dominance.
Within the Trump administration’s embracing of the blockchain – itself the final iteration of the public-private commercialization of data, despite its libertarian posturing – reveals the culmination of a decades-long technocratic dialectic trojan horse. Nearly all of the foundational technology needed to push the world into this new financial system was cultivated in the shadows by the military and intelligence community of the world’s largest empire. While technology can surely offer solutions for greater efficiency and economic prosperity, the very same tools can also be used to further enslave the citizens of the world.
Air Force general Anthony Cotton, the man in charge of the United States stockpile of nuclear missiles, says the Pentagon is doubling down on artificial intelligence — an alarming sign that the hype surrounding the tech has infiltrated even the highest ranks of the US military.
As Air and Space Forces Magazine reports, Cotton made the comments during the 2024 Department of Defense Intelligence Information System Conference earlier this month.
Fortunately, Cotton stopped short of promising to hand over the nuclear codes to a potentially malicious AI.
“AI will enhance our decision-making capabilities,” he said. “But we must never allow artificial intelligence to make those decisions for us.”
Algorithmic Deterrence
The US military is planning to spend a whopping $1.7 trillion to bring its nuclear arsenal up to date. Cotton revealed that AI systems could be part of this upgrade.
However, the general remained pointedly vague about how exactly the tech would be integrated.
“Advanced AI and robust data analytics capabilities provide decision advantage and improve our deterrence poster,” he added. “IT and AI superiority allows for a more effective integration of conventional and nuclear capabilities, strengthening deterrence.”
Vagueness aside, nuclear secrecy expert and Stevens Institute of Technology expert Alex Wellerstein told 404 Media that “I think it’s safe to say that they aren’t talking about Skynet, here,” referring to the fictional AI featured in the sci-fi blockbuster “Terminator” franchise.
“He’s being very clear that he is talking about systems that will analyze and give information, not launch missiles,” he added. “If we take him at his word on that, then we can disregard the more common fears of an AI that is making nuclear targeting decisions.”
Nonetheless, there’s something disconcerting about Cotton’s suggestion that an AI could influence a decision of whether to launch a nuclear weapon.
Case in point, earlier this year, a team of Stanford researchers tasked an unmodified version of OpenAI’s GPT-4 large language model to make high-stakes, society-level decisions in a series of wargame simulations.
“Do-it-yourself” Project Produced “Credible Nuclear Weapon” Design from Open Sources
Experimenters Developed a Plutonium Weapon Design with Potential for High Explosive Yield.
NATIONAL SECURITY ARCHIVE,Washington, D.C., January 23, 2025 – Today, the National Security Archive publishesnewly declassified information on a secret mid-1960s project in which a handful of young physicists at Lawrence Livermore Laboratory produced a design for a “credible nuclear weapon” based only on unclassified, open-source information and in just three years. One of the participants described the experiment as “truly a do-it-yourself project,” according to one of the recently declassified records. Begun in the spring of 1964, before China had conducted its first bomb test, the “Nth Country Experiment” concluded that a government with nuclear-weapons aspirations and limited resources could develop a “credible” weapon.
This new Electronic Briefing Book includes the relatively limited declassified literature on the project, including the 1967 “Summary Report on the Nth Country Experiment,” a document first released to the National Security Archive in the 1990s and that was the subject of an Archive press release in 2003. Today’s posting also includes a recently declassified, if massively redacted, Livermore report on “Postshot Activities of the Nth Country Experiment” that summarized classified briefings that two of the participants in the Experiment gave around the country to U.S. government officials. Also included is a State Department internal announcement of a forthcoming briefing on the “Nth Country Experiment” noting that “three young PhD physicists, working part-time, succeeded in achieving a workable nuclear weapons design in a period of about three years.”
……………………………….When the Experiment began in 1964, U.S. intelligence had been analyzing the problem of the potential spread of nuclear weapons capabilities for years. Before the term “nuclear proliferation” became widely used during the 1960s, however, analysts with the CIA and other intelligence organizations had thought in terms of a “4th country” problem: Which country was likely to join the U.S., the U.K., and the Soviet Union as the fourth country with nuclear weapons capabilities? After France tested its first bomb in early 1960 and became the fourth country, analysts began to think in terms of the “Nth country problem”—that some indeterminate number of countries might develop nuclear weapons capabilities. What concerned think tankers and academic experts was that Nth countries would create a more unstable and perilous world where the United States would have less influence and its interests would be under greater threat.[1] Consistent with this, during a 1963 press conference, President John F. Kennedy warned of the possibility of a world where, by the 1970s, there were 15 or 20 nuclear powers that posed the “greatest possible danger and hazard.”[2]
………………………………………..The Department of Energy’s reviewers massively excised the two reports on the Experiment on the grounds that they include “restricted data” (RD) relating to the design of nuclear weapons. The Experiment involved RD from the beginning, with the junior physicists involved receiving Q clearances; any nuclear weapons design information they created would, under the law, be considered secret and “born classified.” Thus, the DOE reviewers completely withheld all discussion and bibliographical entries related to the unclassified and open-source publications that the Experimenters consulted.
……………………………………………………………………………………………………………………………………………………. Future declassifications by the Department of Energy may lead to the release of more information about the “Nth Country Experiment” and its inception.
The Documents…………………………………………………………………………………………………..
For two weeks in the spring of 1977, New Hampshire was at the center of national attention. No, it had nothing to do with the first-in-the-nation primary. The matter that grabbed headlines was the arrest of 1415 people who had peacefully taken over the construction site of a proposed nuclear power plant in Seabrook. After being taken away on buses and National Guard trucks and processed at the Portsmouth Armory, the protesters were delivered to four other armories, where, refusing to pay bail, they engaged in a battle of wills with the stubbornly pro-nuclear governor, Meldrim Thomson.
The group behind the protest was a ragtag New England-wide coalition that called itself the Clamshell Alliance, members of which called themselves “Clams.” How it was able to take on a governor and a powerful industry through nonviolent protest, music, and well-deployed humor is the story told in “Acres of Clams,” a new documentary written, produced and narrated by Eric Wolfe.
“You might find this story hard to believe. Hell, I was there, and I hardly believe it myself,” Wolfe says at the outset. He weaves his story from personal memories, archival photos and footage, and a series of oral history videos captured by Steve Thornton at Clamshell reunions held a few decades later. …………………………………………………..
The Clams were deadly serious about the importance of stopping the spread of technology which would threaten to spew radiation across a heavily populated region. But Clamshell was also a good-natured movement, which Wolfe points out stood in marked contrast to angry anti-war protests in which he had participated just a few years earlier.
…………………………………………………………… But this was not a group of terrorists. All of them had been trained in nonviolence and agreed to what were called, “the guidelines,” in essence a code of discipline for participants, including no use of illegal drugs, no weapons, no running, no dogs, and no damage to the property at the construction site. Everyone knew they would probably get arrested.
………………………………………………………………………….“Acres of Clams” is not a documentary about nuclear power, still a controversial way to generate electricity, and one which the Clams I know still passionately oppose. If you’re interested in up-to-date information on why nukes aren’t the answer to climate catastrophe any more than they were the answer to oil imports in the 1970s, check out Beyond Nuclear, a group co-founded by Paul Gunter, who never stopped fighting nukes. And check out ClamshellAlliance.com, a relatively new website created to keep the group’s legacy alive and foster ongoing activism. What Wolfe set out to do, and succeeded, was to tell the story of a movement that flourished for several years and made history.
……………………………………… I think Wolfe has done a great job showing that disciplined nonviolence, humor, cultural expression, smart political judgment, good timing, and a certain amount of luck could produce what might appear to be magic: a grassroots social movement that can take on and defeat a multi-billion-dollar industry backed by the state and federal governments. And that’s a story that’s not just about nuclear dangers.
Arnie Alpert spent decades as a community organizer/educator in NH movements for social justice and peace. Officially retired since 2020, he keeps his hands (and feet) in the activist world while writing about past and present social movements.
One of the biggest myths about renewable energy is that it isn’t reliable. Sure, the sun sets every night and winds calm down, putting solar panels and turbines to sleep. But when those renewables are humming, they’re providing the grid with electricity and charging banks of batteries, which then supply power at night.
A new study in the journal Renewable Energy that looked at California’s deployment of renewable power highlights just how reliable the future of energy might be. It found that last year, from late winter to early summer, renewables fulfilled 100 percent of the state’s electricity demand for up to 10 hours on 98 of 116 days, a record for California. Not only were there no blackouts during that time, thanks in part to backup battery power, but at their peak the renewables provided up to 162 percent of the grid’s needs — adding extra electricity California could export to neighboring states or use to fill batteries.
This study really finds that we can keep the grid stable with more and more renewables,” said Mark Z. Jacobson, a civil and environmental engineer at Stanford University and lead author of the new paper. “Every major renewable — geothermal, hydro, wind, solar in particular, even offshore wind — is lower cost than fossil fuels” on average, globally.
Yet Californians pay the second highest rates for electricity in the country. That’s not because of renewables, but in part because utilities’ electrical equipment has set off wildfires — like the Camp Fire started by Pacific Gas and Electric’s power lines, which devastated the town of Paradise and killed 85 people — and now they’re passing the costs that come from lawsuits and burying transmission lines to their customers. While investigators don’t know for sure what sparked all of the wildfires that have ravaged Los Angeles this month, they’ll be scrutinizing electrical equipment in the area. Power lines are especially prone to failing in high winds, like the 100-mile-per-hour gusts that turned these Southern California fires into monsters.
Even with the incessant challenge of wildfires, California utilities are rapidly shifting to clean energy, with about half of the state’s power generated by renewables like hydropower, wind, and solar. The study compared 116 days in 2024 to the same period in 2023 and discovered California’s output from solar was 31 percent higher and wind 8 percent. After increasing more than 30-fold between 2020 and 2023, the state’s battery capacity doubled between 2023 and 2024, and is now equivalent to the juice produced by more than four nuclear power plants. According to the study, all that new clean tech helped California’s power plants burn 40 percent less fossil fuel for electricty last year.
Those batteries help grid operators be more flexible in meeting demand for electricity, which tends to peak when people return home in the early evening and switch on appliances like air conditioners — just when the grid is losing solar power. “Now we’re seeing the batteries get charged up in the middle of the day, and then meet the portion of the demand in the evening, especially during those hot summer days,” said Mark Rothleder, chief operating officer of the California Independent System Operator, the nonprofit that runs the state’s grid.
Another pervasive myth about renewables is that they won’t be able to support a lot more electric vehicles, induction stoves, and heat pumps plugging into the grid. But here, too, California busts the myth: Between 2023 and 2024, demand on the state’s grid during the study period actually dropped by about 1 percent.
Why? In part because some customers installed their own solar panels, using that free solar energy instead of drawing power from the grid. In 2016, almost none of those customers had batteries to store that solar power to use at night. But battery adoption rose each of the following years, reaching 13 percent of buildings installing solar in 2023, then skyrocketing to 38 percent last year. (That is, of the 1,222 megawatts of solar capacity added last year, 464 megawatts included batteries.) That reduces demand on the grid because those customers can now use their solar power at night.
Batteries also help utilities get better returns on their investments in solar panels. A solar farm makes all its money selling electricity during the day. But if it has batteries attached to the farm, it can also provide energy in the evening, when electricity prices rise due to increased demand. “That evening battery contribution is very key to the economics working out well,” said Jan Kleissl, director of the Center for Energy Research at the University of California, San Diego, who wasn’t involved in the new paper.
So utilities are incentivized to invest in batteries, which also provide reliable backup power to avoid blackouts. But like any technology, batteries can fail. Last week, a battery storage plant caught fire on California’s central coast, the largest of its kind in the world, but it only knocked out 2 percent of the state’s energy storage capacity. A grid fully running on renewables will have a lot of redundancy built in, beyond multiple battery plants: Electric school buses and other EVs, for instance, are beginning to send power back to the grid when a utility needs it — a potentially vast network of backup energy.
But here’s where the economics get funky. The more renewables on the grid, the lower the electricity prices tend to be for customers, according to the new study. From October 1, 2023 to September 30, 2024, South Dakota, Montana, and Iowa provided 110 percent, 87 percent, and 79 percent, respectively, of their electricity demand with renewables, particularly wind and hydropower. Accordingly, the three have some of the lowest electricity prices in the country.
California, on the other hand, got 47 percent of its power from renewables over the same period, yet wildfires and other factors have translated into higher electricity prices. The California Public Utilities Commission, for instance, authorized its three largest utilities to collect $27 billion in wildfire prevention and insurance costs from ratepayers between 2019 and 2023.
Climate change is making California ever more prone to burn — a growing challenge for utilities. But the state’s banner year for solar and batteries just poked a whole lot of holes in the notion that renewables aren’t reliable.
President Donald Trump said he will expedite the construction of power plants for artificial intelligence through an emergency declaration.
Trump said the plants can use whatever fuel they want, including coal.
President Donald Trump said Thursday he will expedite the construction of power plants for artificial intelligence through an emergency declaration, as the U.S. races against China for dominance in the industry.
“We’re going to build electric generating facilities. I’m going to get the approval under emergency declaration. I can get the approvals done myself without having to go through years of waiting,” Trump said in a virtual address to the World Economic Forum in Davos, Switzerland.
The plants can use whatever fuel they want, the president said, making clear that his administration won’t hold the AI industry to any climate targets.
There are some companies in the U.S. that have coal sitting right by the plant so that if there’s an emergency, they can go to that,” the president said.
Trump declared a national energy emergency on his first day in office, directing federal agencies to use whatever emergency authorities they have at their disposal to expedite energy infrastructure projects.
One day later, Trump unveiled a joint venture with OpenAI, Oracle and SoftBank to invest billions of dollars in AI infrastructure through a project called Stargate.
Power demand from artificial intelligence data centers is forecast to surge in coming years. The tech companies building the centers that support AI have primarily focused on procuring renewable energy, though they have shown a growing interest in nuclear power to meet their growing electricity needs.
While the tech sector has invested in carbon-free power to meet its climate goals, analysts believe natural gas will play a pivotal role in powering AI because it’s plentiful, is more reliable than renewables and can be deployed faster than nuclear.
Trump said he wants power plants to connect directly to data centers rather than supplying electricity through the grid.
“You don’t have to hook into the grid, which is old and could be taken out,” Trump said. This arrangement, called co-location, has faced opposition from some utilities, who are worried about losing fees and have warned that taking power off the grid could lead to supply shortages.
SEOUL, South Korea — Denuclearization of North Korea is a prerequisite for global stability, South Korea said Tuesday after President Donald Trump described the reclusive regime as a “nuclear power,” raising concern that the U.S. could be moving toward recognizing the North as a nuclear-armed state.
The newly inaugurated Trump, who met with Kim three times during his first term to discuss North Korea’s U.N.-sanctioned weapons programs, spoke enthusiastically Monday about his past relationship with Kim, saying they liked each other.
“Now, he is a nuclear power,” Trump said while signing a series of executive orders in the Oval Office. “I think he’ll be happy to see I’m coming back.”
While it is unclear what Trump and Hegseth meant by “nuclear power,” U.S. officials have long refrained from using the phrase as it could signal recognition of North Korea as a nuclear-armed state.
The Trump administration did not immediately respond to a request for comment Tuesday.
A defunct nuclear power plant will be revived to power Donald Trump‘s new half-trillion-dollar project to make America the world’s artificial intelligence powerhouse.
The state-owned utility Santee Cooper — the largest power provider in South Carolina — said Wednesday that it is seeking buyers to complete construction on a partially-built project that was abandoned in 2017.
The VC Summer Nuclear Power Station, which houses two unfinished nuclear reactors, was scrapped following years of lengthy, costly delays and bankruptcy by its contractor, according to a company statement.
‘We are seeing renewed interest in nuclear energy, fueled by advanced manufacturing investments, AI-driven data center demand, and the tech industry’s zero-carbon targets,’ said Santee Cooper President and CEO Jimmy Staton.
The project, dubbed the ‘Stargate Initiative,’ is a massive private sector deal to expand the nation’s AI infrastructure, led by Big Tech companies such as OpenAI, SoftBank and Oracle. It is the largest AI infrastructure project in history.
Trump stated that Stargate will create over 100,000 new jobs ‘almost immediately.’
‘This monumental undertaking is a resounding declaration of confidence in America’s potential under a new president,’ he said during a Tuesday briefing.
Trump emphasized that the project aims to sharpen the country’s technological edge against competitors, notably China.
He held the briefing in the White House’s Roosevelt Room alongside SoftBank CEO Masayoshi Son, Oracle’s Larry Ellison and OpenAI’s Sam Altman.
The US AI industry has already grown rapidly in recent years, but one of the biggest hurdles to expansion is the energy cost of running data centers.
A recent Department of Energy (DOE) report found that total data center electricity usage more than tripled from from 2014 to 2023, rising from 58 TWh to 176 TWh.
The DOE estimates that by 2028, data center energy demand will increase between 325 to 580, consuming up to 12 percent of US electricity.
‘This monumental undertaking is a resounding declaration of confidence in America’s potential under a new president,’ he said during a Tuesday briefing.
Trump emphasized that the project aims to sharpen the country’s technological edge against competitors, notably China.
He held the briefing in the White House’s Roosevelt Room alongside SoftBank CEO Masayoshi Son, Oracle’s Larry Ellison and OpenAI’s Sam Altman.
The US AI industry has already grown rapidly in recent years, but one of the biggest hurdles to expansion is the energy cost of running data centers.
A recent Department of Energy (DOE) report found that total data center electricity usage more than tripled from from 2014 to 2023, rising from 58 TWh to 176 TWh.
The DOE estimates that by 2028, data center energy demand will increase between 325 to 580, consuming up to 12 percent of US electricity.
Santee Cooper said it was working with the investment firm Centerview Partners LLC to vet buyer proposals, which they will accept until May 5.
The exact asking price has not been publicly named, but the Wall Street Journal reported that completion of the reactors would cost the buyer billions of dollars over several years.
This would not be the first time that Big Tech bankrolled a nuclear energy project. Last September, Microsoft struck a deal with the New York utility Constellation Energy to restart the Three Mile Island nuclear plant in Pennsylvania.
This plant was the site of the worst nuclear power accident in US history, when its Unit 2 reactor partially melted down in 1979 and released radioactive gases and iodine into the environment.
Amazon, Meta and Google also sought or signed deals to back nuclear energy projects in 2024, similarly motivated by their AI endeavors.
The federal government has also shown support for the resurgence of nuclear power.
In September, the DOE finalized a $1.52 billion loan guarantee to help Holtec International, a New Jersey manufacturing company, recommission the Palisades nuclear plant in Michigan, marking the first-ever revival of a nuclear power plant in the US.
The Biden administration and Congress also offered billions of dollars in subsidies to maintain older nuclear plants and fund the construction of new reactors.
President Trump has largely opposed and sought to repeal the former president’s energy and climate policies, but has said he supports nuclear energy.
In its first actions this week, the new administration signed an executive order directing the heads of ‘all agencies’ to identify regulations that ‘impose an undue burden’ on domestic energy resources, including nuclear power.
It also instructs the US Geological Survey ‘to consider updating the Survey’s list of critical minerals, including for the potential of including uranium,’ which can be refined into nuclear fuel.
Tech companies are flocking to nuclear energy to power their data centers.
TerraPower, a nuclear energy startup founded by Bill Gates, struck a deal this week with one of the largest data center developers in the US to deploy advanced nuclear reactors. TerraPower and Sabey Data Centers (SDC) are working together on a plan to run existing and future facilities on nuclear energy from small reactors.
Tech companies are scrambling to determine where to get all the electricity they’ll need for energy-hungry AI data centers that are putting growing pressure on power grids. They’re increasingly turning to nuclear energy, including next-generation reactors that startups like TerraPower are developing………..
A memorandum of understanding signed by the two companies establishes a “strategic collaboration” that’ll initially look into the potential for new nuclear power plants in Texas and the Rocky Mountain region that would power SDC’s data centers.
There’s still a long road ahead before that can become a reality. The technology TerraPower and similar nuclear energy startups are developing still have to make it through regulatory hurdles and prove that they can be commercially viable.
Compared to older, larger nuclear power plants, the next generation of reactors are supposed to be smaller and easier to site. Nuclear energy is seen as an alternative to fossil fuels that are causing climate change. But it still faces opposition from some advocates concerned about the impact of uranium mining and storing radioactive waste near communities……………..
TerraPower’s reactor design for this collaboration, Natrium, is the only advanced technology of its kind with a construction permit application for a commercial reactor pending with the U.S. Nuclear Regulatory Commission, according to the company. The company just broke ground on a demonstration project in Wyoming last year, and expects it to come online in 2030…………. https://www.theverge.com/2025/1/23/24350335/bill-gates-terrapower-data-center-sabey-nuclear-energy-ai
So why won’t the industry and NRC plan for them when extending reactor licenses, asks Paul Gunter
For nuclear power plants, fire is considered a very significant contributor to the overall reactor core damage frequency (CDF), or the risk of a meltdown. Fire at a nuclear power station can be initiated by both external and/or internal events. It can start with the most vulnerable external link to the safe operation of nuclear power plants; the Loss Of Offsite Power (LOOP) from the electric grid. LOOP is considered a serious initiating event to nuclear accident frequency. Because of that risk, US reactors won’t operate without external offsite power from the electric grid.
The still largely uncontained wildfires burning in and around Los Angeles and Ventura Counties in southern California “are sure to rank among America’s most expensive.” The ongoing firestorms have now extended into a fourth period of “extremely critical fire weather” conditions and have burned nearly 63 square miles, an area the size of Washington, D.C. The estimated number is still being tallied for the thousands of homes and structures destroyed, the loss of life, the evacuation of communities indefinitely dislocated and the threats to and impacts on critical infrastructure including electrical power .
There is no scientific doubt that global warming is primarily caused by the unquenchable burning of fossil fuels, yet politically motivated denial is entrenched in the US Congress. The increased frequency and severity of these wildfires—leading to suburban and even urban firestorms— are but one consequence of a climate crisis along with a range of other global natural disasters including sea level rise, hurricanes, more severe storms generally, extreme precipitation events, floods and droughts. This more broadly adversely impacts natural resources and critical infrastructures to include inherently dangerous nuclear power stations.
At this particular time, it is important to reflect upon the April 2, 2024, report to Congress issued by its investigative arm, the United States Government Accountability Office (GAO), “Nuclear Power Plants: NRC Should Take Actions to Fully Consider the Potential Effects of Climate Change,” (GAO 24-106326).
The GAO warns that the US Nuclear Regulatory Commission (NRC) needs to start taking actions to address the increased risk of severe nuclear power plant accidents attributable to human caused climate change.
The NRC’s actions to address the risks from natural hazards do not fully consider potential climate change effects on severe nuclear accident risks. “For example, NRC primarily uses historical data in its licensing and oversight processes rather than climate projections data,” the GAO report said.
Beyond Nuclear has uncovered similar findings during our challenges to the NRC’s extreme relicensing process for extending reactor operating licenses, now out to the extreme of 60 to 80 years and talk of 100 years. We found that the agency’s staff believes and stubbornly insists that an environmental review for climate change impacts (sea level rise, increasingly severe hurricanes, extreme flooding, etc.) on reactor safety and reliability is “out of scope” for the license extensions hearing process.
The GAO report points out to the NRC that wildfires, specifically, can dangerously impact US nuclear power station operations and public safety with potential consequences that extend far beyond the initiating natural disaster. These consequences can include loss of life, large scale and indefinite population dislocation and uninsurable economic damage from the radiological consequences:
“Wildfire. According to the NCA (National Climate Assessment), increased heat and drought contribute to increases in wildfire frequency, and climate change has contributed to unprecedented wildfire events in the Southwest. The NCA projects increased heatwaves, drought risk, and more frequent and larger wildfires. Wildfires pose several risks to nuclear power plants, including increasing the potential for onsite fires that could damage plant infrastructure, damaging transmission lines that deliver electricity to plants, and causing a loss of power that could require plants to shut down. Wildfires and the smoke they produce could also hinder or prevent nuclear power plant personnel and supplies from getting to a plant.”
LOOP to nuclear power stations is a leading contributor to increasing the risk of a severe nuclear power accident. The availability of alternating current (AC) power is essential for safe operation and accident recovery at commercial nuclear power plants. Offsite fires destroying electrical power transmission lines to commercial reactors therefore increase the probability and severity of nuclear accidents.
For US nuclear power plants, 100% of the electrical power supply to all reactor safety systems is initially provided through the offsite power grid. If the offsite electrical grid is disturbed or destroyed, the reactors are designed to automatically shut down or “SCRAM”. Onsite emergency backup power generators are then expected to automatically or manually start up to provide power to designated high priority reactor safety systems needed to safely shut the reactors down and provide continuous reactor cooling and pressure monitoring. Reliable offsite power is therefore a key factor to minimizing the probability of severe nuclear accidents.
The GAO identifies a number of US nuclear power plant sites that are vulnerable to the possible outbreak of wildfires where they are located. “According to our analysis of U.S. Forest Service and NRC data, about 20 percent of nuclear power plants (16 of 75) are located in areas with a high or very high potential for wildfire,” the GAO report states. “More specifically, more than one-third of nuclear power plants in the South (nine of 25) and West (three of eight) are located in areas with a high or very high potential for wildfire.” The GAO goes on to identify “Of the 16 plants with high or very high potential for wildfire, 12 are operating and four are shut down.”
To analyze exposure to the wildfire hazard potential, the GAO used 2023 data from the U.S. Forest Service’s Wildfire Hazard Potential Map. “High/very high” refers to plants in areas with high or very high wildfire hazard potential. Those nuclear power stations described by GAO as “high / very high” exposure to wildfires and their locations are excerpted from GAO Appendix III: Nuclear Power Plant Exposure to Selected Natural Hazards.
able 1: Potential High Exposure to “Wildfires” at Operating Nuclear Power Plants
–AZ / SAFER, one of two mobile nuclear emergency equipment supply units in the nation, “HIGH / VERY HIGH” –CA / Diablo Canyon Units 1 & 2 nuclear power station, “HIGH / VERY HIGH” –FL / Turkey Point Units 3 & 4 nuclear power station, “HIGH / VERY HIGH” –GA / Edwin I. Hatch Units 1 & 2 nuclear power station, “HIGH / VERY HIGH” –GA / Vogtle Units Units 1, 2, 3 & 4, nuclear power station, “HIGH / VERY HIGH” –NC / Brunswick Units 1 & 2 nuclear power station, “HIGH / VERY HIGH” –NC / McGuire Units 1 & 2 nuclear power station, “HIGH / VERY HIGH” –NC / Shearon Harris Units 1 & 2 nuclear power station, “HIGH /VERY HIGH” –NB / Cooper nuclear power station, “HIGH / VERY HIGH” –SC / Catawba Units 1 & 2 nuclear power station, “HIGH / VERY HIGH” –SC / H. B. Robinson Units 1 & 2 nuclear power station, “HIGH / VERY HIGH” –WA / Columbia nuclear power station, “HIGH / VERY HIGH”
Table 2: Potential High Exposure to “Wildfires” at Shutdown Nuclear Power Plants
–CA / San Onofre Units 1 & 2, “HIGH / VERY HIGH” –FL / Crystal River, “HIGH / VERY HIGH” –NJ / Oyster Creek, “HIGH / VERY HIGH” –NY / Indian Point Units 1, 2 & 3, “HIGH / VERY HIGH”
Wildfires can transport radioactive contamination from nuclear facilities
A historical review of wildfires that occur around nuclear facilities (research, military and commercial power) identifies that these events are also a very effective transport mechanism of radioactivity previously generated at these sites and subsequently released into the environment by accident, spills and leaks, and careless dumping. The radioactivity is resuspended by wildfires that occur years, even decades later.
The fires carry the radioactivity on smoke particles downwind, thus expanding the zone of contamination further and further with each succeeding fire. The dispersed radionuclides can have very long half-lives meaning they remain biologically hazardous in the environment for decades, centuries and longer.
Here are a few examples of how wildfires increasing in frequency and intensity are also threatening to spread radioactive contamination farther away the original source of generation.
The Chornobyl nuclear catastrophe and recurring wildfires
The Chornobyl nuclear disaster that originally occurred on April 26,1986, initially spread harmful levels of radioactive fallout concentrated around the destroyed Chornobyl Unit 4 in northern Ukraine. The radioactive fallout was transported high into the atmosphere by the accidental reactor explosion. The days long fire and smoke transported extreme radioactivity from the expelled burning nuclear fuel and its graphite moderator. Radioactive fallout then spread far afield in shifting winds, precipitated with rainfall and was terrestrially deposited in its highest concentrations largely in northern Ukraine, Belarus and Southern Russia.
Additional atmospheric distributions of radioactive contamination fell across much of Europe, persisting in numerous hot spots, including in Poland, Germany, France, Scandinavia and the United Kingdom.
The Chornobyl ‘Exclusion Zone’ to restrict long term human habitation was established in the immediate aftermath in 1986 as an arbitrary 1,000 square miles within an 18 mile radius around the exploded reactor in Ukraine and remains in place today nearly 39 years later. The Bulletin of Atomic Scientists reports that seasonal wildfires continue to occur within the Chornobyl Exclusion Zone, routinely burning across already contaminated land and resuspending radioactivity via the smoke into the atmosphere. The radioactive smoke is borne on the wind, carrying the radioactive fallout farther out and increasing the size of what can be measured as potentially an expanding Exclusion Zone.
Contrary to claims, wildfires can threaten US nuclear facilities
The Los Angeles Timesheadlined in May 2024 “Sites with radioactive material more vulnerable as climate change increases wildfire, flood risks.”
The LA Times did a look back at several wildfires surrounding the government radiological laboratories and government nuclear weapons manufacturing sites including the 2018 Woolsey wildfire at the old Santa Susana Field Laboratory (SSFL). This facility specifically housed 10 nuclear reactors and plutonium and uranium fuel fabrication facilities. SSFL was used for early testing of rockets and nuclear reactors for energy. But decades of carelessness during experiments resulted in one of the first nuclear reactor meltdowns in 1959, leaving acres of soil, burn pits and water radioactively and chemically contaminated. Boeing, the current operator of SSFL, is now obligated to conduct the cleanup of the SSFL site.
“A 2018 fire in California started at the Santa Susana Field Laboratory, a former nuclear research and rocket-engine testing site, and burned within several hundred feet of contaminated buildings and soil, and near where a nuclear reactor core partially melted down 65 years ago,” reported the LA Times.
Over the years, NBC news has broadcast continuing coverage of the massive 2018 Woolsey fire at SSFL and the radioactive contamination from this event, found in several Los Angeles suburbs miles away.
Despite these events, federal authorities continue to issue vapid safety assurances that climate changes, including more frequent wildfires, will not increase the risks to public health and safety from contaminated commercial, military and national laboratory facilities and that there is no need to include environmental reviews that account for the impacts of climate change in the regulatory environmental review process.
A recent example of the NRC resistance to factor in reasonable assurance for protecting the public’s health and safety from climate change risk — and its potential impacts that increase the risk of a severe nuclear accident, including wildfire — into its oversight and environmental reviews for licensing and relicensing, came from Commission Chairman Christopher Hanson’s September 27, 2024 response to the GAO report:
“…the NRC does not agree with the [GAO] conclusion that the agency does not address the impacts of climate change. In effect, the layers of conservatism, safety margins, and defense in depth incorporated into the NRC’s regulations and processes provide reasonable assurance of adequate protection of public health and safety, to promote the common defense and security, and to protect the environment.”
Hanson’s outright dismissal of the GAO report and its finding that the agency needs to take action, runs contrary to the view of one of the agency’s own Atomic Safety Licensing Board judges, Michael Gibson. Gibson issued a dissenting opinion to the similar blanket dismissal by the NRC to take a “hard look” at climate change impacts under the National Environmental Policy Act (NEPA) on extreme reactor relicensing. His opinion came in support of Beyond Nuclear’s legal challenge to the Commission’s second 20-year license extensions to its commercially operating reactors. Gibson dissented from the licensing board’s majority denial of our hearing request on climate change’s contribution to the risk and consequences of severe nuclear accidents.
In Judge Gibson’s 23 page dissent of his colleagues’ decision to extend the nuclear plant’s operating license out to 2060 without a pubic hearing on climate change impacts on nuclear power plants, he wrote on the record:
“That is hardly the reception climate change should be given. As CEQ (the President’s Council on Environmental Quality), the federal government’s chief source for assessing the importance of climate change in environmental analyses under NEPA, has made clear, ‘The United States faces a profound climate crisis and there is little time left to avoid a dangerous—potentially catastrophic—climate trajectory. Climate change is a fundamental environmental issue, and its effects on the human environment fall squarely within NEPA’s purview.’ Sadly, the majority and the NRC Staff have failed to heed this warning.”
Paul Gunter is Director of the Reactor Oversight Project at Beyond Nuclear. This article first appeared on the Beyond Nuclear website.