‘Artificial Escalation’: Imagining the future of nuclear risk
As arms race dynamics push AI progress forward, prioritizing speed over safety, it is important to remember that in races toward mutual destruction, there is no winner. There is a point at which an arms race becomes a suicide race.
Bulletin of the Atomic Scientists, By Anthony Aguirre, Emilia Javorsky, Max Tegmark | July 17, 2023
Imagine it’s 2032. The US and China are still rivals. In order to give their military commanders better intel and more time to make decisions, both powers have integrated artificial intelligence (AI) throughout their nuclear command, control, and communications (NC3) systems. But instead, events take an unexpected turn and spin out of control, with catastrophic results.
This is the story told in a new short film called Artificial Escalation produced by Space Film & VFX for The Future of Life Institute. This plot may sound like science fiction (and the story is fictional), but the possibility of AI integration into weapons of mass destruction is now very real. Some experts say that the United States should build an NC3 system using AI “with predetermined response decisions, that detects, decides, and directs strategic forces.” The US is already envisioning integration like this in conventional command and control systems: the Joint All-Domain Command and Control has proposed connecting sensors from all military services into a single network, using AI to identify targets and recommend the “optimal weapon.” But NC3-AI integration is a terrible idea.
The Stockholm International Peace Research Institute (SIPRI) explored key risks of AI integration into NC3, including: increased speed of warfare, accidental escalation, misperception of intentions and capabilities, erosion of human control, first-strike instability, the unpredictability of AI, the vulnerabilities of AI to adversary penetration, and arms race dynamics. The National Security Commission on AI cautioned that AI “will likely increase the pace and automation of warfare across the board, reducing the time and space available for de-escalatory measures.”
This new rate of warfare would leave less time for countries to signal their own capabilities and intentions or to understand their opponents’ perspectives. This could lead to unintended conflict escalation, crisis instability, and even nuclear war.
As arms race dynamics push AI progress forward, prioritizing speed over safety, it is important to remember that in races toward mutual destruction, there is no winner. There is a point at which an arms race becomes a suicide race. The reasons not to integrate AI into comprehensive command, control, and communications systems are manifold:
Adversarial AI carries unpredictable escalation risk. Even if AI-NC3 systems are carefully tested and evaluated, they may be made unpredictable by design. Two or more such systems interacting in a complex and adversarial environment can push each other to new extremes, greatly increasing the risk of accidental escalation. We have seen this before with the 2010 “flash crash” of the stock market, when adversarial trading algorithms wiped trillions of dollars off the stock exchange in under an hour. The military equivalent of that hour would be catastrophic.
No real training data. AI systems require a lot of data in their training, whether real or simulated. But training systems for nuclear conflict necessitates the generation of synthetic data with incomplete information, because the full extent of an adversary’s capabilities is unknown. This adds another element of dangerous unpredictability into the command and control mix.
Cyber vulnerabilities of networked systems. AI-integrated command, control, and communications systems would also be vulnerable to cyberattacks, hacking, and data poisoning. When all sensor data and systems are networked, failure can spread throughout the entire system. Each of these vulnerabilities must be considered across the systems of every nuclear nation, as the whole system is only as strong as its weakest link.
Epistemic uncertainty. Widespread use of AI to create misinformation is already clouding what is real and what is fake. The inability to discern truth is especially dangerous in the military context, and accurate information is particularly crucial to the stability of command and control systems. Historically, there have been channels of reliable, trustworthy communication between adversaries, even when there were also disinformation campaigns happening in the background. When we automate more and engage person-to-person less, those reliable channels dissipate and the risk of unnecessary escalation skyrockets.
Human Deference to Machines. If an algorithm makes a suggestion, people could defy it, but will they? ……………………………………………………………………………
Integrating AI into the critical functions of command, control, and communication is reckless. The world cannot afford to give up control over something as dangerous as weapons of mass destruction. As the United Nations Security Council prepares to meet tomorrow to discuss AI and nuclear risk, now is the time to set hard limits, strengthen trust and transparency, and ensure that the future remains in human hands. https://thebulletin.org/2023/07/artificial-escalation-imagining-the-future-of-nuclear-risk/
2 Comments »
Leave a comment
-
Archives
- December 2025 (293)
- November 2025 (359)
- October 2025 (377)
- September 2025 (258)
- August 2025 (319)
- July 2025 (230)
- June 2025 (348)
- May 2025 (261)
- April 2025 (305)
- March 2025 (319)
- February 2025 (234)
- January 2025 (250)
-
Categories
- 1
- 1 NUCLEAR ISSUES
- business and costs
- climate change
- culture and arts
- ENERGY
- environment
- health
- history
- indigenous issues
- Legal
- marketing of nuclear
- media
- opposition to nuclear
- PERSONAL STORIES
- politics
- politics international
- Religion and ethics
- safety
- secrets,lies and civil liberties
- spinbuster
- technology
- Uranium
- wastes
- weapons and war
- Women
- 2 WORLD
- ACTION
- AFRICA
- Atrocities
- AUSTRALIA
- Christina's notes
- Christina's themes
- culture and arts
- Events
- Fuk 2022
- Fuk 2023
- Fukushima 2017
- Fukushima 2018
- fukushima 2019
- Fukushima 2020
- Fukushima 2021
- general
- global warming
- Humour (God we need it)
- Nuclear
- RARE EARTHS
- Reference
- resources – print
- Resources -audiovicual
- Weekly Newsletter
- World
- World Nuclear
- YouTube
-
RSS
Entries RSS
Comments RSS


[…] Source link […]
Pingback by Imagining the way forward for nuclear danger « nuclear-news - Energy Saving Blog | July 19, 2023 |
[…] Source link […]
Pingback by Imagining the future of nuclear risk « nuclear-news - Energy News 247- Reliable energy, green, climate change energy news and more | July 19, 2023 |