nuclear-news

The News That Matters about the Nuclear Industry Fukushima Chernobyl Mayak Three Mile Island Atomic Testing Radiation Isotope

Dangers of Artificial Intelligence with Nuclear Weapons

AI, CYBERSPACE, AND NUCLEAR WEAPONS.  WSJ, JAMES JOHNSON AND ELEANOR KRABILL, JANUARY 31, 2020
COMMENTARY “………
 A new generation of AI-augmented offensive cyber capabilities will likely exacerbate the military escalation risks associated with emerging technology, especially inadvertent and accidental escalation. Examples include the increasing vulnerability of nuclear command, control, and communication (NC3) systems to cyber attacks. Further, the challenges posed by remote sensing technologyautonomous vehicles, conventional precision munitions, and hypersonic weapons to hitherto concealed and hardened nuclear assets. Taken together, this trend might further erode the survivability of states’ nuclear forces.

AI, and the state-of-the-art capabilities it empowers, is a natural manifestation — not the cause or origin — of an established trend in emerging technology. the increasing speed of war, the shortening of the decision-making timeframe, and the co-mingling of nuclear and conventional capabilities are leading states to adopt destabilizing launch postures.

The AI-Cyber Security Intersection

AI will make existing cyber warfare capabilities more powerful. Rapid advances in AI and increasing degrees of military autonomy could amplify the speed, power, and scale of future attacks in cyberspace. Specifically, there are three ways in which AI and cyber security converge in a military context.
First, advances in autonomy and machine learning mean that a much broader range of physical systems are now vulnerable to cyber attacks, including, hacking, spoofing, and data poisoning. ………
Second, cyber attacks that target AI systems can offer attackers access to machine learning algorithms, and potentially vast amounts of data from facial recognition and intelligence collection and analysis systems. These things could be used, for example, to cue precision munitions strikes and support intelligence, surveillance, and reconnaissance missions.
Third, AI systems used in conjunction with existing cyber offense tools might become powerful force multipliers, thus enabling sophisticated cyber attacks to be executed on a larger scale (both geographically and across networks), at faster speeds, simultaneously across multiple military domains, and with greater anonymity than before.
………. the speed and scope of the next generation of AI cyber tools will likely have destabilizing effects……..
A significant risk in the operation of autonomous systems is the time that passes between a system failure (i.e., performing in a manner other than how the human operator intended), and the time it takes for a human operator to take corrective action. ……..

New Risks to the Security of Nuclear Systems

During the early stages of a cyber operation, it is generally unclear whether an adversary intends to collect intelligence or prepare for an offensive attack. The blurring of cyber offense-defense will likely compound an adversary’s fear of a preemptive strike and increase first-mover incentivesIn extremis, strategic ambiguity caused by this issue may trigger use-them-or-lose-them situations………..
It is now thought possible that a cyber attack (i.e., spoofing, hacking, manipulation, and digital jamming) could infiltrate a nuclear weapons system, threaten the integrity of its communications, and ultimately (and possibly unbeknown to its target) gain control of both its nuclear and non-nuclear command and control systems………..

Pathways to Escalation

AI-enhanced cyber attacks against nuclear systems would be almost impossible to detect and authenticate, let alone attribute, within the short timeframe for initiating a nuclear strike. According to open sources, operators at the North American Aerospace Defense Command have less than three minutes to assess and confirm initial indications from early-warning systems of an incoming attack. This compressed decision-making timeframe could put political leaders under intense pressure to make a decision to escalate during a crisis, with incomplete (and possibly false) information about a situation……..
Ironically, new technologies designed to enhance information, such as 5G networksmachine learningbig-data analytics, and quantum computing, can also undermine its clear and reliable flow and communication, which is critical for effective deterrence.

Advances in AI could also exacerbate this cyber security challenge by enabling improvements to cyber offense. Machine learning and AI, by automating advanced persistent threat (or “hunting for weaknesses”) operations, might dramatically reduce the extensive manpower resources and high levels of technical skill required to execute advanced persistent threat operations, especially against hardened nuclear targets.

Information Warfare Could Lead to Escalation……….

Conclusion 

Rapid advances in military-use AI and autonomy could amplify the speed, power, and scale of future attacks in cyberspace via several interconnected mechanisms — the ubiquitous connectivity between physical and digital information ecosystems; the creation of vast treasure troves of data and intelligence harvested via machine learning; the formation of powerful force multipliers for increasingly sophisticated, anonymous, and possibly multi-domain cyber attacks.

AI systems could have both positive and negative implications for cyber and nuclear security. On balance, however, several factors make this development particularly troublesome. These include the increasing attacks vectors which threaten states’ NC3 systems, a new generation of destabilizing, AI-empowered cyber offensive capabilities (deepfakes, spoofing, and automated advanced persistent threat tools), the blurring of AI-cyber offense-defense, uncertainties and strategic ambiguity about AI-augmented cyber capabilities, and not least, a competitive and contested geo-strategic environment.

At the moment, AI’s impact on nuclear security remains largely theoretical. Now is the time, therefore, for positive intervention to mitigate (or at least manage) the potential destabilizing and escalatory risks posed by AI and help steer it toward bolstering strategic stability as the technology matures.

The interaction between AI and cyber technology and nuclear command and control raises more questions than answers.  What can we learn from the cyber community to help us use AI to preempt the risks posed by AI-enabled cyber attacks? And how might governments, defense communities, academia, and the private sector work together toward this end?

James Johnson is a Postdoctoral Research Fellow at the James Martin Center for Nonproliferation Studies at the Middlebury Institute of International Studies (MIIS), Monterey. His latest book project is entitled, Artificial Intelligence & the Future of Warfare: USA, China, and Strategic Stability. Twitter:@James_SJohnson

Eleanor Krabill is a Master of Arts in Nonproliferation and Terrorism candidate at Middlebury Institute of International Studies (MIIS), Monterey. Twitter: @EleanorKrabill    https://warontherocks.com/2020/01/ai-cyberspace-and-nuclear-weapons/

February 1, 2020 - Posted by | 2 WORLD, weapons and war

No comments yet.

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.