nuclear-news

The News That Matters about the Nuclear Industry Fukushima Chernobyl Mayak Three Mile Island Atomic Testing Radiation Isotope

Artificial Intelligence is already a serious problem in military systems

Worried about the autonomous weapons of the future? Look at what’s already gone wrong, Bulletin of the Atomic Scientists, By Ingvild BodeTom Watts, April 21, 2021……..a close look at the history of one common type of weapons package, the air defense systems that militaries employ to defend against missiles and other airborne threats, illuminates how highly automated weaponry is actually a risk the world already faces……   while many policymakers say they want to ensure humans remain in control over lethal force, the example of air defense systems shows that they face large obstacles.

Weapons like the US Army’s Patriot missile system, designed to shoot down missiles or planes that threaten protected airspace, include autonomous features that support targeting. These systems now come in many different shapes and sizes and can be typically operated in manual or various automatic modes. In automatic modes, the air defense systems can on their own detect targets and fire on them, relegating human operators to the role of supervising the system’s workings and, if necessary, of aborting attacks. The Patriot air defense system, used by 13 countries, is “nearly autonomous, with only the final launch decision requiring human interaction,” according to research by the Center for Strategic and International Studies………..

Our research on the character of human-machine interaction in air defense systems suggests that over time, their use has incrementally reduced the quality of human oversight in specific targeting decisions. More cognitive functions have been “delegated” to machines, and human operators face incredible difficulties in understanding how the complex computer systems make targeting decisions……….

A study of air defense systems reveals three real-world challenges to human-machine interaction that automated and autonomous features have already created. These problems are likely to grow worse as militaries incorporate more AI into the high-tech weapons of tomorrow.
Targeting decisions are opaque.

The people who operate air defense systems already have trouble understanding how the automated and autonomous features on the weapons they control make decisions…………

The history of Patriot systems operated by the US Army, for instance, includes several near-miss so-called “friendly fire” engagements during the First Gulf War in the 1990s and in training exercises…….. . Rather than addressing the root-causes of these deficiencies or communicating them to human operators, the military appears to have framed the issues as software problems that could be fixed through technical solutions.

Another problem that operators of air defense systems encounter is that of automation bias and over-trust. Human operators can be overly confident of the reliability and accuracy of the information they see on their screens. 

Operators can lose situational awareness………..  In real terms, the machines are now performing the bulk of the cognitive skills involved in operating an air defense system, not just the motor and sensory tasks……….
The tragic 1988 downing of an Iranian Air flight carrying 290 passengers and crew by a US Navy warship, the Vincennes, illustrates how human operators in the midst of combat can misinterpret computer outputs and make fatal mistakes. ………..

Improvements in the speed and maneuverability of modern weaponry continue to reduce the time human operators have to decide whether to authorize the use of force. Take what happened to an unfortunate Ukraine International Airlines jet as a recent example. The Iranian operators of a Tor-M1 system near Tehran’s airport shot down the civilian plane carrying 176 passengers and crew members in January 2020, only minutes after the plane took off…………

Regulating autonomous weapons.   In our assessment, the decades long process of integrating automated and autonomous features into the critical functions of air defense systems has contributed toward an emerging norm governing the use of air defense systems. The norm is that humans have a reduced role in use of force decisions……..

Countries have been debating possible regulations on lethal autonomous weapons systems at the United Nations since 2014. Many states have agreed in principle that human responsibility for using weapons systems has to be retained to ensure that autonomous weapons systems are used in compliance with international humanitarian law. But this raises two questions. First, how can human control over the use of force be defined; and second, how can such control be measured to ensure that it is people, not machines, who retain ultimate control over the use of force?

Almost a decade after a nonprofit called Article 36 introduced the concept of meaningful human control, there is no agreement on what exactly makes human control meaningful. ………………..The current crop of more-or-less autonomous weapons has created norms for human control over lethal force, and policymakers need to understand how these may undermine any (potential) international efforts to regulate autonomous weapons systems.  https://thebulletin.org/2021/04/worried-about-the-autonomous-weapons-of-the-future-look-at-whats-already-gone-wrong/?utm_source=Newsletter&utm_medium=Email&utm_campaign=MondayNewsletter04262021&utm_content=DisruptiveTechnology_AlreadyWrong_04212021

April 24, 2021 - Posted by | 2 WORLD, technology

No comments yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: