nuclear-news

The News That Matters about the Nuclear Industry Fukushima Chernobyl Mayak Three Mile Island Atomic Testing Radiation Isotope

The Technology for Autonomous Weapons Exists. What Now?

The hypothetical escalation that could result relates to another kind of weapon of mass destruction: the nuclear weapon. Some countries interested in autonomy are the same ones that have atomic arsenals. If two nuclear states are in a conflict, and start using autonomous weapons, “it just takes one algorithmic error, or one miscommunication within the same military, to cause an escalating scenario,” said Hehir. And escalation could lead to nuclear catastrophe.

In the future, humans may not be the only arbiters of who lives and dies in war, as weapons gain decision-making power.

UNDARK, By Sarah Scoles, 11.26.2024

One bluebird day in 2021, employees of Fortem Technologies traveled to a flat piece of Utah desert. The land was a good spot to try the company’s new innovation: an attachment for the DroneHunter — which, as the name halfway implies, is a drone that hunts other drones.

As the experiment began, DroneHunter, a sleek black and white rotored aircraft 2 feet tall and with a wingspan as wide as a grown man is tall, started receiving radar data on the ground which indicated an airplane-shaped drone was in the air — one that, in a different circumstance, might carry ammunition meant to harm humans.

“DroneHunter, go hunting,” said an unsettling AI voice, in a video of the event posted on YouTube. Its rotors spun up, and the view lifted above the desiccated ground.

The radar system automatically tracked the target drone, and software directed its chase, no driver required. Within seconds, the two aircrafts faced each other head-on. A net shot out of DroneHunter, wrapping itself around its enemy like something from Spiderman. A connected parachute — the new piece of technology, designed to down bigger aircraft — ballooned from the end of the net, lowering its prey to Earth.

Target: defeated, with no human required outside of authorizing the hunt. “We found that, without exception, our customers want a human in that loop,” said Adam Robertson, co-founder and chief technology officer at Fortem, a drone-focused defense company based in Pleasant Grove, Utah.

While Fortem is still a relatively small company, its counter-drone technology is already in use on the battlefield in Ukraine, and it represents a species of system that the U.S. Department of Defense is investing in: small, relatively inexpensive systems that can act independently once a human gives the okay. The United States doesn’t currently use fully autonomous weapons, meaning ones that make their own decisions about human life and death.

With many users requiring involvement of a human operator, Fortem’s DroneHunter would not quite meet the International Committee of the Red Cross’s definition of autonomous weapon — “any weapons that select and apply force to targets without human intervention,” perhaps the closest to a standard explanation that exists in this still-loose field — but it’s one small step removed from that capability, although it doesn’t target humans.

How autonomous and semi-autonomous technology will operate in the future is up in the air, and the U.S. government will have to decide what limitations to place on its development and use. Those decisions may come sooner rather than later—as the technology advances, global conflicts continue to rage, and other countries are faced with similar choices—meaning that the incoming Trump administration may add to or change existing American policy. But experts say autonomous innovations have the potential to fundamentally change how war is waged: In the future, humans may not be the only arbiters of who lives and dies, with decisions instead in the hands of algorithms.

For some experts, that’s a net-positive: It could reduce casualties and soldiers’ stress. But others claim that it could instead result in more indiscriminate death, with no direct accountability, as well as escalating conflicts between nuclear-armed nations. Peter Asaro, spokesperson for an anti-autonomy advocacy organization called Stop Killer Robots and vice chair of the International Committee for Robot Arms Control, worries about the innovations’ ultimate appearance on the battlefield. “How these systems actually wind up being used is not necessarily how they’re built,” he said.

Many American startups like Fortem aim to ultimately sell their technology to the U.S. Department of Defense because the U.S. has the best-funded military in the world — and so, ample money for contracts — and because it’s relatively simple to sell weapons to one’s own country, or to an ally. Selling their products to other nations does require some administrative work. For instance, in the case of the DroneHunters deployed in Ukraine, Fortem made an agreement with the country directly. The export of the technology, though, had to go through the U.S. Department of State, which is in charge of enforcing policies on what technology can be sold to whom abroad.

The company also markets the DroneHunter commercially — to, say, a cargo-ship operators who want to be safe in contested waters, or stadium owners who want to determine whether a drone flying near the big game belongs to a potential terrorist threat, or a kid who wants to take pictures.

Because Fortem’s technology doesn’t target people and maintains a human as part of the decision-making process, the ethical questions aren’t necessarily about life and death.


In a situation that involves humans, whether an autonomous weapon could accurately tell civilian from combatant, every time all the time, is still an open question. As is whether military leaders would program the weapons to act conservatively, and whether that programming would remain regardless of whose hands a weapon fell into.

A weapon’s makers, after all, aren’t always in control of their creation once it’s out in the world — something the Manhattan Project scientists, many of whom had reservations about the use of nuclear weapons after they developed the atomic bomb, learned the hard way.

………………………………………………………………………………………………………………………………………….. escalation could come from robots’ errors. Autonomous systems based on machine learning may develop false or misleading patterns. 

…………………………………………………………………….The hypothetical escalation that could result relates to another kind of weapon of mass destruction: the nuclear weapon. Some countries interested in autonomy are the same ones that have atomic arsenals. If two nuclear states are in a conflict, and start using autonomous weapons, “it just takes one algorithmic error, or one miscommunication within the same military, to cause an escalating scenario,” said Hehir. And escalation could lead to nuclear catastrophe.

……………………………………………………………………………………………………………………..Hehir and the Future of Life Institute are working toward international agreements to regulate autonomous arms. The Future of Life Institute and the Campaign to Stop Killer Robots have been lobbying and presenting to the U.N. Future of Life has, for instance, largely pushed for inclusion of autonomous weapons in the Convention on Certain Conventional Weapons — an international agreement that entered into force in 1983 to restrict or ban particular kinds of weapons. But that path appears to have petered out. “This is a road to nowhere,” said Hehir. “No new international law has emerged from there for over 20 years.”

And so advocacy groups like hers have moved toward trying for an autonomy-specific treaty — like the ones that exist for chemical, biological, and nuclear weapons. This fall, that was a topic for the UN’s General Assembly.

Hehir and Future of Life aren’t advocating for a total ban on all autonomous weapons. “One arm will be prohibitions of the most unpredictable systems that target humans,” she said. “The other arm will be regulating those that can be used safely, with meaningful human control,” she said.

……………………………………… with the current lack of international regulation, nation-states are going ahead with their existing plans. And companies within their borders, like Fortem, are continuing to work on autonomous tech that may not be fully autonomous or lethal at the moment but could be in the future. …………………

Sarah Scoles is a science journalist based in Colorado, and a senior contributor to Undark. She is the author of “Making Contact,” “They Are Already Here,” and “Countdown: The Blinding Future of 21st Century Nuclear Weapons.”  https://undark.org/2024/11/26/unleashed-autonomous-weapons/?utm_source=Undark%3A+News+%26+Updates&utm_campaign=c63b00e0ff-RSS_EMAIL_CAMPAIGN&utm_medium=email&utm_term=0_5cee408d66-185e4e09de-176033209

December 2, 2024 - Posted by | USA, weapons and war

No comments yet.

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.