Ask one technologist and he or she might say that lethal autonomous weapons — machines that can select and destroy targets without human intervention — are the next step in modern warfare, a natural evolution beyond today’s remotely operated drones and unmanned ground vehicles.
Others will decry such systems as an abomination and a threat to International Humanitarian Law (IHL), or the Law of Armed Conflict.
The U.N. Human Rights Council has, for now, called for a moratorium on the development of killer robots. But activist groups like the International Committee for Robot Arms Control (ICRAC) want to see this class of weapon completely banned. The question is whether it is too early — or too late — for a blanket prohibition. Indeed, depending how one defines “autonomy,” such systems are already in use.
From stones to arrows to ballistic missiles, human beings have always tried to curtail their direct involvement in combat, said Ronald Arkin, a computer scientist at Georgia Institute of Technology. Military robots are just more of the same. With autonomous systems, people no longer do the targeting, but they still program, activate and deploy these weapons.
“There will always be a human in the kill chain with these lethal autonomous systems unless you’re making the case that they can go off and declare war like the Cylons,” said Arkin, referring to the warring cyborgs from “Battlestar Galactica.” He added, “I enjoy science-fiction as much as the next person, but I don’t think that’s what this debate should be about at this point in time.”
Peter Asaro, however, is not impressed with this domino theory of agency. A philosopher of science at The New School, in New York, and co-founder of ICRAC, Asaro contends robots lack “meaningful human control” in their use of deadly force. As such, killer robots would be taking the role of moral actors, a position that he doubts they are capable of fulfilling under International Humanitarian Law. That’s why, he says, these systems must be banned.
Choosing targets, ranking values
According to the Law of Armed Conflict, a combatant has a duty to keep civilian casualties to a minimum. This means using weapons in a discriminating fashion and making sure that, when civilians do get killed in action, their incidental deaths are outweighed by the importance of the military objective — a calculation that entails value judgments.
In terms of assessing a battlefield scene, no technology surpasses the ability of the human eye and brain. “It’s very aspirational to think that we’ll get a drone that can pick a known individual out of a crowd. That’s not going to happen for a long, long, long, long time,” said Mary “Missy” Cummings, director of MIT’s Human and Automation Laboratory, and a former F-18 pilot. [Drone Wars: Pilots Reveal Debilitating Stress Beyond Virtual Battlefield]
Still, a fully autonomous aircraft would do much better than a person at, say, picking up the distinctive electronic signature of a radar signal or the low rumbling of a tank. In fact, pilots make most of their targeting errors when they try to do it by sight, Cummings told Live Science.
As for a robot deciding when to strike a target, Arkin believes that human ethical judgments can be programed into a weapons system. In fact, he has worked on a prototype software program called the Ethical Governor, which promises to serve as an internal constraint on machine actions that would violate IHL. “It’s kind of like putting a muzzle on a dog,” he said.
As expected, some have voiced much skepticism regarding the Ethical Governor, and Arkin himself supports “taking a pause” on buildinglethal autonomous weapons. But he doesn’t agree with a wholesale ban on research “until someone can show some kind of fundamental limitation, which I don’t believe exists, that the goals that researchers such as myself have established are unobtainable.”
Of robots and men
Citing the grisly history of war crimes, advocates of automated killing machines argue that, in the future, these cool and calculating systems might actually be more humane than human soldiers. A robot, for example, will not gun down a civilian out of stress, anger or racial hatred, nor will it succumb to bloodlust or revenge and go on a killing spree in some village.
The Latest on: Lethal autonomous weapons
via Google News
The Latest on: Lethal autonomous weapons
- Naval Mines Are Getting a 21st-Century Upgradeon November 26, 2020 at 10:40 pm
Future mines are likely to become more mobile and autonomous, similar to the Mk 60 CAPTOR (Captured Torpedo) sea mine utilized by the US Navy. These future torpedo-mines are likely to be more ...
- Terrifying robot tanks lock onto targets automatically and fire kamikaze bomber droneson November 23, 2020 at 10:10 am
The tanks would navigate and identify targets using artificial intelligence, though a remote human controller would make the final decision on whether to take lethal action. Three wolfpack vehicle ...
- Pakistan calls for moratorium on production of Lethal Autonomous Weapon Systemson November 23, 2020 at 12:18 am
At the United Nations, Pakistan has called for a moratorium on the production of Lethal Autonomous Weapon Systems (LAWS), known as "killer robots", that are capable of making their own combat ...
- ‘Slaughterbots’ open a new front in frozen warson November 22, 2020 at 7:05 pm
A handful of soldiers shelter behind an embankment, oblivious to the fate about to enfold them. A missile streaks out of the sky, followed by an explosion. The smoke clears. Bodies lie in the sand ...
- ‘Slaughterbots’ open a new front in frozen warson November 21, 2020 at 4:35 pm
A handful of soldiers shelter behind an embankment, oblivious to the fate about to enfold them. A missile streaks out of the sky, followed by an explosion. The smoke clears. Bodies lie in the sand.
- AI and Robotics Companies Ask UN to Ban 'Killer Robots'on November 19, 2020 at 3:59 pm
The letter reads in part: “Lethal autonomous weapons threaten to become the third revolution in warfare. Once developed, they will permit armed conflict to be fought at a scale greater than ever, and ...
- With Iran’s Help, These Iraqi Militias Are Acquiring Deadlier Rocketson November 19, 2020 at 4:31 am
The TOS-1 is a truly lethal weapon. The Iraqi Army has six in its ... including within the boundaries of the relatively secure and stable autonomous Kurdistan region. The rocket’s crash site ...
- China is Selling Autonomous Weaponized Drones to Saudi Arabia and Pakistanon November 17, 2020 at 4:00 pm
The news comes after UN Secretary-General António Guterres urged artificial intelligence experts to restrict the development of lethal autonomous weapons systems, also known as LAWS, in March ...
- How killer robots overran the UNon November 17, 2020 at 4:00 pm
That’s the reluctant conclusion some activists are coming to, as an effort to ban “lethal autonomous weapons systems” under the U.N.’s Convention on Certain Conventional Weapons seems set to fall ...
- Future Navy attack submarine for 2030 – bigger, more lethal, stealthier – WebMagon November 17, 2020 at 2:32 pm
The Navy’s future attack submarines for the 2030s will be bigger, faster, more autonomous, networked and stealthier than the existing Virginia-class attack boats because greater size will allow for ...
via Bing News