March of the killer robots

Eugen Leitl eugen at leitl.org
Thu Jun 18 03:19:45 PDT 2009


http://www.telegraph.co.uk/scienceandtechnology/science/sciencenews/5543603/March-of-the-killer-robots.html


March of the killer robots

The development of mechanical soldiers and remote-controlled tanks and planes
is changing war for ever - but the moral consequences have often been
overlooked.

By Noel Sharkey

Published: 6:45PM BST 15 Jun 2009

 Killing machine: one of America's unmanned Reaper hunter-killer aircraft
Photo: PHILIP COBURN

It's the most realistic shoot-'em-up game ever. The player has a choice of
two planes: a Predator with two Hellfire missiles, or a Reaper with 14. The
action takes place in the Middle East, where you can attack villages and kill
the inhabitants with impunity. But don't bother looking for it in the shops:
to play this deadly game, you'll have to travel to Creech Air Force base in
the Nevada desert. That's because the planes are real, and so are the
casualties.

The first time a Predator made a kill was in Yemen, in 2002, when the CIA
used it to destroy a vehicle carrying an al-Qaeda leader and five of his
associates. The fleet now stands at around 200 craft, which have flown more
than 400,000 combat hours. The company that makes them, General Atomics,
can't keep up with the demand. The bigger, badder version b the Reaper
hunter-killer b is also flying off the shelves. There are now around 30 in
active service, with the first kill taking place in the mountains of
Afghanistan in October 2007.
 
In every field of warfare, mechanical soldiers are fighting alongside b or
instead of b human beings. Apart from unmanned combat air vehicles such as
Predators, the skies above Iraq, Afghanistan and Pakistan are filled with
drones carrying out surveillance operations. On the ground are between 6,000
and 12,000 robots, up from a mere 150 in 2004. Their role is mostly to
protect our soldiers by disrupting improvised explosive devices, or to carry
out surveillance of dangerous places such as caves and buildings.

Our image of such robots owes a great deal to films b most notably The
Terminator or Transformers, both of which have sequels out this month. But
the actual models being used are more like miniature tanks, similar to the
contraptions seen on the television series Robot Wars. The most popular is
the PackBot made by the US company iRobot, which is normally used for bomb
disposal. As the company started out making robotic vacuum cleaners known as
Roombas, the 18kg PackBot is sometimes jokingly referred to as the "Roomba of
doom" or "Doomba" b much to the displeasure of the firm's management, who
would clearly hope to keep the two brands separate.

Recently, iRobot joined forces with Taser International to mount the
allegedly non-lethal weapons on the "bots". But that pales in comparison with
the ordnance that comes with the Talon, a larger device made by
Foster-Miller, a US subsidiary of the British firm QinetiQ. It comes with
chemical, gas, temperature and radiation sensors and can be mounted with a
choice of grenade launcher, machine gun, incendiary weapon or 50-calibre
rifle. Its bigger brother, the MAARS robot, ups the stakes with a tanklike
turret.

Despite planned cutbacks in spending on conventional weapons, the Obama
administration is increasing its budget for robotics: in 2010, the US Air
Force will be given $2.13 billion for unmanned technology, including $489.24
million to procure 24 heavily armed Reapers. The US Army plans to spend $2.13
billion on unmanned vehicle technology, including 36 more Predators, while
the Navy and Marine Corps will spend $1.05 billion, part of which will go on
armed MQ-8B helicopters.

Of course, when the military describes such systems as "unmanned", it is
stretching the truth very slightly. At the moment, all the armed robots in
the Middle East are remote-controlled by humans b there is a "man in the
loop" to control them and to decide when and whether to apply lethal force.

But that makes very little difference to villagers in Waziristan, where there
have been repeated Predator strikes since 2006, many of them controlled from
Creech Air Force Base, thousands of miles away. According to reports coming
out of Pakistan, these have killed 14 al-Qaeda leaders and more than 600
civilians.

Such widespread collateral damage suggests that the human remote-controllers
are not doing a very good job of restraining their robotic servants. In fact,
the role of the "man in the loop" is becoming vanishingly small, and will
disappear. "Our decision power [as controllers] is really only to give a
veto," argues Peter Singer, a senior fellow at the Brookings Institution in
Washington DC. "And, if we are honest with ourselves, it is a veto power we
are often unable or unwilling to exercise because we only have a half-second
to react."

As Dyke Weatherington, deputy director of the Pentagon's Unmanned Aerial
Systems Task Force, points out: "There's really no way that a system that is
remotely controlled can effectively operate in an offensive or defensive air
combat environment. The requirement of that is a fully autonomous system."

Sure enough, plans are well under way to develop robots that can locate and
destroy targets without human intervention. There are already a number of
autonomous ground vehicles, such as the seven-ton "Crusher" developed by
DARPA, the US military's research agency. BAE Systems, a British defence
contractor, recently reported that it had "completed a flying trial which,
for the first time, demonstrated the co-ordinated control of multiple
Unmanned Aerial Vehicles autonomously completing a series of tasks". The
Israelis are already fielding autonomous radar-killer drones known as Harpy
and Harop, and the South Koreans use lethal autonomous systems to defend
their border with the North.

Many in the military are enthusiastic about such developments. "They don't
get hungry. They're not afraid. They don't forget their orders," says Dr
Gordon Johnson, of the Pentagon's Joint Forces Command. "Will they do a
better job than humans? Yes."

Dr Johnson insists that "there are no legal prohibitions against robots
making life-and-death decisions", adding: "The US military will have these
kinds of robots. It's not a question of if, it's a question of when."

The problem, however, is that no autonomous robots or artificial intelligence
systems have the necessary capabilities to discriminate between combatants
and innocents. Compared with the robots in the Terminator films, they suffer
from artificial stupidity. Allowing them to make decisions about who to kill
falls foul of the fundamental ethical precepts of the laws of war set up to
protect civilians, the sick and wounded, the mentally ill and captives. We
are already overreaching the technology and stretching the laws of war.

"Unless we end war, end capitalism and end science, the use of robots will
only grow," says Peter Singer. "We are building and using machines with more
and more autonomy because they are viewed by militaries as useful for war,
and viewed by companies as profitable business." Spending on Unmanned Aerial
Vehicles is expected to exceed tens of billions of dollars over the next 10
years, and more than 40 countries b including Russia and China b now have
their own programmes.

Amid this robotic arms race, there is a sliver of hope. Professor Ron Arkin,
of the Georgia Institute of Technology, believes that humans do not have the
time to make rational ethical decisions in the modern battlefield. "There
appears to be little alternative," he says, "to the use of more dispassionate
autonomous decision-making machinery." He has funding from the US Army for
research on how to programme ethical rules into robots to stop them causing
excessive collateral damage. But this does not get around the problem of how
to discriminate between innocents and combatants b and Arkin admits that the
technology to fully support his system may not be available for 25 years.

The problem is that it is not just a matter of developing adequate sensors.
In complex wars, complex human reasoning is often needed to decide when it is
appropriate to kill. Robots do not feel anger or seek revenge b but they also
don't have sympathy, empathy, remorse or shame. Nor can they be held
accountable for their actions. In subcontracting our wars to our robotic
creations, we are abdicating moral responsibility, too.

Noel Sharkey is Professor of Artificial Intelligence and Robotics at the
University of Sheffield 





More information about the cypherpunks-legacy mailing list