+ All Categories
Home > Documents > THE ETHICAL AND SOCIAL IMPLICATIONS OF … ~ ~ b q p ) 0 0 0 ( ) c THE ETHICAL AND SOCIAL...

THE ETHICAL AND SOCIAL IMPLICATIONS OF … ~ ~ b q p ) 0 0 0 ( ) c THE ETHICAL AND SOCIAL...

Date post: 21-May-2018
Category:
Upload: vohuong
View: 214 times
Download: 0 times
Share this document with a friend
10
( ) ij b q ) 0 p 0 0 ( ) c THE ETHICAL AND SOCIAL IMPLICATIONS OF ROBOTICS EDITED BY Patrick Lin, Keith Abney, and George A. Bekey
Transcript

~ (

~ )

ij ~ ~

b q ) 0

p 0 0 ( ) c

THE ETHICAL AND SOCIAL IMPLICATIONS OF ROBOTICS

EDITED BY

Patrick Lin, Keith Abney, and George A. Bekey

. Made Easy: From joysticks to Politics 1 ~jlltf19

ft8hl trom a di~tancc is in\tincuvc m man. From the first day he ha\ worked to thts end, and

conti.nU~ to do so.

_,ud.lnt du Picq'

Robots will change the way that wars are fought by providing distant "stand-ins" for combatants. Military robots are the fruit of a long chain of weapons development designed to separate fighters from their foes. Throughout the history of war, weapon technology has evolved to enable killing from ever- increasing distances. From stones to pole weapons to bows and arrows to cannon to aerial bombing to jet propelled mlssil~. killing has become ever easier.

1\ot only have distance weapons led to a more effective killtng technology, but attacking from a distance also gets around two of the fundamental obstacles that wartighters must face: fear of being killed and resistance to killing. Fear is one of the 8feat~st obstacles to a soldier's effectiveness in battle (Oaddis 2004). It is obvious that the greater the distance from the enemy, the less fear will play in the action. Many batt!~ throughout history have been lost by men bolting in panic as fear swept through the ranks-often from a misunderstanding of the action (llolmes 2003). Arm~ historian Brigadier General Marshall ([ 194 71 2000), following after-action

\\~et\lews With soldiers in the Pacific and European theaters of operation durtng World •r II cl · U ' 31 ffied that onh about 15 to 20 percent of riflemen were either able to or mg to 11 r · . . htith re. h1s means that around 80 percent of the U.S. mfantry 10 World War

Olier ~r Were not firing their weapom when they could see their enemy, or were firing r~a:c~emy SOldiers' heads. I here have been some very sharp criticisms of Marshall's lldin methods, and the exact percentages may not be correct, but the nature of his thcr &~that man) soldiers are unwilling to kill-ha\ received general support from

ana)}' 111 h· ses of historical battles. ton:, book Act~ ()/ War, Holmes (2003) argues that the 1/it rates in a number of tllly \lthb·atttes <;hm\ that man)' soldiers were not prepared to fire directly at the

~n they were in sight. A group of British soldiers ent1rel) surrounded by

realtech
Highlight
realtech
Highlight

112

Zulu warriors fired at point-blank range, but had a hit rate of only one to rounds fired. At the battle of Wissembourg in 1870, the French fired 48 OOo the Germans advancing across open fields, but only managed to hit 4~ t~e Vietnam War, it was estimated that over 50,000 bullets were fired for Of k1lled .. Holmes also tells the World War 1 story of Lieutenant George

stop h1s men firing in the air, patrolled the trenches, hitting them on the with his sword, telling them to fire low.

The killologist, Lieutenant Colonel David Grossman, argues that "not cowardice, but really a compulsion of ordinary men not to kill (Gross . man g1ves several examples in his book, On Killing, from the U.S. Civil War of rates from close-distance musket fire. In one instance, the Battle of 27,574 muskets retrieved from the battlefield, 90 percent were still loaded

loaded-one musket had even been loaded twenty-three times without Grossman also points out that killing distance can be psychological as

cal. He cites Clausewitz and du Picq for expounding at length on how the of deaths in battle occurred when the victors chased the losing side in suggests that Alexander the Great lost fewer than seven hundred men battles because there never was a victorious enemy to pursue his " rn"" ....

soldiers never retreated. Grossman argues that across the battlefields of

t~e ~.S. Civil War, the majority of casualties and deaths were inflicted by h1s Vlew, the gr<'ater the distanre the artillery is from its targets, the greater tiveness will be. We see the same phenomena with increasingly high bombing and the use of long-range missiles.

Now we are embarking on new territory, where the new battlefield not be considered as distance weapons in the traditional sense. Yes, a

can be considered to be a robot, for after it is launched it can alter its built-in GPS. But it has a single purpose-to strike and destroy a target. The field robots are different. They can stand in directly for soldiers or pilots at greater distances. These robots are coming into their o" nasa new form of killing machi ne that may forever alter how war is waged. Unlike mis~iles projectiles, robots can carry multiweapon systems into the theater of ' """T"'' ..... they are to be deployed in the theater need not be decided m advance. as act flexibly once in place. Eventually, they may be able to take the place combatants without risk to the lives of their operator\. Killing will become easier-but not without moral risk.

7.1 The Ultimate Distance Weapon Systems

Nowadays, so manr robots are being deployed in the Middle East conflict it is difficult to get an accurate estimate of their numbers. The figures for

12 000 1 vcn the lower ligure \how~ the dramatic imrcase In the 6 ()()0 tO • · ·

C rrottl ' . 7004 "'hen there wen~ onlv 1 SO, and it tesllfic~ to their military

rattS t~ ~.nee - , . ol rt1h0

1 t\ have mainly been deployed for dull, dirty, and dangerous tasks, use rhC ro )() . f

f~ne'~· . detonating improvised explo~1ve devices and for surve11lance o we . rupung or . .

h 3' th~ . 1ts such as caves and building' that may be housmg msurgents. ,uc envaronmel • dJogef0°~ mb~ are the most common killer of allied soldiers, and robots are used to Ro.l;tdde bO . d search cars or prod suspected packages. Robot~ have <;aved many

head an dfiH tl

' li~<C). all d I . I sodiCf). ood drawn by a ground robot was actually by the sm an re at1ve Y Jhl.' hTSt bl

1 d M \RCbot which looks like a toy truck with a camera stalk (Singer

tour-whee e ' • cheaP . pose wa~ to mspect underneath cars and trucks for explosives. But

I It~ maul pur . . '1!$1} • • ha·• a dever idea. Its soldiers started loadmg MARCbotsw1th Claymore u.s. umt u . . .

e ~nnel mines and went looking for msurgents h1ding 111 alleyways to ambush

an~:\\'hen they found any, they killed them by exploding the mme. But th1s was th f • 1 use of the robot and it took time to surmount some of the legal and an uno 11c1a .

I d·tftculties of usmg special-purpose armed ground robots. Nonetheless, 1f

phHICil I there 1~ an opportunity to u<;e armed robots to separate soldiers from danger, com-

manders are likely to use them. In June 2007, the hrst three armed Talon SWORDS (Special Weapons Observation

Reconnaissance Detection S\ stem) were sent to Iraq at a reported cost of $200,000 each. Jhese can be equipped with M240 or M249 machine guns, Barrett O.SO caliber rill~ -IOmm grenadl launchers, or antitank rocket launchers. A!> far as it is possible to tell, they were not deployed in action. One explanation given by Kevin Fahey (the US. \rmy's executive officer for ground forces) was that when the SWORDS was first Sl\itchcd on, the gun had begun to move when it should not have moved (Sofge 2008).

.\noth~:r explanation. given to the 1Je(e11se Rt"'•iew journal by U.S. Special f o rces, is that S\\ORns h jokingly referred to as the TVR, or laliban Re-supply Vehicle, because

~ahban lighters w1ll hide and wait for the weaponized Talon robot/SWORDS to roll b) sneak up on it, tip it over, remove the machine gun (or any other weapon) and

mo from 1t and then u!>c It/them against U.S. forces" (('rane 2008). The SWORDS was e\sentially a test of concept to try the robots with soldiers on

the lxlttlefii:Jd lt ha\ mflucnced the development of the ne>.t generation of armed

la~~~obot), whith i~ well under way. More powerfully armed robot), such as the 1

MA \RS (Modular Advanced Armed Robotic System) from l·oster-Miller, are replace th~: S\\ ORns llut it i h .

\\a s t e robot plane) and drones that are currently the ultimate in d1~tance a liPOns S}:.tcms. Mh\1ons are flown by "p1lots" of the 432nd Air Expeditionar) Wing

e Creech \' . ~'dtio i lr I t>n.c base in the Nevada desert, thousands of n11les away from the at r ~ fhe OpcrJtors ~1t at game comoles, mal-.ing decisiom about when to apply

orce. Sometime~. all the operator has to do is to decide (in a very ~hort space

realtech
Highlight
realtech
Highlight
realtech
Highlight
realtech
Highlight
realtech
Highlight
realtech
Highlight

114

of time) whether or not to veto the application of force. The planes can around the clock, as it is easy for pilots to take a break from "battlen at or even go home to have dinner with their children. According to some contrast between home life and the battlefield within the same • period is apparently causing a new kind of battle stress that has not been before.

The Unmanned Combat Air Vehicle (UCAV), the MQ-1 Predator, which payload of two Hellfire mis~iles, flew 250,000 hours up until june 2007. As a its military usefulness, it clocked an additional 150,000 hours in the Afghan conflicts in the subsequent fourteen months, and passed the one-milllon mark in 2010.

In October 2007, the Predator was joined by the much larger and more MQ-9 Reaper. The MQ-9 Reaper carries a payload of up to fourteen Hellfire or a mixture of missiles and bombs. These "hunter-killer" unmanned aerial (UAVs) have conducted many decapitation strikes2 since they were fi rst Afghanistan in October 2007. There is a demand to get many more soon as possible. The number of Reapers flying over the conflict zones has twenty during their first year of operation (2007-2008)-a year ahead of and there has been a push from the U.S. Air Force (USAF) for General increase production levels above the current four per month. In late million was added to the USAI budget for training more nonaerial pilots.

There was no change of direction under the Obama administration. were cutbacks to conventional weapons, the robot programs received more predicted. In 2010, the Air l·orce aimed to spend S2.13 billion on unmanned ogy, with S489.24 million to procure twenty-four new heavily armed Reapers. Army planned to spend S2.13 billion on unmanned vehicle technology. This the purchase of thirty-six more unmanned Predators. The U.S. Navy and targeted S 1.05 billion for unmanned vehicles, including armed MQ-88

Outside of these conventional forces, there is a considerable Central Agency (CIA) use of the drones for decapitation strikes. Indeed, it was the carried out the first missile st rike from an armed Predator in Yemen in 2002 has now effectively got its own air force flying over Somalia. Yemen, Pakistan. The legality of such attacks was questioned at the UN General meeting in October 2009 by Philip Alston, UN special reporter on extrajudicial l ie made a request for U.S. legal justification for how the CIA is accountable targets that they are killing. The United States turned down the request, these are covert operations.

A rebuttal by Harold Koh, legal adviser, U.S. Department of State, in targeting practices, including lethal operations conducted b) UAVs, complY applicable law, including the laws of war" (Koh 20 1 0) However, there

115

1 llll'an'> ol dl'termining how the targetmg d~ci\iom are being made. A

dtpeoden fa torce t>t?longing to a state acting against the Un1tt-d States would be 111

111rnaoder 0

et Intelligence error~ made in the Vietnam War and its aftermath CO • ate targ ·

lrgiufll d rd of evidence used for assassinatiom led to Pre~ldential Order 12333, ' h~ stan a 1 ,IJ(Iut 1 h assassination of civilians. And it is now unclear what type and leve

b'tl118 t e · ,,,olli 1 . being u!>ed to sentence nonstate actor~ to death by Hellfire attack r idcnCC IS ol e' ht to appea l or right to surrender. It sits behind the cloak of national 111th001 ng bsequent report by Alston (2010) to the UN General Assembly' discusses

('(") A su h . stCf •• 'kes as violating international and human rights laws because bot reqUire dr011t ~~:1 C\ abOut the procedures and safeguards in place to ensure that killings are tran~P·ut:n · I d · . 'bl 1·

d ustified "a lack of disclosure gives states a virtua an tmpermtSSI e 1cense !3\\'ful an I

11 • rhe debate contmues. \ 10 1;1 ,·of the armed drones are currently "man in the loop" combat systems. This

AI . ry little difference to the collaterally damaged villagers in Waziristan, where m31:C~ H~ thtrc h,l\e been repeated Predator strikes since 2006. No one knows the true figures for d\'ilian casualties, but according to reports coming from the Pakistan press, drone attacks have killed fourteen al-Qaeda leaders, and this may have been at the cost of

O\CI ,ix hundred civilians (Sharkey 2009b).

7.2 In, On, or Out of the Loop

There h now massive spending going on, and plans are well under way to take the human "out of the loop," so that robots can operate autonomously to locate their own targets and destroy them without human intervention (Sharkey 2008a). This is high on the mlhtar)' agenda of a lithe U.S. forces: "the Navy and Marine Corps should aggrmively exploit the considerable war-fighting benefits offered by autonomous \chicle~ \AVs) bv acquiring operational experience with current systems, and using lessons learn~ from that experience to develop future AV technologies, operational rtqulrements, and wstems concepts" (Committee on Autonomous Vehicles in Support

Naval Operations 'lational Research Council 20051. There are now a number of ~tono

d mous ground \'ehacles. ~uch as DARPA's "Unmanned C. round Combat Vehicle 2(K) Perccptor Integration System," otherwise known ac; the Crusher (Fox New'> lnt 1• RAl. systems recently reported in an industr)' briefing to United Press

ernati 1 t •ne d ona (2008) that they have "completed a !lying trial which, for the first lng' emo.nstrated tht> coordinated control of multiple UAVs autonomously complet­a '1<: .

-raes of tasks." ihe mo .. tJ~ veto autonom) I\ clearly required to fulfill the current U.S. m1htary plans. ~so rak'd ~}'stern\ are more e~pensive to manufacture and require many support 1ta1 ;net to run them One of the main goals is to use robots as force multipliers. so

ne ~Oidier on the battlefield can be a nexus for initiating a large-scale robot

realtech
Highlight
realtech
Highlight
realtech
Highlight

, 16

attack from the ground and the air. Clearly, one soldier cannot remote! several robots alone. }'

In the U.S. Air Force's Umnmmec/ Aircraft Systems Flight Pla11 2009-2047 was also discussed for swarm technologies: "SWARM technology will all~w MQ-Mb aircraft to cooperate in a variety of lethal and nonlethal mi command of a single pilot" (United States Air Force 2009, 39). Such a mo\'e

decisions being made by the swarm-human decision making will be tao not able to react to the control of several aircraft at o nce.

There Is also a considerable push to shrink the ro le of "the man In the begin with, autonomous operation will be mainly for tasks ~uch as take-of(

and refueling. As unmanned drones react in micro- o r nano-seconds, the will no longer be 'in the loop' but rather 'on the loop,' monitoring the certain decisions. Simultaneously, advances in AJ will enable systems to decisions and act within legal and policy constraints, without necessarily human input" (Un ited States Air Force 2009, 41).

The main ethical problems arise because no autonomous robots or a ligence systems have the necessary sensing properties to allow for d

between combatants and innocents. This is also understood clearly by the military. Major Daniel Davis, a combat veteran of Iraq 1991 and Afghan writes: "Suggesting that within the next 1 2-plus years technology could would permit ltfc-and-death decisions to be made by algorithms is machine cannot sense something is wrong and take action when no orders given. It doesn't have intuition. It cannot operate within the commander's use initiative outs1de its programming. It doesn't have compassion and mercy" (2007).

Davis quotes Colonel Lee Fetterman, training and doctrine capabilities Future Combat System~ FCS, who has a high regard for the unmanned he used in Afghanistan to search caves and buildings. However, he has <>trong

about robots makmg decisions about kHiing. 'The funcHon that robots can for us-that is, the function we should not allow them to perform for us-is function. Men should decide to kill other men, not machines," he said ( "This is a moral Imperative that we ignore at great peril to our humanity.

be morally bereft if we abrogate our responsibility to make the life-and-death required on a battlefield as leaders and soldiers with human compassion standing. This is not something we would do. It is not in concert with the spirit" (Davis 2007).

Allowing robots to make decisions about who to kill could fall foul of

mental ethical precepts of a just war under jus in bello, as enshrined an and lla~,rue conventions and the various protocols set up to protect the in combatants/warriors are legitimate targets of attack-all others, includut8

II /

·ce workers. and retirees, \hould be immune from attack. In fact, the ~113ns, >erv

1 . n e\'en ntend to combatant~ that are wounded, have surrendered,

\'' roteeuo "s of Jl

1 ill (but '>CC also rord l 944 ). :are naental~tions ha,·e been in place for man} ccnturie~. Thomas Aquinas, in the

'fllt~ prot n! developed the "doctrin(' of double effect." ~sentially, there is no h~~~· .

t!JirtefOt , for killing innocentS during a conn1ct prOVIded that (1) you did not

111oral penal!) or (2) killing the innocents was not a means to winning, or (3) the d to do so,

otrfl the defense of your nation is proportionally greater than the number partance to

it!l riliart deaths. I d~ 01am circumstances in a modern war where it is extremely difficult, if

There arc . 'bl·· to fully protect noncombatants. for example, in attacking a warship,

1mro5~1 ~. 1 noncombatants, such as chaplains and medical staff, may be unavoidably som~ Similarly, but less ethically justifiable, it Is difficult to protect the innocent kill '1 ge explosives are used near civilian populations, or when missiles get mis­~htn .1r dirtftt.>d In modem warfare, the equivalent of the doctrine of double effect is the pnnople of proportionality, which "requires that the anticipated loss of life and damage to property incidental to attacks must not be excessive in relation to the concrete and direct military advantage expected to be gained" (Petraeus and Amos

2006). In the heat of battle, both the principles of discrimination and proportionality can

be problematic, although their violation requires accountability and can lead to war crtme~ tribunals. But the new robot weapons, which could violate both of these prin­apl~ cannot be held accountable for their decisions (Sharke)' 2008b). You cannot punish an inanimate object. It would be very difficult to allocate responsibility in the chain of command or to manufacturers, programmers, or designers-and being able 10 allocate responsibility is essential to the Jaws of \-••ar.

The problem is exacerbated further by not having a specification of "civilianness"

Robert\, forthcoming, for the difficult•es In trymg to find a definition of a dvii­A computer can compute any given procedure that can be written down in a

Pf<>&ram · lnstru . mmg language We could, for example, give the computer on a robot an

11 ctron such as, "1f civilian, do not shoot." Th1s would be fine, 1f and only if there

ca~nSOme way to give the computer a precise definitiOn of "civilian." We certainly Ct get o f

nlor . ne rom the laws of war that could provide a machine with the necessary rnat•on Th lh~ 1977 ) · e 1949 c~eneva Convention requires the use of common sense, while

Who i\ 1

rotocot 1 essentiall)' defines a "civilian," in the negative sense, as someone I " not a combatant:

CIVIlian Is "nlclc 4 an~· person who doe~ not bdong to one of the catcgorie\ of persons referred to

Of llou~ 0 '· <2> (J l, and !61 of the rtmd C onvcntion and In 1\rtlcle 43 of this Protocol In ti\Jilanwhtthl•r a pt'T\un ''a dvilian, 1hat pcr\on \h.JII lw c.:on\Jdered to be a d\'iUan.

llOpui;Jtao n comprise\ all per\om v. ho arc d\'alidll\

realtech
Highlight
realtech
Highlight
realtech
Highlight
realtech
Highlight

I U!

3. TI1e presence within the civilian population of individuals who do not come

definition of civilians does no t deprive the population of Its civilian character. Additional to the Geneva Conventions, 1977 (Article 50))

And even if there were a clear computational definition of civilian, we need all of the relevant information to be made available from the sensing All that is available to robots are sensors, such as cameras, infrared

lasers, temperature sensors, ladars, and so on. These may be able to tell us something is a human or at least an animal, but not much else. In the labs systems that can identify someone's facial expression or that can recogn ize they do not work well on real-time moving people. And even if they did, could they be in the fog of war? British teenagers beat the surveillance by wearing hooded jackets.

In a conventional war where all of the enemy combatants wear clearly uniforms (or better yet, radio frequency tags), the problems might not be ferent from those faced in conventional methods of bombardment. But, warfare is increasingly making battle with insurgents the norm, and, in sensors would not help in discrimination. Knowing whom to kill would based on situational awareness and having a theory of mind, that is, u

someone else's intentions and predicting their likely behavior in a particular Humans understand one another in a way that machines cannot. Cues can subtle, and there are an infini te number of circumstances where lethal force propriate. just think of children being forced to carry empty rifles, or burying their dead.

7.3 An Ethical Code for Robots?

The military does consider the ethica l implications of civilian deaths from mous robots, although this is not their primary concern. Their role is to country in whatever way is required. ln the United States, all weapons and

systems are subjected to a legal review to ensure compliance with the Law Conflict (LOAC). There are three main questions to be asked before a authorized:

1. Does the weapon cause suffering that is needless, superfluous, or to the military advantage reasonably expected from the use of the weapon? It be declared unlawful merely because it may cause severe suffering or injury. 2. Is the weapon capable of being controlled, so as to be directed against I

target?

3. Is there a specific treaty provision or domestic law prohibiting the weapon's tion or use?

11 !1

ll these rules we have already seen a considerable number of collateral f(l(C)~ ( I

ReStl 11·111., from the use of ~emi-autonomous weapon systems. The argument . resll o . . .

Cil~uJIUes of proportionality, as stated in the first quest1on, but there 1s no quantlta-rllell15 one e that can objectively determine mil itary costs against civilian deaths. It

lneasur . d . . ll'e matter of political argument, as we have seen, t1me an ttme agam . I\ ,ust a oncern is the question of what constitutes a new weapon. Take the case

other c . All Predator UCAV. It was first passed for surveillance missions. Then, when 1t was

ol the "th Hellfire missiles, the judge Advocate General's office said that because both Jf11led WI nd Hell fires had previously been passed, their combination did not need

eaaton a rr canning et al. 2004). Thus, if we have a previously used autonomous robot and 10 be (

1 sly used weapon, it may be possible to combine them without further

a prev ou ,,..nni55ion r· Arrned autonomous robots could also be treated in a legally simila r way to sub-

·t·ons such a) the BLU-1 08 developed by Textron Defense Systems. 1 The BLU-1 08 muru 1 ,

rarachutcs to near the ground, where an altitude sensor triggers a rocket that spins

11 upward. It then releases four Skeet warheads at right angles to one another. Each

has a dual-mode (active and passive) sensor system: the passive infrared sensor detl'Cb hot targets, such as vehicles, while the active laser sensor provides target profiling. They can hit hard targets with penetrators, or destroy soft targets by

fragmentation. The BLU-108 is not like other bombs because it has a method of target discrimina­

tion. If it had been developed in the 1940s or 1950s, there is no doubt that it would ha1·e been dassified as a robot, and even now it is debatably a form of robot. The Skeet llarheads have autonomous operation and use sensors to target their weapons. The Sl'nsors provide discrimination between hot and cold bodies of a certain height, but like autonomous robots, they cannot discriminate between legitimate ta rgets and dvtlians If BLU-l08s were dropped on a civilian area, they would destroy buses, cars, anu trolleys. Like conventional bombs, discrimination between innocents and com­:atant\ requires accurate human targeting jud&'111ents. A key feature of the BLU-108

I that it has built-in redundant self-destruct logic modes that largely leave battlefields

t ean of t unexploded warheads, and it is this that keeps it out of the 2008 international reaty b

1 annlng cluster munitions (Convention of Cluster Munitions).

~ l~k:1se robot technology over the next twenty-five years in warfare would, at best, can UJ>mg the BLU-108 submunition, in other words, it can sense a target, but

not dis . "'Hh th Crtmlnate innocent from combatant (Sharkey 2008c). The big difference irld u e types of autonomous robots currently being planned and developed for aerial

oround or lilo h: Warfare is that they are not perimeter-limited. The BLU- I 08 has a footprint by lhe a: aU around. By way of contrast, mobile autonomous robots are limited only long di ount of fuel or battery power they can carry. They can potentially travel

stance ~and lnOYe out of line of sight communication .

realtech
Highlight
realtech
Highlight
realtech
Highlight

I I

120

In a recent sign of these future weapons, the U.S. Atr I-oree sent out a posals for "Guided, Smart Sub-munitions": "This concept requires a CBtJ Bomb Unit) munition, or UAV capable of deploying guided smart has the ability to engage and neutralize any targets of interest. The goal munitions is very challenging when considering the mission of addressing fixed targets of interest. The ~ub-munition has to be able to reacquire the intere~t it is intended to engage" (United States Air Force 2008). l111s much like an extended version of the BLU-108 that could pursue Most worrying are the words "reacquire the target of interest." If a targeted for example, to overtake a school bus, the weapons might acquire the bus as rather than the truck.

A naval presentation by Chief Engineer J. S. Canning subtitled "The between 'Winning the War' and 'Winning the Peace"' discusses a number of issues involved in the deployment of autonomous weapons. The critical Canning is that armed autonomous systems should have the ability to legality of a target. I lis answer to the ethical problem~ is unnervingly simple: target men" and "let machines target other machines" (Canning 2006).

the target set, and, Canning believes, may overcome the political objections ramifications of using autonomous weapons.

While machines targeting machines sounds like a great ethical drawing table, the reality tS that it belongs to mythical artificial intelligence, world AI. In most circumstances, it would not be possible to pinpoint without also pinpointtng the person using it, or even to discriminate between and nonweapons. I haw the mental image of a little girl being blown she points her ice cream at a robot to see if it would like some. And what if tricks the robot into killing innocent civilians by, for example, placing school or hospital roof? Who will take the responsibility?

A different approach, suggested by Ronald Arkin from the Georgia rechnology, is to equip the robotic soldier with an artificial conscimct Moshkina 2007). Arkin had funding from the U.S. \rmy to work on a designing an ethical autonomous robot, which he refers to a~ a hu glance, this sounds like a move in the right direction. \t the very lea~t, it army to consider the ethical problems raised both by the deployment of machines and even those of the soldier on the ground. Another of Arkin's that he addresses in a public survey, and it is a good one, is "to acceptable to the public and other groups, regarding the use of lethal systems" (Arkin and Moshkina 2007).

Despite the good intentions, I have grave doubts about the outcome of No idea is presented about how this could be made to work reliabl), and a ker issue when it comes to human lives. It is not just about having

·• de EoUY , .... 121

lllllt\, or l>ctng able to make appropnat~ disutminattons. A robot arn~r.~ u . .

0 aod' . to mak~ deci~ions in very complex Circumstance~ that are enttrely oSO II}' haH'

ld ll•t\13 (tlU tJie · I d . . rttllcta · the plan for this conscience is to create a mathemattca easton t)llP out that . d . d

uturns . f constraint~, reprc~ented as prohibitions and obligatiOns enve sisttnS 0

' l 11 h" · t sp3ce con f war and rules of engagement (Arkin 2009). l~sent a y, t ts consts s

r10111 the Jaws 0

plex conditional!. (if-then statements). Reporting on Arkin's work, h of com · · f 11

(a bUtlC 2007) gives the example of a Predator UAV on tts way to ktll a car u E IJCIIIII~t ( "Jd • "(( "t tiJ 11 ' 11 s the car overtaking a bus full of school cht ren, tt wt wat un

~ts. If it see . . . ofteff0° . them before blasting the car into obhvton. But how w1ll the robot

o\ ertaken . ? d . dl t haS ..-ween a bus full of school children and a bu!i full of guards. A mttte y, . .... lrntnate v-;• f h " I d · · ~· f the tasks that Arkin cites but it is still the kind o et tea ectston

i not one o · ' th s s mous robot would have to make. The shadow of mythical AI looms thJI an autono

c tn the background. . (Jrg "~t·eves that a robot could be more ethical than a human because tts ethics

\rktn ""'t . • .

I I Programmed into it and it has no emottOnaluwolvement wtth the action . . art ~tr ct Y ' The justification for this comes from a worrying survey, published by the Office of ~he Su eon General (Mental Health Advisory Team 2006) that tells of the aberran: ethtcal be~\1or and attitudes of many U.S. soldiers and marines serving in Iraq. Arkin holds that a robot cannot feel anger or a desire for revenge, but neither can it feel sympathy, tmpath}. or remorse. Surely. a better way to spend the mone) would be on more thorough ethical training and monitoring of the troops.

f.\'Cn if a robot was fully equipped wtth all of the rules from the Laws of War, and had, by some mysterious means, a way of making the same discriminations as humans make, it could not be ethical in the same way as is an ethical human. Ask any judge what the) think about blindly following rules and laws. In most real-world situations, these art" a matter of interpretation.

Arkin's anthropomorphi~m in sa}•ing, for example, that robots would be more umanc than humam doe~ not serve hi~ cause well. ro be humane is, by definition,

on ~ucharact~rized b) kindne\\, mercy, and sympath), or to be marked by an emphasis mantsttc values and concerns. These are all human attnbutcs that are not appro­

At~te In a discussion of ~oftware for controlling mechanical devices. More recently, re

111

has taken to talking about adding sympathy and guilt to robots. However, the a \alue f · r bo 0 the work would be to add 'iafety constraints to autonomous weapomzed

1111 ts to help to cut down the number of civilian casualties. J'hls i~ easy to understand, may h I . rn e P the work to progre~s In a clearer way. The anthropomorphtc terms create ore tnte . . d

rate fal resting narrative, but they only confuse the important safety tssues an "h. se expectatiom •ne n .

of ~~ber of po'>'>tble moral and ethical problems tn a military operations theater han!> could he inhnitc, or at lea!>t run into extremely large numbe~. Many

realtech
Highlight
realtech
Highlight

122

different circumstances can happen simultaneously c1nd give rise to unprt!di chaotic robot behavior. From a perhaps cynical perspective, the "robot soldi ct~

· " ld · (!r \VIA;,. consCience cou at some po1nt be used by military public relations to alia . ~ . . . >Pol

oppos•t•o~, amountmg to lots of talk while innocent civilians keep on dying: ~ worry, we." figure out how to use the technology discriminately eventually." ~

As OaVJs says about other defense experts talking up robot warfare, "such ments ar~. dangerous, because men disconnected from the realities of warfare..._ sway dec•s•on-makers regarding future force decisions and composition" (Davis~-~ On the same basis, the "artificial conscience" idea could perhaps also be empJ:;:-. an argument to shift the burden of responsibility for collateral fataHties fro m the of command onto inanimate weapons.

o civilized person wishes to see their country's young soldiers die in forc1gn The robot is certainly a great defensive weapon, especially when it comes to r bombs. It is the moral responsibility of military commanders to protect their so but there are a number of far-reaching consequences of "risk-free" war that we to consider.

• Having more robots to reduce the "body bag count" could mean fewer dlsmc to start wars. In the United States, since the Vietnam War, body-bag politics has a major inhibitor of military action. Without bodies coming home, citizens wiD a lot less about action abroad, except in terms of the expense to the taxpayer. It mean, for example, that with greatly reduced public and political opposition ( the so-called Dover\), it is a lot easier for the military to start and run more "dcfen

wars. This is an ethical and moral dilemma that should be engaging inlerna thinking.

• Armstrong warns about the use of robots in "the last three feet" and asks Jf United States reall} wants to have a robot represent the nation as a strategic co You can't hope to win hearts and minds by sticking armed robots in the face t1 occupied population (Armstrong 2007).

• It has been suggested that a country engaged in risk-free war will put Its d population more at risk from terrorist attacks at home and abroad (Kahn 2002) • It is more like policing-a term used for the Kosovo war-but policing reQ different set of rules than war; for e.,ample collateral c1vilian deaths are unacce

for policing. Those suffering from policing need to be demonstrably morally (Kahn 2002).

• There will clearly be proliferation (the indications are already there), and SO risk-free state could be short lived. As Chief Engineer Canning has pointed out· happens when another country sees '~hat we've been domo realizes it's not that

('II ? and begins to pursue it, too, but doesn't have the same moral structure \\l' do

will see a number of countries around the world begin to develop this techn

123

. hut possibl}' without the same lewl ol !Hlfeguard!l thc1t \vt.' might build-in.

111eir o\'11

' ld be facing our own distorted 1magc on the battlchcld" (( anning 2005). ()()

, ({lll

\\l~s · 1· . I 0

toncern is that when we say robot weapons save live\, we unp tot y mean \relate ., of our soldier~ and their allies Of course, in the middle of a vicious war,

thC liH~ onl~ ·e want. But Jet us not forget that such sentiment) allow us to hide from

h wh.tt '~ . tll31

1 , fact that the robot weapom could take a disproportionate toll of hves J,cs I lC •

ou!W h r '>ide including many innocent civilians. Autonomy could greatly mcrease on theot e • fatal errors.

The Problem of Proportionality 7.4

d. g 10 the law!~ of war, a robot could potentially be allowed to make lethal

AcfOf 111 I h .,. 0

, iding that the noncombatant casualties were proportiona to t e m1 1tary er!Of~, pr . •

tage ,,3111ed But how is a robot supposed to calculate what is a proport•onatc 11han c- •

1r11ere is no sensing or computational capability that would allow a robot rc)ponse. . . ~uch a determination. As mentioned for the discrimination problem _descnbed ea.rher, computer systems need clear specifications in order to operate effectively. :here ts no mown metric to objectively measure needless, superfluous, or disproportionate suf­

fenng." It requires human judgment. ~0 clear objective means are given in an} of the Jaws of war for how to calculate

what is proportionate (Sharkey 2009a). The phrase "excessive in relation to the con­cMe and direct military advantage expected to be gained" is not a specification. I low

can such values be assigned, and how can such calculations be made? What could the mttnc be for assigning value to killing an insurgent, relative to the value of noncom­batants, particular!\- children, who could not be accused of willingly contributing to Insurgency adivitv? The military says that il is one of the most d1fficult decisions that a commander has to make, but that acknowledgment doe~ not answer the question of What metrics should be applied. It is left to a military force to argue as to whether or not 11 has made a proportionate response, as has been evidenced in the recent

lsraeh-Gaza conflict (Human Rights Watch 2009). Uncertain!\" needs to be a factor in anv proportiOnality calculuc;. I'> the intelligence

correct, and i~ there really a genuine target in the J...ill zone? The target value must be l\<:lghted by a probabilitY of presence/absence. rhis is an impossible calculation unle~s th(• l , b

arget is visually identified at the omet of the attack. Even then, error'> can e lllactl• l'h · · I f · · e mvestigative ,·ournalist Seymour llersh g•ves the examp e o a man Ill Afgh la

anistan being mistaken for bin Laden by (II\ Predator operators. A Hellfire was Unch d I · lh . e , k1lling three people who were later reported to be loca men scavengmg tn

\til \'ioo<h for scrap metal (llersh 2002, 66). This error was made u'iing a robot plane lh a human in the loop. There is also the problem of relving on informants. fhe

realtech
Highlight
realtech
Highlight
realtech
Highlight
realtech
Highlight
realtech
Highlight
realtech
Highlight
realtech
Highlight
realtech
Highlight

124

reliability of the Informant needs to be taken into account, and so does th of each link in the chain of information reaching the informant before e onto the commander/operator/pilot. There can be deliberate deception along the information chain, as was revealed in investigatiom of Operation the U.S. assassination program-after the Vietnam War. As Hersh pointt<J

of the thousands on the assassination list had been put there by South officials for personal reasons, such as erasing gambling debts or quarrels.

It is also often practically impossible to calculate a value for actual m tage. This is not necessarily the same as the political advantage of creattng mili tary success by putting a face to the enemy to rally public support at

1

to boost the morale of the troops. Obviously there are gross calculations the extreme, such as a military force carrying weapons sufficient to kill the of a large city. fhen, it could be possible to balance the number of against the number saved. Military advantage, at best, results in rlete"ence of

from acting in a particu lar way, disruption of the social, political, economic, functions (or a combinallon of these), and destmction of the social, poli tical, or military functions (or a combination) (Hyder 2004, 5). Proportionality

should be based on the likely differences in mHilary outcome if the ml killing innocents had not been taken (Chakwin, Voelkel, and Scott 2002).

Despite the impossibility of proportionality calculations, mili tary a t war have a political mandate to make such decisions on an almost Commanders have to weigh the circumstances before making a ultimately it will be a \UbJective metric. Clearly the extremes of wiping out city to eliminate even the highest-value target, sa} Osama bin Laden,

the question. So there must be some subjective estimates about just innocent people killed equal the military value of the successful completion mission.

Yes, humans do make errors and can behave unethicall}', but they can held accountable. Who is to be held responsible for the lethal mishaps of Robert Sparrow argues that it certainly cannot be the machtne itself, is not legitimate to use automated killing machines (Sparrow 2007).

way to punish a robot. We could just switch it off, but it would not care about that than my washing machine would care. Imagine telling your machine that if it does not remove stains properly, you will break Its door

you expect that to have any impact on its behavior? There is a long associated with robots: the manufacturer, the programmer, the Department of Defense, the generals or admirals in charge of the n nt'UI" ...

the operator. It is thus difficult to allocate responsibility for deliberate war or even mishaps.

125

t the outset how k1lling is made ea~ter for combatants when the distance

\,'l:di~ftJ')ed a and their enemies is mcreascd. Soldiers throughout history have thCrtl

bt'"'ffll difficult to IJII at close rang(· when the} can clearly see whom they a_re

fj)llnd It e whether physical or psychological, helps to overcome the twill Ot~tanc ' .

~ 11118· f fear of being killed and resistance to killing that particularly dog the f(lblelll~ 0

p . antn.· set to change the way that wars are fought by providing flexible "stand-RobOts are . .

batants. 1 hev provide the ult1mate distance targetmg that allows warnors • ror corn •

IllS •. 1·mng from the comfort of an armchair in their home country-4!ven

dO thCIT " . . 10 nd) of miles awa} from the action. Robots are developmg as a new kmd of thousa ethod different from what has come before. Unlike missile or other projec-IJ&ht108 m .

bOts can carry multiweapon systems Into the theater of operations, and act ~m . flt \lbl) once in place. Eventually, they may be able to operate as flex1bly as human combatants, without risk to the lives of their operators that control them. However,

011 we discussed, there is no such thing as risk-free warfare. Apart from the moral risks

discussed, asymmetrical warfare can also lead to more insurgency and terrorist activity,

threJtcning the dlizens of the stronger power. The biggest changes in warfare will come with the further development of autono­

mou~ military robots that can decide who, where, and when to kill, without human ln\'Oivement. There are no current international guidelines or even discussions about the use~ of autonomous robots in v.arfare. These are needed urgently, since robots

simply cannot discriminate between innocents and combatants. If there was a strong political will to u~e autonomous robot weapons, or even a

strrous threat to the state that has them, then legal arguments could be constructed that l~ve no room for complaints.' This is espec1ally the ca!>e if they could be

asro somewhere where there is a fairly high probability that they will kill a con$lderably greater number of enemy combatants (uniformed and nonuniformed} than.

Innocents (i.e., the civilian death toll was not disproportionate to the military d\'ilntage).

i At tht.' \'ery least. it c,hould be discw;scd how to limit the range and action of ~tonumous robot weapons before their inevitable proliferation (forty-three

Untrics n . . . here 0"' have military robot programs). fven 1f all o f the elements d1scussed neeu coutd be accommodated within the exi'>ting laws of war, their application

s to beth to n ought through properly, and ~pecific new laws should be implemented ho\v

01 Just accommodate their usc, but to constrain it as well. We don't know ~~ .

d 1 mous robots will affect military strategy of the future, or 1f they will 0 rno -"ats. re subJugation of weak nation-states and less public pressure to prevent

1

realtech
Highlight
realtech
Highlight

126

Notes

I. Sec du Picq 1946. The book was compiled from notes left hy Colonel Ardant du l'ic after he was killed In battle by a Pru~~ian projectile in 1870. q Of

2. Dt'Capitallon is a euphemism for a~sasslnation of suspected insurgent leaders. The "'t>rd tatitm was used to indicate cutting off the head (leader) from the body of the ln~urgents.

3. Thanks to Richard Moyes of Landmine Action for pointing me to the BLU-108 and to

Westerberg and Robert Buckley from Textron Defense Systems for their careful reading and ments on my description.

-1 . Contract IIW911 NF-06-1-0252 from the U.S. Army Re~earth Office.

5. Dover, Delaware, is the U.S. Air Force base where the bodies of soldiers are returned

front lin e in flag-draped coffins. The Dover test concerns how much the electoral chances

national political administration are affected by the numbers of dead.

6. Bug~plat software and its successors have been used to help calculate the correct

usc to destroy a target and calculate the impact. It is only used to help In the human

making process and it is unclear how successful this approach

casualties.

7. Regardle)S of trcatie~ t~nd agreements, an}' weapon that has been developed may be the survival of a state 1s in quesUon The International Court of justice Nuclear Wedpon.s Opinion (1996) decided that 11 could not definitively conclude that In every cl rcumstanat

threat or usc of nuclear weapom was axiomatically contrary to international law; ~-eSt

and Lewis 2005.

References

Alston, Philip. 20 I 0. Rcp<Jrt of tile Spt•d11/ Reporter un Extrajudicial, Summctry, or Arbitrd~ Ex The U Hurmm Right~ Council, fourteenth session, A/11Rt./14/24/Add.6, May 28

Arkin, Ronald 2009. Goltming Letlwl Beltal'ior in Autonomuu~ Robots. Boca Raton, Fl · Chi .... t and llali/CRC. Press.

Arkin, Ronald, and Ulia \loshkina. 2007. LethJiity and autonomou' robots: An cth•ca1

Paper presentt>d at the lfl.F International S)·mposium on fl'lhnology and Soclet}, June

Vegas.

Arm~trong, Matthew. 2007. Unintended comequl'nces of unmanned warfare.

Proteus Management C.roup Future~ Workshop at the U.S Arm-. War C ollege, (,.ull•le Penmylvania, August I 5.

Canning, John. 2005. A dl'finiti\'e work on factors impacting the arming of unmano('d

Dahlgren Divi~ion :--:a val Surface Warfare Centtr report :\SWC DO I R-05 36

mou\ W\tem~. l'rescrnalion for f • tlom for armed autnno

zoO~> \ conct-pt o 01~r.t . !Ill'· Johll \\arfarc Center l)ahlgrcn Oln\1011

(;IIlii I <;uri·•~ I • 2004 \concept fm the opera-..;3\.l · II d and { J Blake tx... f

1~t · l v .. Rigg~. 0. r. Ho an , • ntcd at the \wx:latlon or John ' < n the battleheld. l'apcr prc~e

Cl1'otnll• I .nawnomous sy~tems > • . Anaheim C.A. >\ugu\t 17. ·""' Ql arflll.'l I ~,·stems lntemational conference. ~r· od Vchl~ ~ • J · nt l'orces 'ltaff

fll~onc . S II 2002 I eaders as target~. oa to 1)\eter Voelkel, and Fnrlght co ·

. Mark (.1\J\i.l~lll• · II. \ \ seminar 1108. .. 1

Uell'" :-.orfo . . rt of 'Java\ Operations ational Researth Couno . co \ lttlnomou~ Vehicle~ In Suppo DC' rhe National AcademiCS (pflln'lttee on l I '•hide\ Ill Support o(Nc•wll OpemtiOII\. Washington. . 'Jfi)' \IILOIIOIIUilll c

·n; robOt. Vc(cme fitS V· World's first operational ~ecun.;

ld zoo!\ G-:-.IUS Guardlum UG . . . I -nius-guardium-ugv-world~-ftr~t-CrJIW. OaV • ? 3 <http:lfwww.defen~creVIeW.COm g ~itll (Augu,tl' -·. 'ty obot/> (acce~scd April 3, 2011).

31-autonomous-secun -r \.filiwn· Rn·itw Ouly-opeauon d ding rear's effect on unit effectwene~s

dl Gregon. 2oo-1 . Un erstan Diid ~. -Augu\t): Z?r27. h'ne' Anntd Fora~ Jounral <http://www.

.,oo- Who decide~: Man or mac I .

!).l\ ' Oamel - _' 7 {1l/30.i67S3> (acce,sed \pril 2. 2011 ). aunt-dlorceslournal.coml200 PA· Military ~ervice Publi~hing

. . ) 2 ·hapter I Harrl\burg, . du I'Kq, Ardant. 19-16. Rattle Sllldle\. I art • c .

Co. t build an ethical rolx>IIC soldier. April 17. Tilt Economist 2007. Robot wars: An attempt o ·//

d •0 February 27. <http. .. bot vehicle nearly rea y to g .

fo\ ~cws. 200!1 Pentagon\ ucrusher ro I ( essed No\'embcr 27, 2010). 0 2933 332755 00 htm > ace •

\\'\\'\\.foxnew\.cuml~tor~ I • - · · • • · . . . _ ~09. . mbin '· Theologrcal ~n'd'e' 2.i. 261 .

ford, Johns. \944. I hc morality of obliteration bo K , . . • t o Leamiu~ ro Kill "' ~\ar ami SQc.lety.

ronman, Oa' ll IQ9S Ou Killm.~: The Psp.lwlog1Cc11 Ctl\ I ,

Ne:\\ York: l.ittk, BrO\' n ,10d Co. . )t ter-.. tlnn·~ new \tratcgy in the war agaan

lltr$h, ~ymour. 2002 \lanhunt: I he Bush admlmstra

r rhm. ,\;1:\\ ~ilrkfr tDl'I.Cmbcr): M-6tt . t M 'II ill Batik. London: < asscll.

ltolmes, Richard. 200\ \L ts of Wc1r: rill' Bellawour 0 1 < • • . . tl b hr!lt'li Dwm·-1 ,,mcht·cl \11\llles.

Human R1ght' \\atch. 2009 Prt'Ci,rh H'nmg; <11m 1 Unlwm Ktlle >

New York: lluman R1~lm Watch h ~ . . . r leu dill!/ IIWIIIY Lemkr~hlp Monograp I lyder, Victor. 2<Xl4. l>wtpiratitm Opt:rc111011\: < nterw fo g • U 'ted Sate\ •\rrm ( ommand rePott Fort I I L·C' ~··h~>l of Ad, anced !>.hlitaf} Studle\ nt ' ' .('iJ\en\\Otl 1 "-': ·"- "'

lnd General Stilff < ollq;c. I I' p 11 r Quartaly 22 (3):

~"· Paul. 20()2. I he parado\ ol risk.IC\\ war. Philmuphy ""'J I'll' IC II l


Recommended