Ethics Assignment
WHEN ROBOTS ATTACK!!
A Discussion on the Ethics of Autonomous Weapons by the Members of Team #27
In 2015, at the 24th International Joint Conference on Artificial Intelligence, the Campaign to Stop Killer Robots put forth a letter urging the scientific and military communities to come together in an effort to stop the growing threat of a new type of arms race, one of artificial intelligence, by pledging to put a ban on autonomous weapons. The letter received popular support from several big-name signatories, including Stephen Hawking, Elon Musk, Steve Wozniak, and Cornell University President Martha E. Pollack. “Autonomous weapons,” the letter begins, “select and engage targets without human intervention.”(Campaign, 2015) As memories of Philip K. Dick and Isaac Asimov flood through the minds of science-fiction fans, the reality is slowly beginning to take hold around us. Unmanned, armed drones can eliminate targets with limited human involvement from around the globe, sentry guns can identify and target aggressors autonomously, and today’s disarmament treaties have trouble gaining traction among the biggest players on the world stage. With all of the odds stacked against it, a ban on “offensive autonomous weapons beyond meaningful human control” seems destined to fail in the long run.
It is important to note that any ban agreed upon by a group necessitates a common definition of autonomy in the context of autonomous weapons. Currently, under Directive 3000.09, the Department of Defense defines an autonomous weapon system as “a weapon system that, once activated, can select and engage targets without further intervention by a human operator. This includes human-supervised autonomous weapon systems that are designed to allow human operators to override operation of the weapon system, but can select and engage targets without further human input after activation.”(Department of Defense, 2005) The phrase “select and engage” comes up frequently in discussions surrounding autonomous weaponry. Engagement is a term that leaves little room for ambiguity, encompassing any act of general military aggression. “Autonomous” engagement can then be defined likewise as an act of military aggression in which the offender is not directly participating in the engagement. Is a landmine an autonomous weapon, or simply its grandfathered predecessor? We need to consider the former qualifier for autonomy - selection - in tandem to get a clearer definition. Selecting a target implies the act of identifying targets prior to making such a selection from the available pool; however, the definition of a “target” itself is subject to the autonomous weapon system itself. Mark Gubrud, an adjunct professor in the Curriculum in Peace, War & Defense at the University of North Carolina addresses the issue: “Fortunately for present purposes, the word “target” appears in the Pentagon’s definition only as an object of the verbs “select” and “engage.” Therefore, we can say that an object becomes a target when it is selected; before that it may be a “potential target,” another term which appears in DoDD 3000.09… The word “select” is where the serious issues hide, and where the clarity of DoDD 3000.09 begins to break down”(Gubrud, 2014). Regardless, we can get the picture that the Department paints for us of a given autonomous weapons system, and of the decisions that the system must continually make: to interpret sensor data as an identified target, to engage an identified target, to select one of a number of identified targets and engage accordingly, and to abort engagement when necessary. All or most of these traits constitute an offensive autonomous weapon beyond meaningful human control, the target of our potential blanket ban. While it is easy to imagine the dangers surrounding such platforms that could necessitate such a ban, predicting its effectiveness will depend on the strength of the opposing argument.
Returning to the landmine analogy, we can predict the shortcomings of a generalized autonomous weapons ban by examining the more-specific Ottawa Treaty of 1997. The Mine Ban Treaty, as it was colloquially referred to, sought to “put an end to the suffering and casualties caused by anti-personnel mines”(Ottawa Treaty, 1997) worldwide. Landmines are generally defined as defensive autonomous weapons, as opposed to strictly offensive autonomous weapons such as drones and unmanned combat vehicles. To date, 164 states have signed the treaty, while 32 remain outside, including Russia, China, and the US. While the majority of the abstaining states “do not actually use or produce antipersonnel mines,” as claimed by the International Campaign to Ban Landmines, the organization that kickstarted the treaty, the US very much so continues to implement anti-personnel mines in active duty today, and has stood by their justification. In June 2014, Obama’s White House announced its intentions to comply with the majority of regulations specified in the treaty, with the glaring exception of the Korean peninsula. “Even as we take these further steps, the unique circumstances on the Korean Peninsula and our commitment to the defense of the Republic of Korea preclude us from changing our anti-personnel landmine policy there at this time,”(Obama admin, 2014) the White House announced in September of the same year. The case of Korea is unique and complex, and even as Supreme Leader Kim Jong-un makes promises to bridging a peace with President Moon Jae-in, the Demilitarized Zone between the two remains the most heavily militarized zone in the world. On the Republic of Korea’s side, “there is one guard post every 50 meters, two guards per post, and twelve shifts per day. With about 5,000 guard posts, in theory there are 120,000 man-years spent on guard duty each year”(GlobalSecurity.org, 2011). Setting the politics of the situation aside, this becomes an economic need that RoK’s military must meet, and which can be optimized. Enter a practical example of an autonomous weapons platform: Samsung Techwin developed the SGR-A1 sentry gun. The sentry gun is a relatively simple, practical implementation of the autonomous weapons system definition laid out above, in which sensors identify targets before a firearm (specifically the Daewoo K3 LMG) is aimed and fired autonomously. While the total produced unit count is classified, the reported specifications for the platform are surprising. “The SGR-A1 has a CCD and an infrared camera allowing it to detect and track targets at ranges of up to 4 km during the day and 2 km during nighttime. The SGR-A1 robot uses a low-light camera and pattern recognition software to distinguish humans from animals or other objects”(GlobalSecurity.org, 2011). It seems that we can make many connections between this platform’s sensor apparatuses and our own. With the need to defend the DMZ ongoing, South Korea faces a problem for which they may have potentially developed a solution that can benefit millions; the solution being autonomous weaponry.
Before our ban can be put to the floor, we have to consider and classify the differences between offensive and defensive autonomous weapon systems, and the myriad other challenges that would affect the feasibility of such legislation. Because landmines and anti-personnel explosives are generally installed in a specific location and are immobile, it is easier to argue that they serve the purpose of protecting property rather than invading, and fall into the ‘defensive’ category as a result. The SGR-A1 is defined as a stationary system, as its normal operations do not involve locomotion, and appears to constitute a defensive autonomous weapon by a quick interpretation, but we must also consider that these sentries can be deactivated and replaced elsewhere. Here, our idea of autonomous weapons becomes confused. How do we make the determination between strictly defensive and strictly offensive weapons, and how does this distinction carry over into legislation?
Of course, when we dissect the term “offensive autonomous weapon,” the image of unmanned armed drones and other vehicles is conjured most immediately. In many senses, these have been a boon to modern military orgs, as soldiers are continuously removed from the battlefield, saving a number of lives from combat, as well as eliminating grieving families and future veterans dealing with depression and PTSD. Unfortunately, this removal from combat may come with an emotional detachment as well. PAX, a founder of the Campaign to Stop Killer Robots, comments on the problem, “By being physically removed from the action, humans could become more detached from decisions to kill. A study in the psychology of killing shows that the absence of a psychological effect of combat can lower, or even neutralize, a soldier’s inhibition to kill.”(Ekelhof et al., 2014) President Obama expressed his concern for such a detachment at the highest executive level, “I think you could see, over the horizon, a situation in which, without Congress showing much interest in restraining actions with authorizations that were written really broadly, you end up with a president who can carry on perpetual wars all over the world, and a lot of them covert, without any accountability or democratic debate.”(Devereaux et al., 2016) Assuming US military technology advances accordingly, it’s not unlikely to assume that the capabilities of an offensive autonomous weapons fleet will only increase over time, potentially evolving into a necessity of future warfare that will be hard to give up as other states compete in the autonomous arms race. As a result, even if such legislation to ban offensive autonomous weapons were laid down on the international table, the US likely wouldn’t have issue going against the grain laid by the majority of states, as seen in the case of Ottawa. Without the full, unified support of the UN Security Council member states, the efficacy of such a bill would be very limited.
As a result of the numerous challenges associated with such a ban, any international legislation preventing the proliferation and use of offensive autonomous weapons would not be effective at limiting the potential harm associated with this new fleet of armaments. While states may agree to such a treaty on a global scale, total unity among all the necessary superpowers does not seem likely to effect change in meaningful proportions. Special cases exist, and always will, for which global powers will expect exceptions to be made, and passing on the current arms race is too great a cost for any one nation over another. These tools exist, and we can only limit who will use them. As Gubrud concludes, “The point of such negotiations should not be to draw a line defining autonomy; that has already been done…Philosophical clarity is not the issue, for we have already achieved that. From this point forward, it’s just old-fashioned horse trading.”(Gubrud, 2014)
Bibliography:
Campaign to Stop Killer Robots. (2015, July 8). Autonomous weapons: an open letter from AI & robotics researchers. Campaign to Stop Killer Robots. Retrieved from https://futureoflife.org/open-letter-autonomous-weapons/
Department of Defense. (2017, May 8). Autonomy in Weapon Systems (DoD Directive 3000.09). Washington, DC. Retrieved from https://www.esd.whs.mil/Portals/54/Documents/DD/issuances/dodd/300009p.pdf
Gubrud, M. (2014, May 9). Autonomy without mystery: where do you draw the line [Blog post] Retrieved from http://gubrud.net/?p=272
Convention on the Prohibition of the Use, Stockpiling, Production and Transfer of Anti-Personnel Mines and on their Destruction, Dec. 3, 1997
Obama admin creates Korean landmine exception (2014, Sept. 23). AP News. Retrieved from https://apnews.com/83a4cdf73a544b84b318efbc2636afc8
GlobalSecurity.org. (Nov. 7, 2011). Samsung techwin SGR-A1 sentry guard robot. Retrieved from https://www.globalsecurity.org/military/world/rok/sgr-a1.htm
Ekelhof, M., Struyk, M. (2014) Deadly decisions: 8 objections to killer robots. Retrieved from https://www.paxforpeace.nl/publications/all-publications/deadly-decisions
Devereaux, R., Emmons, A. (2016, Oct. 3). Obama worries future presidents will wage perpetual, covert drone war. The Intercept. Retrieved from https://theintercept.com/2016/10/03/obama-worries-future-presidents-will-wage-perpetual-covert-drone-war/