"Human out of the Loop" Military Systems

From SI410
Jump to: navigation, search
The Loitering Attack Munition (LAM) is capable of flying over a defined area and autonomously seeking out targets.

"Human out of the loop" military systems are those that are capable of full operation without the involvement of a human operator. Such systems can also be referred to as "man out of the loop" or "fully autonomous." Although this term can be applied to any autonomous military system, it is more commonly applied to systems that are capable of operating in a offensive or defensive manner, as opposed to autonomous military reconnaissance systems. This term is likely not to be used to describe semi-autonomous military systems, such as Predator UAVs, which are incapable of preforming many critical operations without operator input.

History

During WWII, Norbert Wiener and other engineers where working on a new weapon that needed no human intervention. The weapon was an anti-aircraft cannon that was able to:

  1. Perceive the presence of an airplane
  2. Gather information about its speed and trajectory
  3. Predict its future position a few seconds later
  4. Decide whether to aim and when to fire the shell
  5. Carry out that decision. [1]


After completing this weaponry project, Wiener stated that the emerging field of "Human out of the Loop" Military Systems (which was refered to as cybernetics in WWII) had "enormous potential for good and for evil." [2]

While not truly autonomous systems, research on unmanned aircrafts in the United States during the Cold War helped fuel the rise of autonomous military systems. [3] By the Gulf War, in 1991, unmanned systems had found their way into the United States military. By 1995, "unmanned systems were integrated with the Global Positioning System (GPS)," which, as retired Air Force Colonel Tom Ernhart described, was "when it really came together." [4] Unmanned drones were able to operate much more efficiently, as it was possible to know for the human operate to know exactly where the vehicle was it all times. Between 2002 and 2008, the United States' military budget increased by 74% with significant amounts of the additional spending going towards research on robotic systems. [5]

Current Trends

Within the past decade there has been a tremendous increase in the number of autonomous military systems being developed and deployed into areas of conflict.[6] Unlike the previous generation of semi-autonomous systems, which were designed to take action only if given the clear by a human operator (human in the loop), many of these newer systems are capable of using lethal force without operator input. Currently, nearly all "human out of the loop" systems operate in a defensive role, in which their autonomous use of lethal force is justified by their employment in situations in which human operators are unable to react quickly enough to take meaningful action.


Types of Systems

Sea-Based Automated Defenses

US Phalanx Close-In Weapons System (CIWS)

Sea-based automated defense systems were among the first modern automated defense systems to be developed. They are placed on warships, have a relatively simplistic fire control system, and have a variety of functions. In the United States Navy, every ship is equipped with Phalanx, a Close-In Weapons System (CIWS) that functions as a last resort mechanism to destroy incoming cruise missiles aiming for the ship. [7] The US Navy has also developed the Aegis Combat System which it has implemented on numerous carriers and destroyers. The system, which was designed to protect a fleet against any airborne threats, can track up to 100 threats at time and shoot down the ones it deems most dangerous without human involvement. [8] Improvements to the initial system have led to the development of the Aegis Ballistic Missile Defense System. This system, which is currently implemented in 22 U.S warships and 4 Japanese warships, has the ability to intercept short and medium range ballistic missiles fired at land-based targets. The system also has the ability to track and provide advanced warning of Intercontinental ballistic missiles. [9]

Currently Deployed Systems:

Ethical Implications

Of the automated defensive weaponry systems, the sea-based variants are perhaps the most likely to function as intended with minimal accidental human casualties, as the open ocean is a relatively simple environment, free from structures or terrain, or other non-ship objects that can confuse automated tracking systems. In spite of this, the possibility of error is still present. . The true ethical issues with these sea-based systems lies in their ability to automatically destroy incoming projectile. The primary issue with automated response is that incoming friendly aircraft are quite capable of flying in a manner that could be mistaken for an incoming projectile. Certainly, if the right circumstances aligned, such as malfunctioning tracking systems, disabled Friend-or-Foe systems, or environmental interference, such a system could fire on friendly forces.

Stationary Land-Based Automated Defenses

File:Patriot 08.jpg
US Patriot Missile Defense System

Much like their sea-based counterparts, stationary land-based defenses have the ability to automatically identify, track, and destroy incoming threats. However, unlike the sea-based defenses, these systems operate within a far more complex environment and have been developed to engage more types of targets, such as rockets and mortars. Israel's Iron Dome missile defense system is thought to be the most advanced system currently in use. It functions by detecting incoming missiles and reports information about the missiles' trajectory Battle Management and Control to determine if each individual missile is a threat. If the missile is determined to be a threat, an interceptor is launched to destroy it. Each battery in the system is functional for missiles up to 70 kilometers away. [10] The system successfully shot down 85% of the 400 missiles fired by militant groups in Gaza identified as threats during the November 2012 conflict between these groups and Israel. Israel and many others worldwide considered the system a resounding success. [11]

Currently Deployed Systems:

Ethical Implications

Relative to sea-based systems, there is an increased probability of these systems mistakenly firing due to the general increase in human activity over land. Of further note is the possibility of collateral damage even in a best-case scenario of the destruction of a hostile target. Many of these systems have precautions to prevent their own expended ordinance from falling back to earth and causing unintended damage.[12] However, as all of these systems destroy their targets mid-flight, it is possible for debris of the target to rain down on innocents below. Currently, these automated systems do not account for such circumstances and it would be possible for a perfectly functioning system to indirectly kill innocents that happened to be in the path of falling target debris.

The difference between countries that have a land-based automated defense system implemented and those that do not presents another potential ethical issue. As as the case in all military conflicts, technology plays an important role in determining the winner. In a case where one side has a system in place and the other does not, the side with the functional missile defense system in place could potentially fire missiles at the other side indiscriminately without having to worry about the easiest way for the other side to retaliate. While Gaza militants fired first in the November 2012 conflict between Israel and Gaza militant groups, the number of deaths in this conflict show the potential issues in the future with these kind of technologies. Israel, with its Iron Dome system in place, only had 5 citizens killed, while Gaza, without any kind of automated defense system, had at least 140 citizens killed by Israeli airstrikes. [13]

Mobile Land-Based Automated Military Systems

File:Trophy.jpg
Israeli Trophy Active Protection System Intercepting a RPG.

There exist numerous types of Land-Based Automated Military Systems with potential for a wide range of uses. Some types of Automated Defense systems are mounted on high-value ground vehicles, such as main battle tanks, that would likely be the target of RPGs, missiles, or other anti-vehicular rounds. These vehicles commonly operate within populated areas and are often supported by friendly infantry. These conditions are of importance due to these systems methods of destroying incoming enemy rounds, accomplished by detonating missiles near the projectile, or by firing shotgun-like rounds such that the incoming projectile is peppered with shrapnel.

Currently Deployed Systems:

Ethical Implications

Due to their mobile, vehicle-mounted nature, operation in extremely complex environments, and intended targets, these are the most ethically ambiguous automated defense systems.[14] Such systems would likely be deactivated when operating nearby friendly forces, yet this may not be the case when operating in urban environments- where such vehicles are far more likely to be attacked. As such, if fired upon, such vehicle-mounted systems could cause a great deal of harm to bystanders while attempting to defend the vehicle.[15]

Ethical Implications of Automated Military Systems

Groups such as the International Committee for Robot Arms Control argue that automated types of military systems pose a significant threat to "peace and international security." [16] Organizations such as these argue that automation accelerates warfare and undermines the ability of people to rational decisions. William Walach, a member of the Yale Interdisciplinary Center for Bioethics argues that "wars will be started very easily and with little cost." Others point out the difficulty humans have in picking out the difference between soldiers and civilians and argue that automated systems will have similar issues. [17]

Responsibility

One of the issues with autonomous systems is who is responsible for mistakes. Autonomous Military Systems share similar characteristics to Autonomous Vehicles The question is whether consequences from autonomous cars should fall on manufacturers, owners, or others. This debate is furthered when cars are considered autonomous agents and have the ability to learn and change themselves. Product liability laws are one venue of research on this dilemma that places responsibility for defects, injuries, and other unforeseen negative effects of sold products on any entities involved in the manufacture or selling processes. Being as autonomous cars are a tangible product, these laws may provide a framework for how responsibility can be regulated. [18]

Trolley Problem

The trolley problem is a thought experiment used by ethicists and designers to illustrate the sacrifices one must make in problematic scenarios. The problem is as such:

There is a runaway trolley heading down a track towards a fork. You are standing on the side of the track next to a lever that controls which track the trolley will take at the fork. On its current trajectory, the trolley will move straight along the track and kill five people that have been rendered immobile on the track. If you pull the lever, the trolley will switch tracks and kill one person that has also been rendered immobile on the other track. You may either:
  1. Do nothing, and let five people die.
  2. Switch the trajectory of the trolley, where it will kill one person.

What is the most ethical decision?

This thought experiment illustrates a dilemma facing autonomous military systems. Designers of such systems assume accountability for programming that will potentially take the life of a human being. Consequently, designers seeking to make ethically sound products must anticipate how such products will make decisions, such as whether five human lives are worth saving at the expense of one.

Reported Incidents Involving Automated Weaponry:

  • During a 2007 South African National Defense Force firing exercise, a MK5 automated anti-aircraft cannon malfunctioned and opened fire on friendly soldiers, killing nine and seriously wounding 14.[19]
  • In 2003, a US Patriot missile site automatically fired upon a British fighter jet, killing both pilots.[20] Days later, another Patriot site locked onto a US fighter, causing the pilot to fire upon and destroy the site’s radar.[21]
  • In 1991, a Phalanx system on the USS Jarrett automatically fired on countermeasures deployed from the USS Missouri, causing four rounds to strike USS Missouri.[22] No injuries resulted.

Autonomous Military Systems in the Media

Popular culture tends to portray "Human out of the Loop" systems negatively with films such as the Terminator series, I, Robot, and Eagle Eye. In each of these films, the automated system malfunctions and the machines built for defense end up threatening humanity with unrelenting violence.

References

  1. Bynum, T.W., "Milestones in the History of Information Ethics," in Himma and Tavani (2008), pp.25-48.
  2. Bynum, T.W., "Milestones in the History of Information Ethics," in Himma and Tavani (2008), pp.25-48.
  3. Singer, P.W., "Military Robots and the Laws of War." The New Atlantis. <http://www.thenewatlantis.com/publications/military-robots-and-the-laws-of-war>
  4. Singer, P.W., "Military Robots and the Laws of War." The New Atlantis. <http://www.thenewatlantis.com/publications/military-robots-and-the-laws-of-war>
  5. Singer, P.W., "Military Robots and the Laws of War." The New Atlantis. <http://www.thenewatlantis.com/publications/military-robots-and-the-laws-of-war>
  6. "Meeting the Increased Demand for Military Robots." Robotic Trends. 22 Apr 2009. Web. 09 Dec 2009. <http://www.roboticstrends.com/security_defense_robotics/entry/meeting_the_increased_demand_for_military_robots_an_interview_with_foster_m/>.
  7. Stoner, Robert. "The Story of the Phalanx Close-In Weapons System (CIWS)." NavWeaps . 30 Oct 2009. Web. 09 Dec 2009. <http://www.navweaps.com/index_tech/tech-103.htm>.
  8. "Aegis Combat System" Unofficial U.S. Navy Site. <http://navysite.de/weapons/aegis.htm>
  9. "Aegis Ballistic Missile Defense." Lockheed Martin. <http://www.lockheedmartin.com/us/products/aegis/aegis-bmd.html>
  10. "Iron Dome." Rafael. <www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=6&cad=rja&ved=0CE8QFjAF&url=http%3A%2F%2Fwww.rafael.co.il%2Fmarketing%2FSIP_STORAGE%2FFILES%2F6%2F946.pdf&ei=7UTAUMbQI5Hr0QHZ2oEw&usg=AFQjCNHho94dDKYLbqEj_RJMxg4ERGRmQA>
  11. Londoño, Ernesto. "For Israel, missile defense system represents breakthrough." The Washington Post. <http://www.washingtonpost.com/world/national-security/for-israel-iron-dome-missile-defense-system-represents-breakthrough/2012/12/01/24c3dc26-3b32-11e2-8a97-363b0f9a0ab3_story.html>
  12. "Active Protection Systems." Web. 5 Dec 2009. <http://www.globalsecurity.org/military/systems/ground/aps.htm>.
  13. Fisher, Max. "Who 'won' the Israel-Gaza conflict?" Washington Post. 21 November 2012. <http://www.washingtonpost.com/blogs/worldviews/wp/2012/11/21/won-won-the-israel-gaza-conflict/>
  14. "C-RAM: The Art of Winning the Peace?." Think Defense. 11 Apr 2009. Web. 11 Dec 2009. <http://www.thinkdefence.co.uk/2009/04/c-ram-the-art-of-winning-the-peace/>.
  15. "Trophy Active Protection System." Defense Update. Web. 09 Dec 2009. <http://defense-update.com/products/t/trophy.htm>.
  16. "Who We Are." International Committee for Robot Arms Control. <http://icrac.net/who/>
  17. Markoff, John. "War Machines: Recruiting Robots for Combat." The New York Times. <http://www.nytimes.com/2010/11/28/science/28robot.html?pagewanted=all>
  18. http://www.brookings.edu/research/papers/2014/04/products-liability-driverless-cars-villasenor#ftn26
  19. Engelbrecht, Leon. "Did Software Kill Soldiers?." IT Web. 16 Oct 2007. Web. 05 Dec 2009. <http://www.itweb.co.za/index.php?option=com_content&view=article&id=6157&catid=96:defence-and-aerospace-technology >.
  20. "Patriot Missiles Seemingly Falter For Second Time; Glitch in Software Suspected." GolbalSecurity.org. 26 Mar 2003. The Washington Post, Web. 05 Dec 2009. <Patriot Missiles Seemingly Falter For Second Time; Glitch in Software Suspected>.
  21. Guynn, Jessica. "2 friendly fire incidents renew concern over reliability of Patriot missiles." GlobalSecurity.org. 29 Mar 2003. Knight Ridder/Tribune News Service, Web. 05 Dec 2009. <http://www.globalsecurity.org/org/news/2003/030329-patriot01.htm>.
  22. "TAB H -- Friendly-fire Incidents." GulfLINK. Web. 08 Dec 2009. <http://www.gulflink.osd.mil/du_ii/du_ii_tabh.htm >.

(back to index)