Banality of Simulated Evil

From SI410
Revision as of 01:43, 11 September 2011 by WikiSysop (Talk | contribs) (1 revision)

Jump to: navigation, search
The problems associated with treating simulated evil like actual evil was demonstrated in the 1983 film "War Games."

"The banality of simulated evil: designing ethical gameplay" is a scholarly report written by Miguel Sicart. From the abstract, it "offers an analytical description of the ethics of game design and its influence in the ethical challenges computer games (and virtual environments) present."

Summary of Article

File:Adolf Eichmann.jpg
Eichmann on trial, from Wikimedia Commons

Sicart begins by summarizing a 1963 article in The New Yorker by Hannah Arendt, who documented the trial of Otto Adolf Eichmann, a worker for the German government during the Holocaust in Nazi Germany. Acting as a cog for the bureaucratic machine, Eichmann did not truly see the horrible consequences of his actions as he worked hard to make sure his tasks were complete. It is that separation of act and consequence that defines the banality of evil in Arendt's article. Per Sicart: "the Banality of Evil can be defined as a designed limitation of ethical agency in complex multi-agent, hierarchical systems." Thus Eichmann had no real moral decision to make since anything he did he would never see the final outcome of, being the death of millions of people. So he continued his work as he deemed it necessary, efficiently as well. In comparison, no one drop of water feels it is personally responsible for creating a flood.

This brings Sicart's question: "Are computer games systems of this kind?" The question is focused on popular violent and suggestively themed games such as Grand Theft Auto and Defcon. After analyzing or referencing several other games and virtual environments, violent or otherwise, he makes the following conclusions:

  • Games limit their players' ethical agency by the isolation of them from judgment on the ethicality of certain actions. Dependence, or in some cases Trust, is placed on a closed system, the game itself, for that judgement.
  • Consequentially, players' knowledge of the ethicality of their actions and their future actions are constrained to the limits of the game, which he declares as a type of Infosphere.
  • The preconceived positions on moral issues are still in play as players play games.

Sicart goes into heavy detail as to how this works out, making multiple references to aspects of Information Ethics. He then ends his report with the five criteria for ethical gameplay design, which are:

  • Creating an ethically relevant game world.
  • Have a world that reacts to player actions, and not one that quantifies their being.
  • Rigorously test the ethical capability of the player.
  • Have players/characters with similar complexity and ability to adapt.
  • Challenge players' creativity by introducing new scenarios that open and/or close doors.

Mechanics

File:BanalChart.jpg
Illustration of Sicart's concept of Information Ethics w.r.t. Gaming

Sicart, in the context of Information Ethics, defines a video game to be an Information System and/or Infosphere constrained within design. The player is an agent amongst many within the game. Scripting, if any within the game, manipulates the moral status in the game. Values internal and external do have a critical role, as games typically try to be a simulation (unless it is pure fiction).

He defines two Gradients of Abstraction:

  • 1.) Constrained to "direct interaction between agents and the state machine by means of game mechanics." (Procedural GoA)
  • 2.) "The game system as simulation and agents as ethical agents." (Semantic GoA)

Here, #2 is composed from #1, and all Levels of Abstraction can then be viewed through these.

Variants

The following are other aspects of the model laid out above. They may not be exact, but in the examples there is an element of separation between the agent(s) and the receiver of the agent(s)' actions. Often that separation involves utilization of technology.

  • In the military: the recent proliferation of autonomous or remote controlled air drones and robots to perform military operations has created a new dimension on the battlefield. As an example: the recent predator drone attacks in Pakistan carried out by the United States Military brings about a doubled edge sword. On one end, the enemy is dealt with without the risk of a single soldier on the ground. The other end however risks civilian casualties, which Pakistan has reported. Given how both countries handle news, the reality likely will not be known for a while yet. A better example would be the ground robots that are sent out to detonate explosives, but the ethicality of such an action is very much known.
  • In some combat situations where human reaction time is not sufficient to perform required tasks, fully autonomous "Human out of the Loop" Military Systems are employed. Such systems pose a range of ethical dilemmas as the decision to use deadly force can be solely left up to the system-governing algorithms.
  • Telecommunications: phone phreaking, and phone calls made to sexually harass or threaten the target as a couple of examples.
    • Obscene Phone Calls: An article from the psychiatric times explains possible rationale for people that make sexually harassing phone calls. It notes that it is difficult to trace the caller for arrest, and 1 in 5 complaints filed at police departments are women reporting men making such calls. Such behaviour becomes clinical when it repeats for 6 months, and a "preoccupation" with the activity. "Most obscene callers are troubled, immature men who are not dangerous." The anonymity is key, as that provides a layer of defense. That is precisely why this fits in with discussion about the banality of evil and technology in general.
    • Phreaking: or hacking of the telephone system, grew into an obscure fad amongst the technically adept of the latter half of the 20th century. The hobby has its roots in an article from Esquire magazine, and is itself a parent of modern day computer hacking. Coincidentally, one of the first couple well known Phreakers would be the founders of Apple Computer: Steve Jobs and Steve Wozniak. Later, a couple of hacking groups would develop, and eventually shut down by the Secret Service's Operation Sundevil. However, one of the most well known hackers, Kevin Mitnick, who until 1995 rode as a free criminal, gained access to several private databases and even obtained free bus rides through social engineering (yet that is not related to this topic). How this activity got off the ground has much to do with anonymity and the separation of the target from the agent.

See Also

External Links and Sources