Information Vandalism

From SI410
Jump to: navigation, search
Information Vandalism has become a major source of false information, especially on sites like Wikipedia
Back • ↑Topics • ↑Categories

Information Vandalism is the willful destruction or defacement of public or private information. Similar to vandalism as it refers to the destruction of public or private property, information vandalism is done in the same vein but in the world of information. Information Vandalism can take the form of adding or deleting/modifying text, pictures, and other content. The purpose of this vandalism can be an attempt to mislead, act malicious, or for satirical entertainment. The prominence of information vandalism is correlated with wikis and their popularity. Wikipedia pages and other collaborative editing sites are where information vandalism originates, but other websites and media company perpetuate information. As wikis rise in popularity, the importance of regulating their content as well as preventing this vandalism has shown to be incredibly important. Celebrities, companies, politicians, and others have all become victims of these acts, and their consequences can have serious effects. Information vandalism brings up a variety of ethical issues including anonymous users, bots, editor bias, and damaging reputation.

Wikipedia

Background

As the world's largest wiki and the fifth most popular site both globally and in the US, Wikipedia controls a tremendous amount of information [1]. More and more people rely on Wikipedia as a credible source of information and as it continues to produce relevant information, the success of wiki pages is contingent on the the information being up to date and accurate. Over time, Wikipedia has been the source of major information vandalism incidents. Having both a large network while also being an open edit platform has made it a target for vandals to infiltrate. Wikipedia has incorporated systems to both combat vandalism while also preventing attacks from happening to begin with.

Prevention

Autonomous Agents

One of the most important tools that Wikipedia employs to keep the wiki protected from vandalism is autonomous agents. Autonomous agents are intelligent agents that act according to guidelines given by an owner but without direct interference from the owner. Wikipedia employs autonomous agents by using bots to edit, review, and detect possible vandalism. One of the bots Wikipedia employs is ClueBotNG. By using predefined instructions, the bot looks for inappropriate language, tries to maintain proper style and formatting, and other behind the scenes type of work[2]. There are hundreds of bots employed by Wikipedia, and although they have a high success rate for correcting errors, they sometimes flag false positives which can cause an issue if not reviewed by human editors quickly.

User Moderation

User moderation is the practice of allowing a user to moderate any other user's contributions. This feature is heavily utilized in platforms such as Reddit and Wikipedia, where content on the platform relies upon (semi-)anonymous participation. Moderation is usually implemented hand-in-hand with a system of measuring and recording a user's influence or prestige. On Reddit, this can be quantified as "karma", points rewarded in return for quality content by other users. Users with more clout are generally seen as more trustworthy and worthy for consideration of moderation abilities. Depending on the platform, there are either a select few users with the ability to moderate or the power is given democratically to all users.

Rules and Policies

In efforts to sustain the open edit policy, Wikipedia maintains a number of rules, policies, and guidelines to help maintain a smooth running of the website. Some of the most fundamental policies include formatting and content rules for all articles. These include length and styling rules along with what should be covered in each of the articles created. One of the major rules on Wikipedia is the transparent change history. By having all changes be documented by users, it's easy to identify where the vandalism is originating from so that appropriate action can be taken. [3]. Additionally, there is a hierarchy on Wikipedia that allows for more control for involved and trusted users. Users can be promoted to a higher status on the wiki based on previous contributions and involvement, but also can be demoted if a consensus is met that a user is not fulfilling their duty in their respective role. This "checks and balances" approach allows appropriate users to gain more control but also allows the general user the ability to have influence as well.

Notable Incidents

John Seigenthaler, Sr.

John Seigenthaler, Sr. was a prominent writer and political figure in the United States who served as the editorial director of USA Today from 1982 to 1991. In 2005, an anonymous user vandalized Seigenthaler's Wikipedia page with false information about Seigenthaler being a suspect in the Kennedy assassinations. This vandalism went unnoticed for four months and lead to Seigenthaler filing a lawsuit for defamation against Wikipedia [4]. This case of vandalism was a shocking insight into how incorrect information can stay up for large periods of time if it is not occurring on popular pages. Although Seigenthaler declined to pursue a lawsuit against the anonymous user who vandalized his page, the result of this incident was increased policies in regards to biographies and a more intense editorial process for articles including reference requirements.

The Hillsborough Disaster was a human crush that took the lives of 96 people

Hillsborough Disaster

The Hillsborough Disaster was a human crush that occurred in 1989 during a football match between Liverpool and Nottingham Forest. The disaster was responsible for the death of 96 civilians. As a result of ineffective pressure control in the football stadium, an influx of fans was in the stadium and resulted in a severe crushing [5]. In the aftermath of the disaster and as increasing scrutiny came upon law enforcement, revisions were made to the Hillsborough Disaster Wikipedia page in order to shift the blame. Statements such as "Blame Liverpool Fans" and others appeared on the page. After investigations by the Liverpool Echo, the source of comments was found to be connected to the government [6]. The vandalism was done in order to shift public perception, and was an attempt to relieve pressure from law enforcement officials who had acted improperly. The power that this vandalism had was enlightening and was a red flag about the necessary monitoring that must be in place on wikis.

Ethical Issues

Anonymous Users

The ability for wiki users to remain anonymous is a major debate within wiki communities. Being able to pinpoint vandalism and prevent past perpetrators from getting access to the wiki can be difficult, especially when their origins are unknown. A major function of anonymity is the ability for someone to act in a way that would be different if their identity is attached. In regards to wikis, having the ability to vandalize without being identified results in little to no consequences for the user. On the other hand, being anonymous might allow users to freely edit without being questioned or targeted due to unnecessary circumstances such as preconceived negative assumptions or prejudice. It is difficult to find a balance between creating an environment that allows editors to act without bias and influence but also make sure they are following all the rules and guidelines. A potential improvement to the current structure may be adjusting anonymity depending on the level of editorial control or what content is being altered. Adjusting policies according to a situation is an approach that can help resolve the complication that results from anonymity. Kathleen Wallace writes about how approaches like these can be successful when dealing with anonymity, explaining that "Because there are many forms of anonymous communication and activity, and a variety of purposes that anonymity may serve, it may be important to distinguish what type of communication or activity is involved, rather than have a single legal policy or ethical stance toward anonymity."[7]

Bots

The use of bots can help crack down on vandalism, however, their use comes with questions on their regulation and are whether there are biases in the algorithms they use. As bots are beginning to take over the majority of anti-vandalism fighting, it is important to look at the design of them as well as the way they execute their tasks. Bots can be easily designed to catch grammatical mistakes, but it becomes more complicated when coding a bot to check for accuracy and respectfulness of content. There is debate on how sufficient bots are at recognizing online misconduct due to their inability to read sarcasm and messages more obscure than a few flagged profanity words. It is also difficult to know exactly how a bot will interact with information on a case by case basis. A study has shown that simple grammar bots employed by Wikipedia have gotten into editing conflicts with other bots[8]. This demonstrates the potential for unintended behavior of bots and a need for thorough testing and regulation. Wikipedia has a bot policy, which requires bots to get approval for performing specific tasks. In the context of vandalism, Bots perform surveillance on its users in order to prevent future vandalism. By performing profiling and reputation analysis among other actions, bots can begin to target certain users and monitor their behavior to make sure they do not vandalize [9]. For example, new users may be more likely to be flagged as vandals. The effect of this is that certain types of users may be discouraged from editing and writing articles because their edits were reverted.

Editor Bias

Although the ultimate goal with Wikipedia is to have all people, regardless of age, gender, race, and other background contributing to an encyclopedia that all can trust and enjoy, at the moment Wikipedia editors are not diverse at all. Wikipedia is governed, monitored and edited by its users, and currently less than 10 percent of its users are female [10]. As a result of this disproportion, there have been several reports of harassment against female editors, and the current control is well in the hands of the male population. Beyond these direct negatives, there are also issues with the content that is produced as a result of this gender bias. In her paper, Critical Surveillance Literacy in Social Media: Interrogating Black Death and Dying Online[11], Sofiya Noble outlines how black death has been normalized, as opposed to white death which is typically hidden or less reported. Due to the predominant white makeup of media organizations, this perpetuation and normalization of black death have occurred. This case study is similar to the situation at Wikipedia. With a stark gender bias will come with inherent ramifications, whether intentional or not. Without having multiple perspectives and opinions, information will be shifted and start to be harmful towards people with other backgrounds. It is important that Wikipedia and other wikis make sure to get involvement from people with diverse backgrounds in order to maintain its neutral point of view.

Reputation

As Wikipedia and other wikis have grown, the success of their methodology has been noticed. The "invisible hand" approach taken by these wikis has allowed for a product that stays well regulated which results in trust among readers. Unfortunately, the approach is a double-edged sword of sorts as the wikis are liable to vandalism which can lead to the spread of misinformation. There are many pundits that feel that Wikipedia and other prominent wikis need to have more restrictions or a stricter editorial process in place. On the other side, the belief that wikis are unique because of the self-regulation and changing this structure makes it no different than other media or information source in existence. With Wikipedia being the sixth-most visited website in the world and Wikipedia articles showing up in the first page of 99% of internet searches, vandalized information on Wikipedia can spread very rapidly.[12]

The threat posed by information vandalism has rapidly evolved with the increased connectivity of the 21st century. In contrast to decades ago when misinformation could be addressed at a slower pace as it leaked, modern communications teams must be prepared to address harm to brand reputations in real time. Richard Webber identifies the key characteristics of a 'modern' communications team in order to be prepared to fight information vandalism in real time[13]. The skills include flexibility, preparedness, awareness, dialogue, transparency, authenticity, and humor. Webber suggests that modern communications teams must be willing to own up to a brand's actions whether positive or negative, as the world's newfound connectivity often leads to actions being exposed in the long run so brands are better off controlling the narrative.

At the pace in which information spreads due to rapid technological advancements in the last hundred years, it is more important than ever to emphasize the accuracy of information. Misinformation can be spread extremely quickly, and it can cause reputational damage to individuals, communities, and companies. Regardless of the truthfulness of the information, reputational damage can often be a permanent fixture to an entity's image, and is very often not easily undone. Information vandalism can tank reputations beyond repair, making it extremely dangerous to leave offenders unpunished and undeterred.

In Jonathan Zittrain's Lessons of Wikipedia, it is discussed how Wikipedia structure is similar to the traffic policy of the Dutch city Drachten[14]. In Drachten, all traffic signs and rules have been removed and as a result, the number of fatalities and injuries due to traffic accidents has decreased. Zittrain argues that Wikipedia is in a similar situation, claiming that with fewer rules and direct editing, the net result is a product that has well-researched, thorough articles without much bias at all. A lot of the trust that Wikipedia stems from their processes. Having this hands-off approach that is still regulated well is respected by its readers and allows for the belief that the content they are consuming is at the worst, a good basis for the first steps of research on a topic. However, without the proper vandalism protection or constant monitoring of the articles, it is concerning that to many, Wikipedia has become the de facto source for information globally. For people to look at Wikipedia as a leader for trust and value, it is important that they know how the site operates and what information they are actually consuming.

See also

References

  1. Conway, Paul. “Wikipedia as an Infosphere.” 12 Mar. 2019, Ann Arbor, Michigan
  2. Nasaw, Daniel. “Meet the 'Bots' That Edit Wikipedia.” BBC News, BBC, 25 July 2012, www.bbc.com/news/magazine-18892510.
  3. Conway, Paul. “Wikipedia as an Infosphere.” 12 Mar. 2019, Ann Arbor, Michigan
  4. “John Seigenthaler, Sr.” John Seigenthaler, Sr., www.cs.mcgill.ca/~rwest/wikispeedia/wpcd/wp/j/John_Seigenthaler%252C_Sr..htm.
  5. “How the Hillsborough Disaster Unfolded.” BBC News, BBC, 26 Apr. 2016, www.bbc.com/news/uk-19545126.
  6. Duggan, Oliver. “Exclusive: Shocking Hillsborough Insults Added on Wikipedia from Government Computers.” Liverpoolecho, 25 Apr. 2014, www.liverpoolecho.co.uk/news/liverpool-news/hillsborough-wikipedia-insults-added-government-7029881.
  7. Wallace, Kathleen. "Online Anonymity." In The Handbook of Information and Computer Ethics, edited by Kenneth Einar Himma and Herman T. Tavani, 165-188. Wiley, 2008. https://umich.instructure.com/courses/273552/files/9617526?module_item_id=590875.
  8. Tsvetkova, García-Gavilanes, Floridi, & Yasseri (2017) "Even good bots fight: The case of Wikipedia", journals.plos.org/plosone/article?id=10.1371/journal.pone.0171774
  9. Laat, Paul B. De. Ethics and Information Technology, vol. 3, no. 4, 2001
  10. Paling, Emma. “Wikipedia's Hostility to Women.” The Atlantic, Atlantic Media Company, 30 Oct. 2015, www.theatlantic.com/technology/archive/2015/10/how-wikipedia-is-hostile-to-women/411619/.
  11. Noble (2018) "Critical Surveillance Literacy in Social Media: Interrogating Black Death and Dying Online", muse.jhu.edu/article/694972
  12. "When Wikipedia tarnishes your online reputation." Reputation Defender. 31 July 2017. https://www.reputationdefender.com/blog/orm/when-wikipedia-tarnishes-your-online-reputation
  13. Webber, Richard. "Brand vandals: Reputation wreckers and how to build better defences." Journal of Direct, Data and Digital Marketing Practice. 19 June 2014. https://link.springer.com/article/10.1057/dddmp.2014.14
  14. Jonathan L. Zittrain, The Future of the Internet -- And How to Stop It. Yale University Press & Penguin UK. 2008.