Difference between revisions of "Information Vandalism"
(→Wikipedia) |
(→Prevention) |
||
Line 9: | Line 9: | ||
==== Autonomous Agents ==== | ==== Autonomous Agents ==== | ||
One of the most important tools that Wikipedia employs to keep the wiki protected from vandalism is autonomous agents. Autonomous agents are intelligent agents that act according to guidelines given by an owner but without direct interference from the owner. Wikipedia employs autonomous agents by using [[Wikipedia bots| bots ]] to edit, review, and detect possible vandalism. One of the bots Wikipedia employs is ClueBotNG. By using predefined instructions, the bot looks for inappropriate language, tries to maintain proper style and formatting, and other behind the scenes type of work<ref>Nasaw, Daniel. “Meet the 'Bots' That Edit Wikipedia.” BBC News, BBC, 25 July 2012, www.bbc.com/news/magazine-18892510.</ref>. There are hundreds of bots employed by Wikipedia, and although they have a high success rate for correcting errors, they sometimes flag false positives which can cause an issue if not reviewed by human editors quickly. | One of the most important tools that Wikipedia employs to keep the wiki protected from vandalism is autonomous agents. Autonomous agents are intelligent agents that act according to guidelines given by an owner but without direct interference from the owner. Wikipedia employs autonomous agents by using [[Wikipedia bots| bots ]] to edit, review, and detect possible vandalism. One of the bots Wikipedia employs is ClueBotNG. By using predefined instructions, the bot looks for inappropriate language, tries to maintain proper style and formatting, and other behind the scenes type of work<ref>Nasaw, Daniel. “Meet the 'Bots' That Edit Wikipedia.” BBC News, BBC, 25 July 2012, www.bbc.com/news/magazine-18892510.</ref>. There are hundreds of bots employed by Wikipedia, and although they have a high success rate for correcting errors, they sometimes flag false positives which can cause an issue if not reviewed by human editors quickly. | ||
+ | |||
+ | ==== Human Prevention ==== | ||
+ | Part of what makes Wikipedia so unique, is the open edit platform that allows anyone to make changes to Wikipedia pages. While this allows for deceitful users to wrongly input information and vandalize Wikipedia pages, this also allows for anyone to correct the page and input the correct information. | ||
==== Rules and Policies ==== | ==== Rules and Policies ==== |
Revision as of 11:18, 18 April 2019
Information Vandalism is the willful destruction or defacement of public or private information. In the same vein as vandalism, the destruction of public or private property, information vandalism is the equivalent in the world of information. Information Vandalism can take the form of adding or deleting/modifying text, pictures, and other content. The purpose of this vandalism can be an attempt to mislead, act malicious, or for the purpose of satire. The prominence of information vandalism is correlated with wikis and their popularity. Wikis and other collaborative editing sites are where information vandalism originates, but other websites and media company will perpetuate information. As wikis rise in popularity, the importance of regulating their content as well as preventing this vandalism has shown to be incredibly important. Celebrities, companies, politicians, and others have become the victim and these acts, and their consequences can have serious effects. Information vandalism brings up a variety of ethical issues including anonymous users, bots, editor bias, and damaging reputation.
Contents
Wikipedia
Background
As the world's largest wiki and the 5th most popular site both globally and in the US, Wikipedia controls a tremendous amount of information [1]. More and more people rely on Wikipedia as a credible source of information and as it continues to produce relevant information, the success of wiki pages is contingent on the the information being up to date and accurate. Over time, Wikipedia has been the source of major information vandalism incidents. Having both a large network while also being an open edit platform, has made it a great target for vandals to infiltrate. Wikipedia has incorporated systems to both combat vandalism while also preventing attacks from happening to begin with.
Prevention
Autonomous Agents
One of the most important tools that Wikipedia employs to keep the wiki protected from vandalism is autonomous agents. Autonomous agents are intelligent agents that act according to guidelines given by an owner but without direct interference from the owner. Wikipedia employs autonomous agents by using bots to edit, review, and detect possible vandalism. One of the bots Wikipedia employs is ClueBotNG. By using predefined instructions, the bot looks for inappropriate language, tries to maintain proper style and formatting, and other behind the scenes type of work[2]. There are hundreds of bots employed by Wikipedia, and although they have a high success rate for correcting errors, they sometimes flag false positives which can cause an issue if not reviewed by human editors quickly.
Human Prevention
Part of what makes Wikipedia so unique, is the open edit platform that allows anyone to make changes to Wikipedia pages. While this allows for deceitful users to wrongly input information and vandalize Wikipedia pages, this also allows for anyone to correct the page and input the correct information.
Rules and Policies
In efforts to keep the "anyone can edit" policy sustained, Wikipedia maintains a number of rules, policies, and guidelines to help maintain a smooth running of the website. Some of the most fundamental policies include formatting and content rules for all articles. These include length and styling rules along with what should be covered in each of the articles created. One of the major rules on Wikipedia is the transparent change history. All changes are documented by users and thus can be easy to identify where vandalism is originating from and taking appropriate action against certain users [3]. Additionally, there is a hierarchy on Wikipedia that allows for more control, the more involved and trustworthy you are with the project. Users can be promoted to a higher status on the wiki based on previous contributions and involvement, but also can be demoted if a consensus is met that a user is not fulfilling their duty in their respective role. This "checks and balances" approach allows appropriate users to gain more control but always allows the general user the ability to have influence as well.
Notable Incidents
John Seigenthaler, Sr.
John Seigenthaler, Sr. was a prominent writer and political figure in the United States who served as the editorial director of USA Today from 1982 to 1991. In 2005, an anonymous user vandalized Seigenthaler's Wikipedia page with false information about Seigenthaler being a suspect in the Kennedy assassinations. This vandalism went unnoticed for four months and lead to Seigenthaler filing a lawsuit for defamation against Wikipedia [4]. This case of vandalism was a shocking insight into how incorrect information can stay up for large periods of time if it is not occurring on popular pages. Although Seigenthaler declined to pursue a lawsuit against the anonymous user who vandalized his page, the result of this incident was increased policies in regards to biographies and a more intense editorial process for articles including reference requirements.
Hillsborough Disaster
The Hillsborough Disaster was a human crush that occurred in 1989 during a football match between Liverpool and Nottingham Forest. The disaster was responsible for the death of 96 civilians. As a result of ineffective pressure control in the football stadium, an influx of fans was in the stadium and resulted in a severe crushing [5]. In the aftermath of the disaster and as increasing scrutiny came upon law enforcement, revisions were made to the Hillsborough Disaster Wikipedia page in order to shift the blame. Statements such as "Blame Liverpool Fans" and others appeared on the page. After investigations by the Liverpool Echo, the source of comments was found to be connected to the government [6]. The vandalism was done in order to shift public perception, and was an attempt to relieve pressure from law enforcement officials who had acted improperly. The power that this vandalism had was enlightening and was a red flag about the necessary monitoring that must be in place on wikis.
Ethical Issues
Anonymous Users
The ability for wiki users to stay anonymous is a major cause for debate within wiki communities. Being able to pinpoint vandalism and prevent past perpetrators from getting access to the wiki can be difficult, especially when their origins are unknown. A major function of anonymity is the ability for someone to act in a way that would be different if their identity is attached. In regards to wikis, having the ability to vandalize without being identified results in little to no consequences for the user. On the other hand, being anonymous might allow users to freely edit without being questioned or targeted due to unnecessary circumstances. It is difficult to find a balance between creating an environment that allows editors to act without bias and influence but also make sure they are following all the rules and guidelines. A potential improvement to the current structure may be adjusting anonymity depending on the level of editorial control or what content is being altered. Adjusting policies according to a situation is an approach that can help resolve the complication that results from anonymity. Kathleen Wallace, in her book Information Privacy: Concepts, Theories and Controversies, writes about how approaches like these can be successful when dealing with anonymity, "Because there are many forms of anonymous communication and activity, and a variety of purposes that anonymity may serve, it may be important to distinguish what type of communication or activity is involved, rather than have a single legal policy or ethical stance toward anonymity."
Bots
The usage of bots can help crack down on vandalism, however, their use comes with questions on how they are being regulated and are whether there are biases in the algorithms they use. As bots are beginning to take over the majority of anti-vandalism fighting, it is important to look at the design of them as well as the way they execute their tasks. A study has shown that simple grammar bots employed by Wikipedia have gotten into editing conflicts with other bots[7]. This demonstrates the potential for unintended behavior of bots and a need for thorough testing and regulation. Wikipedia has a bot policy, which requires bots to get approval for performing specific tasks. In the context of vandalism, Bots perform surveillance on its users in order to prevent future vandalism. By performing profiling and reputation analysis among other actions, bots can begin to target certain users and monitor their behavior to make sure they do not vandalize [8]. For example, new users may be more likely to be flagged as vandals. The effect of this is that certain types of users may be discouraged from editing and writing articles because their edits were reverted.
Editor Bias
Although the ultimate goal with Wikipedia is to have all people, regardless of age, gender, race, and other background contributing to an encyclopedia that all can trust and enjoy, at the moment Wikipedia editors are not diverse at all. Wikipedia is governed, monitored and editor by its users, and currently less than 10 percent of its users of female [9]. As a result of this disproportion, there have been several reports of harassment against female editors, and the current control is well in the hands of the male population. Beyond these direct negatives, there are also issues with the content that will be produced as a result of this gender bias. In her paper, Critical Surveillance Literacy in Social Media: Interrogating Black Death and Dying Online[10], Sofiya Noble outlines how black death has been normalized as opposed to white death, which is typically hidden or less reported. Due to predominant white makeup of media organizations, this perpetuation and normalization of black death have occurred. This case study is similar to the situation at Wikipedia. With a stark gender bias will come with the inherent ramifications, whether intentional or not. Without having multiple perspectives and opinions, information will be shifted and start to be harmful towards people with other backgrounds. It is important that Wikipedia and other wikis make sure to get involvement from people with diverse backgrounds in order to maintain its neutral point of view.
Reputation
As Wikipedia and other wikis have grown, the success of their methodology has been noticed. The "invisible hand" approach taken by these wikis has allowed for a product that stays well regulated which results in trust among readers. Unfortunately, the approach is a double-edged sword of sorts as the wikis are liable to vandalism which can lead to the spread of misinformation. There are many pundits that feel that Wikipedia and other prominent wikis need to have more restrictions or a stricter editorial process in place. On the other side, the belief that wikis are unique because of the self-regulation and changing this structure makes it no different than other media or information source in existence. In Jonathan Zittrain's Lessons of Wikipedia, it is discussed how Wikipedia structure is similar to the traffic policy of the Dutch city Drachten. In Drachten, all traffic signs and rules have been removed and as a result, the number of fatalities and injuries due to traffic accidents has decreased. Zittrain argues that Wikipedia is in a similar situation in that with fewer rules and direct editing, the net result is a product that has well-researched, thorough articles without much bias at all. A lot of the trust that Wikipedia stems from their processes. Having this hands-off approach that is still regulated well is respected by its readers and allows for the belief that the content they are consuming is at the worst, a good basis for the first steps of research on a topic. However, without the proper vandalism protection or constant monitoring of the articles, it is concerning to many that Wikipedia has become the de facto source for information globally. Many can look at Wikipedia as a leader for trust and value, but it is important to know how the site operates and what information you are actually consuming.
References
- ↑ Conway, Paul. “Wikipedia as an Infosphere.” 12 Mar. 2019, Ann Arbor, Michigan
- ↑ Nasaw, Daniel. “Meet the 'Bots' That Edit Wikipedia.” BBC News, BBC, 25 July 2012, www.bbc.com/news/magazine-18892510.
- ↑ Conway, Paul. “Wikipedia as an Infosphere.” 12 Mar. 2019, Ann Arbor, Michigan
- ↑ “John Seigenthaler, Sr.” John Seigenthaler, Sr., www.cs.mcgill.ca/~rwest/wikispeedia/wpcd/wp/j/John_Seigenthaler%252C_Sr..htm.
- ↑ “How the Hillsborough Disaster Unfolded.” BBC News, BBC, 26 Apr. 2016, www.bbc.com/news/uk-19545126.
- ↑ Duggan, Oliver. “Exclusive: Shocking Hillsborough Insults Added on Wikipedia from Government Computers.” Liverpoolecho, 25 Apr. 2014, www.liverpoolecho.co.uk/news/liverpool-news/hillsborough-wikipedia-insults-added-government-7029881.
- ↑ Tsvetkova, García-Gavilanes, Floridi, & Yasseri (2017) "Even good bots fight: The case of Wikipedia", journals.plos.org/plosone/article?id=10.1371/journal.pone.0171774
- ↑ Laat, Paul B. De. Ethics and Information Technology, vol. 3, no. 4, 2001
- ↑ Paling, Emma. “Wikipedia's Hostility to Women.” The Atlantic, Atlantic Media Company, 30 Oct. 2015, www.theatlantic.com/technology/archive/2015/10/how-wikipedia-is-hostile-to-women/411619/.
- ↑ Noble (2018) "Critical Surveillance Literacy in Social Media: Interrogating Black Death and Dying Online", muse.jhu.edu/article/694972