Difference between revisions of "Parental Controls"

From SI410
Jump to: navigation, search
Line 8: Line 8:
  
 
Parental control software typically consists of customizing internet permissions on different accounts. <ref>‌U-M Weblogin. (2020, September 18). Weblogin.umich.edu. https://go-gale-com.proxy.lib.umich.edu/ps/i.do?p=STND&u=umuser&id=GALE%7CA635821966&v=2.1&it=r&sid=summon</ref> For example, parents' accounts typically can make changes for the entire household and experience no internet restrictions. Children are typically assigned accounts which the parents control.  
 
Parental control software typically consists of customizing internet permissions on different accounts. <ref>‌U-M Weblogin. (2020, September 18). Weblogin.umich.edu. https://go-gale-com.proxy.lib.umich.edu/ps/i.do?p=STND&u=umuser&id=GALE%7CA635821966&v=2.1&it=r&sid=summon</ref> For example, parents' accounts typically can make changes for the entire household and experience no internet restrictions. Children are typically assigned accounts which the parents control.  
 +
  
 
There are two ways in which parental control software functions:
 
There are two ways in which parental control software functions:
Line 17: Line 18:
  
 
==Ethical Issues==
 
==Ethical Issues==
The ethical issues regarding content moderation include how it is carried out, the possible bias of such content moderators, and the negative effects this kind of job has on moderators. The problem lies in the fact that content moderation cannot be carried out by an autonomous program since many cases are highly nuanced and detectable only by knowing the context and the way humans might perceive it. Not only is this job often ill-defined in terms of policy, content moderators are often expected to make very difficult judgments while being afforded very few to no mistakes.
+
//
  
===Virtual Sweatshops===
+
===//===
Often, companies outsource their content moderation tasks to third parties. This work cannot be done by computer algorithms because it is often very nuanced, which is where [http://si410wiki.sites.uofmhosting.net/index.php/Virtual_sweatshops virtual sweatshops] enter the picture. Virtual sweatshops enlist workers to complete mundane tasks in which they will receive small monetary reward for their labor. While some view this as a new market for human labor with extreme flexibility, there are also concerns with labor laws. There is not yet policy that exists on work regulations for internet labor, requiring teams of people overseas who are underpaid for the labor they perform. Companies overlook and often choose not to acknowledge the hands-on effort they require. Human error is inevitable causing concerns with privacy and trust when information is sent to these third-party moderators. <ref> Zittrain, Jonathon. "THE INTERNET CREATES A NEW KIND OF SWEATSHOP." NewsWeek. December 7, 2009. https://www.newsweek.com/internet-creates-new-kind-sweatshop-75751 </ref>
+
//
  
=== Google's Content Moderation & the Catsouras Scandal ===
+
=== //===
[https://en.wikipedia.org/wiki/Google Google] is home to a practically endless amount of content and information all of which is for the most part, not regulated. In 2006, a young teen in Southern California named Nikki Catsouras crashed her car, which resulted in her gruesome death and decapitation. On the scene, members of the police force were tasked with taking pictures of the scene. However, as a Halloween joke, a few of the members who took the photos sent them around to various friends and family members. The picture of Nikki's mutilated body was then passed around the internet and was easily accessible via Google. The Catsouras family was devastated that these pictures of their daughter were being seen and viewed by millions, and desperately attempted to get the photo removed from the Google platform. However, Google refused to comply with Catsouras plea. This is a clear ethical dilemma that involves content moderation as this picture was certainly not meant to be released to the public and was very difficult for the family, but because Google did not want to begin moderating specific content of their platform they did nothing. This brings up the ethical question of if people have "The Right to Be Forgotten" <ref> Toobin, Jeffrey. “The Solace of Oblivion.” The New Yorker, 22 Sept. 2014, www.newyorker.com/.</ref>.
+
//
  
Another massive ethical issue with the moderation of content online is the fact that the owners of the content or platform decide what is and what is not moderated. Thousands of people and companies claim that Google purposefully moderates content that directly competes with their platform. Shivaun Moeran and Adam Raff are two computer scientists who together created an incredibly powerful search platform called Foundem.com. The website was helpful for finding any amounts of information, it was particularly helpful for finding the cheapest possible items being sold on the internet. The key to the site was a Vertical Search Algorithm, which as an incredibly complex computer algorithm that focuses on search. This vertical search algorithm was significantly more powerful than Google's search algorithm, which was a horizontal search algorithm. The couple posted their site and within the first few days experienced great success and many site visitors, however, after a few days the visitor rate significantly decreased. They discovered that their site had been pushed multiple pages back on Google. This is because it directly competed with the "Google Shopping" app that had been released by Google. Morean and Raff had countless lawsuits filed and met with people at Google and other large companies to figure out what the issue was and how they could get it fixed but were met with silence or ambiguity. Foundem.com never became the next big search algorithm, partly because of the ethical issues seen in content moderation by Google <ref> Duhigg, Charles. “The Case Against Google.” The New York Times, The New York Times, 20 Feb. 2018, www.nytimes.com/. </ref>
+
//
  
  

Revision as of 16:52, 12 March 2021

Label that was used to warn parents about explicit content in music. [1]
Back • ↑Topics • ↑Categories

Parental control software is a kind of software that provides parents with the power to control what sites their children can see, when internet access is available as well as track the internet internet history of their children. Before the widespread use of the internet, children did not have the access to information that they do now. For example, explicit music came in CD’s with large warning signs which made it very straightforward for parents to see what content their children were consuming. Now that many children have access to the internet, the way in which parents monitor the content their children see has changed as well. Several companies have entered the business of parental control software as it is becoming widely adopted by parents worldwide. Increasingly, concerns are being raised in academia about the potential for abuse of parental control software by parents given the omnipresence that the software allows parents.


Overview

Parental control software typically consists of customizing internet permissions on different accounts. [2] For example, parents' accounts typically can make changes for the entire household and experience no internet restrictions. Children are typically assigned accounts which the parents control.


There are two ways in which parental control software functions:

  • Complete disablement to the internet involves having the ability to disable wifi entirely for children. This can range from disabling the wifi within a certain time frame every day to turning off the internet indefinitely.
  • Monitoring means that parents can have access to the complete browsing history of their children at any time. This allows parents to monitor how their children navigate the internet without hard boundaries.


Ethical Issues

//

//

//

//

//

//


References

  1. ‌Schilling, D. R. (2017, February 7). Parental Control for the Internet, TV & Voice Controlled Assistants. Industry Tap. http://www.industrytap.com/parental-control-internet-tv-voice-controlled-assistants/40715
  2. ‌U-M Weblogin. (2020, September 18). Weblogin.umich.edu. https://go-gale-com.proxy.lib.umich.edu/ps/i.do?p=STND&u=umuser&id=GALE%7CA635821966&v=2.1&it=r&sid=summon