Difference between revisions of "Parental Controls"

From SI410
Jump to: navigation, search
 
(93 intermediate revisions by 7 users not shown)
Line 1: Line 1:
[[File:Pa.jpeg|600px|thumbnail|Label that was used to warn parents about explicit content in music.]]
+
 
 
{{Nav-Bar|Topics#A}}<br>
 
{{Nav-Bar|Topics#A}}<br>
'''Parental control software''' is a kind of software that provides parents with the power to control what sites their children can see, when internet access is available as well as track the internet internet history of their children. Before the widespread use of the internet, children did not have the access to information that they do now. For example, explicit music came in CD’s with large warning signs which made it very straightforward for parents to see what content their children were consuming. Now that many children have access to the internet, the way in which parents monitor the content their children see has changed as well. Several companies have entered the business of parental control software as it is becoming widely adopted by parents worldwide. Increasingly, concerns are being raised in academia about the potential for abuse of parental control software by parents given the omnipresence that the software allows parents.  
+
'''Parental controls''' allows parents to control what websites their children can see, determine at what times their children have access to the internet, track their children's internet history, limit accessible content, block outgoing content, and many other things. <ref>J. D. Biersdorfer. [https://www.nytimes.com/2018/03/02/technology/personaltech/setting-up-parental-controls-on-pcs-and-macs.html "Tools to Keep Your Children in Line When They’re Online"] (2018, March 02). ''New York Times''. Retrieved April 01, 2021. </ref> <ref name=FTC> Federal Trade Commission. (2018, March 13). Parental controls. Retrieved April 06, 2021, from https://www.consumer.ftc.gov/articles/0029-parental-controls </ref> The introduction of parental control software has raised ethical concerns over the last decade. <ref name = who>Thierer, Adam. (2019). Parental Controls & Online Child Protection. The Progress & Freedom Foundation, 45-51. https://poseidon01.ssrn.com/delivery.php?ID=396031102089030101016075082091103088058033095009026094104027126098086093111106075097029031099028051096054088127017025008122107111073000085023006114113101117080113006077037031064068080020002099009101067004068104007107105022107078007111120115083101094&EXT=pdf&INDEX=TRUE</ref> Parental controls are most likely used between the ages of 7 and 16, but parents with “very young children or older teens often have very little need for parental control technologies.” <ref name="who"/> Ethical concerns include loss of trust between parents and children <ref> Managing Screen Time and Privacy | Could Parental Control Apps Do More Harm than Good? https://techden.com/blog/screen-time-privacy-parental-control-apps/ </ref> and a decreased sense of autonomy that leads to reduced opportunities for self-directed learning.  <ref> Controlling Parents – The Signs And Why They Are Harmful https://www.parentingforbrain.com/controlling-parents/ </ref> Additionally, various monitoring tools may be used with or without the child's consent. <ref name=FTC> Federal Trade Commission. (2018, March 13). Parental controls. Retrieved April 06, 2021, from https://www.consumer.ftc.gov/articles/0029-parental-controls </ref>
  
==Overview==
+
[[File:Phone.png|400px|thumbnail|Apple's built-in parental controls. <ref>‌amaysim. (2020, November 3). How to set up parental controls on your iPhone, iPad or Android device. amaysim. https://www.amaysim.com.au/blog/world-of-mobile/set-up-parental-controls-apple-android </ref>]]
  
Most types of moderation involve a top-down approach, where a moderator or small group of moderators are given discretionary power by a platform to approve or disapprove user-generated content. These moderators may be paid contractors or unpaid volunteers. A moderation hierarchy may exist or each moderator may have independent and absolute authority to make decisions.
 
  
In general, content moderation can be broken down into 6 major categories:<ref>Grime-Viort, Blaise (December 7, 2010). [https://www.socialmediatoday.com/content/6-types-content-moderation-you-need-know-about "6 Types of Content Moderation You Need to Know About"]. ''Social Media Today.'' Retrieved March 26, 2019.</ref>
 
  
* '''Pre-Moderation''' screens each submission before it is visible to the public. This creates a bottleneck in user-engagement, and the delay may cause frustration in the user-base. However, it ensures maximum protection against undesired content, eliminating the risk of exposure to unsuspecting users. It is only practical for small user communities otherwise the flow of content would be slowed down too much. It was common in moderated newsgroups on Usenet.<ref>''Big-8.org''. August 4, 2012. [https://archive.is/t2xJ "Moderated Newsgroups"].  Archived from [http://www.big-8.org/wiki/Moderated_Newsgroups#Who_can_force_the_moderators_to_obey_the_group_charter.3F the original] on August 4, 2012. Retrieved March 26, 2019.</ref> Pre-moderation provides a high control of what ends up visible to the public. This method is suited towards communities where child protection is vital.
+
==Overview==
  
* '''Post-Moderation''' screens each submission after it is visible to the public. While preventing the bottleneck problem, it is still impractical for large user communities due to the vast number of submissions. Furthermore, as the content is often reviewed in a queue, undesired content may remain visible for an extended period of time, drowned out by benign content ahead of it, which must still be reviewed. This method is preferable to pre-moderation from a user-experience perspective, since the flow of content has not been slowed down by waiting for approval.  
+
Parental control software typically allows parents to customize internet permissions on their child's devices or accounts on shared devices. <ref>‌The Business Insider. (2020, September 18). The best internet parental control systems. Newstex LLC. https://go-gale-com.proxy.lib.umich.edu/ps/i.do?p=STND&u=umuser&id=GALE%7CA635821966&v=2.1&it=r&sid=summon</ref> The account administrator, typically a parent, can change internet permissions for the entire household, while accounts under the administrator do not have that capability. This allows parents to establish rules for their children without having to physically enforce them.  
  
* '''Reactive moderation''' reviews only that content which has been flagged by users. It retains the benefits of both pre- and post-moderation, allowing for real-time user-engagement and the immediate review of only potentially undesired content. However, it is reliant on user participation and is still susceptible to benign content being falsely flagged. Therefore it is most practical for large user communities which have a lot of user activity. Most modern social media platforms, including [[Facebook]] and [[YouTube]], rely on this method. This method allows for the users themselves to be held accountable for any information available and for determining what should or should not be taken down. This method is more easily scalable to a large number of users than both pre and post-moderation.
+
The parental control software has become more prevalent recently. For example, basic parental control software now comes with standard operating systems, such as Windows. <ref>‌Microsoft. (n.d.). Parental consent and Microsoft child accounts. Microsoft. https://support.microsoft.com/en-us/account-billing/parental-consent-and-microsoft-child-accounts-c6951746-8ee5-8461-0809-fbd755cd902e
 +
</ref>
  
* '''Distributed moderation''' is an exception to the top-down approach. It instead gives the power of moderation to the users, often making use of a voting system. This is common on [[Reddit]] and Slashdot, the latter also using a meta-moderation system, in which users also rate the decisions of other users.<ref>''Slashdot''. Retrieved March 26, 2019.[https://slashdot.org/faq/metamod.shtml "Moderation and Metamoderation"]. </ref> This method scales well across user-communities of all sizes, but also relies on users having the same perception of undesired content as the platform. It is also susceptible to groupthink and malicious coordination, also known as brigading.<ref>''Reddit''. January 18, 2018. Retrieved March 26, 2019.[https://www.reddit.com/wiki/reddiquette#wiki_in_regard_to_voting "Reddiquette: In Regard to Voting"] </ref>
+
There are three main ways that parental control software functions:
  
* '''Automated moderation''' is the use of software to automatically assess content for desirability. It can be used in conjunction with any of the above moderation types. Its accuracy is dependent on the quality of its implementation, and it is susceptible to algorithmic bias and adversarial examples<ref>Goodfellow, Ian; Papernot, Nicolas; et al (February 24, 2017). [https://openai.com/blog/adversarial-example-research/ "Attacking Machine Learning with Adversarial Examples"]. ''OpenAI''. Retrieved March 26, 2019.</ref>. Copyright detection software on YouTube and spam filtering are examples of automated moderation<ref name="Injustice">Tassi, Paul (December 19, 2013). [https://www.forbes.com/sites/insertcoin/2013/12/19/the-injustice-of-the-youtube-content-id-crackdown-reveals-googles-dark-side/#5762095b66c8 "The Injustice of the YouTube Content ID Crackdown Reveals Google's Dark Side"]. ''Forbes''. Retrieved March 26, 2019.</ref>.
+
* '''Complete disablement''' of the internet allows parents to cut off their child's connection to wifi entirely during chosen time intervals. This can range from disabling their wifi during a scheduled time interval, such as at bedtime, to turn off their internet indefinitely, such as in instances of punishment.  
  
* '''No moderation''' is the lack of moderation entirely. Such platforms are often hosts to illegal and obscene content, and typically operate outside the law, such as The Pirate Bay and [[Dark Web]] markets. Spam is a perennial problem for unmoderated platforms, but may be mitigated by other methods, such as limited posting frequency and monetary barriers to entry. However, small communities with shared values and few bad actors can also thrive under no moderation, like unmoderated Usenet newsgroups.
+
* '''Content blocking'''   focuses on filtering the content that children can see, and different accounts might have different age-appropriate preferences set. For larger households with several devices, this allows for different children to view age-appropriate content as determined by the account administrator.  
  
 +
* '''Monitoring'''  means that parents can have access to the complete browsing history of their children at any time. This allows parents to monitor how their children navigate the internet without hard boundaries.
  
==Ethical Issues==
+
==History==
The ethical issues regarding content moderation include how it is carried out, the possible bias of such content moderators, and the negative effects this kind of job has on moderators. The problem lies in the fact that content moderation cannot be carried out by an autonomous program since many cases are highly nuanced and detectable only by knowing the context and the way humans might perceive it. Not only is this job often ill-defined in terms of policy, content moderators are often expected to make very difficult judgments while being afforded very few to no mistakes.
+
The emergence of parental controls was first known as a content filter. <ref> (Clark, N). Content Filter Technology. Retrieved 1 April 2021, from https://www.britannica.com/technology/content-filter </ref> Content filters are interchangeable with internet filter, which is a software that assesses and blocks online content that has specific words or images. Although the Internet was created to make information more accessible, open access to all different kinds of information was deemed to be problematic, in instances where children or younger age groups could potentially view obscene or offensive materials. Content filters restrict what users may view on their computer by screening Web pages and e-mail messages for category-specific content. Such filters can be used by individuals, businesses, or even countries in order to regulate Internet usage. <ref> Web Content Filtering. Retrieved 1 April 2021, from https://www.webroot.com/us/en/resources/glossary/what-is-web-content-filtering </ref>
  
===Virtual Sweatshops===
+
===Uses for Parental Control Software===
Often, companies outsource their content moderation tasks to third parties. This work cannot be done by computer algorithms because it is often very nuanced, which is where [http://si410wiki.sites.uofmhosting.net/index.php/Virtual_sweatshops virtual sweatshops] enter the picture. Virtual sweatshops enlist workers to complete mundane tasks in which they will receive small monetary reward for their labor. While some view this as a new market for human labor with extreme flexibility, there are also concerns with labor laws. There is not yet policy that exists on work regulations for internet labor, requiring teams of people overseas who are underpaid for the labor they perform. Companies overlook and often choose not to acknowledge the hands-on effort they require. Human error is inevitable causing concerns with privacy and trust when information is sent to these third-party moderators. <ref> Zittrain, Jonathon. "THE INTERNET CREATES A NEW KIND OF SWEATSHOP." NewsWeek. December 7, 2009. https://www.newsweek.com/internet-creates-new-kind-sweatshop-75751 </ref>
+
====Protection from Harmful Content====
 +
The vastness of the internet places a heavy burden on parents trying to protect their children from harmful content. Because children might not have the skills to successfully and safely navigate online environments, parental controls can be a helpful tool to guide them. While the internet is an integral part of children's schooling, the internet also makes available potentially traumatic content that these children would not otherwise see. Parental control software offers parents the ability to control what content their children have access to, even when they are not physically present to monitor them. There is evidence that parents who are involved in their children’s internet use in some way are more likely to encourage safe internet practices. <ref>Gallego, Francisco, A. (2020, August). Parental monitoring and children's internet use: The role of information, control, and cues. ScienceDirect. https://www-sciencedirect-com.proxy.lib.umich.edu/science/article/pii/S0047272720300724?via%3Dihub</ref>
 +
====Recommendations for Controlling Content for Children====
 +
There are various third-party parental control services such as Bark, Qustodio or NetNanny that allows people to keep track of the devices. Prices for these services range anywhere from $50 to more than $100 if there are several children to monitor. These costs include 24/7 device monitoring and full-range visibility into how they are using their devices. <ref>Knorr, Caroline. (2021, March). Parents' Ultimate Guide to Parental Controls. https://www.commonsensemedia.org/blog/parents-ultimate-guide-to-parental-controls#What%20are%20the%20best%20parental%20controls%20for%20setting%20limits%20and%20monitoring%20kids?</ref> However, these parental controls can only monitor certain accounts that they know the children are using. For some instances, the services need passwords in order to monitor activity. <ref>Orchilles, Jorge. (2010, April). Parental Control. https://www.sciencedirect.com/topics/computer-science/parental-control</ref>
 +
====Health Benefits====
 +
Parental controls can also help one's child to live a healthier lifestyle. A study from 2016 found that about 59% of parents believed their children to be addicted to their cellular and/or electronic devices. <ref> Teenage Cellphone Addiction https://www.psycom.net/cell-phone-internet-addiction </ref> As children increasingly receive smartphones at younger and younger ages, it is important for parents to be able to limit their device usage so as to lower their children's chance of becoming addicted to their phone in the future. Addiction to cellular and other electronic devices has several negative symptoms. These symptoms range from psychological (anxiety and depression) to physical (eye strain and neck strain). <ref> Signs and Symptoms of Cell Phone Addiction https://www.psychguides.com/behavioral-disorders/cell-phone-addiction/signs-and-symptoms/ </ref> Less time spent on phones leads to increased physical activity and more legitimate social interaction, which makes for a more well-rounded lifestyle. The American Academy of Pediatrics found that limiting children's screentime improves their physical and mental health and they develop academically, creatively, and socially. <ref name=benefits> Miller, M. (2020, February 24). Benefits Of Limiting Screen Time For Children. Retrieved from https://web.magnushealth.com/insights/benefits-of-limiting-screen-time-for-children#:~:text=It%20is%20amazing%20to%20see,online%20and%20age%2Dinappropriate%20videos. </ref>
  
=== Google's Content Moderation & the Catsouras Scandal ===
+
==Parental Control Apps==
[https://en.wikipedia.org/wiki/Google Google] is home to a practically endless amount of content and information all of which is for the most part, not regulated. In 2006, a young teen in Southern California named Nikki Catsouras crashed her car, which resulted in her gruesome death and decapitation. On the scene, members of the police force were tasked with taking pictures of the scene. However, as a Halloween joke, a few of the members who took the photos sent them around to various friends and family members. The picture of Nikki's mutilated body was then passed around the internet and was easily accessible via Google. The Catsouras family was devastated that these pictures of their daughter were being seen and viewed by millions, and desperately attempted to get the photo removed from the Google platform. However, Google refused to comply with Catsouras plea. This is a clear ethical dilemma that involves content moderation as this picture was certainly not meant to be released to the public and was very difficult for the family, but because Google did not want to begin moderating specific content of their platform they did nothing. This brings up the ethical question of if people have "The Right to Be Forgotten" <ref> Toobin, Jeffrey. “The Solace of Oblivion.” The New Yorker, 22 Sept. 2014, www.newyorker.com/.</ref>.
+
  
Another massive ethical issue with the moderation of content online is the fact that the owners of the content or platform decide what is and what is not moderated. Thousands of people and companies claim that Google purposefully moderates content that directly competes with their platform. Shivaun Moeran and Adam Raff are two computer scientists who together created an incredibly powerful search platform called Foundem.com. The website was helpful for finding any amounts of information, it was particularly helpful for finding the cheapest possible items being sold on the internet. The key to the site was a Vertical Search Algorithm, which as an incredibly complex computer algorithm that focuses on search. This vertical search algorithm was significantly more powerful than Google's search algorithm, which was a horizontal search algorithm. The couple posted their site and within the first few days experienced great success and many site visitors, however, after a few days the visitor rate significantly decreased. They discovered that their site had been pushed multiple pages back on Google. This is because it directly competed with the "Google Shopping" app that had been released by Google. Morean and Raff had countless lawsuits filed and met with people at Google and other large companies to figure out what the issue was and how they could get it fixed but were met with silence or ambiguity. Foundem.com never became the next big search algorithm, partly because of the ethical issues seen in content moderation by Google <ref> Duhigg, Charles. “The Case Against Google.” The New York Times, The New York Times, 20 Feb. 2018, www.nytimes.com/. </ref>
+
[[File:Qustodio.jpeg|250px|thumbnail|An example of what information parents could view about their children's device usage with Qustodio. <ref name="spy"/>]]
  
===Psychological Effects on Moderators===
+
Aside from standard operating systems’ built-in parental controls, parents can also download apps to set up these restrictions/monitoring systems. Life360 is a popular app that parents can use to have access to their children's location at all times. The app also offers driving reports so you can see if your teenager is speeding or not. Life360 has been controversial with people even calling it the "Big Brother" of apps. Teens have even said it has ruined their relationship with their parents. <ref> Lenore Skenazy, Life 360 Should Be Called “Life Sentence 360”</ref>Net Nanny is an app that can block inappropriate content online and “can also record instant messaging programs, monitor Internet activity, and send daily email reports to parents.” <ref>‌Kanable, Rebecca. (2004, November). Policing Online: From Internet Safety to Employee Management and Parolee Monitoring, Technology Can Help. U.S. Department of Justice. https://www.ojp.gov/ncjrs/virtual-library/abstracts/policing-online-internet-safety-employee-management-and-parolee </ref>  
Content moderation can have significant negative effects on the individuals tasked with carrying out the motivation. Because most content must be reviewed by a human, professional content moderators spend hours every day reviewing disturbing images and videos, including pornography (sometimes involving children or animals) gore, executions, animal abuse, and hate speech. Viewing such content repeatedly, day after day can be stressful and traumatic, with moderators sometimes developing PTSD-like symptoms. Others, after continuous exposure to fringe ideas and conspiracy theories, develop intense paranoia and anxiety, and begin to accept those fringe ideas as true.<ref name="Secret History"></ref><ref name="Beheadings">Chen, Adrian (October 23, 2014). [https://www.wired.com/2014/10/content-moderation/ "The Laborers Who Keep Dick Pics and Beheadings Out of Your Facebook Feed"] ''Wired''. Retrieved March 26, 2019.</ref><ref name="Trauma">Newton, Casey (February 25, 2019). [https://www.theverge.com/2019/2/25/18229714/cognizant-facebook-content-moderator-interviews-trauma-working-conditions-arizona "The Trauma Floor: The Secret Lives of Facebook Moderators in America"] ''The Verge''. Retrieved March 26, 2019.</ref>
+
  
Further negative effects are brought on by the stress of applying the subjective and inconsistent rules regarding content moderation. Moderators are often called upon to make judgment calls regarding ambiguously-objectionable material or content that is offensive but breaks no rules. However, the performance of their moderation decisions is strictly monitored and measured against the subjective judgment calls of other moderators. A few mistakes are all it takes for a professional moderator to lose their job.<ref name="Trauma"></ref>
+
One app called Qustodio provides parents with activity summaries for each of their children. <ref name = spy>‌Teodosieva, Radina. (2015, October 16). Spy me, please! The University of Amsterdam. http://mastersofmedia.hum.uva.nl/blog/2015/10/16/spy-me-please-the-self-spying-app-that-you-need/ </ref> These summaries include total screen time, a breakdown of the screen time that shows how much time was spent on different apps, a list of words that the child searched on the internet, and a tab alerting parents to possibly questionable activity. <ref name="spy"/>  
  
A report detailing the lives of [[Facebook]] content moderators explained the poor conditions these workers are subject to <ref name = "facebook">Simon, Scott, and Emma Bowman. “Propaganda, Hate Speech, Violence: The Working Lives Of Facebook's Content Moderators.” NPR, NPR, 2 Mar. 2019, www.npr.org/2019/03/02/699663284/the-working-lives-of-facebooks-content-moderators.</ref>. Even though content moderators have an emotionally intense, stressful job they are often underpaid. In addition, Facebook does provide forms of counseling to their employees, however, many are dissatisfied with the service <ref name = "facebook"/>. The employees review a significant amount of traumatizing information daily, but it is their responsibility to seek therapy if needed, which is difficult for many. They are also required to constantly oversee content and are only allotted two 15 minute breaks and a half an hour lunch break. In the cases where they review particularly horrifying content, they are only given a nine minute break to recover <ref name = "facebook"/>. [[Facebook]] is often criticized for the ethical treatment of their content moderator employees.
+
Another, more invasive, option is Key logger services. A Key logger tracks all key strokes made on a device and provides a file to the parent, usually via email, of everything logged in the child's device. These services can include access to contact lists and internet search histories. <ref name=key> Reporter, S. (2019, November 29). Parenting benefits of keylogger program. Retrieved April 08, 2021, from https://www.sciencetimes.com/articles/24367/20191129/parenting-benefits-of-keylogger-program.htm </ref>
  
===Information Transparency===
+
About “16% of parents report using parental control apps to monitor and restrict their teens’ mobile online activities,” and some parents are more likely than others to download these apps. <ref name = safety>‌Ghosh, Arup K, et al. (2018, April 26). A Matter of Control or Safety? Examining Parental Use of Technical Monitoring Apps on Teens’ Mobile Devices. Association for Computing Machinery Digital Library. Association for Computing Machinery. https://dl.acm.org/doi/pdf/10.1145/3173574.3173768 </ref> Two factors that correspond to higher rates of parental control app usage are if the parents are “low autonomy granting” and if the child is being “victimized online” or has had “peer problems.” <ref name="safety"/>
  
Information transparency is the degree to which information about a system is visible to its users.<ref>Turilli, Matteo; Floridi, Luciano (March 10, 2009). [https://link.springer.com/article/10.1007/s10676-009-9187-9 "The ethics of information transparency"] ''Ethics and Information Technology''. '''11''' (2): 105-112. doi:[https://doi.org/10.1007/s10676-009-9187-9
+
==Bypassing Parental Control==
10.1007/s10676-009-9187-9
+
[[File:Vpn.png|400px|thumbnail|How VPN works as the middle man between the client and the host server. <ref>Namecheap [https://www.namecheap.com/vpn/how-does-vpn-virtual-private-network-work/ . "How does VPN work?"] (January 3, 2020). ''PC Mag''. Retrieved April 01, 2021. </ref>]]
]. Retrieved March 26, 2019.</ref> By this definition, content moderation is not transparent at any level. First, content moderation is often not transparent to the public, those it is trying to moderate. While a platform may have public rules regarding acceptable content, these are often vague and subjective, allowing the platform to enforce them as broadly or as narrowly as it chooses. Furthermore, such public documents are often supplemented by internal documents accessible only to the moderators themselves.<ref name="Secret History"></ref>
+
While parental controls like Life360 are quite comprehensive, tech-savvy teens have found ways to bypass some of the controls. Most software has bugs that can be utilized by users to bypass the rules and regulations set for them. For example, on the iPhone parents can set screen time limits for certain apps like limiting two hours of iMessaging per day for their child. <ref>Lance Whitney. [https://www.pcmag.com/how-to/how-to-use-apples-screen-time-on-iphone-or-ipad#:~:text=Set%20App%20Limits,entire%20category%20or%20specific%20apps. "How to Use Apple's Screen Time on iPhone or iPad"] (January 3, 2020). ''PC Mag''. Retrieved April 01, 2021. </ref> However, what kids have figured out is that a lot of apps have a built-in share functionality that you can use to send messages through. This is a loophole to send messages even when the iMessage app is locked. <ref> Jellies App. [https://jelliesapp.com/blog/kids-bypassing-screen-time "Are Your Kids or Teens Unlocking Apple Screen Time Limits?"] (January 3, 2020). Retrieved April 01, 2021. </ref> And if done properly, the parents can remain ignorant that their kids have found a way to seemingly extended their time limits to have no bounds. Currently, there are no solutions for parents to fully implement a solution to the issue. A complete solution would need to come from developers at Apple.  
  
Content moderation is not transparent at the level of moderators either. The internal documents are often as vague as the public ones and contain significantly more internal inconsistencies and judgment calls that make them difficult to apply fairly. Furthermore, such internal documents are often contradicted by statements from higher-ups, which in turn may be contradicted by similar statements.<ref name="Trauma"></ref>
+
Additionally, teens can download VPN (Virtual private network) to bypass browsing the internet freely. A VPN can create an encrypted connection between your computer and the server you are trying to reach.<ref>Mark Smirniotis. [https://www.nytimes.com/wirecutter/guides/what-is-a-vpn/. "What Is a VPN and What Can (and Can’t) It Do?"] (March 3, 2021). ''New York Times''. Retrieved April 01, 2021. </ref> In essence, even though a teenager has parental controls for a certain website, the VPN can make a request to another computer that does not have parental controls, then that computer can give you the content you want. Creating a middle man between blocked content and teenager. Again there is really no complete solution to stopping someone from accessing a website once they have a VPN. The main use of VPNs is to protect people’s privacy, so they were built to be extremely secure. <ref>David Pierce. [https://www.wsj.com/articles/why-you-need-a-vpnand-how-to-choose-the-right-one-1537294244 . "Why You Need a VPN—and How to Choose the Right One"] (September 18, 2018). ''Wall Street Journal''. Retrieved April 01, 2021. </ref> The only sure way to prevent VPN use would just to prevent teenagers from downloading a VPN from the beginning. This can be done by monitoring the app store to prevent a VPN app from being downloaded in the first place. Additionally, most good VPNs have a monthly fee, so there is a financial barrier to entry. <ref>Yael Grauer. [https://www.nytimes.com/wirecutter/reviews/best-vpn-service/. "The Best VPN Service"] (February 25, 2021). ''New York Times''. Retrieved April 01, 2021. </ref>
  
Finally, even at the corporate level where policy is set, moderation is not transparent. Moderation policies are often created by an ad-hoc, case-by-case process and applied in the same manner. Some content that would normally be removed by moderation rules will be accepted for special circumstances, such as "newsworthiness". For example, videos of violent government suppression could be displayed or not, depending on the whims of moderation policy-makers and moderation QAs at the time.<ref name="Secret History"></ref>
+
A recent TikTok trend revealed how teens were escaping the potential punishments of their parents finding out their true whereabouts or activities by using creative methods to avoid setting off any triggers. Some such tactics are placing their phones in their friends' mailboxes to ensure their location is where they said they are, and then traveling without a phone to their actual destination. Teens have also figured out various cellular and wifi settings that allow them to restrict which of their activities are available to their parents via the app. <ref name=tiktok> Meisenzahl, M. (2019, November 08). Teens are Finding sneaky and clever ways to outsmart their parents' location-tracking apps, and it's turning into a meme ON TIKTOK. Retrieved April 08, 2021, from https://www.businessinsider.com/life360-location-tracker-teens-tiktok-memes-tips-2019-11#as-more-teens-and-their-parents-use-life360-the-community-on-tiktok-has-made-a-meme-out-of-it-videos-are-set-to-the-song-fly-by-still-lonely-which-has-the-lyrics-life-360-in-it-2 </ref>
  
===Bias===
+
==Adoption==
  
Due to its inherently subjective nature, content moderation can suffer from various kinds of bias. Algorithmic bias is possible when automated tools are used to remove content. For example, YouTube's automated Content ID tools may flag reviews of films or games that feature clips or gameplay as copyright violations, despite being Fair Use when used to criticize<ref name="Injustice"></ref>. When a youtube content is flagged they lose out on any ad revenue from that video during the time their content is flagged. Even if a content creator is able to fight the claim and has their video unflagged by Youtube they don't receive any of the revenue from their video while it was flagged. The algorithm bias thus serious financial effects for creators and especially for small channel who can't afford to fight the copyright claim <ref name = "Romano"> Romano, Aja. “YouTube's ‘Ad-Friendly’ Content Policy May Push One of Its Biggest Stars off the Website.” Vox, Vox, 2 Sept. 2016, www.vox.com/2016/9/2/12746450/youtube-monetization-phil-defranco-leaving-site.</ref>. Moderation may also suffer from cultural bias, when something considered objectionable by one group may be considered fine to another. For example, moderators tasked with removing content that depicts minors engaging in violence may disagree over what constitutes a minor. Classification of obscenity is also culturally biased, with different societies around the world having different standards of modesty.<ref name="Secret History"></ref><ref name="Gatekeepers"></ref> Moderation, both from the perspective of humans and automated systems, may be inherently flawed in that the subjective nature that comes along with deciding what is right versus what is wrong can be difficult to lay out in concrete terms. While there is no uniform solution to issues of bias in content moderation, some have suggested that approaching these issues with a utilitarian approach may serve as guiding ethical standard. <ref>Mandal, Jharna, et al. “Utilitarian and Deontological Ethics in Medicine.” Tropical Parasitology, Medknow Publications & Media Pvt Ltd, 2016, www.ncbi.nlm.nih.gov/pmc/articles/PMC4778182/.</ref>
+
About 32% of U.S. households have children, but that doesn’t mean all of them to utilize parental controls. <ref name = who>Thierer, Adam. (2019). Parental Controls & Online Child Protection. The Progress & Freedom Foundation, 45-51. https://poseidon01.ssrn.com/delivery.php?ID=396031102089030101016075082091103088058033095009026094104027126098086093111106075097029031099028051096054088127017025008122107111073000085023006114113101117080113006077037031064068080020002099009101067004068104007107105022107078007111120115083101094&EXT=pdf&INDEX=TRUE</ref> Parental controls are most likely used between the ages of 7 and 16, but parents with “very young children or older teens often have very little need for parental control technologies.<ref name="who"/> Other factors that influence whether or not a family chooses to use parental controls include aversions to these technologies, beliefs that these technologies are ineffective, and “alternative methods of controlling media content and access in the home,” such as “household media rules.” <ref name="who"/> Sometimes, parents might elect not to deal with parental controls simply because they’re too out of energy. <ref name="who"/> However, parents who do choose to monitor their children’s technology use can do so in a variety of ways.
  
===Free Speech and Censorship===
+
==Ethical Issues==
 
+
In academia, there is a debate about whether or not parental control software leads to healthy outcomes. Some say that greater parental involvement in children's device usage allows for better internet safety practices. Others contend that parental control software enables parental behaviors that negatively affect family dynamics and internet safety practices. However, there is a consensus that the obstacles parents face when trying to protect their children from harmful content are largely shaped by how much information is easily accessible on the internet.
Content moderation often finds itself in conflict with the principles of free speech, especially when the content it moderates is of a political, social or controversial nature.<ref name="Gatekeepers"></ref>. One the one hand, internet platforms are private entities with full control over what they can allow their users to post. On the other hand, large, dominant social media platforms like Facebook and Twitter have significant influence over the public discourse and act as effective monopolies on audience engagement. The ethical dilemma comes in when discussing who has the right to control what the public has to say and what gives them this right. In this sense, centralized platforms act as a modern day ''agoras'', where [[Utilitarian_Philosophy#John_Stuart_Mill|John Stuart Mill's]] "marketplace of ideas" allows good ideas to be separated from the bad without top-down influence.<ref name="Garbage"></ref> When corporations are allowed to decide with impunity what is or isn't allowed to be discussed in such a space, they circumvent this process and stifle free speech on the primary channel individuals use to express themselves.<ref name="Impossible Choices"></ref>
+
 
+
Other, similar anonymous apps include [[Yik Yak|Yik Yak]], [https://en.wikipedia.org/wiki/Secret_(app) Secret] (now defunct), and [https://en.wikipedia.org/wiki/Whisper_(app) Whisper]. Learning from their predecessors and competition, YikYak and Whisper have also taken a more active approach at Content Moderation and have not just employed people to moderate content, but also algorithms <ref>Deamicis, Carmel. “Meet the Anonymous App Police Fighting Bullies and Porn on Whisper, Yik Yak, and Potentially Secret.” Gigaom – Your Industry Partner in Emerging Technology Research, Gigaom, 8 Aug. 2014, gigaom.com/. </ref>.
+
  
=== Bots ===
+
===Trust===
Although a lot of content moderation cannot be dealt with using computer algorithms and must be outsourced to "virtual sweatshops", a lot of content is still moderated through the use of computer bots. The use of these computer bots naturally comes with many ethical concerns <ref>Bengani, Priyanjana. “Controlling the Conversation: The Ethics of Social Platforms and Content Moderation.” Columbia University in the City of New York, Apr. 2018, www.columbia.edu/content/.</ref>. The largest concern lies among academics, an increasing portion of whom are worried that auto-moderation cannot be effectively implemented on a global scale<ref>Newton, Casey. "Facebook’s content moderation efforts face increasing skepticism." The Verge. 24 August 2018. https://www.theverge.com/2018/8/24/17775788/facebook-content-moderation-motherboard-critics-skepticism</ref> UCLA Professor Assistant Professor Sarah Roberts said in an interview with ''Motherboard'' regarding Facebook's attempt at global auto-moderation, "it’s actually almost ridiculous when you think about it... What they’re trying to do is to resolve human nature fundamentally."<ref name=motherboard>Koebler, Jason. "The Impossible Job: Inside Facebook’s Struggle to Moderate Two Billion People." Motherboard. 23 August 2018. https://motherboard.vice.com/en_us/article/xwk9zd/how-facebook-content-moderation-works</ref> The article's objective of making clear that auto-moderation isn't feasible includes a report that Facebook CEO Mark Zuckerberg and COO Sheryl Sandberg often have to weigh in on content moderation themselves, a testament to how situational and subjective the job is.<ref name=motherboard></ref>
+
While some parents can potentially harm their children by failing to teach them safe internet practices, research has shown that the reverse is also true. Because it provides parents with greater control over their children’s internet access, the parental control software can enable parents who may already struggle with being overcontrolling in their relationships with their children. This can lead to broken trust within families and leave the children without any of their own experience practicing safe internet practices. A study from the University of Central Florida found that two-thirds of teens' relationships with their parents soured after the installation of a parental control application. <ref> Managing Screen Time and Privacy | Could Parental Control Apps Do More Harm than Good? https://techden.com/blog/screen-time-privacy-parental-control-apps/ </ref> It is thought that perhaps parents may replace meaningful conversations about safe internet practices with parental control software.
  
Tech companies such as Microsoft's Azure cloud service have begun offering automated content moderation packages for purchase by companies.<ref name=microsoft>"Content Moderator." Microsoft Azure. https://azure.microsoft.com/en-us/services/cognitive-services/content-moderator/</ref> The Microsoft Azure content moderator advertises expertise in image moderation, text moderation in over 100 languages that monitors for profanity and contextualized offensive language, video moderation including recognizing "racy" content, as well as a human review tool for situations where the automated moderator is unsure of what to do.<ref name=microsoft></ref>
+
The teens who are bypassing parental controls also adjust the trust balance between parents and their children. <ref name=tiktok> </ref> Many of the children believe they are outsmarting their parents. Consequently, parents are not as alert to the safety of their children because they trust the app to do its job. Because of some of these behaviors, such as not having their phones present with them, teens may not be able to properly contact their parents or emergency services in the case that something goes wrong. This leads to an increased safety concern, when parents intended to decrease safety concerns.  
  
 +
=== Independence ===
 +
As children enter adulthood, some have trouble adjusting to having autonomy of their internet practices due to heavy supervision in the home. <ref>Cetinkaya, L. (2019). The Relationship between Perceived Parental Control and Internet Addiction: A Cross-sectional study among Adolescents. Contemporary Educational Technology, 10(1), 55–74. https://doi.org/10.30935/cet.512531</ref> An important thing for children to develop is the ability to learn from their mistakes and solve their problems on their own. One study suggested that some children whose parents used parental controls were less likely to want to approach their parents about problems they had run into, both on- and offline, leading to these children being less able to solve their problems through collaboration with others. <ref> Managing Screen Time and Privacy | Could Parental Control Apps Do More Harm than Good? https://techden.com/blog/screen-time-privacy-parental-control-apps/ </ref> Additionally, there have been extensive studies done on the effects that overbearing parents can have on children. Children with controlling parents demonstrate lower self-esteem, act out more, and have lower academic performance. <ref> Controlling Parents – The Signs And Why They Are Harmful https://www.parentingforbrain.com/controlling-parents/ </ref> As parental controls can lead to controlling parenting, they need to be treated with great care.
  
 
==References==
 
==References==
  
 
<references/>
 
<references/>

Latest revision as of 00:50, 9 April 2021

Back • ↑Topics • ↑Categories

Parental controls allows parents to control what websites their children can see, determine at what times their children have access to the internet, track their children's internet history, limit accessible content, block outgoing content, and many other things. [1] [2] The introduction of parental control software has raised ethical concerns over the last decade. [3] Parental controls are most likely used between the ages of 7 and 16, but parents with “very young children or older teens often have very little need for parental control technologies.” [3] Ethical concerns include loss of trust between parents and children [4] and a decreased sense of autonomy that leads to reduced opportunities for self-directed learning. [5] Additionally, various monitoring tools may be used with or without the child's consent. [2]

Apple's built-in parental controls. [6]


Overview

Parental control software typically allows parents to customize internet permissions on their child's devices or accounts on shared devices. [7] The account administrator, typically a parent, can change internet permissions for the entire household, while accounts under the administrator do not have that capability. This allows parents to establish rules for their children without having to physically enforce them.

The parental control software has become more prevalent recently. For example, basic parental control software now comes with standard operating systems, such as Windows. [8]

There are three main ways that parental control software functions:

  • Complete disablement of the internet allows parents to cut off their child's connection to wifi entirely during chosen time intervals. This can range from disabling their wifi during a scheduled time interval, such as at bedtime, to turn off their internet indefinitely, such as in instances of punishment.
  • Content blocking focuses on filtering the content that children can see, and different accounts might have different age-appropriate preferences set. For larger households with several devices, this allows for different children to view age-appropriate content as determined by the account administrator.
  • Monitoring means that parents can have access to the complete browsing history of their children at any time. This allows parents to monitor how their children navigate the internet without hard boundaries.

History

The emergence of parental controls was first known as a content filter. [9] Content filters are interchangeable with internet filter, which is a software that assesses and blocks online content that has specific words or images. Although the Internet was created to make information more accessible, open access to all different kinds of information was deemed to be problematic, in instances where children or younger age groups could potentially view obscene or offensive materials. Content filters restrict what users may view on their computer by screening Web pages and e-mail messages for category-specific content. Such filters can be used by individuals, businesses, or even countries in order to regulate Internet usage. [10]

Uses for Parental Control Software

Protection from Harmful Content

The vastness of the internet places a heavy burden on parents trying to protect their children from harmful content. Because children might not have the skills to successfully and safely navigate online environments, parental controls can be a helpful tool to guide them. While the internet is an integral part of children's schooling, the internet also makes available potentially traumatic content that these children would not otherwise see. Parental control software offers parents the ability to control what content their children have access to, even when they are not physically present to monitor them. There is evidence that parents who are involved in their children’s internet use in some way are more likely to encourage safe internet practices. [11]

Recommendations for Controlling Content for Children

There are various third-party parental control services such as Bark, Qustodio or NetNanny that allows people to keep track of the devices. Prices for these services range anywhere from $50 to more than $100 if there are several children to monitor. These costs include 24/7 device monitoring and full-range visibility into how they are using their devices. [12] However, these parental controls can only monitor certain accounts that they know the children are using. For some instances, the services need passwords in order to monitor activity. [13]

Health Benefits

Parental controls can also help one's child to live a healthier lifestyle. A study from 2016 found that about 59% of parents believed their children to be addicted to their cellular and/or electronic devices. [14] As children increasingly receive smartphones at younger and younger ages, it is important for parents to be able to limit their device usage so as to lower their children's chance of becoming addicted to their phone in the future. Addiction to cellular and other electronic devices has several negative symptoms. These symptoms range from psychological (anxiety and depression) to physical (eye strain and neck strain). [15] Less time spent on phones leads to increased physical activity and more legitimate social interaction, which makes for a more well-rounded lifestyle. The American Academy of Pediatrics found that limiting children's screentime improves their physical and mental health and they develop academically, creatively, and socially. [16]

Parental Control Apps

An example of what information parents could view about their children's device usage with Qustodio. [17]

Aside from standard operating systems’ built-in parental controls, parents can also download apps to set up these restrictions/monitoring systems. Life360 is a popular app that parents can use to have access to their children's location at all times. The app also offers driving reports so you can see if your teenager is speeding or not. Life360 has been controversial with people even calling it the "Big Brother" of apps. Teens have even said it has ruined their relationship with their parents. [18]Net Nanny is an app that can block inappropriate content online and “can also record instant messaging programs, monitor Internet activity, and send daily email reports to parents.” [19]

One app called Qustodio provides parents with activity summaries for each of their children. [17] These summaries include total screen time, a breakdown of the screen time that shows how much time was spent on different apps, a list of words that the child searched on the internet, and a tab alerting parents to possibly questionable activity. [17]

Another, more invasive, option is Key logger services. A Key logger tracks all key strokes made on a device and provides a file to the parent, usually via email, of everything logged in the child's device. These services can include access to contact lists and internet search histories. [20]

About “16% of parents report using parental control apps to monitor and restrict their teens’ mobile online activities,” and some parents are more likely than others to download these apps. [21] Two factors that correspond to higher rates of parental control app usage are if the parents are “low autonomy granting” and if the child is being “victimized online” or has had “peer problems.” [21]

Bypassing Parental Control

How VPN works as the middle man between the client and the host server. [22]

While parental controls like Life360 are quite comprehensive, tech-savvy teens have found ways to bypass some of the controls. Most software has bugs that can be utilized by users to bypass the rules and regulations set for them. For example, on the iPhone parents can set screen time limits for certain apps like limiting two hours of iMessaging per day for their child. [23] However, what kids have figured out is that a lot of apps have a built-in share functionality that you can use to send messages through. This is a loophole to send messages even when the iMessage app is locked. [24] And if done properly, the parents can remain ignorant that their kids have found a way to seemingly extended their time limits to have no bounds. Currently, there are no solutions for parents to fully implement a solution to the issue. A complete solution would need to come from developers at Apple.

Additionally, teens can download VPN (Virtual private network) to bypass browsing the internet freely. A VPN can create an encrypted connection between your computer and the server you are trying to reach.[25] In essence, even though a teenager has parental controls for a certain website, the VPN can make a request to another computer that does not have parental controls, then that computer can give you the content you want. Creating a middle man between blocked content and teenager. Again there is really no complete solution to stopping someone from accessing a website once they have a VPN. The main use of VPNs is to protect people’s privacy, so they were built to be extremely secure. [26] The only sure way to prevent VPN use would just to prevent teenagers from downloading a VPN from the beginning. This can be done by monitoring the app store to prevent a VPN app from being downloaded in the first place. Additionally, most good VPNs have a monthly fee, so there is a financial barrier to entry. [27]

A recent TikTok trend revealed how teens were escaping the potential punishments of their parents finding out their true whereabouts or activities by using creative methods to avoid setting off any triggers. Some such tactics are placing their phones in their friends' mailboxes to ensure their location is where they said they are, and then traveling without a phone to their actual destination. Teens have also figured out various cellular and wifi settings that allow them to restrict which of their activities are available to their parents via the app. [28]

Adoption

About 32% of U.S. households have children, but that doesn’t mean all of them to utilize parental controls. [3] Parental controls are most likely used between the ages of 7 and 16, but parents with “very young children or older teens often have very little need for parental control technologies.” [3] Other factors that influence whether or not a family chooses to use parental controls include aversions to these technologies, beliefs that these technologies are ineffective, and “alternative methods of controlling media content and access in the home,” such as “household media rules.” [3] Sometimes, parents might elect not to deal with parental controls simply because they’re too out of energy. [3] However, parents who do choose to monitor their children’s technology use can do so in a variety of ways.

Ethical Issues

In academia, there is a debate about whether or not parental control software leads to healthy outcomes. Some say that greater parental involvement in children's device usage allows for better internet safety practices. Others contend that parental control software enables parental behaviors that negatively affect family dynamics and internet safety practices. However, there is a consensus that the obstacles parents face when trying to protect their children from harmful content are largely shaped by how much information is easily accessible on the internet.

Trust

While some parents can potentially harm their children by failing to teach them safe internet practices, research has shown that the reverse is also true. Because it provides parents with greater control over their children’s internet access, the parental control software can enable parents who may already struggle with being overcontrolling in their relationships with their children. This can lead to broken trust within families and leave the children without any of their own experience practicing safe internet practices. A study from the University of Central Florida found that two-thirds of teens' relationships with their parents soured after the installation of a parental control application. [29] It is thought that perhaps parents may replace meaningful conversations about safe internet practices with parental control software.

The teens who are bypassing parental controls also adjust the trust balance between parents and their children. [28] Many of the children believe they are outsmarting their parents. Consequently, parents are not as alert to the safety of their children because they trust the app to do its job. Because of some of these behaviors, such as not having their phones present with them, teens may not be able to properly contact their parents or emergency services in the case that something goes wrong. This leads to an increased safety concern, when parents intended to decrease safety concerns.

Independence

As children enter adulthood, some have trouble adjusting to having autonomy of their internet practices due to heavy supervision in the home. [30] An important thing for children to develop is the ability to learn from their mistakes and solve their problems on their own. One study suggested that some children whose parents used parental controls were less likely to want to approach their parents about problems they had run into, both on- and offline, leading to these children being less able to solve their problems through collaboration with others. [31] Additionally, there have been extensive studies done on the effects that overbearing parents can have on children. Children with controlling parents demonstrate lower self-esteem, act out more, and have lower academic performance. [32] As parental controls can lead to controlling parenting, they need to be treated with great care.

References

  1. J. D. Biersdorfer. "Tools to Keep Your Children in Line When They’re Online" (2018, March 02). New York Times. Retrieved April 01, 2021.
  2. 2.0 2.1 Federal Trade Commission. (2018, March 13). Parental controls. Retrieved April 06, 2021, from https://www.consumer.ftc.gov/articles/0029-parental-controls
  3. 3.0 3.1 3.2 3.3 3.4 3.5 Thierer, Adam. (2019). Parental Controls & Online Child Protection. The Progress & Freedom Foundation, 45-51. https://poseidon01.ssrn.com/delivery.php?ID=396031102089030101016075082091103088058033095009026094104027126098086093111106075097029031099028051096054088127017025008122107111073000085023006114113101117080113006077037031064068080020002099009101067004068104007107105022107078007111120115083101094&EXT=pdf&INDEX=TRUE
  4. Managing Screen Time and Privacy | Could Parental Control Apps Do More Harm than Good? https://techden.com/blog/screen-time-privacy-parental-control-apps/
  5. Controlling Parents – The Signs And Why They Are Harmful https://www.parentingforbrain.com/controlling-parents/
  6. ‌amaysim. (2020, November 3). How to set up parental controls on your iPhone, iPad or Android device. amaysim. https://www.amaysim.com.au/blog/world-of-mobile/set-up-parental-controls-apple-android
  7. ‌The Business Insider. (2020, September 18). The best internet parental control systems. Newstex LLC. https://go-gale-com.proxy.lib.umich.edu/ps/i.do?p=STND&u=umuser&id=GALE%7CA635821966&v=2.1&it=r&sid=summon
  8. ‌Microsoft. (n.d.). Parental consent and Microsoft child accounts. Microsoft. https://support.microsoft.com/en-us/account-billing/parental-consent-and-microsoft-child-accounts-c6951746-8ee5-8461-0809-fbd755cd902e
  9. (Clark, N). Content Filter Technology. Retrieved 1 April 2021, from https://www.britannica.com/technology/content-filter
  10. Web Content Filtering. Retrieved 1 April 2021, from https://www.webroot.com/us/en/resources/glossary/what-is-web-content-filtering
  11. Gallego, Francisco, A. (2020, August). Parental monitoring and children's internet use: The role of information, control, and cues. ScienceDirect. https://www-sciencedirect-com.proxy.lib.umich.edu/science/article/pii/S0047272720300724?via%3Dihub
  12. Knorr, Caroline. (2021, March). Parents' Ultimate Guide to Parental Controls. https://www.commonsensemedia.org/blog/parents-ultimate-guide-to-parental-controls#What%20are%20the%20best%20parental%20controls%20for%20setting%20limits%20and%20monitoring%20kids?
  13. Orchilles, Jorge. (2010, April). Parental Control. https://www.sciencedirect.com/topics/computer-science/parental-control
  14. Teenage Cellphone Addiction https://www.psycom.net/cell-phone-internet-addiction
  15. Signs and Symptoms of Cell Phone Addiction https://www.psychguides.com/behavioral-disorders/cell-phone-addiction/signs-and-symptoms/
  16. Miller, M. (2020, February 24). Benefits Of Limiting Screen Time For Children. Retrieved from https://web.magnushealth.com/insights/benefits-of-limiting-screen-time-for-children#:~:text=It%20is%20amazing%20to%20see,online%20and%20age%2Dinappropriate%20videos.
  17. 17.0 17.1 17.2 ‌Teodosieva, Radina. (2015, October 16). Spy me, please! The University of Amsterdam. http://mastersofmedia.hum.uva.nl/blog/2015/10/16/spy-me-please-the-self-spying-app-that-you-need/
  18. Lenore Skenazy, Life 360 Should Be Called “Life Sentence 360”
  19. ‌Kanable, Rebecca. (2004, November). Policing Online: From Internet Safety to Employee Management and Parolee Monitoring, Technology Can Help. U.S. Department of Justice. https://www.ojp.gov/ncjrs/virtual-library/abstracts/policing-online-internet-safety-employee-management-and-parolee
  20. Reporter, S. (2019, November 29). Parenting benefits of keylogger program. Retrieved April 08, 2021, from https://www.sciencetimes.com/articles/24367/20191129/parenting-benefits-of-keylogger-program.htm
  21. 21.0 21.1 ‌Ghosh, Arup K, et al. (2018, April 26). A Matter of Control or Safety? Examining Parental Use of Technical Monitoring Apps on Teens’ Mobile Devices. Association for Computing Machinery Digital Library. Association for Computing Machinery. https://dl.acm.org/doi/pdf/10.1145/3173574.3173768
  22. Namecheap . "How does VPN work?" (January 3, 2020). PC Mag. Retrieved April 01, 2021.
  23. Lance Whitney. "How to Use Apple's Screen Time on iPhone or iPad" (January 3, 2020). PC Mag. Retrieved April 01, 2021.
  24. Jellies App. "Are Your Kids or Teens Unlocking Apple Screen Time Limits?" (January 3, 2020). Retrieved April 01, 2021.
  25. Mark Smirniotis. "What Is a VPN and What Can (and Can’t) It Do?" (March 3, 2021). New York Times. Retrieved April 01, 2021.
  26. David Pierce. . "Why You Need a VPN—and How to Choose the Right One" (September 18, 2018). Wall Street Journal. Retrieved April 01, 2021.
  27. Yael Grauer. "The Best VPN Service" (February 25, 2021). New York Times. Retrieved April 01, 2021.
  28. 28.0 28.1 Meisenzahl, M. (2019, November 08). Teens are Finding sneaky and clever ways to outsmart their parents' location-tracking apps, and it's turning into a meme ON TIKTOK. Retrieved April 08, 2021, from https://www.businessinsider.com/life360-location-tracker-teens-tiktok-memes-tips-2019-11#as-more-teens-and-their-parents-use-life360-the-community-on-tiktok-has-made-a-meme-out-of-it-videos-are-set-to-the-song-fly-by-still-lonely-which-has-the-lyrics-life-360-in-it-2
  29. Managing Screen Time and Privacy | Could Parental Control Apps Do More Harm than Good? https://techden.com/blog/screen-time-privacy-parental-control-apps/
  30. Cetinkaya, L. (2019). The Relationship between Perceived Parental Control and Internet Addiction: A Cross-sectional study among Adolescents. Contemporary Educational Technology, 10(1), 55–74. https://doi.org/10.30935/cet.512531
  31. Managing Screen Time and Privacy | Could Parental Control Apps Do More Harm than Good? https://techden.com/blog/screen-time-privacy-parental-control-apps/
  32. Controlling Parents – The Signs And Why They Are Harmful https://www.parentingforbrain.com/controlling-parents/