http://si410wiki.sites.uofmhosting.net/api.php?action=feedcontributions&user=Nfigue&feedformat=atom
SI410 - User contributions [en]
2024-03-29T12:02:33Z
User contributions
MediaWiki 1.25.2
http://si410wiki.sites.uofmhosting.net/index.php?title=Google_Photos&diff=102261
Google Photos
2021-04-16T11:07:55Z
<p>Nfigue: </p>
<hr />
<div><br />
[[File:Google_Photos_Logo_new.png|256px|thumb|right|The Google Photos logo. Copyright ©️: Google]]<br />
{{Nav-Bar|Topics##}}<br><br />
'''Google Photos''' is a photo and video storage and sharing service developed by [[Google]].<ref name="about">Google. (2021). [https://www.google.com/photos/about/ "Google Photos | About"]. ''Google Photos''. Retrieved March 26, 2021.</ref> Google Photos was released in May 2015 as it separated from Google’s former social network, [https://en.wikipedia.org/wiki/Google%2B Google+].<ref>Spradlin, L. (2015, May 24). [https://www.androidpolice.com/2015/05/24/an-exclusive-early-look-at-the-new-google-photos-app/ "An Exclusive Early Look At The New Google Photos App"]. ''Android Police''. Retrieved March 12, 2021.</ref> Google Photos was designed with the intent of creating a platform that allows people to store and easily access all of their pictures and videos from any device.<ref>Amateur Photographer. (2015, June 2). [https://www.amateurphotographer.co.uk/latest/photo-news/google-photos-service-raises-privacy-concerns-52721 "Google Photos Service Raises Privacy Concerns"]. ''Amateur Photographer''. Retrieved March 12, 2021.</ref><br />
<br />
After its release, Google Photos's user base rocketed to 200 million after one year, 500 million after two years, and surpassed 1 billion after four years.<ref>Porter, J. (2019, July 24). [https://www.theverge.com/2019/7/24/20708328/google-photos-users-gallery-go-1-billion "Google Photos passes the 1 billion users mark"]. ''The Verge''. Retrieved March 12, 2021.</ref> As of 2020, Google reports that about 28 billion photos and videos are uploaded every week and that the service is home to more than 4 trillion photos.<ref name="future">Ben-Yair, S. (2020, November 11). [https://blog.google/products/photos/storage-changes/ "Updating Google Photos' storage policy to build for the future"]. ''Google | The Keyword''. Retrieved March 12, 2021.</ref><br />
<br />
==Features==<br />
Google Photos is available on [[Android]], [https://en.wikipedia.org/wiki/IOS iOS], and [https://photos.google.com/ online].<ref name="JacobK">Kastrenakes, J. (2015, May 28). [https://www.theverge.com/2015/5/28/8678629/google-photos-app-announced "Google announces unlimited picture and video storage with new Photos app"]. ''The Verge''. Retrieved March 12, 2021.</ref> Photos and videos can be uploaded and accessed via any of these platforms.<ref name="Vox" /> The service organizes photos and videos by identifying any number of features present in the photo such as lakes, night, birthday, etc.<ref name="JacobK" /> It is capable of organizing photos by the faces of people and pets in them, even as the faces age.<ref name="Vox" /><ref name="JacobK" /> Users can manually fix or remove incorrect labels.<ref name="Vox" /> Google Photos also can group photos and videos by location.<ref name="Vox" /> It can determine a photo's location either by its embedded geotagging data, or by analyzing the photo for major landmarks (such as the [https://en.wikipedia.org/wiki/Eiffel_Tower Eiffel Tower]).<ref name="Vox" /><br />
<br />
The service includes a native photo and video editing software that can be used on any platform.<ref>Lowensohn, J. (2015, May 28). [https://www.theverge.com/2015/5/28/8673471/google-photos-hands-on-cloud-storage-io-2015 "Hands-on with Google's new Photos service"]. ''The Verge''. Retrieved March 12, 2021.</ref> Additionally, Google Photos offers a variety of ways to make sharing photos/videos easy and simple, such as generating web link to content that both users and non-users can access.<ref name="Vox">Mossberg, W. (2015, June 2). [https://www.vox.com/2015/6/2/11563182/the-new-google-photos-free-at-last-and-very-smart "The New Google Photos: Free at Last, and Very Smart"]. ''Vox''. Retrieved March 12, 2021.</ref> Google Photos allows users to share albums with someone directly via their Google account.<ref>Mathur, S. (2020, May 19). [https://blog.google/products/photos/new-controls-how-you-share-albums-google-photos/ "New controls for how you share albums in Google Photos"]. ''Google | The Keyword''. Retrieved March 12, 2021.</ref> In 2020, Google added a heat map feature that displays the concentration of photos in the library as a function of location.<ref>Newton, C. (2020, June 25). [https://www.theverge.com/2020/6/25/21301932/google-photos-redesign-map-view-memories-search "Google Photos gets a map view as part of a big new redesign"]. ''The Verge''. Retrieved March 12, 2021.</ref> In March of 2021, Google Photos started rolling out an advanced version of its video-editor for Android users. Users will be able to not only trim, stabilize, and rotate videos, but also crop to any aspect ratio, change perspective, and control brightness, contrast, highlights, and shadows. Google plans to release this feature to iOS in 2021.<ref>Li, A. (2021, March 29). [https://9to5google.com/2021/03/29/google-photos-video-editor/ "Update: Rolling out Google Photos for Android getting advanced video editor with cropping, filters."] ''9To5Google''. Retrieved April 2, 2021.</ref><br />
<br />
===Subscription Service===<br />
In late 2020, Google rolled out a subscription-based service that ships Google Photos users 10 prints of that months best photos.<ref>Perez, S. (2020, October 20).[https://techcrunch.com/2020/10/20/google-photos-revives-its-prints-subscription-service-expands-same-day-print-options/ "Google Photos revives its prints subscription service, expands same-day print options."] ''TechCrunch''. Retrieved April 2, 2021.</ref> The images are selected by a [https://en.wikipedia.org/wiki/Machine_learning machine learning] algorithm, but users can alter any choices. After some negative feedback through user-testing,<ref name="feedback">Nagaraj, P. (2020, June 21). [https://newslanded.com/2020/06/21/google-finally-cancels-controversial-google-photos-subscription-feature/ "Google finally cancels controversial google photos subscription feature."] ''News Landed''. Retrieved April 2, 2021.</ref> Google now allows higher customization of the look, and finishing of the prints along with a reduction in subscription costs from $7.99 to $6.99. Users are able to cancel the service at any time, and are also able to skip a month, if they so choose to. Customers can order prints in multiple sizes, and also ask for a same-day pick-up at their local [https://en.wikipedia.org/wiki/Walgreens Walgreens]. The most prominent feedback received for the service is the "outrageous cost"<ref name="feedback"/>, charging many times the price for printing at a traditional store.<ref name="feedback"/><br />
<br />
[[File:Google's_Revenue_Breakdown_2017-20.jpeg|300px|thumb|right|A breakdown of Google's sources of revenue from 2017 to 2020.<ref name="pivot" /> (Advertising revenue is in blue)]]<br />
<br />
==Storage==<br />
<br />
Photos and videos are each uploaded to Google Photos in one of three ways: "original quality", "high quality", or "express."<ref name="size" /> "Original quality" uploads maintain their original resolution and use part of the associated Google account's 15GB of storage shared between all Google products (including [https://en.wikipedia.org/wiki/Gmail Gmail], [https://en.wikipedia.org/wiki/Google_Drive Google Drive], etc.).<ref name="size">Google. (2021). [https://support.google.com/photos/answer/6220791?co=GENIE.Platform%3DAndroid&oco=1#zippy=%2Cexpress-backup%2Coriginal-quality%2Chigh-quality "Choose the upload size of your photos & videos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> Meanwhile, "high quality" uploads have their resolution downgraded to 16 megapixels and 1080p.<ref name="size" /> Finally, "express" uploads are compressed to 3 megapixels and 480p.<ref name="size" /> The free tier of Google Photos allows unlimited uploads in resolutions up to "high quality."<ref name="size" /><br />
<br />
In November 2020, in an effort to increase the number of [https://en.wikipedia.org/wiki/Google_One Google One] subscriptions and reduce Google's reliance on ad-based revenue, Google announced that Google Photos will no longer offer free unlimited storage at "express" or "high quality" starting June 1, 2021.<ref name="pivot">Newman, J. (2020, November 28). [https://www.fastcompany.com/90579872/google-photos-no-unlimited-storage "The end of unlimited Google Photos storage is part of a bigger pivot"]. ''Fast Company''. Retrieved March 12, 2021.</ref><ref>Bohn, D. (2020, November 11). [https://www.theverge.com/2020/11/11/21560810/google-photos-unlimited-cap-free-uploads-15gb-ending "Google Photos will end its free unlimited storage on June 1st, 2021"]. ''The Verge''. Retrieved March 12, 2021.</ref> Existing photos and videos will remain unaffected.<ref name="future" /> After all 15GB of account storage have been used, users will either have to maintain a Google One subscription or upload from a [https://en.wikipedia.org/wiki/Google_Pixel Google Pixel] 5 or older device.<ref>Google. (2021). [https://support.google.com/photos/answer/10100180 "Storage changes for Google Photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref><ref name="size" /> Future Pixel devices will be unable to upload in "high quality" for free.<ref>Coberly, C. (2020, November 12). [https://www.techspot.com/news/87584-future-pixel-phone-owners-reportedly-wont-retain-access.html "Future Pixel phone owners may not retain unlimited Google Photos storage access"]. ''TechSpot''. Retrieved March 12, 2021.</ref><br />
<br />
==Motivations==<br />
<br />
There have been suggestions that Google released Google Photos in the interest of collecting more information about users through visual data.<ref>Lomas, N. (2015, June 1). [https://techcrunch.com/2015/06/01/google-photos-reminder-smile-its-free-youre-the-product/ "Google Photos Reminder: Smile, It's Free - You're The Product!"]. ''TechCrunch''. Retrieved March 12, 2021.</ref> It has also been speculated that it was intended as a method of outsourcing work to train their visual recognition algorithms; the app frequently asks its users to manually verify (or reject) its proposed tags.<ref name="improve">Perrigo, M. (2020, November 9). [https://chromeunboxed.com/google-photos-crowdsource-feature "Google Photos wants you to answer questions to help improve its image recognition capabilities"]. ''Chrome Unboxed''. Retrieved March 12, 2021.</ref> <br />
<br />
===Bias in Algorithms===<br />
Bias in artificial intelligence is a widely discussed topic, with researchers voicing concerns that flawed [https://en.wikipedia.org/wiki/Training,_validation,_and_test_sets#training_set training data] can exacerbate or create unfair assumptions. One example is facial recognition algorithms misidentifying people of color more frequently than white people.<ref name="gender"> Ghosh, S. (2020, February 20). [https://www.businessinsider.nl/google-cloud-vision-api-wont-tag-images-by-gender-2020-2/ “Google AI will no longer use gender labels like ‘woman’ or ‘man’ on images of people to avoid bias”]. “Business Insider”. Retrieved April 6, 2021. </ref> <br />
<br />
[[File:Gendergoogle.png|thumbnail|right|Google's API no longer uses gender labels<ref name="gender"/>]]<br />
<br />
In 2020, Google made a change to its API tool [https://en.wikipedia.org/wiki/Google_Cloud_Platform Cloud Vision], which uses AI to analyze images in order to identify people, places, and things. The change involved eliminating gender labels on images, because a person’s gender can’t be determined just by how they look in a photo. Instead of using terms like “man” or “woman”, Google will tag images with labels such as “person” in order to avoid instilling AI algorithms with human bias.<ref> Lyons, K. (2020, February 20). [https://www.theverge.com/2020/2/20/21145356/google-ai-images-gender-bias-labels-people “Google Ai tool will no longer use gendered labels like ‘woman’ or ‘man’ in photos of people”]. “The Verge”. Retrieved April 6, 2021. </ref> <br />
<br />
Regarding its own AI principles, Google acknowledges that algorithms and datasets can reinforce bias: “We will seek to avoid unjust impacts on people, particularly those related to sensitive characteristics such as race, ethnicity, gender, nationality, income, sexual orientation, ability, and political or religious belief. We will design AI systems that provide appropriate opportunities for feedback, relevant explanations, and appeal. Our AI technologies will be subject to appropriate human direction and control.”<ref> Pichai, S. (2018, June 7). [https://www.blog.google/technology/ai/ai-principles/ “AI at Google: our principles”]. “Google”. Retrieved April 6, 2021. </ref><br />
<br />
===Misidentification===<br />
<br />
The nature of a computer analyzing and captioning a photo through a machine learning algorithm involves training from a data set. This is where a large amount of bias comes from, in the form of capture, category and negative bias. <ref name="tommasi">Tommasi, Tatiana, Patricia, Novi, Caputo, Barbara, & Tuytelaars, Tinne. (2015). A Deeper Look at Dataset Bias. Pattern Recognition, Lecture Notes in Computer Science (Vol. 9358, pp. 504–516). Cham: Springer International Publishing.</ref> Capture bias refers to all the variables involved with acquiring images, such as lighting conditions or camera used. Category bias refers to a poor definition of categories in which the image can exist, while negative bias involves the inability to train a model on categories which are not included in the data set. <ref name="tommasi"/> Machine learning models only know what they are taught, and as such, any of these biases appearing in Googles training data will resurface when categorizing photos uploaded by users. <ref name="debrusk">Chris DeBrusk. (2018). The Risk of Machine-Learning Bias (and How to Prevent It). MIT Sloan Blogs. Blog, Cambridge: Massachusetts Institute of Technology, Cambridge, MA.</ref><br />
<br />
[[File:Alcine1-cropped.png|216px|thumb|left|Jacky Alciné's Twitter post about the "Gorillas" misidentification.<ref name="gorillas" />]]<br />
<br />
Shortly after its release in 2015, Google Photos found itself under fire for a particularly offensive error: user Jacky Alciné reported that he and his friend were classified as "Gorillas" by Google Photos's visual recognition algorithms.<ref name="gorillas">Mullen, J. (2015, July 2). [https://www.cnn.com/2015/07/02/tech/google-image-recognition-gorillas-tag "Google rushes to fix software that tagged photo with racial slur"]. ''CNN''. Retrieved March 19, 2021.</ref> In response, Google Photos promptly removed all labels relating to primates as a temporary fix.<ref>Hern, A. (2018, January 12). [https://www.theguardian.com/technology/2018/jan/12/google-racism-ban-gorilla-black-people "Google's solution to accidental algorithmic racism: ban gorillas"]. ''The Guardian''. Retrieved March 19, 2021.</ref> A spokesperson from Google stated that the labels "gorilla", "chimp", "chimpanzee", and "monkey" were blocked on the platform.<ref>Vincent, J. (2018, January 12). [https://www.theverge.com/2018/1/12/16882408/google-racist-gorillas-photo-recognition-algorithm-ai "Google ‘fixed’ its racist algorithm by removing gorillas from its image-labeling tech."] ''The Verge''. Retrieved April 2, 2021.</ref> According to Mr. Alcine, Google has intensified their search for qualified people of color to hire in order to correct issues stemming from under representation of minorities in technology companies. <ref> (2015, July 1). [https://www.bbc.com/news/technology-33347866 “Google apologises for Photo app’s racist blunder”]. “BBC”. Retrieved April 6, 2021. </ref> With enough data and computing power, software can be trained to categorize images to a high level of accuracy, but it can’t easily go beyond the experience of that training. Even the very best algorithms lack the ability to refine their interpretation of the world as humans do.<ref> Simonite, T. (2018, January 11). [https://www.wired.com/story/when-it-comes-to-gorillas-google-photos-remains-blind/ “When It Comes To Gorillas, Google Photos Remains Blind”]. “Wired”. Retrieved April 6, 2021. </ref><br />
<br />
[[File:Incorrectly-Labeled_Bikes-cropped.jpg|150px|thumb|right|This meal has been incorrectly labeled as a "bike" by Google's visual recognition algorithms. Copyright ©️: Cooper Stevens]]<br />
<br />
In an effort to quantify the accuracy of Google Photos image recognition model, a study was performed by uploading a wide range of commonly recognizable photos and recording the results of categorization. The test was setup by uploading over one hundred photos, both with significant (descriptive) and insignificant (arbitrary) file names and placing them into albums organized by country of origin for that image. After leaving the images for one day, they had been organized into 9 broad categories with varying degrees of accuracy. The least accurate were photos of blue, incorrectly identified as the sky, while the most accurate were mountains, cars and sunsets. Nieuwenhuysen states that the system is satisfactory considering it can be used free of charge, reporting that it excels at recognizing any photo to at least a vaguely specific category, even when the image is named arbitrarily. <ref name=""> Nieuwenhuysen, Paul. (2018). Information Discovery and Images A Case Study of Google Photos. 2018 5th International Symposium on Emerging Trends and Technologies in Libraries and Information Services (ETTLIS) (pp. 16–21). orig-research, IEEE.</ref><br />
<br />
<br />
==Ethical Implications==<br />
[[File:July_13,_2018_Timeline-light.jpg|205px|thumb|right|Google Photos works in conjunction with Google Maps to create an extremely precise summary of this user’s day. Copyright ©️: Cooper Stevens]]<br />
===Security===<br />
A foremost concern regarding Google Photos is ensuring that the user who uploads photos and videos is the only one who can control access to and view that content.<ref>Hill, S. (2015, June 16). [https://www.androidauthority.com/google-photos-worried-privacy-616339/ "Google Photos: Should you be worried about privacy?"]. ''Android Authority''. Retrieved March 12, 2021.</ref> These worries are substantiated by occurrences like Google accidentally sending users’ private videos to strangers in November 2019.<ref>Warren, T. (2020, February 4). [https://www.theverge.com/2020/2/4/21122044/google-photos-privacy-breach-takeout-data-video-strangers "Google admits it sent private videos in Google Photos to strangers"]. ''The Verge''. Retrieved March 12, 2021.</ref> To alleviate future incidents, Google continuously works to improve its security infrastructure to ensure that its privacy goals are upheld for the benefit of all users.<ref>Google. (2021). [https://safety.google/photos/#photos-advanced-infraestructure "Google Photos Safety &amp; Privacy Features"]. ''Google Safety Center''. Retrieved March 12, 2021.</ref><br />
<br />
A seldom acknowledged facetof large scale data storage as seen in Gooogle Photos, is the presence of unknown persons present at the physical locations which store the massive collection of photos and videos, who may have access to private user data. At the physical data storage level, a cloud service provider must ensure that data is properly segregated, prepared for data recovery situations and ensure that no person who is not authorized can locally access the database storing sensitive or private information. <ref name="teneyuca">David Teneyuca,<br />
Internet cloud security: The illusion of inclusion,<br />
Information Security Technical Report,<br />
Volume 16, Issues 3–4,<br />
2011,<br />
Pages 102-107,<br />
ISSN 1363-4127,<br />
https://doi.org/10.1016/j.istr.2011.08.005.<br />
(https://www.sciencedirect.com/science/article/pii/S1363412711000501)</ref><br />
<br />
===Computer Vision===<br />
Google Photos automatically runs every photo and video through visual recognition algorithms to identify objects and places.<ref>Google. (2021). [https://support.google.com/photos/answer/6220402?hl=en&amp;ref_topic=6128818 "Get started with Google Photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> Once "face grouping" is enabled by the user, Photos will also start to categorize images by the people and pets who appear in them.<ref name="searchPTP">Google. (2021). [https://support.google.com/photos/answer/6128838?hl=en&co=GENIE.Platform%3DDesktop&oco=0#zippy=%2Clearn-about-face-models "Search by people, things & places in your photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> There is some uncertainty surrounding Google collecting the information embedded in the photos and selling it to third parties or using it to display more relevant advertisements<ref>Luckerson, V. (2017, May 25). [https://www.theringer.com/2017/5/25/16043842/google-photos-data-collection-e8578b3256e0 "Why Google Is Suddenly Obsessed With Your Photos"]. ''The Ringer''. Retrieved March 12, 2021.</ref>, even though Google ensures users that "face groups and labels in your account are only visible to you."<ref name="searchPTP" /><br />
<br />
===Geotagging===<br />
<br />
If a user allows Google to keep track of their location history under "Your Timeline" in Google Maps, Google Photos uses this alongside geotags embedded in photos to allow users to search their gallery by location and even review past trips minute-by-minute.<ref name="Vox" /><ref>Bonifacic, I. (2020, December 16). [https://www.engadget.com/google-photos-adds-maps-timeline-233633756.html "Google Photos lets you relive a day through your Maps timeline"]. ''Engadget''. Retrieved March 12, 2021.</ref><br />
<br />
Even without geotags, Google Photos is capable of intuiting a photo’s location by analyzing for major landmarks.<ref name="Vox" /> As a result, concerns have been raised about how this information is used and if it is secure.<ref>Komando, K. (2021, January 11). [https://www.usatoday.com/story/tech/columnist/komando/2021/01/07/google-map-knows-your-location-photos-you-took-how-turn-off/4113403001/ "Hidden map on your phone shows everywhere you’ve been and the photos you took there"]. ''USA Today''. Retrieved March 12, 2021.</ref><br />
<br />
Google is upfront that it uses this information to show users more relevant advertisements, but at the same time assures users that this information is never shared with advertisers.<ref>Google. (2021). [https://policies.google.com/technologies/location-data?hl=en-US "How Google uses location information"]. ''Privacy &amp; Terms''. Retrieved March 12, 2021.</ref> To accommodate these concerns, Google Photos gives its users the capability to turn off location history, remove location information from already existing photos, and choose whether or not to share a photo's location when shared.<ref>Google. (2021). [https://support.google.com/photos/answer/6153599?co=GENIE.Platform%3DAndroid&oco=1#zippy=%2Cview-your-timeline "Understand, find and edit your photos' locations"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref><br />
<br />
==See also==<br />
*[[Computer Vision]]<br />
*[[Facial Recognition]]<br />
*[[Geotag]]<br />
<br />
==References==<br />
<references/></div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=Google_Photos&diff=102227
Google Photos
2021-04-16T08:19:06Z
<p>Nfigue: /* Ethical Implications */ added to security section</p>
<hr />
<div><span style="color: red; font-size: 50px;">Currently Being Edited</span><br />
<br />
<br />
[[File:Google_Photos_Logo_new.png|256px|thumb|right|The Google Photos logo. Copyright ©️: Google]]<br />
{{Nav-Bar|Topics##}}<br><br />
'''Google Photos''' is a photo and video storage and sharing service developed by [[Google]].<ref name="about">Google. (2021). [https://www.google.com/photos/about/ "Google Photos | About"]. ''Google Photos''. Retrieved March 26, 2021.</ref> Google Photos was released in May 2015 as it separated from Google’s former social network, [https://en.wikipedia.org/wiki/Google%2B Google+].<ref>Spradlin, L. (2015, May 24). [https://www.androidpolice.com/2015/05/24/an-exclusive-early-look-at-the-new-google-photos-app/ "An Exclusive Early Look At The New Google Photos App"]. ''Android Police''. Retrieved March 12, 2021.</ref> Google Photos was designed with the intent of creating a platform that allows people to store and easily access all of their pictures and videos from any device.<ref>Amateur Photographer. (2015, June 2). [https://www.amateurphotographer.co.uk/latest/photo-news/google-photos-service-raises-privacy-concerns-52721 "Google Photos Service Raises Privacy Concerns"]. ''Amateur Photographer''. Retrieved March 12, 2021.</ref><br />
<br />
After its release, Google Photos's user base rocketed to 200 million after one year, 500 million after two years, and surpassed 1 billion after four years.<ref>Porter, J. (2019, July 24). [https://www.theverge.com/2019/7/24/20708328/google-photos-users-gallery-go-1-billion "Google Photos passes the 1 billion users mark"]. ''The Verge''. Retrieved March 12, 2021.</ref> As of 2020, Google reports that about 28 billion photos and videos are uploaded every week and that the service is home to more than 4 trillion photos.<ref name="future">Ben-Yair, S. (2020, November 11). [https://blog.google/products/photos/storage-changes/ "Updating Google Photos' storage policy to build for the future"]. ''Google | The Keyword''. Retrieved March 12, 2021.</ref><br />
<br />
==Features==<br />
Google Photos is available on [[Android]], [https://en.wikipedia.org/wiki/IOS iOS], and [https://photos.google.com/ online].<ref name="JacobK">Kastrenakes, J. (2015, May 28). [https://www.theverge.com/2015/5/28/8678629/google-photos-app-announced "Google announces unlimited picture and video storage with new Photos app"]. ''The Verge''. Retrieved March 12, 2021.</ref> Photos and videos can be uploaded and accessed via any of these platforms.<ref name="Vox" /> The service organizes photos and videos by identifying any number of features present in the photo such as lakes, night, birthday, etc.<ref name="JacobK" /> It is capable of organizing photos by the faces of people and pets in them, even as the faces age.<ref name="Vox" /><ref name="JacobK" /> Users can manually fix or remove incorrect labels.<ref name="Vox" /> Google Photos also can group photos and videos by location.<ref name="Vox" /> It can determine a photo's location either by its embedded geotagging data, or by analyzing the photo for major landmarks (such as the [https://en.wikipedia.org/wiki/Eiffel_Tower Eiffel Tower]).<ref name="Vox" /><br />
<br />
The service includes a native photo and video editing software that can be used on any platform.<ref>Lowensohn, J. (2015, May 28). [https://www.theverge.com/2015/5/28/8673471/google-photos-hands-on-cloud-storage-io-2015 "Hands-on with Google's new Photos service"]. ''The Verge''. Retrieved March 12, 2021.</ref> Additionally, Google Photos offers a variety of ways to make sharing photos/videos easy and simple, such as generating web link to content that both users and non-users can access.<ref name="Vox">Mossberg, W. (2015, June 2). [https://www.vox.com/2015/6/2/11563182/the-new-google-photos-free-at-last-and-very-smart "The New Google Photos: Free at Last, and Very Smart"]. ''Vox''. Retrieved March 12, 2021.</ref> Google Photos allows users to share albums with someone directly via their Google account.<ref>Mathur, S. (2020, May 19). [https://blog.google/products/photos/new-controls-how-you-share-albums-google-photos/ "New controls for how you share albums in Google Photos"]. ''Google | The Keyword''. Retrieved March 12, 2021.</ref> In 2020, Google added a heat map feature that displays the concentration of photos in the library as a function of location.<ref>Newton, C. (2020, June 25). [https://www.theverge.com/2020/6/25/21301932/google-photos-redesign-map-view-memories-search "Google Photos gets a map view as part of a big new redesign"]. ''The Verge''. Retrieved March 12, 2021.</ref> In March of 2021, Google Photos started rolling out an advanced version of its video-editor for Android users. Users will be able to not only trim, stabilize, and rotate videos, but also crop to any aspect ratio, change perspective, and control brightness, contrast, highlights, and shadows. Google plans to release this feature to iOS in 2021.<ref>Li, A. (2021, March 29). [https://9to5google.com/2021/03/29/google-photos-video-editor/ "Update: Rolling out Google Photos for Android getting advanced video editor with cropping, filters."] ''9To5Google''. Retrieved April 2, 2021.</ref><br />
<br />
===Subscription Service===<br />
In late 2020, Google rolled out a subscription-based service that ships Google Photos users 10 prints of that months best photos.<ref>Perez, S. (2020, October 20).[https://techcrunch.com/2020/10/20/google-photos-revives-its-prints-subscription-service-expands-same-day-print-options/ "Google Photos revives its prints subscription service, expands same-day print options."] ''TechCrunch''. Retrieved April 2, 2021.</ref> The images are selected by a [https://en.wikipedia.org/wiki/Machine_learning machine learning] algorithm, but users can alter any choices. After some negative feedback through user-testing,<ref name="feedback">Nagaraj, P. (2020, June 21). [https://newslanded.com/2020/06/21/google-finally-cancels-controversial-google-photos-subscription-feature/ "Google finally cancels controversial google photos subscription feature."] ''News Landed''. Retrieved April 2, 2021.</ref> Google now allows higher customization of the look, and finishing of the prints along with a reduction in subscription costs from $7.99 to $6.99. Users are able to cancel the service at any time, and are also able to skip a month, if they so choose to. Customers can order prints in multiple sizes, and also ask for a same-day pick-up at their local [https://en.wikipedia.org/wiki/Walgreens Walgreens]. The most prominent feedback received for the service is the "outrageous cost"<ref name="feedback"/>, charging many times the price for printing at a traditional store.<ref name="feedback"/><br />
<br />
[[File:Google's_Revenue_Breakdown_2017-20.jpeg|300px|thumb|right|A breakdown of Google's sources of revenue from 2017 to 2020.<ref name="pivot" /> (Advertising revenue is in blue)]]<br />
<br />
==Storage==<br />
<br />
Photos and videos are each uploaded to Google Photos in one of three ways: "original quality", "high quality", or "express."<ref name="size" /> "Original quality" uploads maintain their original resolution and use part of the associated Google account's 15GB of storage shared between all Google products (including [https://en.wikipedia.org/wiki/Gmail Gmail], [https://en.wikipedia.org/wiki/Google_Drive Google Drive], etc.).<ref name="size">Google. (2021). [https://support.google.com/photos/answer/6220791?co=GENIE.Platform%3DAndroid&oco=1#zippy=%2Cexpress-backup%2Coriginal-quality%2Chigh-quality "Choose the upload size of your photos & videos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> Meanwhile, "high quality" uploads have their resolution downgraded to 16 megapixels and 1080p.<ref name="size" /> Finally, "express" uploads are compressed to 3 megapixels and 480p.<ref name="size" /> The free tier of Google Photos allows unlimited uploads in resolutions up to "high quality."<ref name="size" /><br />
<br />
In November 2020, in an effort to increase the number of [https://en.wikipedia.org/wiki/Google_One Google One] subscriptions and reduce Google's reliance on ad-based revenue, Google announced that Google Photos will no longer offer free unlimited storage at "express" or "high quality" starting June 1, 2021.<ref name="pivot">Newman, J. (2020, November 28). [https://www.fastcompany.com/90579872/google-photos-no-unlimited-storage "The end of unlimited Google Photos storage is part of a bigger pivot"]. ''Fast Company''. Retrieved March 12, 2021.</ref><ref>Bohn, D. (2020, November 11). [https://www.theverge.com/2020/11/11/21560810/google-photos-unlimited-cap-free-uploads-15gb-ending "Google Photos will end its free unlimited storage on June 1st, 2021"]. ''The Verge''. Retrieved March 12, 2021.</ref> Existing photos and videos will remain unaffected.<ref name="future" /> After all 15GB of account storage have been used, users will either have to maintain a Google One subscription or upload from a [https://en.wikipedia.org/wiki/Google_Pixel Google Pixel] 5 or older device.<ref>Google. (2021). [https://support.google.com/photos/answer/10100180 "Storage changes for Google Photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref><ref name="size" /> Future Pixel devices will be unable to upload in "high quality" for free.<ref>Coberly, C. (2020, November 12). [https://www.techspot.com/news/87584-future-pixel-phone-owners-reportedly-wont-retain-access.html "Future Pixel phone owners may not retain unlimited Google Photos storage access"]. ''TechSpot''. Retrieved March 12, 2021.</ref><br />
<br />
==Motivations==<br />
<br />
There have been suggestions that Google released Google Photos in the interest of collecting more information about users through visual data.<ref>Lomas, N. (2015, June 1). [https://techcrunch.com/2015/06/01/google-photos-reminder-smile-its-free-youre-the-product/ "Google Photos Reminder: Smile, It's Free - You're The Product!"]. ''TechCrunch''. Retrieved March 12, 2021.</ref> It has also been speculated that it was intended as a method of outsourcing work to train their visual recognition algorithms; the app frequently asks its users to manually verify (or reject) its proposed tags.<ref name="improve">Perrigo, M. (2020, November 9). [https://chromeunboxed.com/google-photos-crowdsource-feature "Google Photos wants you to answer questions to help improve its image recognition capabilities"]. ''Chrome Unboxed''. Retrieved March 12, 2021.</ref> <br />
<br />
===Bias in Algorithms===<br />
Bias in artificial intelligence is a widely discussed topic, with researchers voicing concerns that flawed [https://en.wikipedia.org/wiki/Training,_validation,_and_test_sets#training_set training data] can exacerbate or create unfair assumptions. One example is facial recognition algorithms misidentifying people of color more frequently than white people.<ref name="gender"> Ghosh, S. (2020, February 20). [https://www.businessinsider.nl/google-cloud-vision-api-wont-tag-images-by-gender-2020-2/ “Google AI will no longer use gender labels like ‘woman’ or ‘man’ on images of people to avoid bias”]. “Business Insider”. Retrieved April 6, 2021. </ref> <br />
<br />
[[File:Gendergoogle.png|thumbnail|right|Google's API no longer uses gender labels<ref name="gender"/>]]<br />
<br />
In 2020, Google made a change to its API tool [https://en.wikipedia.org/wiki/Google_Cloud_Platform Cloud Vision], which uses AI to analyze images in order to identify people, places, and things. The change involved eliminating gender labels on images, because a person’s gender can’t be determined just by how they look in a photo. Instead of using terms like “man” or “woman”, Google will tag images with labels such as “person” in order to avoid instilling AI algorithms with human bias.<ref> Lyons, K. (2020, February 20). [https://www.theverge.com/2020/2/20/21145356/google-ai-images-gender-bias-labels-people “Google Ai tool will no longer use gendered labels like ‘woman’ or ‘man’ in photos of people”]. “The Verge”. Retrieved April 6, 2021. </ref> <br />
<br />
Regarding its own AI principles, Google acknowledges that algorithms and datasets can reinforce bias: “We will seek to avoid unjust impacts on people, particularly those related to sensitive characteristics such as race, ethnicity, gender, nationality, income, sexual orientation, ability, and political or religious belief. We will design AI systems that provide appropriate opportunities for feedback, relevant explanations, and appeal. Our AI technologies will be subject to appropriate human direction and control.”<ref> Pichai, S. (2018, June 7). [https://www.blog.google/technology/ai/ai-principles/ “AI at Google: our principles”]. “Google”. Retrieved April 6, 2021. </ref><br />
<br />
===Misidentification===<br />
<br />
The nature of a computer analyzing and captioning a photo through a machine learning algorithm involves training from a data set. This is where a large amount of bias comes from, in the form of capture, category and negative bias. <ref name="tommasi">Tommasi, Tatiana, Patricia, Novi, Caputo, Barbara, & Tuytelaars, Tinne. (2015). A Deeper Look at Dataset Bias. Pattern Recognition, Lecture Notes in Computer Science (Vol. 9358, pp. 504–516). Cham: Springer International Publishing.</ref> Capture bias refers to all the variables involved with acquiring images, such as lighting conditions or camera used. Category bias refers to a poor definition of categories in which the image can exist, while negative bias involves the inability to train a model on categories which are not included in the data set. <ref name="tommasi"/> Machine learning models only know what they are taught, and as such, any of these biases appearing in Googles training data will resurface when categorizing photos uploaded by users. <ref name="debrusk">Chris DeBrusk. (2018). The Risk of Machine-Learning Bias (and How to Prevent It). MIT Sloan Blogs. Blog, Cambridge: Massachusetts Institute of Technology, Cambridge, MA.</ref><br />
<br />
[[File:Alcine1-cropped.png|216px|thumb|left|Jacky Alciné's Twitter post about the "Gorillas" misidentification.<ref name="gorillas" />]]<br />
<br />
Shortly after its release in 2015, Google Photos found itself under fire for a particularly offensive error: user Jacky Alciné reported that he and his friend were classified as "Gorillas" by Google Photos's visual recognition algorithms.<ref name="gorillas">Mullen, J. (2015, July 2). [https://www.cnn.com/2015/07/02/tech/google-image-recognition-gorillas-tag "Google rushes to fix software that tagged photo with racial slur"]. ''CNN''. Retrieved March 19, 2021.</ref> In response, Google Photos promptly removed all labels relating to primates as a temporary fix.<ref>Hern, A. (2018, January 12). [https://www.theguardian.com/technology/2018/jan/12/google-racism-ban-gorilla-black-people "Google's solution to accidental algorithmic racism: ban gorillas"]. ''The Guardian''. Retrieved March 19, 2021.</ref> A spokesperson from Google stated that the labels "gorilla", "chimp", "chimpanzee", and "monkey" were blocked on the platform.<ref>Vincent, J. (2018, January 12). [https://www.theverge.com/2018/1/12/16882408/google-racist-gorillas-photo-recognition-algorithm-ai "Google ‘fixed’ its racist algorithm by removing gorillas from its image-labeling tech."] ''The Verge''. Retrieved April 2, 2021.</ref> According to Mr. Alcine, Google has intensified their search for qualified people of color to hire in order to correct issues stemming from under representation of minorities in technology companies. <ref> (2015, July 1). [https://www.bbc.com/news/technology-33347866 “Google apologises for Photo app’s racist blunder”]. “BBC”. Retrieved April 6, 2021. </ref> With enough data and computing power, software can be trained to categorize images to a high level of accuracy, but it can’t easily go beyond the experience of that training. Even the very best algorithms lack the ability to refine their interpretation of the world as humans do.<ref> Simonite, T. (2018, January 11). [https://www.wired.com/story/when-it-comes-to-gorillas-google-photos-remains-blind/ “When It Comes To Gorillas, Google Photos Remains Blind”]. “Wired”. Retrieved April 6, 2021. </ref><br />
<br />
[[File:Incorrectly-Labeled_Bikes-cropped.jpg|150px|thumb|right|This meal has been incorrectly labeled as a "bike" by Google's visual recognition algorithms. Copyright ©️: Cooper Stevens]]<br />
<br />
In an effort to quantify the accuracy of Google Photos image recognition model, a study was performed by uploading a wide range of commonly recognizable photos and recording the results of categorization. The test was setup by uploading over one hundred photos, both with significant (descriptive) and insignificant (arbitrary) file names and placing them into albums organized by country of origin for that image. After leaving the images for one day, they had been organized into 9 broad categories with varying degrees of accuracy. The least accurate were photos of blue, incorrectly identified as the sky, while the most accurate were mountains, cars and sunsets. Nieuwenhuysen states that the system is satisfactory considering it can be used free of charge, reporting that it excels at recognizing any photo to at least a vaguely specific category, even when the image is named arbitrarily. <ref name=""> Nieuwenhuysen, Paul. (2018). Information Discovery and Images A Case Study of Google Photos. 2018 5th International Symposium on Emerging Trends and Technologies in Libraries and Information Services (ETTLIS) (pp. 16–21). orig-research, IEEE.</ref><br />
<br />
<br />
==Ethical Implications==<br />
[[File:July_13,_2018_Timeline-light.jpg|205px|thumb|right|Google Photos works in conjunction with Google Maps to create an extremely precise summary of this user’s day. Copyright ©️: Cooper Stevens]]<br />
===Security===<br />
A foremost concern regarding Google Photos is ensuring that the user who uploads photos and videos is the only one who can control access to and view that content.<ref>Hill, S. (2015, June 16). [https://www.androidauthority.com/google-photos-worried-privacy-616339/ "Google Photos: Should you be worried about privacy?"]. ''Android Authority''. Retrieved March 12, 2021.</ref> These worries are substantiated by occurrences like Google accidentally sending users’ private videos to strangers in November 2019.<ref>Warren, T. (2020, February 4). [https://www.theverge.com/2020/2/4/21122044/google-photos-privacy-breach-takeout-data-video-strangers "Google admits it sent private videos in Google Photos to strangers"]. ''The Verge''. Retrieved March 12, 2021.</ref> To alleviate future incidents, Google continuously works to improve its security infrastructure to ensure that its privacy goals are upheld for the benefit of all users.<ref>Google. (2021). [https://safety.google/photos/#photos-advanced-infraestructure "Google Photos Safety &amp; Privacy Features"]. ''Google Safety Center''. Retrieved March 12, 2021.</ref><br />
<br />
A seldom acknowledged facetof large scale data storage as seen in Gooogle Photos, is the presence of unknown persons present at the physical locations which store the massive collection of photos and videos, who may have access to private user data. At the physical data storage level, a cloud service provider must ensure that data is properly segregated, prepared for data recovery situations and ensure that no person who is not authorized can locally access the database storing sensitive or private information. <ref name="teneyuca">David Teneyuca,<br />
Internet cloud security: The illusion of inclusion,<br />
Information Security Technical Report,<br />
Volume 16, Issues 3–4,<br />
2011,<br />
Pages 102-107,<br />
ISSN 1363-4127,<br />
https://doi.org/10.1016/j.istr.2011.08.005.<br />
(https://www.sciencedirect.com/science/article/pii/S1363412711000501)</ref><br />
<br />
===Computer Vision===<br />
Google Photos automatically runs every photo and video through visual recognition algorithms to identify objects and places.<ref>Google. (2021). [https://support.google.com/photos/answer/6220402?hl=en&amp;ref_topic=6128818 "Get started with Google Photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> Once "face grouping" is enabled by the user, Photos will also start to categorize images by the people and pets who appear in them.<ref name="searchPTP">Google. (2021). [https://support.google.com/photos/answer/6128838?hl=en&co=GENIE.Platform%3DDesktop&oco=0#zippy=%2Clearn-about-face-models "Search by people, things & places in your photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> There is some uncertainty surrounding Google collecting the information embedded in the photos and selling it to third parties or using it to display more relevant advertisements<ref>Luckerson, V. (2017, May 25). [https://www.theringer.com/2017/5/25/16043842/google-photos-data-collection-e8578b3256e0 "Why Google Is Suddenly Obsessed With Your Photos"]. ''The Ringer''. Retrieved March 12, 2021.</ref>, even though Google ensures users that "face groups and labels in your account are only visible to you."<ref name="searchPTP" /><br />
<br />
===Geotagging===<br />
<br />
If a user allows Google to keep track of their location history under "Your Timeline" in Google Maps, Google Photos uses this alongside geotags embedded in photos to allow users to search their gallery by location and even review past trips minute-by-minute.<ref name="Vox" /><ref>Bonifacic, I. (2020, December 16). [https://www.engadget.com/google-photos-adds-maps-timeline-233633756.html "Google Photos lets you relive a day through your Maps timeline"]. ''Engadget''. Retrieved March 12, 2021.</ref><br />
<br />
Even without geotags, Google Photos is capable of intuiting a photo’s location by analyzing for major landmarks.<ref name="Vox" /> As a result, concerns have been raised about how this information is used and if it is secure.<ref>Komando, K. (2021, January 11). [https://www.usatoday.com/story/tech/columnist/komando/2021/01/07/google-map-knows-your-location-photos-you-took-how-turn-off/4113403001/ "Hidden map on your phone shows everywhere you’ve been and the photos you took there"]. ''USA Today''. Retrieved March 12, 2021.</ref><br />
<br />
Google is upfront that it uses this information to show users more relevant advertisements, but at the same time assures users that this information is never shared with advertisers.<ref>Google. (2021). [https://policies.google.com/technologies/location-data?hl=en-US "How Google uses location information"]. ''Privacy &amp; Terms''. Retrieved March 12, 2021.</ref> To accommodate these concerns, Google Photos gives its users the capability to turn off location history, remove location information from already existing photos, and choose whether or not to share a photo's location when shared.<ref>Google. (2021). [https://support.google.com/photos/answer/6153599?co=GENIE.Platform%3DAndroid&oco=1#zippy=%2Cview-your-timeline "Understand, find and edit your photos' locations"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref><br />
<br />
==See also==<br />
*[[Computer Vision]]<br />
*[[Facial Recognition]]<br />
*[[Geotag]]<br />
<br />
==References==<br />
<references/></div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=Google_Photos&diff=102068
Google Photos
2021-04-15T20:31:14Z
<p>Nfigue: Moved some pictures around</p>
<hr />
<div><span style="color: red; font-size: 50px;">Currently Being Edited</span><br />
<br />
<br />
[[File:Google_Photos_Logo_new.png|256px|thumb|right|The Google Photos logo. Copyright ©️: Google]]<br />
{{Nav-Bar|Topics##}}<br><br />
'''Google Photos''' is a photo and video storage and sharing service developed by [[Google]].<ref name="about">Google. (2021). [https://www.google.com/photos/about/ "Google Photos | About"]. ''Google Photos''. Retrieved March 26, 2021.</ref> Google Photos was released in May 2015 as it separated from Google’s former social network, [https://en.wikipedia.org/wiki/Google%2B Google+].<ref>Spradlin, L. (2015, May 24). [https://www.androidpolice.com/2015/05/24/an-exclusive-early-look-at-the-new-google-photos-app/ "An Exclusive Early Look At The New Google Photos App"]. ''Android Police''. Retrieved March 12, 2021.</ref> Google Photos was designed with the intent of creating a platform that allows people to store and easily access all of their pictures and videos from any device.<ref>Amateur Photographer. (2015, June 2). [https://www.amateurphotographer.co.uk/latest/photo-news/google-photos-service-raises-privacy-concerns-52721 "Google Photos Service Raises Privacy Concerns"]. ''Amateur Photographer''. Retrieved March 12, 2021.</ref><br />
<br />
After its release, Google Photos's user base rocketed to 200 million after one year, 500 million after two years, and surpassed 1 billion after four years.<ref>Porter, J. (2019, July 24). [https://www.theverge.com/2019/7/24/20708328/google-photos-users-gallery-go-1-billion "Google Photos passes the 1 billion users mark"]. ''The Verge''. Retrieved March 12, 2021.</ref> As of 2020, Google reports that about 28 billion photos and videos are uploaded every week and that the service is home to more than 4 trillion photos.<ref name="future">Ben-Yair, S. (2020, November 11). [https://blog.google/products/photos/storage-changes/ "Updating Google Photos' storage policy to build for the future"]. ''Google | The Keyword''. Retrieved March 12, 2021.</ref><br />
<br />
==Features==<br />
Google Photos is available on [[Android]], [https://en.wikipedia.org/wiki/IOS iOS], and [https://photos.google.com/ online].<ref name="JacobK">Kastrenakes, J. (2015, May 28). [https://www.theverge.com/2015/5/28/8678629/google-photos-app-announced "Google announces unlimited picture and video storage with new Photos app"]. ''The Verge''. Retrieved March 12, 2021.</ref> Photos and videos can be uploaded and accessed via any of these platforms.<ref name="Vox" /> The service organizes photos and videos by identifying any number of features present in the photo such as lakes, night, birthday, etc.<ref name="JacobK" /> It is capable of organizing photos by the faces of people and pets in them, even as the faces age.<ref name="Vox" /><ref name="JacobK" /> Users can manually fix or remove incorrect labels.<ref name="Vox" /> Google Photos also can group photos and videos by location.<ref name="Vox" /> It can determine a photo's location either by its embedded geotagging data, or by analyzing the photo for major landmarks (such as the [https://en.wikipedia.org/wiki/Eiffel_Tower Eiffel Tower]).<ref name="Vox" /><br />
<br />
The service includes a native photo and video editing software that can be used on any platform.<ref>Lowensohn, J. (2015, May 28). [https://www.theverge.com/2015/5/28/8673471/google-photos-hands-on-cloud-storage-io-2015 "Hands-on with Google's new Photos service"]. ''The Verge''. Retrieved March 12, 2021.</ref> Additionally, Google Photos offers a variety of ways to make sharing photos/videos easy and simple, such as generating web link to content that both users and non-users can access.<ref name="Vox">Mossberg, W. (2015, June 2). [https://www.vox.com/2015/6/2/11563182/the-new-google-photos-free-at-last-and-very-smart "The New Google Photos: Free at Last, and Very Smart"]. ''Vox''. Retrieved March 12, 2021.</ref> Google Photos allows users to share albums with someone directly via their Google account.<ref>Mathur, S. (2020, May 19). [https://blog.google/products/photos/new-controls-how-you-share-albums-google-photos/ "New controls for how you share albums in Google Photos"]. ''Google | The Keyword''. Retrieved March 12, 2021.</ref> In 2020, Google added a heat map feature that displays the concentration of photos in the library as a function of location.<ref>Newton, C. (2020, June 25). [https://www.theverge.com/2020/6/25/21301932/google-photos-redesign-map-view-memories-search "Google Photos gets a map view as part of a big new redesign"]. ''The Verge''. Retrieved March 12, 2021.</ref> In March of 2021, Google Photos started rolling out an advanced version of its video-editor for Android users. Users will be able to not only trim, stabilize, and rotate videos, but also crop to any aspect ratio, change perspective, and control brightness, contrast, highlights, and shadows. Google plans to release this feature to iOS in 2021.<ref>Li, A. (2021, March 29). [https://9to5google.com/2021/03/29/google-photos-video-editor/ "Update: Rolling out Google Photos for Android getting advanced video editor with cropping, filters."] ''9To5Google''. Retrieved April 2, 2021.</ref><br />
<br />
===Subscription Service===<br />
In late 2020, Google rolled out a subscription-based service that ships Google Photos users 10 prints of that months best photos.<ref>Perez, S. (2020, October 20).[https://techcrunch.com/2020/10/20/google-photos-revives-its-prints-subscription-service-expands-same-day-print-options/ "Google Photos revives its prints subscription service, expands same-day print options."] ''TechCrunch''. Retrieved April 2, 2021.</ref> The images are selected by a [https://en.wikipedia.org/wiki/Machine_learning machine learning] algorithm, but users can alter any choices. After some negative feedback through user-testing,<ref name="feedback">Nagaraj, P. (2020, June 21). [https://newslanded.com/2020/06/21/google-finally-cancels-controversial-google-photos-subscription-feature/ "Google finally cancels controversial google photos subscription feature."] ''News Landed''. Retrieved April 2, 2021.</ref> Google now allows higher customization of the look, and finishing of the prints along with a reduction in subscription costs from $7.99 to $6.99. Users are able to cancel the service at any time, and are also able to skip a month, if they so choose to. Customers can order prints in multiple sizes, and also ask for a same-day pick-up at their local [https://en.wikipedia.org/wiki/Walgreens Walgreens]. The most prominent feedback received for the service is the "outrageous cost"<ref name="feedback"/>, charging many times the price for printing at a traditional store.<ref name="feedback"/><br />
<br />
[[File:Google's_Revenue_Breakdown_2017-20.jpeg|300px|thumb|right|A breakdown of Google's sources of revenue from 2017 to 2020.<ref name="pivot" /> (Advertising revenue is in blue)]]<br />
<br />
==Storage==<br />
<br />
Photos and videos are each uploaded to Google Photos in one of three ways: "original quality", "high quality", or "express."<ref name="size" /> "Original quality" uploads maintain their original resolution and use part of the associated Google account's 15GB of storage shared between all Google products (including [https://en.wikipedia.org/wiki/Gmail Gmail], [https://en.wikipedia.org/wiki/Google_Drive Google Drive], etc.).<ref name="size">Google. (2021). [https://support.google.com/photos/answer/6220791?co=GENIE.Platform%3DAndroid&oco=1#zippy=%2Cexpress-backup%2Coriginal-quality%2Chigh-quality "Choose the upload size of your photos & videos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> Meanwhile, "high quality" uploads have their resolution downgraded to 16 megapixels and 1080p.<ref name="size" /> Finally, "express" uploads are compressed to 3 megapixels and 480p.<ref name="size" /> The free tier of Google Photos allows unlimited uploads in resolutions up to "high quality."<ref name="size" /><br />
<br />
In November 2020, in an effort to increase the number of [https://en.wikipedia.org/wiki/Google_One Google One] subscriptions and reduce Google's reliance on ad-based revenue, Google announced that Google Photos will no longer offer free unlimited storage at "express" or "high quality" starting June 1, 2021.<ref name="pivot">Newman, J. (2020, November 28). [https://www.fastcompany.com/90579872/google-photos-no-unlimited-storage "The end of unlimited Google Photos storage is part of a bigger pivot"]. ''Fast Company''. Retrieved March 12, 2021.</ref><ref>Bohn, D. (2020, November 11). [https://www.theverge.com/2020/11/11/21560810/google-photos-unlimited-cap-free-uploads-15gb-ending "Google Photos will end its free unlimited storage on June 1st, 2021"]. ''The Verge''. Retrieved March 12, 2021.</ref> Existing photos and videos will remain unaffected.<ref name="future" /> After all 15GB of account storage have been used, users will either have to maintain a Google One subscription or upload from a [https://en.wikipedia.org/wiki/Google_Pixel Google Pixel] 5 or older device.<ref>Google. (2021). [https://support.google.com/photos/answer/10100180 "Storage changes for Google Photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref><ref name="size" /> Future Pixel devices will be unable to upload in "high quality" for free.<ref>Coberly, C. (2020, November 12). [https://www.techspot.com/news/87584-future-pixel-phone-owners-reportedly-wont-retain-access.html "Future Pixel phone owners may not retain unlimited Google Photos storage access"]. ''TechSpot''. Retrieved March 12, 2021.</ref><br />
<br />
==Motivations==<br />
<br />
There have been suggestions that Google released Google Photos in the interest of collecting more information about users through visual data.<ref>Lomas, N. (2015, June 1). [https://techcrunch.com/2015/06/01/google-photos-reminder-smile-its-free-youre-the-product/ "Google Photos Reminder: Smile, It's Free - You're The Product!"]. ''TechCrunch''. Retrieved March 12, 2021.</ref> It has also been speculated that it was intended as a method of outsourcing work to train their visual recognition algorithms; the app frequently asks its users to manually verify (or reject) its proposed tags.<ref name="improve">Perrigo, M. (2020, November 9). [https://chromeunboxed.com/google-photos-crowdsource-feature "Google Photos wants you to answer questions to help improve its image recognition capabilities"]. ''Chrome Unboxed''. Retrieved March 12, 2021.</ref> <br />
<br />
===Bias in Algorithms===<br />
Bias in artificial intelligence is a widely discussed topic, with researchers voicing concerns that flawed [https://en.wikipedia.org/wiki/Training,_validation,_and_test_sets#training_set training data] can exacerbate or create unfair assumptions. One example is facial recognition algorithms misidentifying people of color more frequently than white people.<ref name="gender"> Ghosh, S. (2020, February 20). [https://www.businessinsider.nl/google-cloud-vision-api-wont-tag-images-by-gender-2020-2/ “Google AI will no longer use gender labels like ‘woman’ or ‘man’ on images of people to avoid bias”]. “Business Insider”. Retrieved April 6, 2021. </ref> <br />
<br />
[[File:Gendergoogle.png|thumbnail|right|Google's API no longer uses gender labels<ref name="gender"/>]]<br />
<br />
In 2020, Google made a change to its API tool [https://en.wikipedia.org/wiki/Google_Cloud_Platform Cloud Vision], which uses AI to analyze images in order to identify people, places, and things. The change involved eliminating gender labels on images, because a person’s gender can’t be determined just by how they look in a photo. Instead of using terms like “man” or “woman”, Google will tag images with labels such as “person” in order to avoid instilling AI algorithms with human bias.<ref> Lyons, K. (2020, February 20). [https://www.theverge.com/2020/2/20/21145356/google-ai-images-gender-bias-labels-people “Google Ai tool will no longer use gendered labels like ‘woman’ or ‘man’ in photos of people”]. “The Verge”. Retrieved April 6, 2021. </ref> <br />
<br />
Regarding its own AI principles, Google acknowledges that algorithms and datasets can reinforce bias: “We will seek to avoid unjust impacts on people, particularly those related to sensitive characteristics such as race, ethnicity, gender, nationality, income, sexual orientation, ability, and political or religious belief. We will design AI systems that provide appropriate opportunities for feedback, relevant explanations, and appeal. Our AI technologies will be subject to appropriate human direction and control.”<ref> Pichai, S. (2018, June 7). [https://www.blog.google/technology/ai/ai-principles/ “AI at Google: our principles”]. “Google”. Retrieved April 6, 2021. </ref><br />
<br />
===Misidentification===<br />
<br />
The nature of a computer analyzing and captioning a photo through a machine learning algorithm involves training from a data set. This is where a large amount of bias comes from, in the form of capture, category and negative bias. <ref name="tommasi">Tommasi, Tatiana, Patricia, Novi, Caputo, Barbara, & Tuytelaars, Tinne. (2015). A Deeper Look at Dataset Bias. Pattern Recognition, Lecture Notes in Computer Science (Vol. 9358, pp. 504–516). Cham: Springer International Publishing.</ref> Capture bias refers to all the variables involved with acquiring images, such as lighting conditions or camera used. Category bias refers to a poor definition of categories in which the image can exist, while negative bias involves the inability to train a model on categories which are not included in the data set. <ref name="tommasi"/> Machine learning models only know what they are taught, and as such, any of these biases appearing in Googles training data will resurface when categorizing photos uploaded by users. <ref name="debrusk">Chris DeBrusk. (2018). The Risk of Machine-Learning Bias (and How to Prevent It). MIT Sloan Blogs. Blog, Cambridge: Massachusetts Institute of Technology, Cambridge, MA.</ref><br />
<br />
[[File:Alcine1-cropped.png|216px|thumb|left|Jacky Alciné's Twitter post about the "Gorillas" misidentification.<ref name="gorillas" />]]<br />
<br />
Shortly after its release in 2015, Google Photos found itself under fire for a particularly offensive error: user Jacky Alciné reported that he and his friend were classified as "Gorillas" by Google Photos's visual recognition algorithms.<ref name="gorillas">Mullen, J. (2015, July 2). [https://www.cnn.com/2015/07/02/tech/google-image-recognition-gorillas-tag "Google rushes to fix software that tagged photo with racial slur"]. ''CNN''. Retrieved March 19, 2021.</ref> In response, Google Photos promptly removed all labels relating to primates as a temporary fix.<ref>Hern, A. (2018, January 12). [https://www.theguardian.com/technology/2018/jan/12/google-racism-ban-gorilla-black-people "Google's solution to accidental algorithmic racism: ban gorillas"]. ''The Guardian''. Retrieved March 19, 2021.</ref> A spokesperson from Google stated that the labels "gorilla", "chimp", "chimpanzee", and "monkey" were blocked on the platform.<ref>Vincent, J. (2018, January 12). [https://www.theverge.com/2018/1/12/16882408/google-racist-gorillas-photo-recognition-algorithm-ai "Google ‘fixed’ its racist algorithm by removing gorillas from its image-labeling tech."] ''The Verge''. Retrieved April 2, 2021.</ref> According to Mr. Alcine, Google has intensified their search for qualified people of color to hire in order to correct issues stemming from under representation of minorities in technology companies. <ref> (2015, July 1). [https://www.bbc.com/news/technology-33347866 “Google apologises for Photo app’s racist blunder”]. “BBC”. Retrieved April 6, 2021. </ref> With enough data and computing power, software can be trained to categorize images to a high level of accuracy, but it can’t easily go beyond the experience of that training. Even the very best algorithms lack the ability to refine their interpretation of the world as humans do.<ref> Simonite, T. (2018, January 11). [https://www.wired.com/story/when-it-comes-to-gorillas-google-photos-remains-blind/ “When It Comes To Gorillas, Google Photos Remains Blind”]. “Wired”. Retrieved April 6, 2021. </ref><br />
<br />
[[File:Incorrectly-Labeled_Bikes-cropped.jpg|150px|thumb|right|This meal has been incorrectly labeled as a "bike" by Google's visual recognition algorithms. Copyright ©️: Cooper Stevens]]<br />
<br />
In an effort to quantify the accuracy of Google Photos image recognition model, a study was performed by uploading a wide range of commonly recognizable photos and recording the results of categorization. The test was setup by uploading over one hundred photos, both with significant (descriptive) and insignificant (arbitrary) file names and placing them into albums organized by country of origin for that image. After leaving the images for one day, they had been organized into 9 broad categories with varying degrees of accuracy. The least accurate were photos of blue, incorrectly identified as the sky, while the most accurate were mountains, cars and sunsets. Nieuwenhuysen states that the system is satisfactory considering it can be used free of charge, reporting that it excels at recognizing any photo to at least a vaguely specific category, even when the image is named arbitrarily. <ref name=""> Nieuwenhuysen, Paul. (2018). Information Discovery and Images A Case Study of Google Photos. 2018 5th International Symposium on Emerging Trends and Technologies in Libraries and Information Services (ETTLIS) (pp. 16–21). orig-research, IEEE.</ref><br />
<br />
<br />
==Ethical Implications==<br />
[[File:July_13,_2018_Timeline-light.jpg|205px|thumb|right|Google Photos works in conjunction with Google Maps to create an extremely precise summary of this user’s day. Copyright ©️: Cooper Stevens]]<br />
===Security===<br />
A foremost concern regarding Google Photos is ensuring that the user who uploads photos and videos is the only one who can control access to and view that content.<ref>Hill, S. (2015, June 16). [https://www.androidauthority.com/google-photos-worried-privacy-616339/ "Google Photos: Should you be worried about privacy?"]. ''Android Authority''. Retrieved March 12, 2021.</ref> These worries are substantiated by occurrences like Google accidentally sending users’ private videos to strangers in November 2019.<ref>Warren, T. (2020, February 4). [https://www.theverge.com/2020/2/4/21122044/google-photos-privacy-breach-takeout-data-video-strangers "Google admits it sent private videos in Google Photos to strangers"]. ''The Verge''. Retrieved March 12, 2021.</ref> To alleviate future incidents, Google continuously works to improve its security infrastructure to ensure that its privacy goals are upheld for the benefit of all users.<ref>Google. (2021). [https://safety.google/photos/#photos-advanced-infraestructure "Google Photos Safety &amp; Privacy Features"]. ''Google Safety Center''. Retrieved March 12, 2021.</ref><br />
<br />
===Computer Vision===<br />
Google Photos automatically runs every photo and video through visual recognition algorithms to identify objects and places.<ref>Google. (2021). [https://support.google.com/photos/answer/6220402?hl=en&amp;ref_topic=6128818 "Get started with Google Photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> Once "face grouping" is enabled by the user, Photos will also start to categorize images by the people and pets who appear in them.<ref name="searchPTP">Google. (2021). [https://support.google.com/photos/answer/6128838?hl=en&co=GENIE.Platform%3DDesktop&oco=0#zippy=%2Clearn-about-face-models "Search by people, things & places in your photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> There is some uncertainty surrounding Google collecting the information embedded in the photos and selling it to third parties or using it to display more relevant advertisements<ref>Luckerson, V. (2017, May 25). [https://www.theringer.com/2017/5/25/16043842/google-photos-data-collection-e8578b3256e0 "Why Google Is Suddenly Obsessed With Your Photos"]. ''The Ringer''. Retrieved March 12, 2021.</ref>, even though Google ensures users that "face groups and labels in your account are only visible to you."<ref name="searchPTP" /><br />
<br />
===Geotagging===<br />
<br />
If a user allows Google to keep track of their location history under "Your Timeline" in Google Maps, Google Photos uses this alongside geotags embedded in photos to allow users to search their gallery by location and even review past trips minute-by-minute.<ref name="Vox" /><ref>Bonifacic, I. (2020, December 16). [https://www.engadget.com/google-photos-adds-maps-timeline-233633756.html "Google Photos lets you relive a day through your Maps timeline"]. ''Engadget''. Retrieved March 12, 2021.</ref><br />
<br />
Even without geotags, Google Photos is capable of intuiting a photo’s location by analyzing for major landmarks.<ref name="Vox" /> As a result, concerns have been raised about how this information is used and if it is secure.<ref>Komando, K. (2021, January 11). [https://www.usatoday.com/story/tech/columnist/komando/2021/01/07/google-map-knows-your-location-photos-you-took-how-turn-off/4113403001/ "Hidden map on your phone shows everywhere you’ve been and the photos you took there"]. ''USA Today''. Retrieved March 12, 2021.</ref><br />
<br />
Google is upfront that it uses this information to show users more relevant advertisements, but at the same time assures users that this information is never shared with advertisers.<ref>Google. (2021). [https://policies.google.com/technologies/location-data?hl=en-US "How Google uses location information"]. ''Privacy &amp; Terms''. Retrieved March 12, 2021.</ref> To accommodate these concerns, Google Photos gives its users the capability to turn off location history, remove location information from already existing photos, and choose whether or not to share a photo's location when shared.<ref>Google. (2021). [https://support.google.com/photos/answer/6153599?co=GENIE.Platform%3DAndroid&oco=1#zippy=%2Cview-your-timeline "Understand, find and edit your photos' locations"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref><br />
<br />
==See also==<br />
*[[Computer Vision]]<br />
*[[Facial Recognition]]<br />
*[[Geotag]]<br />
<br />
==References==<br />
<references/></div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=Google_Photos&diff=102067
Google Photos
2021-04-15T20:27:44Z
<p>Nfigue: /* Bias in Algorithms */ added link</p>
<hr />
<div><span style="color: red; font-size: 50px;">Currently Being Edited</span><br />
<br />
<br />
[[File:Google_Photos_Logo_new.png|256px|thumb|right|The Google Photos logo. Copyright ©️: Google]]<br />
{{Nav-Bar|Topics##}}<br><br />
'''Google Photos''' is a photo and video storage and sharing service developed by [[Google]].<ref name="about">Google. (2021). [https://www.google.com/photos/about/ "Google Photos | About"]. ''Google Photos''. Retrieved March 26, 2021.</ref> Google Photos was released in May 2015 as it separated from Google’s former social network, [https://en.wikipedia.org/wiki/Google%2B Google+].<ref>Spradlin, L. (2015, May 24). [https://www.androidpolice.com/2015/05/24/an-exclusive-early-look-at-the-new-google-photos-app/ "An Exclusive Early Look At The New Google Photos App"]. ''Android Police''. Retrieved March 12, 2021.</ref> Google Photos was designed with the intent of creating a platform that allows people to store and easily access all of their pictures and videos from any device.<ref>Amateur Photographer. (2015, June 2). [https://www.amateurphotographer.co.uk/latest/photo-news/google-photos-service-raises-privacy-concerns-52721 "Google Photos Service Raises Privacy Concerns"]. ''Amateur Photographer''. Retrieved March 12, 2021.</ref><br />
<br />
After its release, Google Photos's user base rocketed to 200 million after one year, 500 million after two years, and surpassed 1 billion after four years.<ref>Porter, J. (2019, July 24). [https://www.theverge.com/2019/7/24/20708328/google-photos-users-gallery-go-1-billion "Google Photos passes the 1 billion users mark"]. ''The Verge''. Retrieved March 12, 2021.</ref> As of 2020, Google reports that about 28 billion photos and videos are uploaded every week and that the service is home to more than 4 trillion photos.<ref name="future">Ben-Yair, S. (2020, November 11). [https://blog.google/products/photos/storage-changes/ "Updating Google Photos' storage policy to build for the future"]. ''Google | The Keyword''. Retrieved March 12, 2021.</ref><br />
<br />
==Features==<br />
Google Photos is available on [[Android]], [https://en.wikipedia.org/wiki/IOS iOS], and [https://photos.google.com/ online].<ref name="JacobK">Kastrenakes, J. (2015, May 28). [https://www.theverge.com/2015/5/28/8678629/google-photos-app-announced "Google announces unlimited picture and video storage with new Photos app"]. ''The Verge''. Retrieved March 12, 2021.</ref> Photos and videos can be uploaded and accessed via any of these platforms.<ref name="Vox" /> The service organizes photos and videos by identifying any number of features present in the photo such as lakes, night, birthday, etc.<ref name="JacobK" /> It is capable of organizing photos by the faces of people and pets in them, even as the faces age.<ref name="Vox" /><ref name="JacobK" /> Users can manually fix or remove incorrect labels.<ref name="Vox" /> Google Photos also can group photos and videos by location.<ref name="Vox" /> It can determine a photo's location either by its embedded geotagging data, or by analyzing the photo for major landmarks (such as the [https://en.wikipedia.org/wiki/Eiffel_Tower Eiffel Tower]).<ref name="Vox" /><br />
<br />
The service includes a native photo and video editing software that can be used on any platform.<ref>Lowensohn, J. (2015, May 28). [https://www.theverge.com/2015/5/28/8673471/google-photos-hands-on-cloud-storage-io-2015 "Hands-on with Google's new Photos service"]. ''The Verge''. Retrieved March 12, 2021.</ref> Additionally, Google Photos offers a variety of ways to make sharing photos/videos easy and simple, such as generating web link to content that both users and non-users can access.<ref name="Vox">Mossberg, W. (2015, June 2). [https://www.vox.com/2015/6/2/11563182/the-new-google-photos-free-at-last-and-very-smart "The New Google Photos: Free at Last, and Very Smart"]. ''Vox''. Retrieved March 12, 2021.</ref> Google Photos allows users to share albums with someone directly via their Google account.<ref>Mathur, S. (2020, May 19). [https://blog.google/products/photos/new-controls-how-you-share-albums-google-photos/ "New controls for how you share albums in Google Photos"]. ''Google | The Keyword''. Retrieved March 12, 2021.</ref> In 2020, Google added a heat map feature that displays the concentration of photos in the library as a function of location.<ref>Newton, C. (2020, June 25). [https://www.theverge.com/2020/6/25/21301932/google-photos-redesign-map-view-memories-search "Google Photos gets a map view as part of a big new redesign"]. ''The Verge''. Retrieved March 12, 2021.</ref> In March of 2021, Google Photos started rolling out an advanced version of its video-editor for Android users. Users will be able to not only trim, stabilize, and rotate videos, but also crop to any aspect ratio, change perspective, and control brightness, contrast, highlights, and shadows. Google plans to release this feature to iOS in 2021.<ref>Li, A. (2021, March 29). [https://9to5google.com/2021/03/29/google-photos-video-editor/ "Update: Rolling out Google Photos for Android getting advanced video editor with cropping, filters."] ''9To5Google''. Retrieved April 2, 2021.</ref><br />
<br />
===Subscription Service===<br />
In late 2020, Google rolled out a subscription-based service that ships Google Photos users 10 prints of that months best photos.<ref>Perez, S. (2020, October 20).[https://techcrunch.com/2020/10/20/google-photos-revives-its-prints-subscription-service-expands-same-day-print-options/ "Google Photos revives its prints subscription service, expands same-day print options."] ''TechCrunch''. Retrieved April 2, 2021.</ref> The images are selected by a [https://en.wikipedia.org/wiki/Machine_learning machine learning] algorithm, but users can alter any choices. After some negative feedback through user-testing,<ref name="feedback">Nagaraj, P. (2020, June 21). [https://newslanded.com/2020/06/21/google-finally-cancels-controversial-google-photos-subscription-feature/ "Google finally cancels controversial google photos subscription feature."] ''News Landed''. Retrieved April 2, 2021.</ref> Google now allows higher customization of the look, and finishing of the prints along with a reduction in subscription costs from $7.99 to $6.99. Users are able to cancel the service at any time, and are also able to skip a month, if they so choose to. Customers can order prints in multiple sizes, and also ask for a same-day pick-up at their local [https://en.wikipedia.org/wiki/Walgreens Walgreens]. The most prominent feedback received for the service is the "outrageous cost"<ref name="feedback"/>, charging many times the price for printing at a traditional store.<ref name="feedback"/><br />
<br />
[[File:Google's_Revenue_Breakdown_2017-20.jpeg|300px|thumb|right|A breakdown of Google's sources of revenue from 2017 to 2020.<ref name="pivot" /> (Advertising revenue is in blue)]]<br />
<br />
[[File:Incorrectly-Labeled_Bikes-cropped.jpg|216px|thumb|left|This meal has been incorrectly labeled as a "bike" by Google's visual recognition algorithms. Copyright ©️: Cooper Stevens]]<br />
<br />
==Storage==<br />
<br />
Photos and videos are each uploaded to Google Photos in one of three ways: "original quality", "high quality", or "express."<ref name="size" /> "Original quality" uploads maintain their original resolution and use part of the associated Google account's 15GB of storage shared between all Google products (including [https://en.wikipedia.org/wiki/Gmail Gmail], [https://en.wikipedia.org/wiki/Google_Drive Google Drive], etc.).<ref name="size">Google. (2021). [https://support.google.com/photos/answer/6220791?co=GENIE.Platform%3DAndroid&oco=1#zippy=%2Cexpress-backup%2Coriginal-quality%2Chigh-quality "Choose the upload size of your photos & videos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> Meanwhile, "high quality" uploads have their resolution downgraded to 16 megapixels and 1080p.<ref name="size" /> Finally, "express" uploads are compressed to 3 megapixels and 480p.<ref name="size" /> The free tier of Google Photos allows unlimited uploads in resolutions up to "high quality."<ref name="size" /><br />
<br />
In November 2020, in an effort to increase the number of [https://en.wikipedia.org/wiki/Google_One Google One] subscriptions and reduce Google's reliance on ad-based revenue, Google announced that Google Photos will no longer offer free unlimited storage at "express" or "high quality" starting June 1, 2021.<ref name="pivot">Newman, J. (2020, November 28). [https://www.fastcompany.com/90579872/google-photos-no-unlimited-storage "The end of unlimited Google Photos storage is part of a bigger pivot"]. ''Fast Company''. Retrieved March 12, 2021.</ref><ref>Bohn, D. (2020, November 11). [https://www.theverge.com/2020/11/11/21560810/google-photos-unlimited-cap-free-uploads-15gb-ending "Google Photos will end its free unlimited storage on June 1st, 2021"]. ''The Verge''. Retrieved March 12, 2021.</ref> Existing photos and videos will remain unaffected.<ref name="future" /> After all 15GB of account storage have been used, users will either have to maintain a Google One subscription or upload from a [https://en.wikipedia.org/wiki/Google_Pixel Google Pixel] 5 or older device.<ref>Google. (2021). [https://support.google.com/photos/answer/10100180 "Storage changes for Google Photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref><ref name="size" /> Future Pixel devices will be unable to upload in "high quality" for free.<ref>Coberly, C. (2020, November 12). [https://www.techspot.com/news/87584-future-pixel-phone-owners-reportedly-wont-retain-access.html "Future Pixel phone owners may not retain unlimited Google Photos storage access"]. ''TechSpot''. Retrieved March 12, 2021.</ref><br />
<br />
[[File:Alcine1-cropped.png|216px|thumb|left|Jacky Alciné's Twitter post about the "Gorillas" misidentification.<ref name="gorillas" />]]<br />
<br />
==Motivations==<br />
<br />
There have been suggestions that Google released Google Photos in the interest of collecting more information about users through visual data.<ref>Lomas, N. (2015, June 1). [https://techcrunch.com/2015/06/01/google-photos-reminder-smile-its-free-youre-the-product/ "Google Photos Reminder: Smile, It's Free - You're The Product!"]. ''TechCrunch''. Retrieved March 12, 2021.</ref> It has also been speculated that it was intended as a method of outsourcing work to train their visual recognition algorithms; the app frequently asks its users to manually verify (or reject) its proposed tags.<ref name="improve">Perrigo, M. (2020, November 9). [https://chromeunboxed.com/google-photos-crowdsource-feature "Google Photos wants you to answer questions to help improve its image recognition capabilities"]. ''Chrome Unboxed''. Retrieved March 12, 2021.</ref> <br />
<br />
===Bias in Algorithms===<br />
Bias in artificial intelligence is a widely discussed topic, with researchers voicing concerns that flawed [https://en.wikipedia.org/wiki/Training,_validation,_and_test_sets#training_set training data] can exacerbate or create unfair assumptions. One example is facial recognition algorithms misidentifying people of color more frequently than white people.<ref name="gender"> Ghosh, S. (2020, February 20). [https://www.businessinsider.nl/google-cloud-vision-api-wont-tag-images-by-gender-2020-2/ “Google AI will no longer use gender labels like ‘woman’ or ‘man’ on images of people to avoid bias”]. “Business Insider”. Retrieved April 6, 2021. </ref> <br />
<br />
[[File:Gendergoogle.png|thumbnail|right|Google's API no longer uses gender labels<ref name="gender"/>]]<br />
<br />
In 2020, Google made a change to its API tool [https://en.wikipedia.org/wiki/Google_Cloud_Platform Cloud Vision], which uses AI to analyze images in order to identify people, places, and things. The change involved eliminating gender labels on images, because a person’s gender can’t be determined just by how they look in a photo. Instead of using terms like “man” or “woman”, Google will tag images with labels such as “person” in order to avoid instilling AI algorithms with human bias.<ref> Lyons, K. (2020, February 20). [https://www.theverge.com/2020/2/20/21145356/google-ai-images-gender-bias-labels-people “Google Ai tool will no longer use gendered labels like ‘woman’ or ‘man’ in photos of people”]. “The Verge”. Retrieved April 6, 2021. </ref> <br />
<br />
Regarding its own AI principles, Google acknowledges that algorithms and datasets can reinforce bias: “We will seek to avoid unjust impacts on people, particularly those related to sensitive characteristics such as race, ethnicity, gender, nationality, income, sexual orientation, ability, and political or religious belief. We will design AI systems that provide appropriate opportunities for feedback, relevant explanations, and appeal. Our AI technologies will be subject to appropriate human direction and control.”<ref> Pichai, S. (2018, June 7). [https://www.blog.google/technology/ai/ai-principles/ “AI at Google: our principles”]. “Google”. Retrieved April 6, 2021. </ref><br />
<br />
===Misidentification===<br />
<br />
The nature of a computer analyzing and captioning a photo through a machine learning algorithm involves training from a data set. This is where a large amount of bias comes from, in the form of capture, category and negative bias. <ref name="tommasi">Tommasi, Tatiana, Patricia, Novi, Caputo, Barbara, & Tuytelaars, Tinne. (2015). A Deeper Look at Dataset Bias. Pattern Recognition, Lecture Notes in Computer Science (Vol. 9358, pp. 504–516). Cham: Springer International Publishing.</ref> Capture bias refers to all the variables involved with acquiring images, such as lighting conditions or camera used. Category bias refers to a poor definition of categories in which the image can exist, while negative bias involves the inability to train a model on categories which are not included in the data set. <ref name="tommasi"/> Machine learning models only know what they are taught, and as such, any of these biases appearing in Googles training data will resurface when categorizing photos uploaded by users. <ref name="debrusk">Chris DeBrusk. (2018). The Risk of Machine-Learning Bias (and How to Prevent It). MIT Sloan Blogs. Blog, Cambridge: Massachusetts Institute of Technology, Cambridge, MA.</ref><br />
<br />
Shortly after its release in 2015, Google Photos found itself under fire for a particularly offensive error: user Jacky Alciné reported that he and his friend were classified as "Gorillas" by Google Photos's visual recognition algorithms.<ref name="gorillas">Mullen, J. (2015, July 2). [https://www.cnn.com/2015/07/02/tech/google-image-recognition-gorillas-tag "Google rushes to fix software that tagged photo with racial slur"]. ''CNN''. Retrieved March 19, 2021.</ref> In response, Google Photos promptly removed all labels relating to primates as a temporary fix.<ref>Hern, A. (2018, January 12). [https://www.theguardian.com/technology/2018/jan/12/google-racism-ban-gorilla-black-people "Google's solution to accidental algorithmic racism: ban gorillas"]. ''The Guardian''. Retrieved March 19, 2021.</ref> A spokesperson from Google stated that the labels "gorilla", "chimp", "chimpanzee", and "monkey" were blocked on the platform.<ref>Vincent, J. (2018, January 12). [https://www.theverge.com/2018/1/12/16882408/google-racist-gorillas-photo-recognition-algorithm-ai "Google ‘fixed’ its racist algorithm by removing gorillas from its image-labeling tech."] ''The Verge''. Retrieved April 2, 2021.</ref> According to Mr. Alcine, Google has intensified their search for qualified people of color to hire in order to correct issues stemming from under representation of minorities in technology companies. <ref> (2015, July 1). [https://www.bbc.com/news/technology-33347866 “Google apologises for Photo app’s racist blunder”]. “BBC”. Retrieved April 6, 2021. </ref> With enough data and computing power, software can be trained to categorize images to a high level of accuracy, but it can’t easily go beyond the experience of that training. Even the very best algorithms lack the ability to refine their interpretation of the world as humans do.<ref> Simonite, T. (2018, January 11). [https://www.wired.com/story/when-it-comes-to-gorillas-google-photos-remains-blind/ “When It Comes To Gorillas, Google Photos Remains Blind”]. “Wired”. Retrieved April 6, 2021. </ref><br />
<br />
In an effort to quantify the accuracy of Google Photos image recognition model, a study was performed by uploading a wide range of commonly recognizable photos and recording the results of categorization. The test was setup by uploading over one hundred photos, both with significant (descriptive) and insignificant (arbitrary) file names and placing them into albums organized by country of origin for that image. After leaving the images for one day, they had been organized into 9 broad categories with varying degrees of accuracy. The least accurate were photos of blue, incorrectly identified as the sky, while the most accurate were mountains, cars and sunsets. Nieuwenhuysen states that the system is satisfactory considering it can be used free of charge, reporting that it excels at recognizing any photo to at least a vaguely specific category, even when the image is named arbitrarily. <ref name=""> Nieuwenhuysen, Paul. (2018). Information Discovery and Images A Case Study of Google Photos. 2018 5th International Symposium on Emerging Trends and Technologies in Libraries and Information Services (ETTLIS) (pp. 16–21). orig-research, IEEE.</ref><br />
<br />
==Ethical Implications==<br />
[[File:July_13,_2018_Timeline-light.jpg|205px|thumb|right|Google Photos works in conjunction with Google Maps to create an extremely precise summary of this user’s day. Copyright ©️: Cooper Stevens]]<br />
===Security===<br />
A foremost concern regarding Google Photos is ensuring that the user who uploads photos and videos is the only one who can control access to and view that content.<ref>Hill, S. (2015, June 16). [https://www.androidauthority.com/google-photos-worried-privacy-616339/ "Google Photos: Should you be worried about privacy?"]. ''Android Authority''. Retrieved March 12, 2021.</ref> These worries are substantiated by occurrences like Google accidentally sending users’ private videos to strangers in November 2019.<ref>Warren, T. (2020, February 4). [https://www.theverge.com/2020/2/4/21122044/google-photos-privacy-breach-takeout-data-video-strangers "Google admits it sent private videos in Google Photos to strangers"]. ''The Verge''. Retrieved March 12, 2021.</ref> To alleviate future incidents, Google continuously works to improve its security infrastructure to ensure that its privacy goals are upheld for the benefit of all users.<ref>Google. (2021). [https://safety.google/photos/#photos-advanced-infraestructure "Google Photos Safety &amp; Privacy Features"]. ''Google Safety Center''. Retrieved March 12, 2021.</ref><br />
<br />
===Computer Vision===<br />
Google Photos automatically runs every photo and video through visual recognition algorithms to identify objects and places.<ref>Google. (2021). [https://support.google.com/photos/answer/6220402?hl=en&amp;ref_topic=6128818 "Get started with Google Photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> Once "face grouping" is enabled by the user, Photos will also start to categorize images by the people and pets who appear in them.<ref name="searchPTP">Google. (2021). [https://support.google.com/photos/answer/6128838?hl=en&co=GENIE.Platform%3DDesktop&oco=0#zippy=%2Clearn-about-face-models "Search by people, things & places in your photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> There is some uncertainty surrounding Google collecting the information embedded in the photos and selling it to third parties or using it to display more relevant advertisements<ref>Luckerson, V. (2017, May 25). [https://www.theringer.com/2017/5/25/16043842/google-photos-data-collection-e8578b3256e0 "Why Google Is Suddenly Obsessed With Your Photos"]. ''The Ringer''. Retrieved March 12, 2021.</ref>, even though Google ensures users that "face groups and labels in your account are only visible to you."<ref name="searchPTP" /><br />
<br />
===Geotagging===<br />
<br />
If a user allows Google to keep track of their location history under "Your Timeline" in Google Maps, Google Photos uses this alongside geotags embedded in photos to allow users to search their gallery by location and even review past trips minute-by-minute.<ref name="Vox" /><ref>Bonifacic, I. (2020, December 16). [https://www.engadget.com/google-photos-adds-maps-timeline-233633756.html "Google Photos lets you relive a day through your Maps timeline"]. ''Engadget''. Retrieved March 12, 2021.</ref><br />
<br />
Even without geotags, Google Photos is capable of intuiting a photo’s location by analyzing for major landmarks.<ref name="Vox" /> As a result, concerns have been raised about how this information is used and if it is secure.<ref>Komando, K. (2021, January 11). [https://www.usatoday.com/story/tech/columnist/komando/2021/01/07/google-map-knows-your-location-photos-you-took-how-turn-off/4113403001/ "Hidden map on your phone shows everywhere you’ve been and the photos you took there"]. ''USA Today''. Retrieved March 12, 2021.</ref><br />
<br />
Google is upfront that it uses this information to show users more relevant advertisements, but at the same time assures users that this information is never shared with advertisers.<ref>Google. (2021). [https://policies.google.com/technologies/location-data?hl=en-US "How Google uses location information"]. ''Privacy &amp; Terms''. Retrieved March 12, 2021.</ref> To accommodate these concerns, Google Photos gives its users the capability to turn off location history, remove location information from already existing photos, and choose whether or not to share a photo's location when shared.<ref>Google. (2021). [https://support.google.com/photos/answer/6153599?co=GENIE.Platform%3DAndroid&oco=1#zippy=%2Cview-your-timeline "Understand, find and edit your photos' locations"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref><br />
<br />
==See also==<br />
*[[Computer Vision]]<br />
*[[Facial Recognition]]<br />
*[[Geotag]]<br />
<br />
==References==<br />
<references/></div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=Google_Photos&diff=102066
Google Photos
2021-04-15T20:13:10Z
<p>Nfigue: /* Ethical Implications */</p>
<hr />
<div><span style="color: red; font-size: 50px;">Currently Being Edited</span><br />
<br />
<br />
[[File:Google_Photos_Logo_new.png|256px|thumb|right|The Google Photos logo. Copyright ©️: Google]]<br />
{{Nav-Bar|Topics##}}<br><br />
'''Google Photos''' is a photo and video storage and sharing service developed by [[Google]].<ref name="about">Google. (2021). [https://www.google.com/photos/about/ "Google Photos | About"]. ''Google Photos''. Retrieved March 26, 2021.</ref> Google Photos was released in May 2015 as it separated from Google’s former social network, [https://en.wikipedia.org/wiki/Google%2B Google+].<ref>Spradlin, L. (2015, May 24). [https://www.androidpolice.com/2015/05/24/an-exclusive-early-look-at-the-new-google-photos-app/ "An Exclusive Early Look At The New Google Photos App"]. ''Android Police''. Retrieved March 12, 2021.</ref> Google Photos was designed with the intent of creating a platform that allows people to store and easily access all of their pictures and videos from any device.<ref>Amateur Photographer. (2015, June 2). [https://www.amateurphotographer.co.uk/latest/photo-news/google-photos-service-raises-privacy-concerns-52721 "Google Photos Service Raises Privacy Concerns"]. ''Amateur Photographer''. Retrieved March 12, 2021.</ref><br />
<br />
After its release, Google Photos's user base rocketed to 200 million after one year, 500 million after two years, and surpassed 1 billion after four years.<ref>Porter, J. (2019, July 24). [https://www.theverge.com/2019/7/24/20708328/google-photos-users-gallery-go-1-billion "Google Photos passes the 1 billion users mark"]. ''The Verge''. Retrieved March 12, 2021.</ref> As of 2020, Google reports that about 28 billion photos and videos are uploaded every week and that the service is home to more than 4 trillion photos.<ref name="future">Ben-Yair, S. (2020, November 11). [https://blog.google/products/photos/storage-changes/ "Updating Google Photos' storage policy to build for the future"]. ''Google | The Keyword''. Retrieved March 12, 2021.</ref><br />
<br />
==Features==<br />
Google Photos is available on [[Android]], [https://en.wikipedia.org/wiki/IOS iOS], and [https://photos.google.com/ online].<ref name="JacobK">Kastrenakes, J. (2015, May 28). [https://www.theverge.com/2015/5/28/8678629/google-photos-app-announced "Google announces unlimited picture and video storage with new Photos app"]. ''The Verge''. Retrieved March 12, 2021.</ref> Photos and videos can be uploaded and accessed via any of these platforms.<ref name="Vox" /> The service organizes photos and videos by identifying any number of features present in the photo such as lakes, night, birthday, etc.<ref name="JacobK" /> It is capable of organizing photos by the faces of people and pets in them, even as the faces age.<ref name="Vox" /><ref name="JacobK" /> Users can manually fix or remove incorrect labels.<ref name="Vox" /> Google Photos also can group photos and videos by location.<ref name="Vox" /> It can determine a photo's location either by its embedded geotagging data, or by analyzing the photo for major landmarks (such as the [https://en.wikipedia.org/wiki/Eiffel_Tower Eiffel Tower]).<ref name="Vox" /><br />
<br />
The service includes a native photo and video editing software that can be used on any platform.<ref>Lowensohn, J. (2015, May 28). [https://www.theverge.com/2015/5/28/8673471/google-photos-hands-on-cloud-storage-io-2015 "Hands-on with Google's new Photos service"]. ''The Verge''. Retrieved March 12, 2021.</ref> Additionally, Google Photos offers a variety of ways to make sharing photos/videos easy and simple, such as generating web link to content that both users and non-users can access.<ref name="Vox">Mossberg, W. (2015, June 2). [https://www.vox.com/2015/6/2/11563182/the-new-google-photos-free-at-last-and-very-smart "The New Google Photos: Free at Last, and Very Smart"]. ''Vox''. Retrieved March 12, 2021.</ref> Google Photos allows users to share albums with someone directly via their Google account.<ref>Mathur, S. (2020, May 19). [https://blog.google/products/photos/new-controls-how-you-share-albums-google-photos/ "New controls for how you share albums in Google Photos"]. ''Google | The Keyword''. Retrieved March 12, 2021.</ref> In 2020, Google added a heat map feature that displays the concentration of photos in the library as a function of location.<ref>Newton, C. (2020, June 25). [https://www.theverge.com/2020/6/25/21301932/google-photos-redesign-map-view-memories-search "Google Photos gets a map view as part of a big new redesign"]. ''The Verge''. Retrieved March 12, 2021.</ref> In March of 2021, Google Photos started rolling out an advanced version of its video-editor for Android users. Users will be able to not only trim, stabilize, and rotate videos, but also crop to any aspect ratio, change perspective, and control brightness, contrast, highlights, and shadows. Google plans to release this feature to iOS in 2021.<ref>Li, A. (2021, March 29). [https://9to5google.com/2021/03/29/google-photos-video-editor/ "Update: Rolling out Google Photos for Android getting advanced video editor with cropping, filters."] ''9To5Google''. Retrieved April 2, 2021.</ref><br />
<br />
===Subscription Service===<br />
In late 2020, Google rolled out a subscription-based service that ships Google Photos users 10 prints of that months best photos.<ref>Perez, S. (2020, October 20).[https://techcrunch.com/2020/10/20/google-photos-revives-its-prints-subscription-service-expands-same-day-print-options/ "Google Photos revives its prints subscription service, expands same-day print options."] ''TechCrunch''. Retrieved April 2, 2021.</ref> The images are selected by a [https://en.wikipedia.org/wiki/Machine_learning machine learning] algorithm, but users can alter any choices. After some negative feedback through user-testing,<ref name="feedback">Nagaraj, P. (2020, June 21). [https://newslanded.com/2020/06/21/google-finally-cancels-controversial-google-photos-subscription-feature/ "Google finally cancels controversial google photos subscription feature."] ''News Landed''. Retrieved April 2, 2021.</ref> Google now allows higher customization of the look, and finishing of the prints along with a reduction in subscription costs from $7.99 to $6.99. Users are able to cancel the service at any time, and are also able to skip a month, if they so choose to. Customers can order prints in multiple sizes, and also ask for a same-day pick-up at their local [https://en.wikipedia.org/wiki/Walgreens Walgreens]. The most prominent feedback received for the service is the "outrageous cost"<ref name="feedback"/>, charging many times the price for printing at a traditional store.<ref name="feedback"/><br />
<br />
[[File:Google's_Revenue_Breakdown_2017-20.jpeg|300px|thumb|right|A breakdown of Google's sources of revenue from 2017 to 2020.<ref name="pivot" /> (Advertising revenue is in blue)]]<br />
<br />
[[File:Incorrectly-Labeled_Bikes-cropped.jpg|216px|thumb|left|This meal has been incorrectly labeled as a "bike" by Google's visual recognition algorithms. Copyright ©️: Cooper Stevens]]<br />
<br />
==Storage==<br />
<br />
Photos and videos are each uploaded to Google Photos in one of three ways: "original quality", "high quality", or "express."<ref name="size" /> "Original quality" uploads maintain their original resolution and use part of the associated Google account's 15GB of storage shared between all Google products (including [https://en.wikipedia.org/wiki/Gmail Gmail], [https://en.wikipedia.org/wiki/Google_Drive Google Drive], etc.).<ref name="size">Google. (2021). [https://support.google.com/photos/answer/6220791?co=GENIE.Platform%3DAndroid&oco=1#zippy=%2Cexpress-backup%2Coriginal-quality%2Chigh-quality "Choose the upload size of your photos & videos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> Meanwhile, "high quality" uploads have their resolution downgraded to 16 megapixels and 1080p.<ref name="size" /> Finally, "express" uploads are compressed to 3 megapixels and 480p.<ref name="size" /> The free tier of Google Photos allows unlimited uploads in resolutions up to "high quality."<ref name="size" /><br />
<br />
In November 2020, in an effort to increase the number of [https://en.wikipedia.org/wiki/Google_One Google One] subscriptions and reduce Google's reliance on ad-based revenue, Google announced that Google Photos will no longer offer free unlimited storage at "express" or "high quality" starting June 1, 2021.<ref name="pivot">Newman, J. (2020, November 28). [https://www.fastcompany.com/90579872/google-photos-no-unlimited-storage "The end of unlimited Google Photos storage is part of a bigger pivot"]. ''Fast Company''. Retrieved March 12, 2021.</ref><ref>Bohn, D. (2020, November 11). [https://www.theverge.com/2020/11/11/21560810/google-photos-unlimited-cap-free-uploads-15gb-ending "Google Photos will end its free unlimited storage on June 1st, 2021"]. ''The Verge''. Retrieved March 12, 2021.</ref> Existing photos and videos will remain unaffected.<ref name="future" /> After all 15GB of account storage have been used, users will either have to maintain a Google One subscription or upload from a [https://en.wikipedia.org/wiki/Google_Pixel Google Pixel] 5 or older device.<ref>Google. (2021). [https://support.google.com/photos/answer/10100180 "Storage changes for Google Photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref><ref name="size" /> Future Pixel devices will be unable to upload in "high quality" for free.<ref>Coberly, C. (2020, November 12). [https://www.techspot.com/news/87584-future-pixel-phone-owners-reportedly-wont-retain-access.html "Future Pixel phone owners may not retain unlimited Google Photos storage access"]. ''TechSpot''. Retrieved March 12, 2021.</ref><br />
<br />
[[File:Alcine1-cropped.png|216px|thumb|left|Jacky Alciné's Twitter post about the "Gorillas" misidentification.<ref name="gorillas" />]]<br />
<br />
==Motivations==<br />
<br />
There have been suggestions that Google released Google Photos in the interest of collecting more information about users through visual data.<ref>Lomas, N. (2015, June 1). [https://techcrunch.com/2015/06/01/google-photos-reminder-smile-its-free-youre-the-product/ "Google Photos Reminder: Smile, It's Free - You're The Product!"]. ''TechCrunch''. Retrieved March 12, 2021.</ref> It has also been speculated that it was intended as a method of outsourcing work to train their visual recognition algorithms; the app frequently asks its users to manually verify (or reject) its proposed tags.<ref name="improve">Perrigo, M. (2020, November 9). [https://chromeunboxed.com/google-photos-crowdsource-feature "Google Photos wants you to answer questions to help improve its image recognition capabilities"]. ''Chrome Unboxed''. Retrieved March 12, 2021.</ref> <br />
<br />
===Bias in Algorithms===<br />
Bias in artificial intelligence is a widely discussed topic, with researchers voicing concerns that flawed training data can exacerbate or create unfair assumptions. One example is facial recognition algorithms misidentifying people of color more frequently than white people.<ref name="gender"> Ghosh, S. (2020, February 20). [https://www.businessinsider.nl/google-cloud-vision-api-wont-tag-images-by-gender-2020-2/ “Google AI will no longer use gender labels like ‘woman’ or ‘man’ on images of people to avoid bias”]. “Business Insider”. Retrieved April 6, 2021. </ref> <br />
<br />
[[File:Gendergoogle.png|thumbnail|right|Google's API no longer uses gender labels<ref name="gender"/>]]<br />
<br />
In 2020, Google made a change to its API tool [https://en.wikipedia.org/wiki/Google_Cloud_Platform Cloud Vision], which uses AI to analyze images in order to identify people, places, and things. The change involved eliminating gender labels on images, because a person’s gender can’t be determined just by how they look in a photo. Instead of using terms like “man” or “woman”, Google will tag images with labels such as “person” in order to avoid instilling AI algorithms with human bias.<ref> Lyons, K. (2020, February 20). [https://www.theverge.com/2020/2/20/21145356/google-ai-images-gender-bias-labels-people “Google Ai tool will no longer use gendered labels like ‘woman’ or ‘man’ in photos of people”]. “The Verge”. Retrieved April 6, 2021. </ref> <br />
<br />
Regarding its own AI principles, Google acknowledges that algorithms and datasets can reinforce bias: “We will seek to avoid unjust impacts on people, particularly those related to sensitive characteristics such as race, ethnicity, gender, nationality, income, sexual orientation, ability, and political or religious belief. We will design AI systems that provide appropriate opportunities for feedback, relevant explanations, and appeal. Our AI technologies will be subject to appropriate human direction and control.”<ref> Pichai, S. (2018, June 7). [https://www.blog.google/technology/ai/ai-principles/ “AI at Google: our principles”]. “Google”. Retrieved April 6, 2021. </ref><br />
<br />
===Misidentification===<br />
<br />
The nature of a computer analyzing and captioning a photo through a machine learning algorithm involves training from a data set. This is where a large amount of bias comes from, in the form of capture, category and negative bias. <ref name="tommasi">Tommasi, Tatiana, Patricia, Novi, Caputo, Barbara, & Tuytelaars, Tinne. (2015). A Deeper Look at Dataset Bias. Pattern Recognition, Lecture Notes in Computer Science (Vol. 9358, pp. 504–516). Cham: Springer International Publishing.</ref> Capture bias refers to all the variables involved with acquiring images, such as lighting conditions or camera used. Category bias refers to a poor definition of categories in which the image can exist, while negative bias involves the inability to train a model on categories which are not included in the data set. <ref name="tommasi"/> Machine learning models only know what they are taught, and as such, any of these biases appearing in Googles training data will resurface when categorizing photos uploaded by users. <ref name="debrusk">Chris DeBrusk. (2018). The Risk of Machine-Learning Bias (and How to Prevent It). MIT Sloan Blogs. Blog, Cambridge: Massachusetts Institute of Technology, Cambridge, MA.</ref><br />
<br />
Shortly after its release in 2015, Google Photos found itself under fire for a particularly offensive error: user Jacky Alciné reported that he and his friend were classified as "Gorillas" by Google Photos's visual recognition algorithms.<ref name="gorillas">Mullen, J. (2015, July 2). [https://www.cnn.com/2015/07/02/tech/google-image-recognition-gorillas-tag "Google rushes to fix software that tagged photo with racial slur"]. ''CNN''. Retrieved March 19, 2021.</ref> In response, Google Photos promptly removed all labels relating to primates as a temporary fix.<ref>Hern, A. (2018, January 12). [https://www.theguardian.com/technology/2018/jan/12/google-racism-ban-gorilla-black-people "Google's solution to accidental algorithmic racism: ban gorillas"]. ''The Guardian''. Retrieved March 19, 2021.</ref> A spokesperson from Google stated that the labels "gorilla", "chimp", "chimpanzee", and "monkey" were blocked on the platform.<ref>Vincent, J. (2018, January 12). [https://www.theverge.com/2018/1/12/16882408/google-racist-gorillas-photo-recognition-algorithm-ai "Google ‘fixed’ its racist algorithm by removing gorillas from its image-labeling tech."] ''The Verge''. Retrieved April 2, 2021.</ref> According to Mr. Alcine, Google has intensified their search for qualified people of color to hire in order to correct issues stemming from under representation of minorities in technology companies. <ref> (2015, July 1). [https://www.bbc.com/news/technology-33347866 “Google apologises for Photo app’s racist blunder”]. “BBC”. Retrieved April 6, 2021. </ref> With enough data and computing power, software can be trained to categorize images to a high level of accuracy, but it can’t easily go beyond the experience of that training. Even the very best algorithms lack the ability to refine their interpretation of the world as humans do.<ref> Simonite, T. (2018, January 11). [https://www.wired.com/story/when-it-comes-to-gorillas-google-photos-remains-blind/ “When It Comes To Gorillas, Google Photos Remains Blind”]. “Wired”. Retrieved April 6, 2021. </ref><br />
<br />
In an effort to quantify the accuracy of Google Photos image recognition model, a study was performed by uploading a wide range of commonly recognizable photos and recording the results of categorization. The test was setup by uploading over one hundred photos, both with significant (descriptive) and insignificant (arbitrary) file names and placing them into albums organized by country of origin for that image. After leaving the images for one day, they had been organized into 9 broad categories with varying degrees of accuracy. The least accurate were photos of blue, incorrectly identified as the sky, while the most accurate were mountains, cars and sunsets. Nieuwenhuysen states that the system is satisfactory considering it can be used free of charge, reporting that it excels at recognizing any photo to at least a vaguely specific category, even when the image is named arbitrarily. <ref name=""> Nieuwenhuysen, Paul. (2018). Information Discovery and Images A Case Study of Google Photos. 2018 5th International Symposium on Emerging Trends and Technologies in Libraries and Information Services (ETTLIS) (pp. 16–21). orig-research, IEEE.</ref><br />
<br />
==Ethical Implications==<br />
[[File:July_13,_2018_Timeline-light.jpg|205px|thumb|right|Google Photos works in conjunction with Google Maps to create an extremely precise summary of this user’s day. Copyright ©️: Cooper Stevens]]<br />
===Security===<br />
A foremost concern regarding Google Photos is ensuring that the user who uploads photos and videos is the only one who can control access to and view that content.<ref>Hill, S. (2015, June 16). [https://www.androidauthority.com/google-photos-worried-privacy-616339/ "Google Photos: Should you be worried about privacy?"]. ''Android Authority''. Retrieved March 12, 2021.</ref> These worries are substantiated by occurrences like Google accidentally sending users’ private videos to strangers in November 2019.<ref>Warren, T. (2020, February 4). [https://www.theverge.com/2020/2/4/21122044/google-photos-privacy-breach-takeout-data-video-strangers "Google admits it sent private videos in Google Photos to strangers"]. ''The Verge''. Retrieved March 12, 2021.</ref> To alleviate future incidents, Google continuously works to improve its security infrastructure to ensure that its privacy goals are upheld for the benefit of all users.<ref>Google. (2021). [https://safety.google/photos/#photos-advanced-infraestructure "Google Photos Safety &amp; Privacy Features"]. ''Google Safety Center''. Retrieved March 12, 2021.</ref><br />
<br />
===Computer Vision===<br />
Google Photos automatically runs every photo and video through visual recognition algorithms to identify objects and places.<ref>Google. (2021). [https://support.google.com/photos/answer/6220402?hl=en&amp;ref_topic=6128818 "Get started with Google Photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> Once "face grouping" is enabled by the user, Photos will also start to categorize images by the people and pets who appear in them.<ref name="searchPTP">Google. (2021). [https://support.google.com/photos/answer/6128838?hl=en&co=GENIE.Platform%3DDesktop&oco=0#zippy=%2Clearn-about-face-models "Search by people, things & places in your photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> There is some uncertainty surrounding Google collecting the information embedded in the photos and selling it to third parties or using it to display more relevant advertisements<ref>Luckerson, V. (2017, May 25). [https://www.theringer.com/2017/5/25/16043842/google-photos-data-collection-e8578b3256e0 "Why Google Is Suddenly Obsessed With Your Photos"]. ''The Ringer''. Retrieved March 12, 2021.</ref>, even though Google ensures users that "face groups and labels in your account are only visible to you."<ref name="searchPTP" /><br />
<br />
===Geotagging===<br />
<br />
If a user allows Google to keep track of their location history under "Your Timeline" in Google Maps, Google Photos uses this alongside geotags embedded in photos to allow users to search their gallery by location and even review past trips minute-by-minute.<ref name="Vox" /><ref>Bonifacic, I. (2020, December 16). [https://www.engadget.com/google-photos-adds-maps-timeline-233633756.html "Google Photos lets you relive a day through your Maps timeline"]. ''Engadget''. Retrieved March 12, 2021.</ref><br />
<br />
Even without geotags, Google Photos is capable of intuiting a photo’s location by analyzing for major landmarks.<ref name="Vox" /> As a result, concerns have been raised about how this information is used and if it is secure.<ref>Komando, K. (2021, January 11). [https://www.usatoday.com/story/tech/columnist/komando/2021/01/07/google-map-knows-your-location-photos-you-took-how-turn-off/4113403001/ "Hidden map on your phone shows everywhere you’ve been and the photos you took there"]. ''USA Today''. Retrieved March 12, 2021.</ref><br />
<br />
Google is upfront that it uses this information to show users more relevant advertisements, but at the same time assures users that this information is never shared with advertisers.<ref>Google. (2021). [https://policies.google.com/technologies/location-data?hl=en-US "How Google uses location information"]. ''Privacy &amp; Terms''. Retrieved March 12, 2021.</ref> To accommodate these concerns, Google Photos gives its users the capability to turn off location history, remove location information from already existing photos, and choose whether or not to share a photo's location when shared.<ref>Google. (2021). [https://support.google.com/photos/answer/6153599?co=GENIE.Platform%3DAndroid&oco=1#zippy=%2Cview-your-timeline "Understand, find and edit your photos' locations"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref><br />
<br />
==See also==<br />
*[[Computer Vision]]<br />
*[[Facial Recognition]]<br />
*[[Geotag]]<br />
<br />
==References==<br />
<references/></div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=Google_Photos&diff=102064
Google Photos
2021-04-15T20:00:42Z
<p>Nfigue: /* Subscription Service */ added link</p>
<hr />
<div><span style="color: red; font-size: 50px;">Currently Being Edited</span><br />
<br />
<br />
[[File:Google_Photos_Logo_new.png|256px|thumb|right|The Google Photos logo. Copyright ©️: Google]]<br />
{{Nav-Bar|Topics##}}<br><br />
'''Google Photos''' is a photo and video storage and sharing service developed by [[Google]].<ref name="about">Google. (2021). [https://www.google.com/photos/about/ "Google Photos | About"]. ''Google Photos''. Retrieved March 26, 2021.</ref> Google Photos was released in May 2015 as it separated from Google’s former social network, [https://en.wikipedia.org/wiki/Google%2B Google+].<ref>Spradlin, L. (2015, May 24). [https://www.androidpolice.com/2015/05/24/an-exclusive-early-look-at-the-new-google-photos-app/ "An Exclusive Early Look At The New Google Photos App"]. ''Android Police''. Retrieved March 12, 2021.</ref> Google Photos was designed with the intent of creating a platform that allows people to store and easily access all of their pictures and videos from any device.<ref>Amateur Photographer. (2015, June 2). [https://www.amateurphotographer.co.uk/latest/photo-news/google-photos-service-raises-privacy-concerns-52721 "Google Photos Service Raises Privacy Concerns"]. ''Amateur Photographer''. Retrieved March 12, 2021.</ref><br />
<br />
After its release, Google Photos's user base rocketed to 200 million after one year, 500 million after two years, and surpassed 1 billion after four years.<ref>Porter, J. (2019, July 24). [https://www.theverge.com/2019/7/24/20708328/google-photos-users-gallery-go-1-billion "Google Photos passes the 1 billion users mark"]. ''The Verge''. Retrieved March 12, 2021.</ref> As of 2020, Google reports that about 28 billion photos and videos are uploaded every week and that the service is home to more than 4 trillion photos.<ref name="future">Ben-Yair, S. (2020, November 11). [https://blog.google/products/photos/storage-changes/ "Updating Google Photos' storage policy to build for the future"]. ''Google | The Keyword''. Retrieved March 12, 2021.</ref><br />
<br />
==Features==<br />
Google Photos is available on [[Android]], [https://en.wikipedia.org/wiki/IOS iOS], and [https://photos.google.com/ online].<ref name="JacobK">Kastrenakes, J. (2015, May 28). [https://www.theverge.com/2015/5/28/8678629/google-photos-app-announced "Google announces unlimited picture and video storage with new Photos app"]. ''The Verge''. Retrieved March 12, 2021.</ref> Photos and videos can be uploaded and accessed via any of these platforms.<ref name="Vox" /> The service organizes photos and videos by identifying any number of features present in the photo such as lakes, night, birthday, etc.<ref name="JacobK" /> It is capable of organizing photos by the faces of people and pets in them, even as the faces age.<ref name="Vox" /><ref name="JacobK" /> Users can manually fix or remove incorrect labels.<ref name="Vox" /> Google Photos also can group photos and videos by location.<ref name="Vox" /> It can determine a photo's location either by its embedded geotagging data, or by analyzing the photo for major landmarks (such as the [https://en.wikipedia.org/wiki/Eiffel_Tower Eiffel Tower]).<ref name="Vox" /><br />
<br />
The service includes a native photo and video editing software that can be used on any platform.<ref>Lowensohn, J. (2015, May 28). [https://www.theverge.com/2015/5/28/8673471/google-photos-hands-on-cloud-storage-io-2015 "Hands-on with Google's new Photos service"]. ''The Verge''. Retrieved March 12, 2021.</ref> Additionally, Google Photos offers a variety of ways to make sharing photos/videos easy and simple, such as generating web link to content that both users and non-users can access.<ref name="Vox">Mossberg, W. (2015, June 2). [https://www.vox.com/2015/6/2/11563182/the-new-google-photos-free-at-last-and-very-smart "The New Google Photos: Free at Last, and Very Smart"]. ''Vox''. Retrieved March 12, 2021.</ref> Google Photos allows users to share albums with someone directly via their Google account.<ref>Mathur, S. (2020, May 19). [https://blog.google/products/photos/new-controls-how-you-share-albums-google-photos/ "New controls for how you share albums in Google Photos"]. ''Google | The Keyword''. Retrieved March 12, 2021.</ref> In 2020, Google added a heat map feature that displays the concentration of photos in the library as a function of location.<ref>Newton, C. (2020, June 25). [https://www.theverge.com/2020/6/25/21301932/google-photos-redesign-map-view-memories-search "Google Photos gets a map view as part of a big new redesign"]. ''The Verge''. Retrieved March 12, 2021.</ref> In March of 2021, Google Photos started rolling out an advanced version of its video-editor for Android users. Users will be able to not only trim, stabilize, and rotate videos, but also crop to any aspect ratio, change perspective, and control brightness, contrast, highlights, and shadows. Google plans to release this feature to iOS in 2021.<ref>Li, A. (2021, March 29). [https://9to5google.com/2021/03/29/google-photos-video-editor/ "Update: Rolling out Google Photos for Android getting advanced video editor with cropping, filters."] ''9To5Google''. Retrieved April 2, 2021.</ref><br />
<br />
===Subscription Service===<br />
In late 2020, Google rolled out a subscription-based service that ships Google Photos users 10 prints of that months best photos.<ref>Perez, S. (2020, October 20).[https://techcrunch.com/2020/10/20/google-photos-revives-its-prints-subscription-service-expands-same-day-print-options/ "Google Photos revives its prints subscription service, expands same-day print options."] ''TechCrunch''. Retrieved April 2, 2021.</ref> The images are selected by a [https://en.wikipedia.org/wiki/Machine_learning machine learning] algorithm, but users can alter any choices. After some negative feedback through user-testing,<ref name="feedback">Nagaraj, P. (2020, June 21). [https://newslanded.com/2020/06/21/google-finally-cancels-controversial-google-photos-subscription-feature/ "Google finally cancels controversial google photos subscription feature."] ''News Landed''. Retrieved April 2, 2021.</ref> Google now allows higher customization of the look, and finishing of the prints along with a reduction in subscription costs from $7.99 to $6.99. Users are able to cancel the service at any time, and are also able to skip a month, if they so choose to. Customers can order prints in multiple sizes, and also ask for a same-day pick-up at their local [https://en.wikipedia.org/wiki/Walgreens Walgreens]. The most prominent feedback received for the service is the "outrageous cost"<ref name="feedback"/>, charging many times the price for printing at a traditional store.<ref name="feedback"/><br />
<br />
[[File:Google's_Revenue_Breakdown_2017-20.jpeg|300px|thumb|right|A breakdown of Google's sources of revenue from 2017 to 2020.<ref name="pivot" /> (Advertising revenue is in blue)]]<br />
<br />
[[File:Incorrectly-Labeled_Bikes-cropped.jpg|216px|thumb|left|This meal has been incorrectly labeled as a "bike" by Google's visual recognition algorithms. Copyright ©️: Cooper Stevens]]<br />
<br />
==Storage==<br />
<br />
Photos and videos are each uploaded to Google Photos in one of three ways: "original quality", "high quality", or "express."<ref name="size" /> "Original quality" uploads maintain their original resolution and use part of the associated Google account's 15GB of storage shared between all Google products (including [https://en.wikipedia.org/wiki/Gmail Gmail], [https://en.wikipedia.org/wiki/Google_Drive Google Drive], etc.).<ref name="size">Google. (2021). [https://support.google.com/photos/answer/6220791?co=GENIE.Platform%3DAndroid&oco=1#zippy=%2Cexpress-backup%2Coriginal-quality%2Chigh-quality "Choose the upload size of your photos & videos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> Meanwhile, "high quality" uploads have their resolution downgraded to 16 megapixels and 1080p.<ref name="size" /> Finally, "express" uploads are compressed to 3 megapixels and 480p.<ref name="size" /> The free tier of Google Photos allows unlimited uploads in resolutions up to "high quality."<ref name="size" /><br />
<br />
In November 2020, in an effort to increase the number of [https://en.wikipedia.org/wiki/Google_One Google One] subscriptions and reduce Google's reliance on ad-based revenue, Google announced that Google Photos will no longer offer free unlimited storage at "express" or "high quality" starting June 1, 2021.<ref name="pivot">Newman, J. (2020, November 28). [https://www.fastcompany.com/90579872/google-photos-no-unlimited-storage "The end of unlimited Google Photos storage is part of a bigger pivot"]. ''Fast Company''. Retrieved March 12, 2021.</ref><ref>Bohn, D. (2020, November 11). [https://www.theverge.com/2020/11/11/21560810/google-photos-unlimited-cap-free-uploads-15gb-ending "Google Photos will end its free unlimited storage on June 1st, 2021"]. ''The Verge''. Retrieved March 12, 2021.</ref> Existing photos and videos will remain unaffected.<ref name="future" /> After all 15GB of account storage have been used, users will either have to maintain a Google One subscription or upload from a [https://en.wikipedia.org/wiki/Google_Pixel Google Pixel] 5 or older device.<ref>Google. (2021). [https://support.google.com/photos/answer/10100180 "Storage changes for Google Photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref><ref name="size" /> Future Pixel devices will be unable to upload in "high quality" for free.<ref>Coberly, C. (2020, November 12). [https://www.techspot.com/news/87584-future-pixel-phone-owners-reportedly-wont-retain-access.html "Future Pixel phone owners may not retain unlimited Google Photos storage access"]. ''TechSpot''. Retrieved March 12, 2021.</ref><br />
<br />
[[File:Alcine1-cropped.png|216px|thumb|left|Jacky Alciné's Twitter post about the "Gorillas" misidentification.<ref name="gorillas" />]]<br />
<br />
==Motivations==<br />
<br />
There have been suggestions that Google released Google Photos in the interest of collecting more information about users through visual data.<ref>Lomas, N. (2015, June 1). [https://techcrunch.com/2015/06/01/google-photos-reminder-smile-its-free-youre-the-product/ "Google Photos Reminder: Smile, It's Free - You're The Product!"]. ''TechCrunch''. Retrieved March 12, 2021.</ref> It has also been speculated that it was intended as a method of outsourcing work to train their visual recognition algorithms; the app frequently asks its users to manually verify (or reject) its proposed tags.<ref name="improve">Perrigo, M. (2020, November 9). [https://chromeunboxed.com/google-photos-crowdsource-feature "Google Photos wants you to answer questions to help improve its image recognition capabilities"]. ''Chrome Unboxed''. Retrieved March 12, 2021.</ref> <br />
<br />
===Bias in Algorithms===<br />
Bias in artificial intelligence is a widely discussed topic, with researchers voicing concerns that flawed training data can exacerbate or create unfair assumptions. One example is facial recognition algorithms misidentifying people of color more frequently than white people.<ref name="gender"> Ghosh, S. (2020, February 20). [https://www.businessinsider.nl/google-cloud-vision-api-wont-tag-images-by-gender-2020-2/ “Google AI will no longer use gender labels like ‘woman’ or ‘man’ on images of people to avoid bias”]. “Business Insider”. Retrieved April 6, 2021. </ref> <br />
<br />
[[File:Gendergoogle.png|thumbnail|right|Google's API no longer uses gender labels<ref name="gender"/>]]<br />
<br />
In 2020, Google made a change to its API tool [https://en.wikipedia.org/wiki/Google_Cloud_Platform Cloud Vision], which uses AI to analyze images in order to identify people, places, and things. The change involved eliminating gender labels on images, because a person’s gender can’t be determined just by how they look in a photo. Instead of using terms like “man” or “woman”, Google will tag images with labels such as “person” in order to avoid instilling AI algorithms with human bias.<ref> Lyons, K. (2020, February 20). [https://www.theverge.com/2020/2/20/21145356/google-ai-images-gender-bias-labels-people “Google Ai tool will no longer use gendered labels like ‘woman’ or ‘man’ in photos of people”]. “The Verge”. Retrieved April 6, 2021. </ref> <br />
<br />
Regarding its own AI principles, Google acknowledges that algorithms and datasets can reinforce bias: “We will seek to avoid unjust impacts on people, particularly those related to sensitive characteristics such as race, ethnicity, gender, nationality, income, sexual orientation, ability, and political or religious belief. We will design AI systems that provide appropriate opportunities for feedback, relevant explanations, and appeal. Our AI technologies will be subject to appropriate human direction and control.”<ref> Pichai, S. (2018, June 7). [https://www.blog.google/technology/ai/ai-principles/ “AI at Google: our principles”]. “Google”. Retrieved April 6, 2021. </ref><br />
<br />
===Misidentification===<br />
<br />
The nature of a computer analyzing and captioning a photo through a machine learning algorithm involves training from a data set. This is where a large amount of bias comes from, in the form of capture, category and negative bias. <ref name="tommasi">Tommasi, Tatiana, Patricia, Novi, Caputo, Barbara, & Tuytelaars, Tinne. (2015). A Deeper Look at Dataset Bias. Pattern Recognition, Lecture Notes in Computer Science (Vol. 9358, pp. 504–516). Cham: Springer International Publishing.</ref> Capture bias refers to all the variables involved with acquiring images, such as lighting conditions or camera used. Category bias refers to a poor definition of categories in which the image can exist, while negative bias involves the inability to train a model on categories which are not included in the data set. <ref name="tommasi"/> Machine learning models only know what they are taught, and as such, any of these biases appearing in Googles training data will resurface when categorizing photos uploaded by users. <ref name="debrusk">Chris DeBrusk. (2018). The Risk of Machine-Learning Bias (and How to Prevent It). MIT Sloan Blogs. Blog, Cambridge: Massachusetts Institute of Technology, Cambridge, MA.</ref><br />
<br />
Shortly after its release in 2015, Google Photos found itself under fire for a particularly offensive error: user Jacky Alciné reported that he and his friend were classified as "Gorillas" by Google Photos's visual recognition algorithms.<ref name="gorillas">Mullen, J. (2015, July 2). [https://www.cnn.com/2015/07/02/tech/google-image-recognition-gorillas-tag "Google rushes to fix software that tagged photo with racial slur"]. ''CNN''. Retrieved March 19, 2021.</ref> In response, Google Photos promptly removed all labels relating to primates as a temporary fix.<ref>Hern, A. (2018, January 12). [https://www.theguardian.com/technology/2018/jan/12/google-racism-ban-gorilla-black-people "Google's solution to accidental algorithmic racism: ban gorillas"]. ''The Guardian''. Retrieved March 19, 2021.</ref> A spokesperson from Google stated that the labels "gorilla", "chimp", "chimpanzee", and "monkey" were blocked on the platform.<ref>Vincent, J. (2018, January 12). [https://www.theverge.com/2018/1/12/16882408/google-racist-gorillas-photo-recognition-algorithm-ai "Google ‘fixed’ its racist algorithm by removing gorillas from its image-labeling tech."] ''The Verge''. Retrieved April 2, 2021.</ref> According to Mr. Alcine, Google has intensified their search for qualified people of color to hire in order to correct issues stemming from under representation of minorities in technology companies. <ref> (2015, July 1). [https://www.bbc.com/news/technology-33347866 “Google apologises for Photo app’s racist blunder”]. “BBC”. Retrieved April 6, 2021. </ref> With enough data and computing power, software can be trained to categorize images to a high level of accuracy, but it can’t easily go beyond the experience of that training. Even the very best algorithms lack the ability to refine their interpretation of the world as humans do.<ref> Simonite, T. (2018, January 11). [https://www.wired.com/story/when-it-comes-to-gorillas-google-photos-remains-blind/ “When It Comes To Gorillas, Google Photos Remains Blind”]. “Wired”. Retrieved April 6, 2021. </ref><br />
<br />
In an effort to quantify the accuracy of Google Photos image recognition model, a study was performed by uploading a wide range of commonly recognizable photos and recording the results of categorization. The test was setup by uploading over one hundred photos, both with significant (descriptive) and insignificant (arbitrary) file names and placing them into albums organized by country of origin for that image. After leaving the images for one day, they had been organized into 9 broad categories with varying degrees of accuracy. The least accurate were photos of blue, incorrectly identified as the sky, while the most accurate were mountains, cars and sunsets. Nieuwenhuysen states that the system is satisfactory considering it can be used free of charge, reporting that it excels at recognizing any photo to at least a vaguely specific category, even when the image is named arbitrarily. <ref name=""> Nieuwenhuysen, Paul. (2018). Information Discovery and Images A Case Study of Google Photos. 2018 5th International Symposium on Emerging Trends and Technologies in Libraries and Information Services (ETTLIS) (pp. 16–21). orig-research, IEEE.</ref><br />
<br />
==Ethical Implications==<br />
[[File:July_13,_2018_Timeline-light.jpg|205px|thumb|right|Google Photos work in conjunction with Google Maps to create an extremely precise summary of this user’s day. Copyright ©️: Cooper Stevens]]<br />
===Security===<br />
A foremost concern regarding Google Photos is ensuring that the user who uploads photos and videos is the only one who can control access to and view that content.<ref>Hill, S. (2015, June 16). [https://www.androidauthority.com/google-photos-worried-privacy-616339/ "Google Photos: Should you be worried about privacy?"]. ''Android Authority''. Retrieved March 12, 2021.</ref> These worries are substantiated by occurrences like Google accidentally sending users’ private videos to strangers in November 2019.<ref>Warren, T. (2020, February 4). [https://www.theverge.com/2020/2/4/21122044/google-photos-privacy-breach-takeout-data-video-strangers "Google admits it sent private videos in Google Photos to strangers"]. ''The Verge''. Retrieved March 12, 2021.</ref> To alleviate future incidents, Google continuously works to improve its security infrastructure to ensure that its privacy goals are upheld for the benefit of all users.<ref>Google. (2021). [https://safety.google/photos/#photos-advanced-infraestructure "Google Photos Safety &amp; Privacy Features"]. ''Google Safety Center''. Retrieved March 12, 2021.</ref><br />
<br />
===Computer Vision===<br />
Google Photos automatically runs every photo and video through visual recognition algorithms to identify objects and places.<ref>Google. (2021). [https://support.google.com/photos/answer/6220402?hl=en&amp;ref_topic=6128818 "Get started with Google Photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> Once "face grouping" is enabled by the user, Photos will also start to categorize images by the people and pets who appear in them.<ref name="searchPTP">Google. (2021). [https://support.google.com/photos/answer/6128838?hl=en&co=GENIE.Platform%3DDesktop&oco=0#zippy=%2Clearn-about-face-models "Search by people, things & places in your photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> There is some uncertainty surrounding Google collecting the information embedded in the photos and selling it to third parties or using it to display more relevant advertisements<ref>Luckerson, V. (2017, May 25). [https://www.theringer.com/2017/5/25/16043842/google-photos-data-collection-e8578b3256e0 "Why Google Is Suddenly Obsessed With Your Photos"]. ''The Ringer''. Retrieved March 12, 2021.</ref>, even though Google ensures users that "face groups and labels in your account are only visible to you."<ref name="searchPTP" /><br />
<br />
===Geotagging===<br />
<br />
If a user allows Google to keep track of their location history under "Your Timeline" in Google Maps, Google Photos uses this alongside geotags embedded in photos to allow users to search their gallery by location and even review past trips minute-by-minute.<ref name="Vox" /><ref>Bonifacic, I. (2020, December 16). [https://www.engadget.com/google-photos-adds-maps-timeline-233633756.html "Google Photos lets you relive a day through your Maps timeline"]. ''Engadget''. Retrieved March 12, 2021.</ref><br />
<br />
Even without geotags, Google Photos is capable of intuiting a photo’s location by analyzing for major landmarks.<ref name="Vox" /> As a result, concerns have been raised about how this information is used and if it is secure.<ref>Komando, K. (2021, January 11). [https://www.usatoday.com/story/tech/columnist/komando/2021/01/07/google-map-knows-your-location-photos-you-took-how-turn-off/4113403001/ "Hidden map on your phone shows everywhere you’ve been and the photos you took there"]. ''USA Today''. Retrieved March 12, 2021.</ref><br />
<br />
Google is upfront that it uses this information to show users more relevant advertisements, but at the same time assures users that this information is never shared with advertisers.<ref>Google. (2021). [https://policies.google.com/technologies/location-data?hl=en-US "How Google uses location information"]. ''Privacy &amp; Terms''. Retrieved March 12, 2021.</ref> To accommodate these concerns, Google Photos gives its users the capability to turn off location history, remove location information from already existing photos, and choose whether or not to share a photo's location when shared.<ref>Google. (2021). [https://support.google.com/photos/answer/6153599?co=GENIE.Platform%3DAndroid&oco=1#zippy=%2Cview-your-timeline "Understand, find and edit your photos' locations"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref><br />
<br />
==See also==<br />
*[[Computer Vision]]<br />
*[[Facial Recognition]]<br />
*[[Geotag]]<br />
<br />
==References==<br />
<references/></div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=Google_Photos&diff=102063
Google Photos
2021-04-15T19:59:27Z
<p>Nfigue: /* Misidentification */ Added case study information to section</p>
<hr />
<div><span style="color: red; font-size: 50px;">Currently Being Edited</span><br />
<br />
<br />
[[File:Google_Photos_Logo_new.png|256px|thumb|right|The Google Photos logo. Copyright ©️: Google]]<br />
{{Nav-Bar|Topics##}}<br><br />
'''Google Photos''' is a photo and video storage and sharing service developed by [[Google]].<ref name="about">Google. (2021). [https://www.google.com/photos/about/ "Google Photos | About"]. ''Google Photos''. Retrieved March 26, 2021.</ref> Google Photos was released in May 2015 as it separated from Google’s former social network, [https://en.wikipedia.org/wiki/Google%2B Google+].<ref>Spradlin, L. (2015, May 24). [https://www.androidpolice.com/2015/05/24/an-exclusive-early-look-at-the-new-google-photos-app/ "An Exclusive Early Look At The New Google Photos App"]. ''Android Police''. Retrieved March 12, 2021.</ref> Google Photos was designed with the intent of creating a platform that allows people to store and easily access all of their pictures and videos from any device.<ref>Amateur Photographer. (2015, June 2). [https://www.amateurphotographer.co.uk/latest/photo-news/google-photos-service-raises-privacy-concerns-52721 "Google Photos Service Raises Privacy Concerns"]. ''Amateur Photographer''. Retrieved March 12, 2021.</ref><br />
<br />
After its release, Google Photos's user base rocketed to 200 million after one year, 500 million after two years, and surpassed 1 billion after four years.<ref>Porter, J. (2019, July 24). [https://www.theverge.com/2019/7/24/20708328/google-photos-users-gallery-go-1-billion "Google Photos passes the 1 billion users mark"]. ''The Verge''. Retrieved March 12, 2021.</ref> As of 2020, Google reports that about 28 billion photos and videos are uploaded every week and that the service is home to more than 4 trillion photos.<ref name="future">Ben-Yair, S. (2020, November 11). [https://blog.google/products/photos/storage-changes/ "Updating Google Photos' storage policy to build for the future"]. ''Google | The Keyword''. Retrieved March 12, 2021.</ref><br />
<br />
==Features==<br />
Google Photos is available on [[Android]], [https://en.wikipedia.org/wiki/IOS iOS], and [https://photos.google.com/ online].<ref name="JacobK">Kastrenakes, J. (2015, May 28). [https://www.theverge.com/2015/5/28/8678629/google-photos-app-announced "Google announces unlimited picture and video storage with new Photos app"]. ''The Verge''. Retrieved March 12, 2021.</ref> Photos and videos can be uploaded and accessed via any of these platforms.<ref name="Vox" /> The service organizes photos and videos by identifying any number of features present in the photo such as lakes, night, birthday, etc.<ref name="JacobK" /> It is capable of organizing photos by the faces of people and pets in them, even as the faces age.<ref name="Vox" /><ref name="JacobK" /> Users can manually fix or remove incorrect labels.<ref name="Vox" /> Google Photos also can group photos and videos by location.<ref name="Vox" /> It can determine a photo's location either by its embedded geotagging data, or by analyzing the photo for major landmarks (such as the [https://en.wikipedia.org/wiki/Eiffel_Tower Eiffel Tower]).<ref name="Vox" /><br />
<br />
The service includes a native photo and video editing software that can be used on any platform.<ref>Lowensohn, J. (2015, May 28). [https://www.theverge.com/2015/5/28/8673471/google-photos-hands-on-cloud-storage-io-2015 "Hands-on with Google's new Photos service"]. ''The Verge''. Retrieved March 12, 2021.</ref> Additionally, Google Photos offers a variety of ways to make sharing photos/videos easy and simple, such as generating web link to content that both users and non-users can access.<ref name="Vox">Mossberg, W. (2015, June 2). [https://www.vox.com/2015/6/2/11563182/the-new-google-photos-free-at-last-and-very-smart "The New Google Photos: Free at Last, and Very Smart"]. ''Vox''. Retrieved March 12, 2021.</ref> Google Photos allows users to share albums with someone directly via their Google account.<ref>Mathur, S. (2020, May 19). [https://blog.google/products/photos/new-controls-how-you-share-albums-google-photos/ "New controls for how you share albums in Google Photos"]. ''Google | The Keyword''. Retrieved March 12, 2021.</ref> In 2020, Google added a heat map feature that displays the concentration of photos in the library as a function of location.<ref>Newton, C. (2020, June 25). [https://www.theverge.com/2020/6/25/21301932/google-photos-redesign-map-view-memories-search "Google Photos gets a map view as part of a big new redesign"]. ''The Verge''. Retrieved March 12, 2021.</ref> In March of 2021, Google Photos started rolling out an advanced version of its video-editor for Android users. Users will be able to not only trim, stabilize, and rotate videos, but also crop to any aspect ratio, change perspective, and control brightness, contrast, highlights, and shadows. Google plans to release this feature to iOS in 2021.<ref>Li, A. (2021, March 29). [https://9to5google.com/2021/03/29/google-photos-video-editor/ "Update: Rolling out Google Photos for Android getting advanced video editor with cropping, filters."] ''9To5Google''. Retrieved April 2, 2021.</ref><br />
<br />
===Subscription Service===<br />
In late 2020, Google rolled out a subscription-based service that ships Google Photos users 10 prints of that months best photos.<ref>Perez, S. (2020, October 20).[https://techcrunch.com/2020/10/20/google-photos-revives-its-prints-subscription-service-expands-same-day-print-options/ "Google Photos revives its prints subscription service, expands same-day print options."] ''TechCrunch''. Retrieved April 2, 2021.</ref> The images are selected by a machine learning algorithm, but users can alter any choices. After some negative feedback through user-testing,<ref name="feedback">Nagaraj, P. (2020, June 21). [https://newslanded.com/2020/06/21/google-finally-cancels-controversial-google-photos-subscription-feature/ "Google finally cancels controversial google photos subscription feature."] ''News Landed''. Retrieved April 2, 2021.</ref> Google now allows higher customization of the look, and finishing of the prints along with a reduction in subscription costs from $7.99 to $6.99. Users are able to cancel the service at any time, and are also able to skip a month, if they so choose to. Customers can order prints in multiple sizes, and also ask for a same-day pick-up at their local [https://en.wikipedia.org/wiki/Walgreens Walgreens]. The most prominent feedback received for the service is the "outrageous cost"<ref name="feedback"/>, charging many times the price for printing at a traditional store.<ref name="feedback"/><br />
<br />
[[File:Google's_Revenue_Breakdown_2017-20.jpeg|300px|thumb|right|A breakdown of Google's sources of revenue from 2017 to 2020.<ref name="pivot" /> (Advertising revenue is in blue)]]<br />
<br />
[[File:Incorrectly-Labeled_Bikes-cropped.jpg|216px|thumb|left|This meal has been incorrectly labeled as a "bike" by Google's visual recognition algorithms. Copyright ©️: Cooper Stevens]]<br />
<br />
==Storage==<br />
<br />
Photos and videos are each uploaded to Google Photos in one of three ways: "original quality", "high quality", or "express."<ref name="size" /> "Original quality" uploads maintain their original resolution and use part of the associated Google account's 15GB of storage shared between all Google products (including [https://en.wikipedia.org/wiki/Gmail Gmail], [https://en.wikipedia.org/wiki/Google_Drive Google Drive], etc.).<ref name="size">Google. (2021). [https://support.google.com/photos/answer/6220791?co=GENIE.Platform%3DAndroid&oco=1#zippy=%2Cexpress-backup%2Coriginal-quality%2Chigh-quality "Choose the upload size of your photos & videos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> Meanwhile, "high quality" uploads have their resolution downgraded to 16 megapixels and 1080p.<ref name="size" /> Finally, "express" uploads are compressed to 3 megapixels and 480p.<ref name="size" /> The free tier of Google Photos allows unlimited uploads in resolutions up to "high quality."<ref name="size" /><br />
<br />
In November 2020, in an effort to increase the number of [https://en.wikipedia.org/wiki/Google_One Google One] subscriptions and reduce Google's reliance on ad-based revenue, Google announced that Google Photos will no longer offer free unlimited storage at "express" or "high quality" starting June 1, 2021.<ref name="pivot">Newman, J. (2020, November 28). [https://www.fastcompany.com/90579872/google-photos-no-unlimited-storage "The end of unlimited Google Photos storage is part of a bigger pivot"]. ''Fast Company''. Retrieved March 12, 2021.</ref><ref>Bohn, D. (2020, November 11). [https://www.theverge.com/2020/11/11/21560810/google-photos-unlimited-cap-free-uploads-15gb-ending "Google Photos will end its free unlimited storage on June 1st, 2021"]. ''The Verge''. Retrieved March 12, 2021.</ref> Existing photos and videos will remain unaffected.<ref name="future" /> After all 15GB of account storage have been used, users will either have to maintain a Google One subscription or upload from a [https://en.wikipedia.org/wiki/Google_Pixel Google Pixel] 5 or older device.<ref>Google. (2021). [https://support.google.com/photos/answer/10100180 "Storage changes for Google Photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref><ref name="size" /> Future Pixel devices will be unable to upload in "high quality" for free.<ref>Coberly, C. (2020, November 12). [https://www.techspot.com/news/87584-future-pixel-phone-owners-reportedly-wont-retain-access.html "Future Pixel phone owners may not retain unlimited Google Photos storage access"]. ''TechSpot''. Retrieved March 12, 2021.</ref><br />
<br />
[[File:Alcine1-cropped.png|216px|thumb|left|Jacky Alciné's Twitter post about the "Gorillas" misidentification.<ref name="gorillas" />]]<br />
<br />
==Motivations==<br />
<br />
There have been suggestions that Google released Google Photos in the interest of collecting more information about users through visual data.<ref>Lomas, N. (2015, June 1). [https://techcrunch.com/2015/06/01/google-photos-reminder-smile-its-free-youre-the-product/ "Google Photos Reminder: Smile, It's Free - You're The Product!"]. ''TechCrunch''. Retrieved March 12, 2021.</ref> It has also been speculated that it was intended as a method of outsourcing work to train their visual recognition algorithms; the app frequently asks its users to manually verify (or reject) its proposed tags.<ref name="improve">Perrigo, M. (2020, November 9). [https://chromeunboxed.com/google-photos-crowdsource-feature "Google Photos wants you to answer questions to help improve its image recognition capabilities"]. ''Chrome Unboxed''. Retrieved March 12, 2021.</ref> <br />
<br />
===Bias in Algorithms===<br />
Bias in artificial intelligence is a widely discussed topic, with researchers voicing concerns that flawed training data can exacerbate or create unfair assumptions. One example is facial recognition algorithms misidentifying people of color more frequently than white people.<ref name="gender"> Ghosh, S. (2020, February 20). [https://www.businessinsider.nl/google-cloud-vision-api-wont-tag-images-by-gender-2020-2/ “Google AI will no longer use gender labels like ‘woman’ or ‘man’ on images of people to avoid bias”]. “Business Insider”. Retrieved April 6, 2021. </ref> <br />
<br />
[[File:Gendergoogle.png|thumbnail|right|Google's API no longer uses gender labels<ref name="gender"/>]]<br />
<br />
In 2020, Google made a change to its API tool [https://en.wikipedia.org/wiki/Google_Cloud_Platform Cloud Vision], which uses AI to analyze images in order to identify people, places, and things. The change involved eliminating gender labels on images, because a person’s gender can’t be determined just by how they look in a photo. Instead of using terms like “man” or “woman”, Google will tag images with labels such as “person” in order to avoid instilling AI algorithms with human bias.<ref> Lyons, K. (2020, February 20). [https://www.theverge.com/2020/2/20/21145356/google-ai-images-gender-bias-labels-people “Google Ai tool will no longer use gendered labels like ‘woman’ or ‘man’ in photos of people”]. “The Verge”. Retrieved April 6, 2021. </ref> <br />
<br />
Regarding its own AI principles, Google acknowledges that algorithms and datasets can reinforce bias: “We will seek to avoid unjust impacts on people, particularly those related to sensitive characteristics such as race, ethnicity, gender, nationality, income, sexual orientation, ability, and political or religious belief. We will design AI systems that provide appropriate opportunities for feedback, relevant explanations, and appeal. Our AI technologies will be subject to appropriate human direction and control.”<ref> Pichai, S. (2018, June 7). [https://www.blog.google/technology/ai/ai-principles/ “AI at Google: our principles”]. “Google”. Retrieved April 6, 2021. </ref><br />
<br />
===Misidentification===<br />
<br />
The nature of a computer analyzing and captioning a photo through a machine learning algorithm involves training from a data set. This is where a large amount of bias comes from, in the form of capture, category and negative bias. <ref name="tommasi">Tommasi, Tatiana, Patricia, Novi, Caputo, Barbara, & Tuytelaars, Tinne. (2015). A Deeper Look at Dataset Bias. Pattern Recognition, Lecture Notes in Computer Science (Vol. 9358, pp. 504–516). Cham: Springer International Publishing.</ref> Capture bias refers to all the variables involved with acquiring images, such as lighting conditions or camera used. Category bias refers to a poor definition of categories in which the image can exist, while negative bias involves the inability to train a model on categories which are not included in the data set. <ref name="tommasi"/> Machine learning models only know what they are taught, and as such, any of these biases appearing in Googles training data will resurface when categorizing photos uploaded by users. <ref name="debrusk">Chris DeBrusk. (2018). The Risk of Machine-Learning Bias (and How to Prevent It). MIT Sloan Blogs. Blog, Cambridge: Massachusetts Institute of Technology, Cambridge, MA.</ref><br />
<br />
Shortly after its release in 2015, Google Photos found itself under fire for a particularly offensive error: user Jacky Alciné reported that he and his friend were classified as "Gorillas" by Google Photos's visual recognition algorithms.<ref name="gorillas">Mullen, J. (2015, July 2). [https://www.cnn.com/2015/07/02/tech/google-image-recognition-gorillas-tag "Google rushes to fix software that tagged photo with racial slur"]. ''CNN''. Retrieved March 19, 2021.</ref> In response, Google Photos promptly removed all labels relating to primates as a temporary fix.<ref>Hern, A. (2018, January 12). [https://www.theguardian.com/technology/2018/jan/12/google-racism-ban-gorilla-black-people "Google's solution to accidental algorithmic racism: ban gorillas"]. ''The Guardian''. Retrieved March 19, 2021.</ref> A spokesperson from Google stated that the labels "gorilla", "chimp", "chimpanzee", and "monkey" were blocked on the platform.<ref>Vincent, J. (2018, January 12). [https://www.theverge.com/2018/1/12/16882408/google-racist-gorillas-photo-recognition-algorithm-ai "Google ‘fixed’ its racist algorithm by removing gorillas from its image-labeling tech."] ''The Verge''. Retrieved April 2, 2021.</ref> According to Mr. Alcine, Google has intensified their search for qualified people of color to hire in order to correct issues stemming from under representation of minorities in technology companies. <ref> (2015, July 1). [https://www.bbc.com/news/technology-33347866 “Google apologises for Photo app’s racist blunder”]. “BBC”. Retrieved April 6, 2021. </ref> With enough data and computing power, software can be trained to categorize images to a high level of accuracy, but it can’t easily go beyond the experience of that training. Even the very best algorithms lack the ability to refine their interpretation of the world as humans do.<ref> Simonite, T. (2018, January 11). [https://www.wired.com/story/when-it-comes-to-gorillas-google-photos-remains-blind/ “When It Comes To Gorillas, Google Photos Remains Blind”]. “Wired”. Retrieved April 6, 2021. </ref><br />
<br />
In an effort to quantify the accuracy of Google Photos image recognition model, a study was performed by uploading a wide range of commonly recognizable photos and recording the results of categorization. The test was setup by uploading over one hundred photos, both with significant (descriptive) and insignificant (arbitrary) file names and placing them into albums organized by country of origin for that image. After leaving the images for one day, they had been organized into 9 broad categories with varying degrees of accuracy. The least accurate were photos of blue, incorrectly identified as the sky, while the most accurate were mountains, cars and sunsets. Nieuwenhuysen states that the system is satisfactory considering it can be used free of charge, reporting that it excels at recognizing any photo to at least a vaguely specific category, even when the image is named arbitrarily. <ref name=""> Nieuwenhuysen, Paul. (2018). Information Discovery and Images A Case Study of Google Photos. 2018 5th International Symposium on Emerging Trends and Technologies in Libraries and Information Services (ETTLIS) (pp. 16–21). orig-research, IEEE.</ref><br />
<br />
==Ethical Implications==<br />
[[File:July_13,_2018_Timeline-light.jpg|205px|thumb|right|Google Photos work in conjunction with Google Maps to create an extremely precise summary of this user’s day. Copyright ©️: Cooper Stevens]]<br />
===Security===<br />
A foremost concern regarding Google Photos is ensuring that the user who uploads photos and videos is the only one who can control access to and view that content.<ref>Hill, S. (2015, June 16). [https://www.androidauthority.com/google-photos-worried-privacy-616339/ "Google Photos: Should you be worried about privacy?"]. ''Android Authority''. Retrieved March 12, 2021.</ref> These worries are substantiated by occurrences like Google accidentally sending users’ private videos to strangers in November 2019.<ref>Warren, T. (2020, February 4). [https://www.theverge.com/2020/2/4/21122044/google-photos-privacy-breach-takeout-data-video-strangers "Google admits it sent private videos in Google Photos to strangers"]. ''The Verge''. Retrieved March 12, 2021.</ref> To alleviate future incidents, Google continuously works to improve its security infrastructure to ensure that its privacy goals are upheld for the benefit of all users.<ref>Google. (2021). [https://safety.google/photos/#photos-advanced-infraestructure "Google Photos Safety &amp; Privacy Features"]. ''Google Safety Center''. Retrieved March 12, 2021.</ref><br />
<br />
===Computer Vision===<br />
Google Photos automatically runs every photo and video through visual recognition algorithms to identify objects and places.<ref>Google. (2021). [https://support.google.com/photos/answer/6220402?hl=en&amp;ref_topic=6128818 "Get started with Google Photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> Once "face grouping" is enabled by the user, Photos will also start to categorize images by the people and pets who appear in them.<ref name="searchPTP">Google. (2021). [https://support.google.com/photos/answer/6128838?hl=en&co=GENIE.Platform%3DDesktop&oco=0#zippy=%2Clearn-about-face-models "Search by people, things & places in your photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> There is some uncertainty surrounding Google collecting the information embedded in the photos and selling it to third parties or using it to display more relevant advertisements<ref>Luckerson, V. (2017, May 25). [https://www.theringer.com/2017/5/25/16043842/google-photos-data-collection-e8578b3256e0 "Why Google Is Suddenly Obsessed With Your Photos"]. ''The Ringer''. Retrieved March 12, 2021.</ref>, even though Google ensures users that "face groups and labels in your account are only visible to you."<ref name="searchPTP" /><br />
<br />
===Geotagging===<br />
<br />
If a user allows Google to keep track of their location history under "Your Timeline" in Google Maps, Google Photos uses this alongside geotags embedded in photos to allow users to search their gallery by location and even review past trips minute-by-minute.<ref name="Vox" /><ref>Bonifacic, I. (2020, December 16). [https://www.engadget.com/google-photos-adds-maps-timeline-233633756.html "Google Photos lets you relive a day through your Maps timeline"]. ''Engadget''. Retrieved March 12, 2021.</ref><br />
<br />
Even without geotags, Google Photos is capable of intuiting a photo’s location by analyzing for major landmarks.<ref name="Vox" /> As a result, concerns have been raised about how this information is used and if it is secure.<ref>Komando, K. (2021, January 11). [https://www.usatoday.com/story/tech/columnist/komando/2021/01/07/google-map-knows-your-location-photos-you-took-how-turn-off/4113403001/ "Hidden map on your phone shows everywhere you’ve been and the photos you took there"]. ''USA Today''. Retrieved March 12, 2021.</ref><br />
<br />
Google is upfront that it uses this information to show users more relevant advertisements, but at the same time assures users that this information is never shared with advertisers.<ref>Google. (2021). [https://policies.google.com/technologies/location-data?hl=en-US "How Google uses location information"]. ''Privacy &amp; Terms''. Retrieved March 12, 2021.</ref> To accommodate these concerns, Google Photos gives its users the capability to turn off location history, remove location information from already existing photos, and choose whether or not to share a photo's location when shared.<ref>Google. (2021). [https://support.google.com/photos/answer/6153599?co=GENIE.Platform%3DAndroid&oco=1#zippy=%2Cview-your-timeline "Understand, find and edit your photos' locations"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref><br />
<br />
==See also==<br />
*[[Computer Vision]]<br />
*[[Facial Recognition]]<br />
*[[Geotag]]<br />
<br />
==References==<br />
<references/></div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=Google_Photos&diff=102060
Google Photos
2021-04-15T19:45:44Z
<p>Nfigue: /* Misidentification */ added to section</p>
<hr />
<div><span style="color: red; font-size: 50px;">Currently Being Edited</span><br />
<br />
<br />
[[File:Google_Photos_Logo_new.png|256px|thumb|right|The Google Photos logo. Copyright ©️: Google]]<br />
{{Nav-Bar|Topics##}}<br><br />
'''Google Photos''' is a photo and video storage and sharing service developed by [[Google]].<ref name="about">Google. (2021). [https://www.google.com/photos/about/ "Google Photos | About"]. ''Google Photos''. Retrieved March 26, 2021.</ref> Google Photos was released in May 2015 as it separated from Google’s former social network, [https://en.wikipedia.org/wiki/Google%2B Google+].<ref>Spradlin, L. (2015, May 24). [https://www.androidpolice.com/2015/05/24/an-exclusive-early-look-at-the-new-google-photos-app/ "An Exclusive Early Look At The New Google Photos App"]. ''Android Police''. Retrieved March 12, 2021.</ref> Google Photos was designed with the intent of creating a platform that allows people to store and easily access all of their pictures and videos from any device.<ref>Amateur Photographer. (2015, June 2). [https://www.amateurphotographer.co.uk/latest/photo-news/google-photos-service-raises-privacy-concerns-52721 "Google Photos Service Raises Privacy Concerns"]. ''Amateur Photographer''. Retrieved March 12, 2021.</ref><br />
<br />
After its release, Google Photos's user base rocketed to 200 million after one year, 500 million after two years, and surpassed 1 billion after four years.<ref>Porter, J. (2019, July 24). [https://www.theverge.com/2019/7/24/20708328/google-photos-users-gallery-go-1-billion "Google Photos passes the 1 billion users mark"]. ''The Verge''. Retrieved March 12, 2021.</ref> As of 2020, Google reports that about 28 billion photos and videos are uploaded every week and that the service is home to more than 4 trillion photos.<ref name="future">Ben-Yair, S. (2020, November 11). [https://blog.google/products/photos/storage-changes/ "Updating Google Photos' storage policy to build for the future"]. ''Google | The Keyword''. Retrieved March 12, 2021.</ref><br />
<br />
==Features==<br />
Google Photos is available on [[Android]], [https://en.wikipedia.org/wiki/IOS iOS], and [https://photos.google.com/ online].<ref name="JacobK">Kastrenakes, J. (2015, May 28). [https://www.theverge.com/2015/5/28/8678629/google-photos-app-announced "Google announces unlimited picture and video storage with new Photos app"]. ''The Verge''. Retrieved March 12, 2021.</ref> Photos and videos can be uploaded and accessed via any of these platforms.<ref name="Vox" /> The service organizes photos and videos by identifying any number of features present in the photo such as lakes, night, birthday, etc.<ref name="JacobK" /> It is capable of organizing photos by the faces of people and pets in them, even as the faces age.<ref name="Vox" /><ref name="JacobK" /> Users can manually fix or remove incorrect labels.<ref name="Vox" /> Google Photos also can group photos and videos by location.<ref name="Vox" /> It can determine a photo's location either by its embedded geotagging data, or by analyzing the photo for major landmarks (such as the [https://en.wikipedia.org/wiki/Eiffel_Tower Eiffel Tower]).<ref name="Vox" /><br />
<br />
The service includes a native photo and video editing software that can be used on any platform.<ref>Lowensohn, J. (2015, May 28). [https://www.theverge.com/2015/5/28/8673471/google-photos-hands-on-cloud-storage-io-2015 "Hands-on with Google's new Photos service"]. ''The Verge''. Retrieved March 12, 2021.</ref> Additionally, Google Photos offers a variety of ways to make sharing photos/videos easy and simple, such as generating web link to content that both users and non-users can access.<ref name="Vox">Mossberg, W. (2015, June 2). [https://www.vox.com/2015/6/2/11563182/the-new-google-photos-free-at-last-and-very-smart "The New Google Photos: Free at Last, and Very Smart"]. ''Vox''. Retrieved March 12, 2021.</ref> Google Photos allows users to share albums with someone directly via their Google account.<ref>Mathur, S. (2020, May 19). [https://blog.google/products/photos/new-controls-how-you-share-albums-google-photos/ "New controls for how you share albums in Google Photos"]. ''Google | The Keyword''. Retrieved March 12, 2021.</ref> In 2020, Google added a heat map feature that displays the concentration of photos in the library as a function of location.<ref>Newton, C. (2020, June 25). [https://www.theverge.com/2020/6/25/21301932/google-photos-redesign-map-view-memories-search "Google Photos gets a map view as part of a big new redesign"]. ''The Verge''. Retrieved March 12, 2021.</ref> In March of 2021, Google Photos started rolling out an advanced version of its video-editor for Android users. Users will be able to not only trim, stabilize, and rotate videos, but also crop to any aspect ratio, change perspective, and control brightness, contrast, highlights, and shadows. Google plans to release this feature to iOS in 2021.<ref>Li, A. (2021, March 29). [https://9to5google.com/2021/03/29/google-photos-video-editor/ "Update: Rolling out Google Photos for Android getting advanced video editor with cropping, filters."] ''9To5Google''. Retrieved April 2, 2021.</ref><br />
<br />
===Subscription Service===<br />
In late 2020, Google rolled out a subscription-based service that ships Google Photos users 10 prints of that months best photos.<ref>Perez, S. (2020, October 20).[https://techcrunch.com/2020/10/20/google-photos-revives-its-prints-subscription-service-expands-same-day-print-options/ "Google Photos revives its prints subscription service, expands same-day print options."] ''TechCrunch''. Retrieved April 2, 2021.</ref> The images are selected by a machine learning algorithm, but users can alter any choices. After some negative feedback through user-testing,<ref name="feedback">Nagaraj, P. (2020, June 21). [https://newslanded.com/2020/06/21/google-finally-cancels-controversial-google-photos-subscription-feature/ "Google finally cancels controversial google photos subscription feature."] ''News Landed''. Retrieved April 2, 2021.</ref> Google now allows higher customization of the look, and finishing of the prints along with a reduction in subscription costs from $7.99 to $6.99. Users are able to cancel the service at any time, and are also able to skip a month, if they so choose to. Customers can order prints in multiple sizes, and also ask for a same-day pick-up at their local [https://en.wikipedia.org/wiki/Walgreens Walgreens]. The most prominent feedback received for the service is the "outrageous cost"<ref name="feedback"/>, charging many times the price for printing at a traditional store.<ref name="feedback"/><br />
<br />
[[File:Google's_Revenue_Breakdown_2017-20.jpeg|300px|thumb|right|A breakdown of Google's sources of revenue from 2017 to 2020.<ref name="pivot" /> (Advertising revenue is in blue)]]<br />
<br />
[[File:Incorrectly-Labeled_Bikes-cropped.jpg|216px|thumb|left|This meal has been incorrectly labeled as a "bike" by Google's visual recognition algorithms. Copyright ©️: Cooper Stevens]]<br />
<br />
==Storage==<br />
<br />
Photos and videos are each uploaded to Google Photos in one of three ways: "original quality", "high quality", or "express."<ref name="size" /> "Original quality" uploads maintain their original resolution and use part of the associated Google account's 15GB of storage shared between all Google products (including [https://en.wikipedia.org/wiki/Gmail Gmail], [https://en.wikipedia.org/wiki/Google_Drive Google Drive], etc.).<ref name="size">Google. (2021). [https://support.google.com/photos/answer/6220791?co=GENIE.Platform%3DAndroid&oco=1#zippy=%2Cexpress-backup%2Coriginal-quality%2Chigh-quality "Choose the upload size of your photos & videos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> Meanwhile, "high quality" uploads have their resolution downgraded to 16 megapixels and 1080p.<ref name="size" /> Finally, "express" uploads are compressed to 3 megapixels and 480p.<ref name="size" /> The free tier of Google Photos allows unlimited uploads in resolutions up to "high quality."<ref name="size" /><br />
<br />
In November 2020, in an effort to increase the number of [https://en.wikipedia.org/wiki/Google_One Google One] subscriptions and reduce Google's reliance on ad-based revenue, Google announced that Google Photos will no longer offer free unlimited storage at "express" or "high quality" starting June 1, 2021.<ref name="pivot">Newman, J. (2020, November 28). [https://www.fastcompany.com/90579872/google-photos-no-unlimited-storage "The end of unlimited Google Photos storage is part of a bigger pivot"]. ''Fast Company''. Retrieved March 12, 2021.</ref><ref>Bohn, D. (2020, November 11). [https://www.theverge.com/2020/11/11/21560810/google-photos-unlimited-cap-free-uploads-15gb-ending "Google Photos will end its free unlimited storage on June 1st, 2021"]. ''The Verge''. Retrieved March 12, 2021.</ref> Existing photos and videos will remain unaffected.<ref name="future" /> After all 15GB of account storage have been used, users will either have to maintain a Google One subscription or upload from a [https://en.wikipedia.org/wiki/Google_Pixel Google Pixel] 5 or older device.<ref>Google. (2021). [https://support.google.com/photos/answer/10100180 "Storage changes for Google Photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref><ref name="size" /> Future Pixel devices will be unable to upload in "high quality" for free.<ref>Coberly, C. (2020, November 12). [https://www.techspot.com/news/87584-future-pixel-phone-owners-reportedly-wont-retain-access.html "Future Pixel phone owners may not retain unlimited Google Photos storage access"]. ''TechSpot''. Retrieved March 12, 2021.</ref><br />
<br />
[[File:Alcine1-cropped.png|216px|thumb|left|Jacky Alciné's Twitter post about the "Gorillas" misidentification.<ref name="gorillas" />]]<br />
<br />
==Motivations==<br />
<br />
There have been suggestions that Google released Google Photos in the interest of collecting more information about users through visual data.<ref>Lomas, N. (2015, June 1). [https://techcrunch.com/2015/06/01/google-photos-reminder-smile-its-free-youre-the-product/ "Google Photos Reminder: Smile, It's Free - You're The Product!"]. ''TechCrunch''. Retrieved March 12, 2021.</ref> It has also been speculated that it was intended as a method of outsourcing work to train their visual recognition algorithms; the app frequently asks its users to manually verify (or reject) its proposed tags.<ref name="improve">Perrigo, M. (2020, November 9). [https://chromeunboxed.com/google-photos-crowdsource-feature "Google Photos wants you to answer questions to help improve its image recognition capabilities"]. ''Chrome Unboxed''. Retrieved March 12, 2021.</ref> <br />
<br />
===Bias in Algorithms===<br />
Bias in artificial intelligence is a widely discussed topic, with researchers voicing concerns that flawed training data can exacerbate or create unfair assumptions. One example is facial recognition algorithms misidentifying people of color more frequently than white people.<ref name="gender"> Ghosh, S. (2020, February 20). [https://www.businessinsider.nl/google-cloud-vision-api-wont-tag-images-by-gender-2020-2/ “Google AI will no longer use gender labels like ‘woman’ or ‘man’ on images of people to avoid bias”]. “Business Insider”. Retrieved April 6, 2021. </ref> <br />
<br />
[[File:Gendergoogle.png|thumbnail|right|Google's API no longer uses gender labels<ref name="gender"/>]]<br />
<br />
In 2020, Google made a change to its API tool [https://en.wikipedia.org/wiki/Google_Cloud_Platform Cloud Vision], which uses AI to analyze images in order to identify people, places, and things. The change involved eliminating gender labels on images, because a person’s gender can’t be determined just by how they look in a photo. Instead of using terms like “man” or “woman”, Google will tag images with labels such as “person” in order to avoid instilling AI algorithms with human bias.<ref> Lyons, K. (2020, February 20). [https://www.theverge.com/2020/2/20/21145356/google-ai-images-gender-bias-labels-people “Google Ai tool will no longer use gendered labels like ‘woman’ or ‘man’ in photos of people”]. “The Verge”. Retrieved April 6, 2021. </ref> <br />
<br />
Regarding its own AI principles, Google acknowledges that algorithms and datasets can reinforce bias: “We will seek to avoid unjust impacts on people, particularly those related to sensitive characteristics such as race, ethnicity, gender, nationality, income, sexual orientation, ability, and political or religious belief. We will design AI systems that provide appropriate opportunities for feedback, relevant explanations, and appeal. Our AI technologies will be subject to appropriate human direction and control.”<ref> Pichai, S. (2018, June 7). [https://www.blog.google/technology/ai/ai-principles/ “AI at Google: our principles”]. “Google”. Retrieved April 6, 2021. </ref><br />
<br />
===Misidentification===<br />
<br />
The nature of a computer analyzing and captioning a photo through a machine learning algorithm involves training from a data set. This is where a large amount of bias comes from, in the form of capture, category and negative bias. <ref name="tommasi">Tommasi, Tatiana, Patricia, Novi, Caputo, Barbara, & Tuytelaars, Tinne. (2015). A Deeper Look at Dataset Bias. Pattern Recognition, Lecture Notes in Computer Science (Vol. 9358, pp. 504–516). Cham: Springer International Publishing.</ref> Capture bias refers to all the variables involved with acquiring images, such as lighting conditions or camera used. Category bias refers to a poor definition of categories in which the image can exist, while negative bias involves the inability to train a model on categories which are not included in the data set. <ref name="tommasi"/> Machine learning models only know what they are taught, and as such, any of these biases appearing in Googles training data will resurface when categorizing photos uploaded by users. <ref name="debrusk">Chris DeBrusk. (2018). The Risk of Machine-Learning Bias (and How to Prevent It). MIT Sloan Blogs. Blog, Cambridge: Massachusetts Institute of Technology, Cambridge, MA.</ref><br />
<br />
Shortly after its release in 2015, Google Photos found itself under fire for a particularly offensive error: user Jacky Alciné reported that he and his friend were classified as "Gorillas" by Google Photos's visual recognition algorithms.<ref name="gorillas">Mullen, J. (2015, July 2). [https://www.cnn.com/2015/07/02/tech/google-image-recognition-gorillas-tag "Google rushes to fix software that tagged photo with racial slur"]. ''CNN''. Retrieved March 19, 2021.</ref> In response, Google Photos promptly removed all labels relating to primates as a temporary fix.<ref>Hern, A. (2018, January 12). [https://www.theguardian.com/technology/2018/jan/12/google-racism-ban-gorilla-black-people "Google's solution to accidental algorithmic racism: ban gorillas"]. ''The Guardian''. Retrieved March 19, 2021.</ref> A spokesperson from Google stated that the labels "gorilla", "chimp", "chimpanzee", and "monkey" were blocked on the platform.<ref>Vincent, J. (2018, January 12). [https://www.theverge.com/2018/1/12/16882408/google-racist-gorillas-photo-recognition-algorithm-ai "Google ‘fixed’ its racist algorithm by removing gorillas from its image-labeling tech."] ''The Verge''. Retrieved April 2, 2021.</ref> According to Mr. Alcine, Google has intensified their search for qualified people of color to hire in order to correct issues stemming from under representation of minorities in technology companies. <ref> (2015, July 1). [https://www.bbc.com/news/technology-33347866 “Google apologises for Photo app’s racist blunder”]. “BBC”. Retrieved April 6, 2021. </ref> With enough data and computing power, software can be trained to categorize images to a high level of accuracy, but it can’t easily go beyond the experience of that training. Even the very best algorithms lack the ability to refine their interpretation of the world as humans do.<ref> Simonite, T. (2018, January 11). [https://www.wired.com/story/when-it-comes-to-gorillas-google-photos-remains-blind/ “When It Comes To Gorillas, Google Photos Remains Blind”]. “Wired”. Retrieved April 6, 2021. </ref><br />
<br />
==Ethical Implications==<br />
[[File:July_13,_2018_Timeline-light.jpg|205px|thumb|right|Google Photos work in conjunction with Google Maps to create an extremely precise summary of this user’s day. Copyright ©️: Cooper Stevens]]<br />
===Security===<br />
A foremost concern regarding Google Photos is ensuring that the user who uploads photos and videos is the only one who can control access to and view that content.<ref>Hill, S. (2015, June 16). [https://www.androidauthority.com/google-photos-worried-privacy-616339/ "Google Photos: Should you be worried about privacy?"]. ''Android Authority''. Retrieved March 12, 2021.</ref> These worries are substantiated by occurrences like Google accidentally sending users’ private videos to strangers in November 2019.<ref>Warren, T. (2020, February 4). [https://www.theverge.com/2020/2/4/21122044/google-photos-privacy-breach-takeout-data-video-strangers "Google admits it sent private videos in Google Photos to strangers"]. ''The Verge''. Retrieved March 12, 2021.</ref> To alleviate future incidents, Google continuously works to improve its security infrastructure to ensure that its privacy goals are upheld for the benefit of all users.<ref>Google. (2021). [https://safety.google/photos/#photos-advanced-infraestructure "Google Photos Safety &amp; Privacy Features"]. ''Google Safety Center''. Retrieved March 12, 2021.</ref><br />
<br />
===Computer Vision===<br />
Google Photos automatically runs every photo and video through visual recognition algorithms to identify objects and places.<ref>Google. (2021). [https://support.google.com/photos/answer/6220402?hl=en&amp;ref_topic=6128818 "Get started with Google Photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> Once "face grouping" is enabled by the user, Photos will also start to categorize images by the people and pets who appear in them.<ref name="searchPTP">Google. (2021). [https://support.google.com/photos/answer/6128838?hl=en&co=GENIE.Platform%3DDesktop&oco=0#zippy=%2Clearn-about-face-models "Search by people, things & places in your photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> There is some uncertainty surrounding Google collecting the information embedded in the photos and selling it to third parties or using it to display more relevant advertisements<ref>Luckerson, V. (2017, May 25). [https://www.theringer.com/2017/5/25/16043842/google-photos-data-collection-e8578b3256e0 "Why Google Is Suddenly Obsessed With Your Photos"]. ''The Ringer''. Retrieved March 12, 2021.</ref>, even though Google ensures users that "face groups and labels in your account are only visible to you."<ref name="searchPTP" /><br />
<br />
===Geotagging===<br />
<br />
If a user allows Google to keep track of their location history under "Your Timeline" in Google Maps, Google Photos uses this alongside geotags embedded in photos to allow users to search their gallery by location and even review past trips minute-by-minute.<ref name="Vox" /><ref>Bonifacic, I. (2020, December 16). [https://www.engadget.com/google-photos-adds-maps-timeline-233633756.html "Google Photos lets you relive a day through your Maps timeline"]. ''Engadget''. Retrieved March 12, 2021.</ref><br />
<br />
Even without geotags, Google Photos is capable of intuiting a photo’s location by analyzing for major landmarks.<ref name="Vox" /> As a result, concerns have been raised about how this information is used and if it is secure.<ref>Komando, K. (2021, January 11). [https://www.usatoday.com/story/tech/columnist/komando/2021/01/07/google-map-knows-your-location-photos-you-took-how-turn-off/4113403001/ "Hidden map on your phone shows everywhere you’ve been and the photos you took there"]. ''USA Today''. Retrieved March 12, 2021.</ref><br />
<br />
Google is upfront that it uses this information to show users more relevant advertisements, but at the same time assures users that this information is never shared with advertisers.<ref>Google. (2021). [https://policies.google.com/technologies/location-data?hl=en-US "How Google uses location information"]. ''Privacy &amp; Terms''. Retrieved March 12, 2021.</ref> To accommodate these concerns, Google Photos gives its users the capability to turn off location history, remove location information from already existing photos, and choose whether or not to share a photo's location when shared.<ref>Google. (2021). [https://support.google.com/photos/answer/6153599?co=GENIE.Platform%3DAndroid&oco=1#zippy=%2Cview-your-timeline "Understand, find and edit your photos' locations"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref><br />
<br />
==See also==<br />
*[[Computer Vision]]<br />
*[[Facial Recognition]]<br />
*[[Geotag]]<br />
<br />
==References==<br />
<references/></div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=Google_Photos&diff=102052
Google Photos
2021-04-15T19:00:47Z
<p>Nfigue: Split Bias in Algorithms into two sections for specificity</p>
<hr />
<div><span style="color: red; font-size: 50px;">Currently Being Edited</span><br />
<br />
<br />
[[File:Google_Photos_Logo_new.png|256px|thumb|right|The Google Photos logo. Copyright ©️: Google]]<br />
{{Nav-Bar|Topics##}}<br><br />
'''Google Photos''' is a photo and video storage and sharing service developed by [[Google]].<ref name="about">Google. (2021). [https://www.google.com/photos/about/ "Google Photos | About"]. ''Google Photos''. Retrieved March 26, 2021.</ref> Google Photos was released in May 2015 as it separated from Google’s former social network, [https://en.wikipedia.org/wiki/Google%2B Google+].<ref>Spradlin, L. (2015, May 24). [https://www.androidpolice.com/2015/05/24/an-exclusive-early-look-at-the-new-google-photos-app/ "An Exclusive Early Look At The New Google Photos App"]. ''Android Police''. Retrieved March 12, 2021.</ref> Google Photos was designed with the intent of creating a platform that allows people to store and easily access all of their pictures and videos from any device.<ref>Amateur Photographer. (2015, June 2). [https://www.amateurphotographer.co.uk/latest/photo-news/google-photos-service-raises-privacy-concerns-52721 "Google Photos Service Raises Privacy Concerns"]. ''Amateur Photographer''. Retrieved March 12, 2021.</ref><br />
<br />
After its release, Google Photos's user base rocketed to 200 million after one year, 500 million after two years, and surpassed 1 billion after four years.<ref>Porter, J. (2019, July 24). [https://www.theverge.com/2019/7/24/20708328/google-photos-users-gallery-go-1-billion "Google Photos passes the 1 billion users mark"]. ''The Verge''. Retrieved March 12, 2021.</ref> As of 2020, Google reports that about 28 billion photos and videos are uploaded every week and that the service is home to more than 4 trillion photos.<ref name="future">Ben-Yair, S. (2020, November 11). [https://blog.google/products/photos/storage-changes/ "Updating Google Photos' storage policy to build for the future"]. ''Google | The Keyword''. Retrieved March 12, 2021.</ref><br />
<br />
==Features==<br />
Google Photos is available on [[Android]], [https://en.wikipedia.org/wiki/IOS iOS], and [https://photos.google.com/ online].<ref name="JacobK">Kastrenakes, J. (2015, May 28). [https://www.theverge.com/2015/5/28/8678629/google-photos-app-announced "Google announces unlimited picture and video storage with new Photos app"]. ''The Verge''. Retrieved March 12, 2021.</ref> Photos and videos can be uploaded and accessed via any of these platforms.<ref name="Vox" /> The service organizes photos and videos by identifying any number of features present in the photo such as lakes, night, birthday, etc.<ref name="JacobK" /> It is capable of organizing photos by the faces of people and pets in them, even as the faces age.<ref name="Vox" /><ref name="JacobK" /> Users can manually fix or remove incorrect labels.<ref name="Vox" /> Google Photos also can group photos and videos by location.<ref name="Vox" /> It can determine a photo's location either by its embedded geotagging data, or by analyzing the photo for major landmarks (such as the [https://en.wikipedia.org/wiki/Eiffel_Tower Eiffel Tower]).<ref name="Vox" /><br />
<br />
The service includes a native photo and video editing software that can be used on any platform.<ref>Lowensohn, J. (2015, May 28). [https://www.theverge.com/2015/5/28/8673471/google-photos-hands-on-cloud-storage-io-2015 "Hands-on with Google's new Photos service"]. ''The Verge''. Retrieved March 12, 2021.</ref> Additionally, Google Photos offers a variety of ways to make sharing photos/videos easy and simple, such as generating web link to content that both users and non-users can access.<ref name="Vox">Mossberg, W. (2015, June 2). [https://www.vox.com/2015/6/2/11563182/the-new-google-photos-free-at-last-and-very-smart "The New Google Photos: Free at Last, and Very Smart"]. ''Vox''. Retrieved March 12, 2021.</ref> Google Photos allows users to share albums with someone directly via their Google account.<ref>Mathur, S. (2020, May 19). [https://blog.google/products/photos/new-controls-how-you-share-albums-google-photos/ "New controls for how you share albums in Google Photos"]. ''Google | The Keyword''. Retrieved March 12, 2021.</ref> In 2020, Google added a heat map feature that displays the concentration of photos in the library as a function of location.<ref>Newton, C. (2020, June 25). [https://www.theverge.com/2020/6/25/21301932/google-photos-redesign-map-view-memories-search "Google Photos gets a map view as part of a big new redesign"]. ''The Verge''. Retrieved March 12, 2021.</ref> In March of 2021, Google Photos started rolling out an advanced version of its video-editor for Android users. Users will be able to not only trim, stabilize, and rotate videos, but also crop to any aspect ratio, change perspective, and control brightness, contrast, highlights, and shadows. Google plans to release this feature to iOS in 2021.<ref>Li, A. (2021, March 29). [https://9to5google.com/2021/03/29/google-photos-video-editor/ "Update: Rolling out Google Photos for Android getting advanced video editor with cropping, filters."] ''9To5Google''. Retrieved April 2, 2021.</ref><br />
<br />
===Subscription Service===<br />
In late 2020, Google rolled out a subscription-based service that ships Google Photos users 10 prints of that months best photos.<ref>Perez, S. (2020, October 20).[https://techcrunch.com/2020/10/20/google-photos-revives-its-prints-subscription-service-expands-same-day-print-options/ "Google Photos revives its prints subscription service, expands same-day print options."] ''TechCrunch''. Retrieved April 2, 2021.</ref> The images are selected by a machine learning algorithm, but users can alter any choices. After some negative feedback through user-testing,<ref name="feedback">Nagaraj, P. (2020, June 21). [https://newslanded.com/2020/06/21/google-finally-cancels-controversial-google-photos-subscription-feature/ "Google finally cancels controversial google photos subscription feature."] ''News Landed''. Retrieved April 2, 2021.</ref> Google now allows higher customization of the look, and finishing of the prints along with a reduction in subscription costs from $7.99 to $6.99. Users are able to cancel the service at any time, and are also able to skip a month, if they so choose to. Customers can order prints in multiple sizes, and also ask for a same-day pick-up at their local [https://en.wikipedia.org/wiki/Walgreens Walgreens]. The most prominent feedback received for the service is the "outrageous cost"<ref name="feedback"/>, charging many times the price for printing at a traditional store.<ref name="feedback"/><br />
<br />
[[File:Google's_Revenue_Breakdown_2017-20.jpeg|300px|thumb|right|A breakdown of Google's sources of revenue from 2017 to 2020.<ref name="pivot" /> (Advertising revenue is in blue)]]<br />
<br />
[[File:Incorrectly-Labeled_Bikes-cropped.jpg|216px|thumb|left|This meal has been incorrectly labeled as a "bike" by Google's visual recognition algorithms. Copyright ©️: Cooper Stevens]]<br />
<br />
==Storage==<br />
<br />
Photos and videos are each uploaded to Google Photos in one of three ways: "original quality", "high quality", or "express."<ref name="size" /> "Original quality" uploads maintain their original resolution and use part of the associated Google account's 15GB of storage shared between all Google products (including [https://en.wikipedia.org/wiki/Gmail Gmail], [https://en.wikipedia.org/wiki/Google_Drive Google Drive], etc.).<ref name="size">Google. (2021). [https://support.google.com/photos/answer/6220791?co=GENIE.Platform%3DAndroid&oco=1#zippy=%2Cexpress-backup%2Coriginal-quality%2Chigh-quality "Choose the upload size of your photos & videos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> Meanwhile, "high quality" uploads have their resolution downgraded to 16 megapixels and 1080p.<ref name="size" /> Finally, "express" uploads are compressed to 3 megapixels and 480p.<ref name="size" /> The free tier of Google Photos allows unlimited uploads in resolutions up to "high quality."<ref name="size" /><br />
<br />
In November 2020, in an effort to increase the number of [https://en.wikipedia.org/wiki/Google_One Google One] subscriptions and reduce Google's reliance on ad-based revenue, Google announced that Google Photos will no longer offer free unlimited storage at "express" or "high quality" starting June 1, 2021.<ref name="pivot">Newman, J. (2020, November 28). [https://www.fastcompany.com/90579872/google-photos-no-unlimited-storage "The end of unlimited Google Photos storage is part of a bigger pivot"]. ''Fast Company''. Retrieved March 12, 2021.</ref><ref>Bohn, D. (2020, November 11). [https://www.theverge.com/2020/11/11/21560810/google-photos-unlimited-cap-free-uploads-15gb-ending "Google Photos will end its free unlimited storage on June 1st, 2021"]. ''The Verge''. Retrieved March 12, 2021.</ref> Existing photos and videos will remain unaffected.<ref name="future" /> After all 15GB of account storage have been used, users will either have to maintain a Google One subscription or upload from a [https://en.wikipedia.org/wiki/Google_Pixel Google Pixel] 5 or older device.<ref>Google. (2021). [https://support.google.com/photos/answer/10100180 "Storage changes for Google Photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref><ref name="size" /> Future Pixel devices will be unable to upload in "high quality" for free.<ref>Coberly, C. (2020, November 12). [https://www.techspot.com/news/87584-future-pixel-phone-owners-reportedly-wont-retain-access.html "Future Pixel phone owners may not retain unlimited Google Photos storage access"]. ''TechSpot''. Retrieved March 12, 2021.</ref><br />
<br />
[[File:Alcine1-cropped.png|216px|thumb|left|Jacky Alciné's Twitter post about the "Gorillas" misidentification.<ref name="gorillas" />]]<br />
<br />
==Motivations==<br />
<br />
There have been suggestions that Google released Google Photos in the interest of collecting more information about users through visual data.<ref>Lomas, N. (2015, June 1). [https://techcrunch.com/2015/06/01/google-photos-reminder-smile-its-free-youre-the-product/ "Google Photos Reminder: Smile, It's Free - You're The Product!"]. ''TechCrunch''. Retrieved March 12, 2021.</ref> It has also been speculated that it was intended as a method of outsourcing work to train their visual recognition algorithms; the app frequently asks its users to manually verify (or reject) its proposed tags.<ref name="improve">Perrigo, M. (2020, November 9). [https://chromeunboxed.com/google-photos-crowdsource-feature "Google Photos wants you to answer questions to help improve its image recognition capabilities"]. ''Chrome Unboxed''. Retrieved March 12, 2021.</ref> <br />
<br />
===Bias in Algorithms===<br />
Bias in artificial intelligence is a widely discussed topic, with researchers voicing concerns that flawed training data can exacerbate or create unfair assumptions. One example is facial recognition algorithms misidentifying people of color more frequently than white people.<ref name="gender"> Ghosh, S. (2020, February 20). [https://www.businessinsider.nl/google-cloud-vision-api-wont-tag-images-by-gender-2020-2/ “Google AI will no longer use gender labels like ‘woman’ or ‘man’ on images of people to avoid bias”]. “Business Insider”. Retrieved April 6, 2021. </ref> <br />
<br />
[[File:Gendergoogle.png|thumbnail|right|Google's API no longer uses gender labels<ref name="gender"/>]]<br />
<br />
In 2020, Google made a change to its API tool [https://en.wikipedia.org/wiki/Google_Cloud_Platform Cloud Vision], which uses AI to analyze images in order to identify people, places, and things. The change involved eliminating gender labels on images, because a person’s gender can’t be determined just by how they look in a photo. Instead of using terms like “man” or “woman”, Google will tag images with labels such as “person” in order to avoid instilling AI algorithms with human bias.<ref> Lyons, K. (2020, February 20). [https://www.theverge.com/2020/2/20/21145356/google-ai-images-gender-bias-labels-people “Google Ai tool will no longer use gendered labels like ‘woman’ or ‘man’ in photos of people”]. “The Verge”. Retrieved April 6, 2021. </ref> <br />
<br />
Regarding its own AI principles, Google acknowledges that algorithms and datasets can reinforce bias: “We will seek to avoid unjust impacts on people, particularly those related to sensitive characteristics such as race, ethnicity, gender, nationality, income, sexual orientation, ability, and political or religious belief. We will design AI systems that provide appropriate opportunities for feedback, relevant explanations, and appeal. Our AI technologies will be subject to appropriate human direction and control.”<ref> Pichai, S. (2018, June 7). [https://www.blog.google/technology/ai/ai-principles/ “AI at Google: our principles”]. “Google”. Retrieved April 6, 2021. </ref><br />
<br />
===Misidentification===<br />
<br />
The nature of a computer analyzing and captioning a photo through a machine learning algorithm involves training from a data set. This is where a large amount of bias comes from, both in the form of capture and category bias. <ref name="tommasi">Tommasi, Tatiana, Patricia, Novi, Caputo, Barbara, & Tuytelaars, Tinne. (2015). A Deeper Look at Dataset Bias. Pattern Recognition, Lecture Notes in Computer Science (Vol. 9358, pp. 504–516). Cham: Springer International Publishing.</ref><br />
<br />
Shortly after its release in 2015, Google Photos found itself under fire for a particularly offensive misidentification: user Jacky Alciné reported that he and his friend were classified as "Gorillas" by Google Photos's visual recognition algorithms.<ref name="gorillas">Mullen, J. (2015, July 2). [https://www.cnn.com/2015/07/02/tech/google-image-recognition-gorillas-tag "Google rushes to fix software that tagged photo with racial slur"]. ''CNN''. Retrieved March 19, 2021.</ref> In response, Google Photos promptly removed all labels relating to primates as a temporary fix.<ref>Hern, A. (2018, January 12). [https://www.theguardian.com/technology/2018/jan/12/google-racism-ban-gorilla-black-people "Google's solution to accidental algorithmic racism: ban gorillas"]. ''The Guardian''. Retrieved March 19, 2021.</ref> A spokesperson from Google stated that the labels "gorilla", "chimp", "chimpanzee", and "monkey" were blocked on the platform.<ref>Vincent, J. (2018, January 12). [https://www.theverge.com/2018/1/12/16882408/google-racist-gorillas-photo-recognition-algorithm-ai "Google ‘fixed’ its racist algorithm by removing gorillas from its image-labeling tech."] ''The Verge''. Retrieved April 2, 2021.</ref> According to Mr. Alcine, Google has intensified their search for qualified people of color to hire in order to correct issues stemming from under representation of minorities in technology companies. <ref> (2015, July 1). [https://www.bbc.com/news/technology-33347866 “Google apologises for Photo app’s racist blunder”]. “BBC”. Retrieved April 6, 2021. </ref> With enough data and computing power, software can be trained to categorize images to a high level of accuracy, but it can’t easily go beyond the experience of that training. Even the very best algorithms lack the ability to refine their interpretation of the world as humans do.<ref> Simonite, T. (2018, January 11). [https://www.wired.com/story/when-it-comes-to-gorillas-google-photos-remains-blind/ “When It Comes To Gorillas, Google Photos Remains Blind”]. “Wired”. Retrieved April 6, 2021. </ref><br />
<br />
<br />
==Ethical Implications==<br />
[[File:July_13,_2018_Timeline-light.jpg|205px|thumb|right|Google Photos work in conjunction with Google Maps to create an extremely precise summary of this user’s day. Copyright ©️: Cooper Stevens]]<br />
===Security===<br />
A foremost concern regarding Google Photos is ensuring that the user who uploads photos and videos is the only one who can control access to and view that content.<ref>Hill, S. (2015, June 16). [https://www.androidauthority.com/google-photos-worried-privacy-616339/ "Google Photos: Should you be worried about privacy?"]. ''Android Authority''. Retrieved March 12, 2021.</ref> These worries are substantiated by occurrences like Google accidentally sending users’ private videos to strangers in November 2019.<ref>Warren, T. (2020, February 4). [https://www.theverge.com/2020/2/4/21122044/google-photos-privacy-breach-takeout-data-video-strangers "Google admits it sent private videos in Google Photos to strangers"]. ''The Verge''. Retrieved March 12, 2021.</ref> To alleviate future incidents, Google continuously works to improve its security infrastructure to ensure that its privacy goals are upheld for the benefit of all users.<ref>Google. (2021). [https://safety.google/photos/#photos-advanced-infraestructure "Google Photos Safety &amp; Privacy Features"]. ''Google Safety Center''. Retrieved March 12, 2021.</ref><br />
<br />
===Computer Vision===<br />
Google Photos automatically runs every photo and video through visual recognition algorithms to identify objects and places.<ref>Google. (2021). [https://support.google.com/photos/answer/6220402?hl=en&amp;ref_topic=6128818 "Get started with Google Photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> Once "face grouping" is enabled by the user, Photos will also start to categorize images by the people and pets who appear in them.<ref name="searchPTP">Google. (2021). [https://support.google.com/photos/answer/6128838?hl=en&co=GENIE.Platform%3DDesktop&oco=0#zippy=%2Clearn-about-face-models "Search by people, things & places in your photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> There is some uncertainty surrounding Google collecting the information embedded in the photos and selling it to third parties or using it to display more relevant advertisements<ref>Luckerson, V. (2017, May 25). [https://www.theringer.com/2017/5/25/16043842/google-photos-data-collection-e8578b3256e0 "Why Google Is Suddenly Obsessed With Your Photos"]. ''The Ringer''. Retrieved March 12, 2021.</ref>, even though Google ensures users that "face groups and labels in your account are only visible to you."<ref name="searchPTP" /><br />
<br />
===Geotagging===<br />
<br />
If a user allows Google to keep track of their location history under "Your Timeline" in Google Maps, Google Photos uses this alongside geotags embedded in photos to allow users to search their gallery by location and even review past trips minute-by-minute.<ref name="Vox" /><ref>Bonifacic, I. (2020, December 16). [https://www.engadget.com/google-photos-adds-maps-timeline-233633756.html "Google Photos lets you relive a day through your Maps timeline"]. ''Engadget''. Retrieved March 12, 2021.</ref><br />
<br />
Even without geotags, Google Photos is capable of intuiting a photo’s location by analyzing for major landmarks.<ref name="Vox" /> As a result, concerns have been raised about how this information is used and if it is secure.<ref>Komando, K. (2021, January 11). [https://www.usatoday.com/story/tech/columnist/komando/2021/01/07/google-map-knows-your-location-photos-you-took-how-turn-off/4113403001/ "Hidden map on your phone shows everywhere you’ve been and the photos you took there"]. ''USA Today''. Retrieved March 12, 2021.</ref><br />
<br />
Google is upfront that it uses this information to show users more relevant advertisements, but at the same time assures users that this information is never shared with advertisers.<ref>Google. (2021). [https://policies.google.com/technologies/location-data?hl=en-US "How Google uses location information"]. ''Privacy &amp; Terms''. Retrieved March 12, 2021.</ref> To accommodate these concerns, Google Photos gives its users the capability to turn off location history, remove location information from already existing photos, and choose whether or not to share a photo's location when shared.<ref>Google. (2021). [https://support.google.com/photos/answer/6153599?co=GENIE.Platform%3DAndroid&oco=1#zippy=%2Cview-your-timeline "Understand, find and edit your photos' locations"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref><br />
<br />
==See also==<br />
*[[Computer Vision]]<br />
*[[Facial Recognition]]<br />
*[[Geotag]]<br />
<br />
==References==<br />
<references/></div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=Google_Photos&diff=102050
Google Photos
2021-04-15T18:43:59Z
<p>Nfigue: /* Bias in Algorithms */ rewording</p>
<hr />
<div><span style="color: red; font-size: 50px;">Currently Being Edited</span><br />
<br />
<br />
[[File:Google_Photos_Logo_new.png|256px|thumb|right|The Google Photos logo. Copyright ©️: Google]]<br />
{{Nav-Bar|Topics##}}<br><br />
'''Google Photos''' is a photo and video storage and sharing service developed by [[Google]].<ref name="about">Google. (2021). [https://www.google.com/photos/about/ "Google Photos | About"]. ''Google Photos''. Retrieved March 26, 2021.</ref> Google Photos was released in May 2015 as it separated from Google’s former social network, [https://en.wikipedia.org/wiki/Google%2B Google+].<ref>Spradlin, L. (2015, May 24). [https://www.androidpolice.com/2015/05/24/an-exclusive-early-look-at-the-new-google-photos-app/ "An Exclusive Early Look At The New Google Photos App"]. ''Android Police''. Retrieved March 12, 2021.</ref> Google Photos was designed with the intent of creating a platform that allows people to store and easily access all of their pictures and videos from any device.<ref>Amateur Photographer. (2015, June 2). [https://www.amateurphotographer.co.uk/latest/photo-news/google-photos-service-raises-privacy-concerns-52721 "Google Photos Service Raises Privacy Concerns"]. ''Amateur Photographer''. Retrieved March 12, 2021.</ref><br />
<br />
After its release, Google Photos's user base rocketed to 200 million after one year, 500 million after two years, and surpassed 1 billion after four years.<ref>Porter, J. (2019, July 24). [https://www.theverge.com/2019/7/24/20708328/google-photos-users-gallery-go-1-billion "Google Photos passes the 1 billion users mark"]. ''The Verge''. Retrieved March 12, 2021.</ref> As of 2020, Google reports that about 28 billion photos and videos are uploaded every week and that the service is home to more than 4 trillion photos.<ref name="future">Ben-Yair, S. (2020, November 11). [https://blog.google/products/photos/storage-changes/ "Updating Google Photos' storage policy to build for the future"]. ''Google | The Keyword''. Retrieved March 12, 2021.</ref><br />
<br />
==Features==<br />
Google Photos is available on [[Android]], [https://en.wikipedia.org/wiki/IOS iOS], and [https://photos.google.com/ online].<ref name="JacobK">Kastrenakes, J. (2015, May 28). [https://www.theverge.com/2015/5/28/8678629/google-photos-app-announced "Google announces unlimited picture and video storage with new Photos app"]. ''The Verge''. Retrieved March 12, 2021.</ref> Photos and videos can be uploaded and accessed via any of these platforms.<ref name="Vox" /> The service organizes photos and videos by identifying any number of features present in the photo such as lakes, night, birthday, etc.<ref name="JacobK" /> It is capable of organizing photos by the faces of people and pets in them, even as the faces age.<ref name="Vox" /><ref name="JacobK" /> Users can manually fix or remove incorrect labels.<ref name="Vox" /> Google Photos also can group photos and videos by location.<ref name="Vox" /> It can determine a photo's location either by its embedded geotagging data, or by analyzing the photo for major landmarks (such as the [https://en.wikipedia.org/wiki/Eiffel_Tower Eiffel Tower]).<ref name="Vox" /><br />
<br />
The service includes a native photo and video editing software that can be used on any platform.<ref>Lowensohn, J. (2015, May 28). [https://www.theverge.com/2015/5/28/8673471/google-photos-hands-on-cloud-storage-io-2015 "Hands-on with Google's new Photos service"]. ''The Verge''. Retrieved March 12, 2021.</ref> Additionally, Google Photos offers a variety of ways to make sharing photos/videos easy and simple, such as generating web link to content that both users and non-users can access.<ref name="Vox">Mossberg, W. (2015, June 2). [https://www.vox.com/2015/6/2/11563182/the-new-google-photos-free-at-last-and-very-smart "The New Google Photos: Free at Last, and Very Smart"]. ''Vox''. Retrieved March 12, 2021.</ref> Google Photos allows users to share albums with someone directly via their Google account.<ref>Mathur, S. (2020, May 19). [https://blog.google/products/photos/new-controls-how-you-share-albums-google-photos/ "New controls for how you share albums in Google Photos"]. ''Google | The Keyword''. Retrieved March 12, 2021.</ref> In 2020, Google added a heat map feature that displays the concentration of photos in the library as a function of location.<ref>Newton, C. (2020, June 25). [https://www.theverge.com/2020/6/25/21301932/google-photos-redesign-map-view-memories-search "Google Photos gets a map view as part of a big new redesign"]. ''The Verge''. Retrieved March 12, 2021.</ref> In March of 2021, Google Photos started rolling out an advanced version of its video-editor for Android users. Users will be able to not only trim, stabilize, and rotate videos, but also crop to any aspect ratio, change perspective, and control brightness, contrast, highlights, and shadows. Google plans to release this feature to iOS in 2021.<ref>Li, A. (2021, March 29). [https://9to5google.com/2021/03/29/google-photos-video-editor/ "Update: Rolling out Google Photos for Android getting advanced video editor with cropping, filters."] ''9To5Google''. Retrieved April 2, 2021.</ref><br />
<br />
===Subscription Service===<br />
In late 2020, Google rolled out a subscription-based service that ships Google Photos users 10 prints of that months best photos.<ref>Perez, S. (2020, October 20).[https://techcrunch.com/2020/10/20/google-photos-revives-its-prints-subscription-service-expands-same-day-print-options/ "Google Photos revives its prints subscription service, expands same-day print options."] ''TechCrunch''. Retrieved April 2, 2021.</ref> The images are selected by a machine learning algorithm, but users can alter any choices. After some negative feedback through user-testing,<ref name="feedback">Nagaraj, P. (2020, June 21). [https://newslanded.com/2020/06/21/google-finally-cancels-controversial-google-photos-subscription-feature/ "Google finally cancels controversial google photos subscription feature."] ''News Landed''. Retrieved April 2, 2021.</ref> Google now allows higher customization of the look, and finishing of the prints along with a reduction in subscription costs from $7.99 to $6.99. Users are able to cancel the service at any time, and are also able to skip a month, if they so choose to. Customers can order prints in multiple sizes, and also ask for a same-day pick-up at their local [https://en.wikipedia.org/wiki/Walgreens Walgreens]. The most prominent feedback received for the service is the "outrageous cost"<ref name="feedback"/>, charging many times the price for printing at a traditional store.<ref name="feedback"/><br />
<br />
[[File:Google's_Revenue_Breakdown_2017-20.jpeg|300px|thumb|right|A breakdown of Google's sources of revenue from 2017 to 2020.<ref name="pivot" /> (Advertising revenue is in blue)]]<br />
<br />
[[File:Incorrectly-Labeled_Bikes-cropped.jpg|216px|thumb|left|This meal has been incorrectly labeled as a "bike" by Google's visual recognition algorithms. Copyright ©️: Cooper Stevens]]<br />
<br />
==Storage==<br />
<br />
Photos and videos are each uploaded to Google Photos in one of three ways: "original quality", "high quality", or "express."<ref name="size" /> "Original quality" uploads maintain their original resolution and use part of the associated Google account's 15GB of storage shared between all Google products (including [https://en.wikipedia.org/wiki/Gmail Gmail], [https://en.wikipedia.org/wiki/Google_Drive Google Drive], etc.).<ref name="size">Google. (2021). [https://support.google.com/photos/answer/6220791?co=GENIE.Platform%3DAndroid&oco=1#zippy=%2Cexpress-backup%2Coriginal-quality%2Chigh-quality "Choose the upload size of your photos & videos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> Meanwhile, "high quality" uploads have their resolution downgraded to 16 megapixels and 1080p.<ref name="size" /> Finally, "express" uploads are compressed to 3 megapixels and 480p.<ref name="size" /> The free tier of Google Photos allows unlimited uploads in resolutions up to "high quality."<ref name="size" /><br />
<br />
In November 2020, in an effort to increase the number of [https://en.wikipedia.org/wiki/Google_One Google One] subscriptions and reduce Google's reliance on ad-based revenue, Google announced that Google Photos will no longer offer free unlimited storage at "express" or "high quality" starting June 1, 2021.<ref name="pivot">Newman, J. (2020, November 28). [https://www.fastcompany.com/90579872/google-photos-no-unlimited-storage "The end of unlimited Google Photos storage is part of a bigger pivot"]. ''Fast Company''. Retrieved March 12, 2021.</ref><ref>Bohn, D. (2020, November 11). [https://www.theverge.com/2020/11/11/21560810/google-photos-unlimited-cap-free-uploads-15gb-ending "Google Photos will end its free unlimited storage on June 1st, 2021"]. ''The Verge''. Retrieved March 12, 2021.</ref> Existing photos and videos will remain unaffected.<ref name="future" /> After all 15GB of account storage have been used, users will either have to maintain a Google One subscription or upload from a [https://en.wikipedia.org/wiki/Google_Pixel Google Pixel] 5 or older device.<ref>Google. (2021). [https://support.google.com/photos/answer/10100180 "Storage changes for Google Photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref><ref name="size" /> Future Pixel devices will be unable to upload in "high quality" for free.<ref>Coberly, C. (2020, November 12). [https://www.techspot.com/news/87584-future-pixel-phone-owners-reportedly-wont-retain-access.html "Future Pixel phone owners may not retain unlimited Google Photos storage access"]. ''TechSpot''. Retrieved March 12, 2021.</ref><br />
<br />
[[File:Alcine1-cropped.png|216px|thumb|left|Jacky Alciné's Twitter post about the "Gorillas" misidentification.<ref name="gorillas" />]]<br />
<br />
==Motivations==<br />
<br />
There have been suggestions that Google released Google Photos in the interest of collecting more information about users through visual data.<ref>Lomas, N. (2015, June 1). [https://techcrunch.com/2015/06/01/google-photos-reminder-smile-its-free-youre-the-product/ "Google Photos Reminder: Smile, It's Free - You're The Product!"]. ''TechCrunch''. Retrieved March 12, 2021.</ref> It has also been speculated that it was intended as a method of outsourcing work to train their visual recognition algorithms; the app frequently asks its users to manually verify (or reject) its proposed tags.<ref name="improve">Perrigo, M. (2020, November 9). [https://chromeunboxed.com/google-photos-crowdsource-feature "Google Photos wants you to answer questions to help improve its image recognition capabilities"]. ''Chrome Unboxed''. Retrieved March 12, 2021.</ref> <br />
<br />
===Bias in Algorithms===<br />
Bias in artificial intelligence is a widely discussed topic, with researchers voicing concerns that flawed training data can exacerbate or create unfair assumptions. One example is facial recognition algorithms misidentifying people of color more frequently than white people.<ref name="gender"> Ghosh, S. (2020, February 20). [https://www.businessinsider.nl/google-cloud-vision-api-wont-tag-images-by-gender-2020-2/ “Google AI will no longer use gender labels like ‘woman’ or ‘man’ on images of people to avoid bias”]. “Business Insider”. Retrieved April 6, 2021. </ref> <br />
<br />
Shortly after its release in 2015, Google Photos found itself under fire for a particularly offensive misidentification: user Jacky Alciné reported that he and his friend were classified as "Gorillas" by Google Photos's visual recognition algorithms.<ref name="gorillas">Mullen, J. (2015, July 2). [https://www.cnn.com/2015/07/02/tech/google-image-recognition-gorillas-tag "Google rushes to fix software that tagged photo with racial slur"]. ''CNN''. Retrieved March 19, 2021.</ref> In response, Google Photos promptly removed all labels relating to primates as a temporary fix.<ref>Hern, A. (2018, January 12). [https://www.theguardian.com/technology/2018/jan/12/google-racism-ban-gorilla-black-people "Google's solution to accidental algorithmic racism: ban gorillas"]. ''The Guardian''. Retrieved March 19, 2021.</ref> A spokesperson from Google stated that the labels "gorilla", "chimp", "chimpanzee", and "monkey" were blocked on the platform.<ref>Vincent, J. (2018, January 12). [https://www.theverge.com/2018/1/12/16882408/google-racist-gorillas-photo-recognition-algorithm-ai "Google ‘fixed’ its racist algorithm by removing gorillas from its image-labeling tech."] ''The Verge''. Retrieved April 2, 2021.</ref> According to Mr. Alcine, Google has intensified their search for qualified people of color to hire in order to correct issues stemming from under representation of minorities in technology companies. <ref> (2015, July 1). [https://www.bbc.com/news/technology-33347866 “Google apologises for Photo app’s racist blunder”]. “BBC”. Retrieved April 6, 2021. </ref> With enough data and computing power, software can be trained to categorize images to a high level of accuracy, but it can’t easily go beyond the experience of that training. Even the very best algorithms lack the ability to refine their interpretation of the world as humans do.<ref> Simonite, T. (2018, January 11). [https://www.wired.com/story/when-it-comes-to-gorillas-google-photos-remains-blind/ “When It Comes To Gorillas, Google Photos Remains Blind”]. “Wired”. Retrieved April 6, 2021. </ref><br />
<br />
[[File:Gendergoogle.png|thumbnail|right|Google's API no longer uses gender labels<ref name="gender"/>]]<br />
<br />
In 2020, Google made a change to its API tool [https://en.wikipedia.org/wiki/Google_Cloud_Platform Cloud Vision], which uses AI to analyze images in order to identify people, places, and things. The change involved eliminating gender labels on images, because a person’s gender can’t be determined just by how they look in a photo. Instead of using terms like “man” or “woman”, Google will tag images with labels such as “person” in order to avoid instilling AI algorithms with human bias.<ref> Lyons, K. (2020, February 20). [https://www.theverge.com/2020/2/20/21145356/google-ai-images-gender-bias-labels-people “Google Ai tool will no longer use gendered labels like ‘woman’ or ‘man’ in photos of people”]. “The Verge”. Retrieved April 6, 2021. </ref> <br />
<br />
<br />
Regarding its own AI principles, Google acknowledges that algorithms and datasets can reinforce bias: “We will seek to avoid unjust impacts on people, particularly those related to sensitive characteristics such as race, ethnicity, gender, nationality, income, sexual orientation, ability, and political or religious belief. We will design AI systems that provide appropriate opportunities for feedback, relevant explanations, and appeal. Our AI technologies will be subject to appropriate human direction and control.”<ref> Pichai, S. (2018, June 7). [https://www.blog.google/technology/ai/ai-principles/ “AI at Google: our principles”]. “Google”. Retrieved April 6, 2021. </ref><br />
<br />
==Ethical Implications==<br />
[[File:July_13,_2018_Timeline-light.jpg|205px|thumb|right|Google Photos work in conjunction with Google Maps to create an extremely precise summary of this user’s day. Copyright ©️: Cooper Stevens]]<br />
===Security===<br />
A foremost concern regarding Google Photos is ensuring that the user who uploads photos and videos is the only one who can control access to and view that content.<ref>Hill, S. (2015, June 16). [https://www.androidauthority.com/google-photos-worried-privacy-616339/ "Google Photos: Should you be worried about privacy?"]. ''Android Authority''. Retrieved March 12, 2021.</ref> These worries are substantiated by occurrences like Google accidentally sending users’ private videos to strangers in November 2019.<ref>Warren, T. (2020, February 4). [https://www.theverge.com/2020/2/4/21122044/google-photos-privacy-breach-takeout-data-video-strangers "Google admits it sent private videos in Google Photos to strangers"]. ''The Verge''. Retrieved March 12, 2021.</ref> To alleviate future incidents, Google continuously works to improve its security infrastructure to ensure that its privacy goals are upheld for the benefit of all users.<ref>Google. (2021). [https://safety.google/photos/#photos-advanced-infraestructure "Google Photos Safety &amp; Privacy Features"]. ''Google Safety Center''. Retrieved March 12, 2021.</ref><br />
<br />
===Computer Vision===<br />
Google Photos automatically runs every photo and video through visual recognition algorithms to identify objects and places.<ref>Google. (2021). [https://support.google.com/photos/answer/6220402?hl=en&amp;ref_topic=6128818 "Get started with Google Photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> Once "face grouping" is enabled by the user, Photos will also start to categorize images by the people and pets who appear in them.<ref name="searchPTP">Google. (2021). [https://support.google.com/photos/answer/6128838?hl=en&co=GENIE.Platform%3DDesktop&oco=0#zippy=%2Clearn-about-face-models "Search by people, things & places in your photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> There is some uncertainty surrounding Google collecting the information embedded in the photos and selling it to third parties or using it to display more relevant advertisements<ref>Luckerson, V. (2017, May 25). [https://www.theringer.com/2017/5/25/16043842/google-photos-data-collection-e8578b3256e0 "Why Google Is Suddenly Obsessed With Your Photos"]. ''The Ringer''. Retrieved March 12, 2021.</ref>, even though Google ensures users that "face groups and labels in your account are only visible to you."<ref name="searchPTP" /><br />
<br />
===Geotagging===<br />
<br />
If a user allows Google to keep track of their location history under "Your Timeline" in Google Maps, Google Photos uses this alongside geotags embedded in photos to allow users to search their gallery by location and even review past trips minute-by-minute.<ref name="Vox" /><ref>Bonifacic, I. (2020, December 16). [https://www.engadget.com/google-photos-adds-maps-timeline-233633756.html "Google Photos lets you relive a day through your Maps timeline"]. ''Engadget''. Retrieved March 12, 2021.</ref><br />
<br />
Even without geotags, Google Photos is capable of intuiting a photo’s location by analyzing for major landmarks.<ref name="Vox" /> As a result, concerns have been raised about how this information is used and if it is secure.<ref>Komando, K. (2021, January 11). [https://www.usatoday.com/story/tech/columnist/komando/2021/01/07/google-map-knows-your-location-photos-you-took-how-turn-off/4113403001/ "Hidden map on your phone shows everywhere you’ve been and the photos you took there"]. ''USA Today''. Retrieved March 12, 2021.</ref><br />
<br />
Google is upfront that it uses this information to show users more relevant advertisements, but at the same time assures users that this information is never shared with advertisers.<ref>Google. (2021). [https://policies.google.com/technologies/location-data?hl=en-US "How Google uses location information"]. ''Privacy &amp; Terms''. Retrieved March 12, 2021.</ref> To accommodate these concerns, Google Photos gives its users the capability to turn off location history, remove location information from already existing photos, and choose whether or not to share a photo's location when shared.<ref>Google. (2021). [https://support.google.com/photos/answer/6153599?co=GENIE.Platform%3DAndroid&oco=1#zippy=%2Cview-your-timeline "Understand, find and edit your photos' locations"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref><br />
<br />
==See also==<br />
*[[Computer Vision]]<br />
*[[Facial Recognition]]<br />
*[[Geotag]]<br />
<br />
==References==<br />
<references/></div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=Google_Photos&diff=102033
Google Photos
2021-04-15T17:05:51Z
<p>Nfigue: /* Computer Vision */ deleted some wordiness, reworded</p>
<hr />
<div><span style="color: red; font-size: 50px;">Currently Being Edited</span><br />
<br />
<br />
[[File:Google_Photos_Logo_new.png|256px|thumb|right|The Google Photos logo. Copyright ©️: Google]]<br />
{{Nav-Bar|Topics##}}<br><br />
'''Google Photos''' is a photo and video storage and sharing service developed by [[Google]].<ref name="about">Google. (2021). [https://www.google.com/photos/about/ "Google Photos | About"]. ''Google Photos''. Retrieved March 26, 2021.</ref> Google Photos was released in May 2015 as it separated from Google’s former social network, [https://en.wikipedia.org/wiki/Google%2B Google+].<ref>Spradlin, L. (2015, May 24). [https://www.androidpolice.com/2015/05/24/an-exclusive-early-look-at-the-new-google-photos-app/ "An Exclusive Early Look At The New Google Photos App"]. ''Android Police''. Retrieved March 12, 2021.</ref> Google Photos was designed with the intent of creating a platform that allows people to store and easily access all of their pictures and videos from any device.<ref>Amateur Photographer. (2015, June 2). [https://www.amateurphotographer.co.uk/latest/photo-news/google-photos-service-raises-privacy-concerns-52721 "Google Photos Service Raises Privacy Concerns"]. ''Amateur Photographer''. Retrieved March 12, 2021.</ref><br />
<br />
After its release, Google Photos's user base rocketed to 200 million after one year, 500 million after two years, and surpassed 1 billion after four years.<ref>Porter, J. (2019, July 24). [https://www.theverge.com/2019/7/24/20708328/google-photos-users-gallery-go-1-billion "Google Photos passes the 1 billion users mark"]. ''The Verge''. Retrieved March 12, 2021.</ref> As of 2020, Google reports that about 28 billion photos and videos are uploaded every week and that the service is home to more than 4 trillion photos.<ref name="future">Ben-Yair, S. (2020, November 11). [https://blog.google/products/photos/storage-changes/ "Updating Google Photos' storage policy to build for the future"]. ''Google | The Keyword''. Retrieved March 12, 2021.</ref><br />
<br />
==Features==<br />
Google Photos is available on [[Android]], [https://en.wikipedia.org/wiki/IOS iOS], and [https://photos.google.com/ online].<ref name="JacobK">Kastrenakes, J. (2015, May 28). [https://www.theverge.com/2015/5/28/8678629/google-photos-app-announced "Google announces unlimited picture and video storage with new Photos app"]. ''The Verge''. Retrieved March 12, 2021.</ref> Photos and videos can be uploaded and accessed via any of these platforms.<ref name="Vox" /> The service organizes photos and videos by identifying any number of features present in the photo such as lakes, night, birthday, etc.<ref name="JacobK" /> It is capable of organizing photos by the faces of people and pets in them, even as the faces age.<ref name="Vox" /><ref name="JacobK" /> Users can manually fix or remove incorrect labels.<ref name="Vox" /> Google Photos also can group photos and videos by location.<ref name="Vox" /> It can determine a photo's location either by its embedded geotagging data, or by analyzing the photo for major landmarks (such as the [https://en.wikipedia.org/wiki/Eiffel_Tower Eiffel Tower]).<ref name="Vox" /><br />
<br />
The service includes a native photo and video editing software that can be used on any platform.<ref>Lowensohn, J. (2015, May 28). [https://www.theverge.com/2015/5/28/8673471/google-photos-hands-on-cloud-storage-io-2015 "Hands-on with Google's new Photos service"]. ''The Verge''. Retrieved March 12, 2021.</ref> Additionally, Google Photos offers a variety of ways to make sharing photos/videos easy and simple, such as generating web link to content that both users and non-users can access.<ref name="Vox">Mossberg, W. (2015, June 2). [https://www.vox.com/2015/6/2/11563182/the-new-google-photos-free-at-last-and-very-smart "The New Google Photos: Free at Last, and Very Smart"]. ''Vox''. Retrieved March 12, 2021.</ref> Google Photos allows users to share albums with someone directly via their Google account.<ref>Mathur, S. (2020, May 19). [https://blog.google/products/photos/new-controls-how-you-share-albums-google-photos/ "New controls for how you share albums in Google Photos"]. ''Google | The Keyword''. Retrieved March 12, 2021.</ref> In 2020, Google added a heat map feature that displays the concentration of photos in the library as a function of location.<ref>Newton, C. (2020, June 25). [https://www.theverge.com/2020/6/25/21301932/google-photos-redesign-map-view-memories-search "Google Photos gets a map view as part of a big new redesign"]. ''The Verge''. Retrieved March 12, 2021.</ref> In March of 2021, Google Photos started rolling out an advanced version of its video-editor for Android users. Users will be able to not only trim, stabilize, and rotate videos, but also crop to any aspect ratio, change perspective, and control brightness, contrast, highlights, and shadows. Google plans to release this feature to iOS in 2021.<ref>Li, A. (2021, March 29). [https://9to5google.com/2021/03/29/google-photos-video-editor/ "Update: Rolling out Google Photos for Android getting advanced video editor with cropping, filters."] ''9To5Google''. Retrieved April 2, 2021.</ref><br />
<br />
===Subscription Service===<br />
In late 2020, Google rolled out a subscription-based service that ships Google Photos users 10 prints of that months best photos.<ref>Perez, S. (2020, October 20).[https://techcrunch.com/2020/10/20/google-photos-revives-its-prints-subscription-service-expands-same-day-print-options/ "Google Photos revives its prints subscription service, expands same-day print options."] ''TechCrunch''. Retrieved April 2, 2021.</ref> The images are selected by a machine learning algorithm, but users can alter any choices. After some negative feedback through user-testing,<ref name="feedback">Nagaraj, P. (2020, June 21). [https://newslanded.com/2020/06/21/google-finally-cancels-controversial-google-photos-subscription-feature/ "Google finally cancels controversial google photos subscription feature."] ''News Landed''. Retrieved April 2, 2021.</ref> Google now allows higher customization of the look, and finishing of the prints along with a reduction in subscription costs from $7.99 to $6.99. Users are able to cancel the service at any time, and are also able to skip a month, if they so choose to. Customers can order prints in multiple sizes, and also ask for a same-day pick-up at their local [https://en.wikipedia.org/wiki/Walgreens Walgreens]. The most prominent feedback received for the service is the "outrageous cost"<ref name="feedback"/>, charging many times the price for printing at a traditional store.<ref name="feedback"/><br />
<br />
[[File:Google's_Revenue_Breakdown_2017-20.jpeg|300px|thumb|right|A breakdown of Google's sources of revenue from 2017 to 2020.<ref name="pivot" /> (Advertising revenue is in blue)]]<br />
<br />
[[File:Incorrectly-Labeled_Bikes-cropped.jpg|216px|thumb|left|This meal has been incorrectly labeled as a "bike" by Google's visual recognition algorithms. Copyright ©️: Cooper Stevens]]<br />
<br />
==Storage==<br />
<br />
Photos and videos are each uploaded to Google Photos in one of three ways: "original quality", "high quality", or "express."<ref name="size" /> "Original quality" uploads maintain their original resolution and use part of the associated Google account's 15GB of storage shared between all Google products (including [https://en.wikipedia.org/wiki/Gmail Gmail], [https://en.wikipedia.org/wiki/Google_Drive Google Drive], etc.).<ref name="size">Google. (2021). [https://support.google.com/photos/answer/6220791?co=GENIE.Platform%3DAndroid&oco=1#zippy=%2Cexpress-backup%2Coriginal-quality%2Chigh-quality "Choose the upload size of your photos & videos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> Meanwhile, "high quality" uploads have their resolution downgraded to 16 megapixels and 1080p.<ref name="size" /> Finally, "express" uploads are compressed to 3 megapixels and 480p.<ref name="size" /> The free tier of Google Photos allows unlimited uploads in resolutions up to "high quality."<ref name="size" /><br />
<br />
In November 2020, in an effort to increase the number of [https://en.wikipedia.org/wiki/Google_One Google One] subscriptions and reduce Google's reliance on ad-based revenue, Google announced that Google Photos will no longer offer free unlimited storage at "express" or "high quality" starting June 1, 2021.<ref name="pivot">Newman, J. (2020, November 28). [https://www.fastcompany.com/90579872/google-photos-no-unlimited-storage "The end of unlimited Google Photos storage is part of a bigger pivot"]. ''Fast Company''. Retrieved March 12, 2021.</ref><ref>Bohn, D. (2020, November 11). [https://www.theverge.com/2020/11/11/21560810/google-photos-unlimited-cap-free-uploads-15gb-ending "Google Photos will end its free unlimited storage on June 1st, 2021"]. ''The Verge''. Retrieved March 12, 2021.</ref> Existing photos and videos will remain unaffected.<ref name="future" /> After all 15GB of account storage have been used, users will either have to maintain a Google One subscription or upload from a [https://en.wikipedia.org/wiki/Google_Pixel Google Pixel] 5 or older device.<ref>Google. (2021). [https://support.google.com/photos/answer/10100180 "Storage changes for Google Photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref><ref name="size" /> Future Pixel devices will be unable to upload in "high quality" for free.<ref>Coberly, C. (2020, November 12). [https://www.techspot.com/news/87584-future-pixel-phone-owners-reportedly-wont-retain-access.html "Future Pixel phone owners may not retain unlimited Google Photos storage access"]. ''TechSpot''. Retrieved March 12, 2021.</ref><br />
<br />
[[File:Alcine1-cropped.png|216px|thumb|left|Jacky Alciné's Twitter post about the "Gorillas" misidentification.<ref name="gorillas" />]]<br />
<br />
==Motivations==<br />
<br />
There have been suggestions that Google released Google Photos in the interest of collecting more information about users through visual data.<ref>Lomas, N. (2015, June 1). [https://techcrunch.com/2015/06/01/google-photos-reminder-smile-its-free-youre-the-product/ "Google Photos Reminder: Smile, It's Free - You're The Product!"]. ''TechCrunch''. Retrieved March 12, 2021.</ref> It has also been speculated that it was intended as a method of outsourcing work to train their visual recognition algorithms; the app frequently asks its users to manually verify (or reject) its proposed tags.<ref name="improve">Perrigo, M. (2020, November 9). [https://chromeunboxed.com/google-photos-crowdsource-feature "Google Photos wants you to answer questions to help improve its image recognition capabilities"]. ''Chrome Unboxed''. Retrieved March 12, 2021.</ref> <br />
<br />
===Bias in Algorithms===<br />
Bias in artificial intelligence is a widely discussed topic, with researchers voicing concerns that flawed training data can exacerbate or create unfair assumptions. One example is facial recognition algorithms misidentifying people of color more frequently than white people.<ref name="gender"> Ghosh, S. (2020, February 20). [https://www.businessinsider.nl/google-cloud-vision-api-wont-tag-images-by-gender-2020-2/ “Google AI will no longer use gender labels like ‘woman’ or ‘man’ on images of people to avoid bias”]. “Business Insider”. Retrieved April 6, 2021. </ref> <br />
<br />
Shortly after its release in 2015, Google Photos found itself under fire for a particularly offensive misidentification: user Jacky Alciné reported that he and his friend were classified as "Gorillas" by Google Photos's visual recognition algorithms.<ref name="gorillas">Mullen, J. (2015, July 2). [https://www.cnn.com/2015/07/02/tech/google-image-recognition-gorillas-tag "Google rushes to fix software that tagged photo with racial slur"]. ''CNN''. Retrieved March 19, 2021.</ref> In response, Google Photos promptly removed all labels relating to primates as a temporary fix.<ref>Hern, A. (2018, January 12). [https://www.theguardian.com/technology/2018/jan/12/google-racism-ban-gorilla-black-people "Google's solution to accidental algorithmic racism: ban gorillas"]. ''The Guardian''. Retrieved March 19, 2021.</ref> A spokesperson from Google stated that the labels "gorilla", "chimp", "chimpanzee", and "monkey" were blocked on the platform.<ref>Vincent, J. (2018, January 12). [https://www.theverge.com/2018/1/12/16882408/google-racist-gorillas-photo-recognition-algorithm-ai "Google ‘fixed’ its racist algorithm by removing gorillas from its image-labeling tech."] ''The Verge''. Retrieved April 2, 2021.</ref> According to Mr. Alcine, Google has mentioned a more intensified search into getting people of colour candidates through the door, but only time will tell that will help correct the problem of under representation of minorities in technology companies. <ref> (2015, July 1). [https://www.bbc.com/news/technology-33347866 “Google apologises for Photo app’s racist blunder”]. “BBC”. Retrieved April 6, 2021. </ref> With enough data and computing power, software can be trained to categorize images to a high level of accuracy, but it can’t easily go beyond the experience of that training. Even the very best algorithms lack the ability to refine their interpretation of the world as humans do.<ref> Simonite, T. (2018, January 11). [https://www.wired.com/story/when-it-comes-to-gorillas-google-photos-remains-blind/ “When It Comes To Gorillas, Google Photos Remains Blind”]. “Wired”. Retrieved April 6, 2021. </ref><br />
<br />
[[File:Gendergoogle.png|thumbnail|right|Google's API no longer uses gender labels<ref name="gender"/>]]<br />
<br />
In 2020, Google made a change to its API tool [https://en.wikipedia.org/wiki/Google_Cloud_Platform Cloud Vision], which uses AI to analyze images in order to identify people, places, and things. The change involved eliminating gender labels on images, because a person’s gender can’t be determined just by how they look in a photo. Instead of using terms like “man” or “woman”, Google will tag images with labels such as “person” in order to avoid instilling AI algorithms with human bias.<ref> Lyons, K. (2020, February 20). [https://www.theverge.com/2020/2/20/21145356/google-ai-images-gender-bias-labels-people “Google Ai tool will no longer use gendered labels like ‘woman’ or ‘man’ in photos of people”]. “The Verge”. Retrieved April 6, 2021. </ref> <br />
<br />
<br />
Regarding its own AI principles, Google acknowledges that algorithms and datasets can reinforce bias: “We will seek to avoid unjust impacts on people, particularly those related to sensitive characteristics such as race, ethnicity, gender, nationality, income, sexual orientation, ability, and political or religious belief. We will design AI systems that provide appropriate opportunities for feedback, relevant explanations, and appeal. Our AI technologies will be subject to appropriate human direction and control.”<ref> Pichai, S. (2018, June 7). [https://www.blog.google/technology/ai/ai-principles/ “AI at Google: our principles”]. “Google”. Retrieved April 6, 2021. </ref><br />
<br />
==Ethical Implications==<br />
[[File:July_13,_2018_Timeline-light.jpg|205px|thumb|right|Google Photos work in conjunction with Google Maps to create an extremely precise summary of this user’s day. Copyright ©️: Cooper Stevens]]<br />
===Security===<br />
A foremost concern regarding Google Photos is ensuring that the user who uploads photos and videos is the only one who can control access to and view that content.<ref>Hill, S. (2015, June 16). [https://www.androidauthority.com/google-photos-worried-privacy-616339/ "Google Photos: Should you be worried about privacy?"]. ''Android Authority''. Retrieved March 12, 2021.</ref> These worries are substantiated by occurrences like Google accidentally sending users’ private videos to strangers in November 2019.<ref>Warren, T. (2020, February 4). [https://www.theverge.com/2020/2/4/21122044/google-photos-privacy-breach-takeout-data-video-strangers "Google admits it sent private videos in Google Photos to strangers"]. ''The Verge''. Retrieved March 12, 2021.</ref> To alleviate future incidents, Google continuously works to improve its security infrastructure to ensure that its privacy goals are upheld for the benefit of all users.<ref>Google. (2021). [https://safety.google/photos/#photos-advanced-infraestructure "Google Photos Safety &amp; Privacy Features"]. ''Google Safety Center''. Retrieved March 12, 2021.</ref><br />
<br />
===Computer Vision===<br />
Google Photos automatically runs every photo and video through visual recognition algorithms to identify objects and places.<ref>Google. (2021). [https://support.google.com/photos/answer/6220402?hl=en&amp;ref_topic=6128818 "Get started with Google Photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> Once "face grouping" is enabled by the user, Photos will also start to categorize images by the people and pets who appear in them.<ref name="searchPTP">Google. (2021). [https://support.google.com/photos/answer/6128838?hl=en&co=GENIE.Platform%3DDesktop&oco=0#zippy=%2Clearn-about-face-models "Search by people, things & places in your photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> There is some uncertainty surrounding Google collecting the information embedded in the photos and selling it to third parties or using it to display more relevant advertisements<ref>Luckerson, V. (2017, May 25). [https://www.theringer.com/2017/5/25/16043842/google-photos-data-collection-e8578b3256e0 "Why Google Is Suddenly Obsessed With Your Photos"]. ''The Ringer''. Retrieved March 12, 2021.</ref>, even though Google ensures users that "face groups and labels in your account are only visible to you."<ref name="searchPTP" /><br />
<br />
===Geotagging===<br />
<br />
If a user allows Google to keep track of their location history under "Your Timeline" in Google Maps, Google Photos uses this alongside geotags embedded in photos to allow users to search their gallery by location and even review past trips minute-by-minute.<ref name="Vox" /><ref>Bonifacic, I. (2020, December 16). [https://www.engadget.com/google-photos-adds-maps-timeline-233633756.html "Google Photos lets you relive a day through your Maps timeline"]. ''Engadget''. Retrieved March 12, 2021.</ref><br />
<br />
Even without geotags, Google Photos is capable of intuiting a photo’s location by analyzing for major landmarks.<ref name="Vox" /> As a result, concerns have been raised about how this information is used and if it is secure.<ref>Komando, K. (2021, January 11). [https://www.usatoday.com/story/tech/columnist/komando/2021/01/07/google-map-knows-your-location-photos-you-took-how-turn-off/4113403001/ "Hidden map on your phone shows everywhere you’ve been and the photos you took there"]. ''USA Today''. Retrieved March 12, 2021.</ref><br />
<br />
Google is upfront that it uses this information to show users more relevant advertisements, but at the same time assures users that this information is never shared with advertisers.<ref>Google. (2021). [https://policies.google.com/technologies/location-data?hl=en-US "How Google uses location information"]. ''Privacy &amp; Terms''. Retrieved March 12, 2021.</ref> To accommodate these concerns, Google Photos gives its users the capability to turn off location history, remove location information from already existing photos, and choose whether or not to share a photo's location when shared.<ref>Google. (2021). [https://support.google.com/photos/answer/6153599?co=GENIE.Platform%3DAndroid&oco=1#zippy=%2Cview-your-timeline "Understand, find and edit your photos' locations"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref><br />
<br />
==See also==<br />
*[[Computer Vision]]<br />
*[[Facial Recognition]]<br />
*[[Geotag]]<br />
<br />
==References==<br />
<references/></div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=Google_Photos&diff=102032
Google Photos
2021-04-15T17:01:56Z
<p>Nfigue: /* Security */ reworded for understanding</p>
<hr />
<div><span style="color: red; font-size: 50px;">Currently Being Edited</span><br />
<br />
<br />
[[File:Google_Photos_Logo_new.png|256px|thumb|right|The Google Photos logo. Copyright ©️: Google]]<br />
{{Nav-Bar|Topics##}}<br><br />
'''Google Photos''' is a photo and video storage and sharing service developed by [[Google]].<ref name="about">Google. (2021). [https://www.google.com/photos/about/ "Google Photos | About"]. ''Google Photos''. Retrieved March 26, 2021.</ref> Google Photos was released in May 2015 as it separated from Google’s former social network, [https://en.wikipedia.org/wiki/Google%2B Google+].<ref>Spradlin, L. (2015, May 24). [https://www.androidpolice.com/2015/05/24/an-exclusive-early-look-at-the-new-google-photos-app/ "An Exclusive Early Look At The New Google Photos App"]. ''Android Police''. Retrieved March 12, 2021.</ref> Google Photos was designed with the intent of creating a platform that allows people to store and easily access all of their pictures and videos from any device.<ref>Amateur Photographer. (2015, June 2). [https://www.amateurphotographer.co.uk/latest/photo-news/google-photos-service-raises-privacy-concerns-52721 "Google Photos Service Raises Privacy Concerns"]. ''Amateur Photographer''. Retrieved March 12, 2021.</ref><br />
<br />
After its release, Google Photos's user base rocketed to 200 million after one year, 500 million after two years, and surpassed 1 billion after four years.<ref>Porter, J. (2019, July 24). [https://www.theverge.com/2019/7/24/20708328/google-photos-users-gallery-go-1-billion "Google Photos passes the 1 billion users mark"]. ''The Verge''. Retrieved March 12, 2021.</ref> As of 2020, Google reports that about 28 billion photos and videos are uploaded every week and that the service is home to more than 4 trillion photos.<ref name="future">Ben-Yair, S. (2020, November 11). [https://blog.google/products/photos/storage-changes/ "Updating Google Photos' storage policy to build for the future"]. ''Google | The Keyword''. Retrieved March 12, 2021.</ref><br />
<br />
==Features==<br />
Google Photos is available on [[Android]], [https://en.wikipedia.org/wiki/IOS iOS], and [https://photos.google.com/ online].<ref name="JacobK">Kastrenakes, J. (2015, May 28). [https://www.theverge.com/2015/5/28/8678629/google-photos-app-announced "Google announces unlimited picture and video storage with new Photos app"]. ''The Verge''. Retrieved March 12, 2021.</ref> Photos and videos can be uploaded and accessed via any of these platforms.<ref name="Vox" /> The service organizes photos and videos by identifying any number of features present in the photo such as lakes, night, birthday, etc.<ref name="JacobK" /> It is capable of organizing photos by the faces of people and pets in them, even as the faces age.<ref name="Vox" /><ref name="JacobK" /> Users can manually fix or remove incorrect labels.<ref name="Vox" /> Google Photos also can group photos and videos by location.<ref name="Vox" /> It can determine a photo's location either by its embedded geotagging data, or by analyzing the photo for major landmarks (such as the [https://en.wikipedia.org/wiki/Eiffel_Tower Eiffel Tower]).<ref name="Vox" /><br />
<br />
The service includes a native photo and video editing software that can be used on any platform.<ref>Lowensohn, J. (2015, May 28). [https://www.theverge.com/2015/5/28/8673471/google-photos-hands-on-cloud-storage-io-2015 "Hands-on with Google's new Photos service"]. ''The Verge''. Retrieved March 12, 2021.</ref> Additionally, Google Photos offers a variety of ways to make sharing photos/videos easy and simple, such as generating web link to content that both users and non-users can access.<ref name="Vox">Mossberg, W. (2015, June 2). [https://www.vox.com/2015/6/2/11563182/the-new-google-photos-free-at-last-and-very-smart "The New Google Photos: Free at Last, and Very Smart"]. ''Vox''. Retrieved March 12, 2021.</ref> Google Photos allows users to share albums with someone directly via their Google account.<ref>Mathur, S. (2020, May 19). [https://blog.google/products/photos/new-controls-how-you-share-albums-google-photos/ "New controls for how you share albums in Google Photos"]. ''Google | The Keyword''. Retrieved March 12, 2021.</ref> In 2020, Google added a heat map feature that displays the concentration of photos in the library as a function of location.<ref>Newton, C. (2020, June 25). [https://www.theverge.com/2020/6/25/21301932/google-photos-redesign-map-view-memories-search "Google Photos gets a map view as part of a big new redesign"]. ''The Verge''. Retrieved March 12, 2021.</ref> In March of 2021, Google Photos started rolling out an advanced version of its video-editor for Android users. Users will be able to not only trim, stabilize, and rotate videos, but also crop to any aspect ratio, change perspective, and control brightness, contrast, highlights, and shadows. Google plans to release this feature to iOS in 2021.<ref>Li, A. (2021, March 29). [https://9to5google.com/2021/03/29/google-photos-video-editor/ "Update: Rolling out Google Photos for Android getting advanced video editor with cropping, filters."] ''9To5Google''. Retrieved April 2, 2021.</ref><br />
<br />
===Subscription Service===<br />
In late 2020, Google rolled out a subscription-based service that ships Google Photos users 10 prints of that months best photos.<ref>Perez, S. (2020, October 20).[https://techcrunch.com/2020/10/20/google-photos-revives-its-prints-subscription-service-expands-same-day-print-options/ "Google Photos revives its prints subscription service, expands same-day print options."] ''TechCrunch''. Retrieved April 2, 2021.</ref> The images are selected by a machine learning algorithm, but users can alter any choices. After some negative feedback through user-testing,<ref name="feedback">Nagaraj, P. (2020, June 21). [https://newslanded.com/2020/06/21/google-finally-cancels-controversial-google-photos-subscription-feature/ "Google finally cancels controversial google photos subscription feature."] ''News Landed''. Retrieved April 2, 2021.</ref> Google now allows higher customization of the look, and finishing of the prints along with a reduction in subscription costs from $7.99 to $6.99. Users are able to cancel the service at any time, and are also able to skip a month, if they so choose to. Customers can order prints in multiple sizes, and also ask for a same-day pick-up at their local [https://en.wikipedia.org/wiki/Walgreens Walgreens]. The most prominent feedback received for the service is the "outrageous cost"<ref name="feedback"/>, charging many times the price for printing at a traditional store.<ref name="feedback"/><br />
<br />
[[File:Google's_Revenue_Breakdown_2017-20.jpeg|300px|thumb|right|A breakdown of Google's sources of revenue from 2017 to 2020.<ref name="pivot" /> (Advertising revenue is in blue)]]<br />
<br />
[[File:Incorrectly-Labeled_Bikes-cropped.jpg|216px|thumb|left|This meal has been incorrectly labeled as a "bike" by Google's visual recognition algorithms. Copyright ©️: Cooper Stevens]]<br />
<br />
==Storage==<br />
<br />
Photos and videos are each uploaded to Google Photos in one of three ways: "original quality", "high quality", or "express."<ref name="size" /> "Original quality" uploads maintain their original resolution and use part of the associated Google account's 15GB of storage shared between all Google products (including [https://en.wikipedia.org/wiki/Gmail Gmail], [https://en.wikipedia.org/wiki/Google_Drive Google Drive], etc.).<ref name="size">Google. (2021). [https://support.google.com/photos/answer/6220791?co=GENIE.Platform%3DAndroid&oco=1#zippy=%2Cexpress-backup%2Coriginal-quality%2Chigh-quality "Choose the upload size of your photos & videos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> Meanwhile, "high quality" uploads have their resolution downgraded to 16 megapixels and 1080p.<ref name="size" /> Finally, "express" uploads are compressed to 3 megapixels and 480p.<ref name="size" /> The free tier of Google Photos allows unlimited uploads in resolutions up to "high quality."<ref name="size" /><br />
<br />
In November 2020, in an effort to increase the number of [https://en.wikipedia.org/wiki/Google_One Google One] subscriptions and reduce Google's reliance on ad-based revenue, Google announced that Google Photos will no longer offer free unlimited storage at "express" or "high quality" starting June 1, 2021.<ref name="pivot">Newman, J. (2020, November 28). [https://www.fastcompany.com/90579872/google-photos-no-unlimited-storage "The end of unlimited Google Photos storage is part of a bigger pivot"]. ''Fast Company''. Retrieved March 12, 2021.</ref><ref>Bohn, D. (2020, November 11). [https://www.theverge.com/2020/11/11/21560810/google-photos-unlimited-cap-free-uploads-15gb-ending "Google Photos will end its free unlimited storage on June 1st, 2021"]. ''The Verge''. Retrieved March 12, 2021.</ref> Existing photos and videos will remain unaffected.<ref name="future" /> After all 15GB of account storage have been used, users will either have to maintain a Google One subscription or upload from a [https://en.wikipedia.org/wiki/Google_Pixel Google Pixel] 5 or older device.<ref>Google. (2021). [https://support.google.com/photos/answer/10100180 "Storage changes for Google Photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref><ref name="size" /> Future Pixel devices will be unable to upload in "high quality" for free.<ref>Coberly, C. (2020, November 12). [https://www.techspot.com/news/87584-future-pixel-phone-owners-reportedly-wont-retain-access.html "Future Pixel phone owners may not retain unlimited Google Photos storage access"]. ''TechSpot''. Retrieved March 12, 2021.</ref><br />
<br />
[[File:Alcine1-cropped.png|216px|thumb|left|Jacky Alciné's Twitter post about the "Gorillas" misidentification.<ref name="gorillas" />]]<br />
<br />
==Motivations==<br />
<br />
There have been suggestions that Google released Google Photos in the interest of collecting more information about users through visual data.<ref>Lomas, N. (2015, June 1). [https://techcrunch.com/2015/06/01/google-photos-reminder-smile-its-free-youre-the-product/ "Google Photos Reminder: Smile, It's Free - You're The Product!"]. ''TechCrunch''. Retrieved March 12, 2021.</ref> It has also been speculated that it was intended as a method of outsourcing work to train their visual recognition algorithms; the app frequently asks its users to manually verify (or reject) its proposed tags.<ref name="improve">Perrigo, M. (2020, November 9). [https://chromeunboxed.com/google-photos-crowdsource-feature "Google Photos wants you to answer questions to help improve its image recognition capabilities"]. ''Chrome Unboxed''. Retrieved March 12, 2021.</ref> <br />
<br />
===Bias in Algorithms===<br />
Bias in artificial intelligence is a widely discussed topic, with researchers voicing concerns that flawed training data can exacerbate or create unfair assumptions. One example is facial recognition algorithms misidentifying people of color more frequently than white people.<ref name="gender"> Ghosh, S. (2020, February 20). [https://www.businessinsider.nl/google-cloud-vision-api-wont-tag-images-by-gender-2020-2/ “Google AI will no longer use gender labels like ‘woman’ or ‘man’ on images of people to avoid bias”]. “Business Insider”. Retrieved April 6, 2021. </ref> <br />
<br />
Shortly after its release in 2015, Google Photos found itself under fire for a particularly offensive misidentification: user Jacky Alciné reported that he and his friend were classified as "Gorillas" by Google Photos's visual recognition algorithms.<ref name="gorillas">Mullen, J. (2015, July 2). [https://www.cnn.com/2015/07/02/tech/google-image-recognition-gorillas-tag "Google rushes to fix software that tagged photo with racial slur"]. ''CNN''. Retrieved March 19, 2021.</ref> In response, Google Photos promptly removed all labels relating to primates as a temporary fix.<ref>Hern, A. (2018, January 12). [https://www.theguardian.com/technology/2018/jan/12/google-racism-ban-gorilla-black-people "Google's solution to accidental algorithmic racism: ban gorillas"]. ''The Guardian''. Retrieved March 19, 2021.</ref> A spokesperson from Google stated that the labels "gorilla", "chimp", "chimpanzee", and "monkey" were blocked on the platform.<ref>Vincent, J. (2018, January 12). [https://www.theverge.com/2018/1/12/16882408/google-racist-gorillas-photo-recognition-algorithm-ai "Google ‘fixed’ its racist algorithm by removing gorillas from its image-labeling tech."] ''The Verge''. Retrieved April 2, 2021.</ref> According to Mr. Alcine, Google has mentioned a more intensified search into getting people of colour candidates through the door, but only time will tell that will help correct the problem of under representation of minorities in technology companies. <ref> (2015, July 1). [https://www.bbc.com/news/technology-33347866 “Google apologises for Photo app’s racist blunder”]. “BBC”. Retrieved April 6, 2021. </ref> With enough data and computing power, software can be trained to categorize images to a high level of accuracy, but it can’t easily go beyond the experience of that training. Even the very best algorithms lack the ability to refine their interpretation of the world as humans do.<ref> Simonite, T. (2018, January 11). [https://www.wired.com/story/when-it-comes-to-gorillas-google-photos-remains-blind/ “When It Comes To Gorillas, Google Photos Remains Blind”]. “Wired”. Retrieved April 6, 2021. </ref><br />
<br />
[[File:Gendergoogle.png|thumbnail|right|Google's API no longer uses gender labels<ref name="gender"/>]]<br />
<br />
In 2020, Google made a change to its API tool [https://en.wikipedia.org/wiki/Google_Cloud_Platform Cloud Vision], which uses AI to analyze images in order to identify people, places, and things. The change involved eliminating gender labels on images, because a person’s gender can’t be determined just by how they look in a photo. Instead of using terms like “man” or “woman”, Google will tag images with labels such as “person” in order to avoid instilling AI algorithms with human bias.<ref> Lyons, K. (2020, February 20). [https://www.theverge.com/2020/2/20/21145356/google-ai-images-gender-bias-labels-people “Google Ai tool will no longer use gendered labels like ‘woman’ or ‘man’ in photos of people”]. “The Verge”. Retrieved April 6, 2021. </ref> <br />
<br />
<br />
Regarding its own AI principles, Google acknowledges that algorithms and datasets can reinforce bias: “We will seek to avoid unjust impacts on people, particularly those related to sensitive characteristics such as race, ethnicity, gender, nationality, income, sexual orientation, ability, and political or religious belief. We will design AI systems that provide appropriate opportunities for feedback, relevant explanations, and appeal. Our AI technologies will be subject to appropriate human direction and control.”<ref> Pichai, S. (2018, June 7). [https://www.blog.google/technology/ai/ai-principles/ “AI at Google: our principles”]. “Google”. Retrieved April 6, 2021. </ref><br />
<br />
==Ethical Implications==<br />
[[File:July_13,_2018_Timeline-light.jpg|205px|thumb|right|Google Photos work in conjunction with Google Maps to create an extremely precise summary of this user’s day. Copyright ©️: Cooper Stevens]]<br />
===Security===<br />
A foremost concern regarding Google Photos is ensuring that the user who uploads photos and videos is the only one who can control access to and view that content.<ref>Hill, S. (2015, June 16). [https://www.androidauthority.com/google-photos-worried-privacy-616339/ "Google Photos: Should you be worried about privacy?"]. ''Android Authority''. Retrieved March 12, 2021.</ref> These worries are substantiated by occurrences like Google accidentally sending users’ private videos to strangers in November 2019.<ref>Warren, T. (2020, February 4). [https://www.theverge.com/2020/2/4/21122044/google-photos-privacy-breach-takeout-data-video-strangers "Google admits it sent private videos in Google Photos to strangers"]. ''The Verge''. Retrieved March 12, 2021.</ref> To alleviate future incidents, Google continuously works to improve its security infrastructure to ensure that its privacy goals are upheld for the benefit of all users.<ref>Google. (2021). [https://safety.google/photos/#photos-advanced-infraestructure "Google Photos Safety &amp; Privacy Features"]. ''Google Safety Center''. Retrieved March 12, 2021.</ref><br />
<br />
===Computer Vision===<br />
Google Photos automatically runs every photo and video through visual recognition algorithms to identify objects and places.<ref>Google. (2021). [https://support.google.com/photos/answer/6220402?hl=en&amp;ref_topic=6128818 "Get started with Google Photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> Once "face grouping" is turned on by the user, it will also start to group together photos of individual people and pets.<ref name="searchPTP">Google. (2021). [https://support.google.com/photos/answer/6128838?hl=en&co=GENIE.Platform%3DDesktop&oco=0#zippy=%2Clearn-about-face-models "Search by people, things & places in your photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> Hence, there is cause for concern about Google collecting the information embedded in the photos (including, but not limited to, hobbies, family members, and pets) and selling it to third parties or using it to display more relevant advertisements.<ref>Luckerson, V. (2017, May 25). [https://www.theringer.com/2017/5/25/16043842/google-photos-data-collection-e8578b3256e0 "Why Google Is Suddenly Obsessed With Your Photos"]. ''The Ringer''. Retrieved March 12, 2021.</ref> To alleviate these concerns, Google ensures users that "face groups and labels in your account are only visible to you."<ref name="searchPTP" /><br />
<br />
===Geotagging===<br />
<br />
If a user allows Google to keep track of their location history under "Your Timeline" in Google Maps, Google Photos uses this alongside geotags embedded in photos to allow users to search their gallery by location and even review past trips minute-by-minute.<ref name="Vox" /><ref>Bonifacic, I. (2020, December 16). [https://www.engadget.com/google-photos-adds-maps-timeline-233633756.html "Google Photos lets you relive a day through your Maps timeline"]. ''Engadget''. Retrieved March 12, 2021.</ref><br />
<br />
Even without geotags, Google Photos is capable of intuiting a photo’s location by analyzing for major landmarks.<ref name="Vox" /> As a result, concerns have been raised about how this information is used and if it is secure.<ref>Komando, K. (2021, January 11). [https://www.usatoday.com/story/tech/columnist/komando/2021/01/07/google-map-knows-your-location-photos-you-took-how-turn-off/4113403001/ "Hidden map on your phone shows everywhere you’ve been and the photos you took there"]. ''USA Today''. Retrieved March 12, 2021.</ref><br />
<br />
Google is upfront that it uses this information to show users more relevant advertisements, but at the same time assures users that this information is never shared with advertisers.<ref>Google. (2021). [https://policies.google.com/technologies/location-data?hl=en-US "How Google uses location information"]. ''Privacy &amp; Terms''. Retrieved March 12, 2021.</ref> To accommodate these concerns, Google Photos gives its users the capability to turn off location history, remove location information from already existing photos, and choose whether or not to share a photo's location when shared.<ref>Google. (2021). [https://support.google.com/photos/answer/6153599?co=GENIE.Platform%3DAndroid&oco=1#zippy=%2Cview-your-timeline "Understand, find and edit your photos' locations"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref><br />
<br />
==See also==<br />
*[[Computer Vision]]<br />
*[[Facial Recognition]]<br />
*[[Geotag]]<br />
<br />
==References==<br />
<references/></div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=Google_Photos&diff=102031
Google Photos
2021-04-15T16:56:08Z
<p>Nfigue: /* Motivations */ Added links, reworded, deleted for wordiness</p>
<hr />
<div><span style="color: red; font-size: 50px;">Currently Being Edited</span><br />
<br />
<br />
[[File:Google_Photos_Logo_new.png|256px|thumb|right|The Google Photos logo. Copyright ©️: Google]]<br />
{{Nav-Bar|Topics##}}<br><br />
'''Google Photos''' is a photo and video storage and sharing service developed by [[Google]].<ref name="about">Google. (2021). [https://www.google.com/photos/about/ "Google Photos | About"]. ''Google Photos''. Retrieved March 26, 2021.</ref> Google Photos was released in May 2015 as it separated from Google’s former social network, [https://en.wikipedia.org/wiki/Google%2B Google+].<ref>Spradlin, L. (2015, May 24). [https://www.androidpolice.com/2015/05/24/an-exclusive-early-look-at-the-new-google-photos-app/ "An Exclusive Early Look At The New Google Photos App"]. ''Android Police''. Retrieved March 12, 2021.</ref> Google Photos was designed with the intent of creating a platform that allows people to store and easily access all of their pictures and videos from any device.<ref>Amateur Photographer. (2015, June 2). [https://www.amateurphotographer.co.uk/latest/photo-news/google-photos-service-raises-privacy-concerns-52721 "Google Photos Service Raises Privacy Concerns"]. ''Amateur Photographer''. Retrieved March 12, 2021.</ref><br />
<br />
After its release, Google Photos's user base rocketed to 200 million after one year, 500 million after two years, and surpassed 1 billion after four years.<ref>Porter, J. (2019, July 24). [https://www.theverge.com/2019/7/24/20708328/google-photos-users-gallery-go-1-billion "Google Photos passes the 1 billion users mark"]. ''The Verge''. Retrieved March 12, 2021.</ref> As of 2020, Google reports that about 28 billion photos and videos are uploaded every week and that the service is home to more than 4 trillion photos.<ref name="future">Ben-Yair, S. (2020, November 11). [https://blog.google/products/photos/storage-changes/ "Updating Google Photos' storage policy to build for the future"]. ''Google | The Keyword''. Retrieved March 12, 2021.</ref><br />
<br />
==Features==<br />
Google Photos is available on [[Android]], [https://en.wikipedia.org/wiki/IOS iOS], and [https://photos.google.com/ online].<ref name="JacobK">Kastrenakes, J. (2015, May 28). [https://www.theverge.com/2015/5/28/8678629/google-photos-app-announced "Google announces unlimited picture and video storage with new Photos app"]. ''The Verge''. Retrieved March 12, 2021.</ref> Photos and videos can be uploaded and accessed via any of these platforms.<ref name="Vox" /> The service organizes photos and videos by identifying any number of features present in the photo such as lakes, night, birthday, etc.<ref name="JacobK" /> It is capable of organizing photos by the faces of people and pets in them, even as the faces age.<ref name="Vox" /><ref name="JacobK" /> Users can manually fix or remove incorrect labels.<ref name="Vox" /> Google Photos also can group photos and videos by location.<ref name="Vox" /> It can determine a photo's location either by its embedded geotagging data, or by analyzing the photo for major landmarks (such as the [https://en.wikipedia.org/wiki/Eiffel_Tower Eiffel Tower]).<ref name="Vox" /><br />
<br />
The service includes a native photo and video editing software that can be used on any platform.<ref>Lowensohn, J. (2015, May 28). [https://www.theverge.com/2015/5/28/8673471/google-photos-hands-on-cloud-storage-io-2015 "Hands-on with Google's new Photos service"]. ''The Verge''. Retrieved March 12, 2021.</ref> Additionally, Google Photos offers a variety of ways to make sharing photos/videos easy and simple, such as generating web link to content that both users and non-users can access.<ref name="Vox">Mossberg, W. (2015, June 2). [https://www.vox.com/2015/6/2/11563182/the-new-google-photos-free-at-last-and-very-smart "The New Google Photos: Free at Last, and Very Smart"]. ''Vox''. Retrieved March 12, 2021.</ref> Google Photos allows users to share albums with someone directly via their Google account.<ref>Mathur, S. (2020, May 19). [https://blog.google/products/photos/new-controls-how-you-share-albums-google-photos/ "New controls for how you share albums in Google Photos"]. ''Google | The Keyword''. Retrieved March 12, 2021.</ref> In 2020, Google added a heat map feature that displays the concentration of photos in the library as a function of location.<ref>Newton, C. (2020, June 25). [https://www.theverge.com/2020/6/25/21301932/google-photos-redesign-map-view-memories-search "Google Photos gets a map view as part of a big new redesign"]. ''The Verge''. Retrieved March 12, 2021.</ref> In March of 2021, Google Photos started rolling out an advanced version of its video-editor for Android users. Users will be able to not only trim, stabilize, and rotate videos, but also crop to any aspect ratio, change perspective, and control brightness, contrast, highlights, and shadows. Google plans to release this feature to iOS in 2021.<ref>Li, A. (2021, March 29). [https://9to5google.com/2021/03/29/google-photos-video-editor/ "Update: Rolling out Google Photos for Android getting advanced video editor with cropping, filters."] ''9To5Google''. Retrieved April 2, 2021.</ref><br />
<br />
===Subscription Service===<br />
In late 2020, Google rolled out a subscription-based service that ships Google Photos users 10 prints of that months best photos.<ref>Perez, S. (2020, October 20).[https://techcrunch.com/2020/10/20/google-photos-revives-its-prints-subscription-service-expands-same-day-print-options/ "Google Photos revives its prints subscription service, expands same-day print options."] ''TechCrunch''. Retrieved April 2, 2021.</ref> The images are selected by a machine learning algorithm, but users can alter any choices. After some negative feedback through user-testing,<ref name="feedback">Nagaraj, P. (2020, June 21). [https://newslanded.com/2020/06/21/google-finally-cancels-controversial-google-photos-subscription-feature/ "Google finally cancels controversial google photos subscription feature."] ''News Landed''. Retrieved April 2, 2021.</ref> Google now allows higher customization of the look, and finishing of the prints along with a reduction in subscription costs from $7.99 to $6.99. Users are able to cancel the service at any time, and are also able to skip a month, if they so choose to. Customers can order prints in multiple sizes, and also ask for a same-day pick-up at their local [https://en.wikipedia.org/wiki/Walgreens Walgreens]. The most prominent feedback received for the service is the "outrageous cost"<ref name="feedback"/>, charging many times the price for printing at a traditional store.<ref name="feedback"/><br />
<br />
[[File:Google's_Revenue_Breakdown_2017-20.jpeg|300px|thumb|right|A breakdown of Google's sources of revenue from 2017 to 2020.<ref name="pivot" /> (Advertising revenue is in blue)]]<br />
<br />
[[File:Incorrectly-Labeled_Bikes-cropped.jpg|216px|thumb|left|This meal has been incorrectly labeled as a "bike" by Google's visual recognition algorithms. Copyright ©️: Cooper Stevens]]<br />
<br />
==Storage==<br />
<br />
Photos and videos are each uploaded to Google Photos in one of three ways: "original quality", "high quality", or "express."<ref name="size" /> "Original quality" uploads maintain their original resolution and use part of the associated Google account's 15GB of storage shared between all Google products (including [https://en.wikipedia.org/wiki/Gmail Gmail], [https://en.wikipedia.org/wiki/Google_Drive Google Drive], etc.).<ref name="size">Google. (2021). [https://support.google.com/photos/answer/6220791?co=GENIE.Platform%3DAndroid&oco=1#zippy=%2Cexpress-backup%2Coriginal-quality%2Chigh-quality "Choose the upload size of your photos & videos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> Meanwhile, "high quality" uploads have their resolution downgraded to 16 megapixels and 1080p.<ref name="size" /> Finally, "express" uploads are compressed to 3 megapixels and 480p.<ref name="size" /> The free tier of Google Photos allows unlimited uploads in resolutions up to "high quality."<ref name="size" /><br />
<br />
In November 2020, in an effort to increase the number of [https://en.wikipedia.org/wiki/Google_One Google One] subscriptions and reduce Google's reliance on ad-based revenue, Google announced that Google Photos will no longer offer free unlimited storage at "express" or "high quality" starting June 1, 2021.<ref name="pivot">Newman, J. (2020, November 28). [https://www.fastcompany.com/90579872/google-photos-no-unlimited-storage "The end of unlimited Google Photos storage is part of a bigger pivot"]. ''Fast Company''. Retrieved March 12, 2021.</ref><ref>Bohn, D. (2020, November 11). [https://www.theverge.com/2020/11/11/21560810/google-photos-unlimited-cap-free-uploads-15gb-ending "Google Photos will end its free unlimited storage on June 1st, 2021"]. ''The Verge''. Retrieved March 12, 2021.</ref> Existing photos and videos will remain unaffected.<ref name="future" /> After all 15GB of account storage have been used, users will either have to maintain a Google One subscription or upload from a [https://en.wikipedia.org/wiki/Google_Pixel Google Pixel] 5 or older device.<ref>Google. (2021). [https://support.google.com/photos/answer/10100180 "Storage changes for Google Photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref><ref name="size" /> Future Pixel devices will be unable to upload in "high quality" for free.<ref>Coberly, C. (2020, November 12). [https://www.techspot.com/news/87584-future-pixel-phone-owners-reportedly-wont-retain-access.html "Future Pixel phone owners may not retain unlimited Google Photos storage access"]. ''TechSpot''. Retrieved March 12, 2021.</ref><br />
<br />
[[File:Alcine1-cropped.png|216px|thumb|left|Jacky Alciné's Twitter post about the "Gorillas" misidentification.<ref name="gorillas" />]]<br />
<br />
==Motivations==<br />
<br />
There have been suggestions that Google released Google Photos in the interest of collecting more information about users through visual data.<ref>Lomas, N. (2015, June 1). [https://techcrunch.com/2015/06/01/google-photos-reminder-smile-its-free-youre-the-product/ "Google Photos Reminder: Smile, It's Free - You're The Product!"]. ''TechCrunch''. Retrieved March 12, 2021.</ref> It has also been speculated that it was intended as a method of outsourcing work to train their visual recognition algorithms; the app frequently asks its users to manually verify (or reject) its proposed tags.<ref name="improve">Perrigo, M. (2020, November 9). [https://chromeunboxed.com/google-photos-crowdsource-feature "Google Photos wants you to answer questions to help improve its image recognition capabilities"]. ''Chrome Unboxed''. Retrieved March 12, 2021.</ref> <br />
<br />
===Bias in Algorithms===<br />
Bias in artificial intelligence is a widely discussed topic, with researchers voicing concerns that flawed training data can exacerbate or create unfair assumptions. One example is facial recognition algorithms misidentifying people of color more frequently than white people.<ref name="gender"> Ghosh, S. (2020, February 20). [https://www.businessinsider.nl/google-cloud-vision-api-wont-tag-images-by-gender-2020-2/ “Google AI will no longer use gender labels like ‘woman’ or ‘man’ on images of people to avoid bias”]. “Business Insider”. Retrieved April 6, 2021. </ref> <br />
<br />
Shortly after its release in 2015, Google Photos found itself under fire for a particularly offensive misidentification: user Jacky Alciné reported that he and his friend were classified as "Gorillas" by Google Photos's visual recognition algorithms.<ref name="gorillas">Mullen, J. (2015, July 2). [https://www.cnn.com/2015/07/02/tech/google-image-recognition-gorillas-tag "Google rushes to fix software that tagged photo with racial slur"]. ''CNN''. Retrieved March 19, 2021.</ref> In response, Google Photos promptly removed all labels relating to primates as a temporary fix.<ref>Hern, A. (2018, January 12). [https://www.theguardian.com/technology/2018/jan/12/google-racism-ban-gorilla-black-people "Google's solution to accidental algorithmic racism: ban gorillas"]. ''The Guardian''. Retrieved March 19, 2021.</ref> A spokesperson from Google stated that the labels "gorilla", "chimp", "chimpanzee", and "monkey" were blocked on the platform.<ref>Vincent, J. (2018, January 12). [https://www.theverge.com/2018/1/12/16882408/google-racist-gorillas-photo-recognition-algorithm-ai "Google ‘fixed’ its racist algorithm by removing gorillas from its image-labeling tech."] ''The Verge''. Retrieved April 2, 2021.</ref> According to Mr. Alcine, Google has mentioned a more intensified search into getting people of colour candidates through the door, but only time will tell that will help correct the problem of under representation of minorities in technology companies. <ref> (2015, July 1). [https://www.bbc.com/news/technology-33347866 “Google apologises for Photo app’s racist blunder”]. “BBC”. Retrieved April 6, 2021. </ref> With enough data and computing power, software can be trained to categorize images to a high level of accuracy, but it can’t easily go beyond the experience of that training. Even the very best algorithms lack the ability to refine their interpretation of the world as humans do.<ref> Simonite, T. (2018, January 11). [https://www.wired.com/story/when-it-comes-to-gorillas-google-photos-remains-blind/ “When It Comes To Gorillas, Google Photos Remains Blind”]. “Wired”. Retrieved April 6, 2021. </ref><br />
<br />
[[File:Gendergoogle.png|thumbnail|right|Google's API no longer uses gender labels<ref name="gender"/>]]<br />
<br />
In 2020, Google made a change to its API tool [https://en.wikipedia.org/wiki/Google_Cloud_Platform Cloud Vision], which uses AI to analyze images in order to identify people, places, and things. The change involved eliminating gender labels on images, because a person’s gender can’t be determined just by how they look in a photo. Instead of using terms like “man” or “woman”, Google will tag images with labels such as “person” in order to avoid instilling AI algorithms with human bias.<ref> Lyons, K. (2020, February 20). [https://www.theverge.com/2020/2/20/21145356/google-ai-images-gender-bias-labels-people “Google Ai tool will no longer use gendered labels like ‘woman’ or ‘man’ in photos of people”]. “The Verge”. Retrieved April 6, 2021. </ref> <br />
<br />
<br />
Regarding its own AI principles, Google acknowledges that algorithms and datasets can reinforce bias: “We will seek to avoid unjust impacts on people, particularly those related to sensitive characteristics such as race, ethnicity, gender, nationality, income, sexual orientation, ability, and political or religious belief. We will design AI systems that provide appropriate opportunities for feedback, relevant explanations, and appeal. Our AI technologies will be subject to appropriate human direction and control.”<ref> Pichai, S. (2018, June 7). [https://www.blog.google/technology/ai/ai-principles/ “AI at Google: our principles”]. “Google”. Retrieved April 6, 2021. </ref><br />
<br />
==Ethical Implications==<br />
[[File:July_13,_2018_Timeline-light.jpg|205px|thumb|right|Google Photos work in conjunction with Google Maps to create an extremely precise summary of this user’s day. Copyright ©️: Cooper Stevens]]<br />
===Security===<br />
An obvious concern for a product like Google Photos is users' privacy and the security of their photos and videos from other individuals, third party institutions, and even the government.<ref>Hill, S. (2015, June 16). [https://www.androidauthority.com/google-photos-worried-privacy-616339/ "Google Photos: Should you be worried about privacy?"]. ''Android Authority''. Retrieved March 12, 2021.</ref> These concerns are substantiated by occurrences like Google accidentally sending users’ private videos to strangers in November 2019.<ref>Warren, T. (2020, February 4). [https://www.theverge.com/2020/2/4/21122044/google-photos-privacy-breach-takeout-data-video-strangers "Google admits it sent private videos in Google Photos to strangers"]. ''The Verge''. Retrieved March 12, 2021.</ref> To alleviate these concerns, Google continuously works to improve its security infrastructure to ensure that the only people that can see an account’s photos are the owner of the account and those that the owner has shared their photos with.<ref>Google. (2021). [https://safety.google/photos/#photos-advanced-infraestructure "Google Photos Safety &amp; Privacy Features"]. ''Google Safety Center''. Retrieved March 12, 2021.</ref><br />
<br />
===Computer Vision===<br />
Google Photos automatically runs every photo and video through visual recognition algorithms to identify objects and places.<ref>Google. (2021). [https://support.google.com/photos/answer/6220402?hl=en&amp;ref_topic=6128818 "Get started with Google Photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> Once "face grouping" is turned on by the user, it will also start to group together photos of individual people and pets.<ref name="searchPTP">Google. (2021). [https://support.google.com/photos/answer/6128838?hl=en&co=GENIE.Platform%3DDesktop&oco=0#zippy=%2Clearn-about-face-models "Search by people, things & places in your photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> Hence, there is cause for concern about Google collecting the information embedded in the photos (including, but not limited to, hobbies, family members, and pets) and selling it to third parties or using it to display more relevant advertisements.<ref>Luckerson, V. (2017, May 25). [https://www.theringer.com/2017/5/25/16043842/google-photos-data-collection-e8578b3256e0 "Why Google Is Suddenly Obsessed With Your Photos"]. ''The Ringer''. Retrieved March 12, 2021.</ref> To alleviate these concerns, Google ensures users that "face groups and labels in your account are only visible to you."<ref name="searchPTP" /><br />
<br />
===Geotagging===<br />
<br />
If a user allows Google to keep track of their location history under "Your Timeline" in Google Maps, Google Photos uses this alongside geotags embedded in photos to allow users to search their gallery by location and even review past trips minute-by-minute.<ref name="Vox" /><ref>Bonifacic, I. (2020, December 16). [https://www.engadget.com/google-photos-adds-maps-timeline-233633756.html "Google Photos lets you relive a day through your Maps timeline"]. ''Engadget''. Retrieved March 12, 2021.</ref><br />
<br />
Even without geotags, Google Photos is capable of intuiting a photo’s location by analyzing for major landmarks.<ref name="Vox" /> As a result, concerns have been raised about how this information is used and if it is secure.<ref>Komando, K. (2021, January 11). [https://www.usatoday.com/story/tech/columnist/komando/2021/01/07/google-map-knows-your-location-photos-you-took-how-turn-off/4113403001/ "Hidden map on your phone shows everywhere you’ve been and the photos you took there"]. ''USA Today''. Retrieved March 12, 2021.</ref><br />
<br />
Google is upfront that it uses this information to show users more relevant advertisements, but at the same time assures users that this information is never shared with advertisers.<ref>Google. (2021). [https://policies.google.com/technologies/location-data?hl=en-US "How Google uses location information"]. ''Privacy &amp; Terms''. Retrieved March 12, 2021.</ref> To accommodate these concerns, Google Photos gives its users the capability to turn off location history, remove location information from already existing photos, and choose whether or not to share a photo's location when shared.<ref>Google. (2021). [https://support.google.com/photos/answer/6153599?co=GENIE.Platform%3DAndroid&oco=1#zippy=%2Cview-your-timeline "Understand, find and edit your photos' locations"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref><br />
<br />
==See also==<br />
*[[Computer Vision]]<br />
*[[Facial Recognition]]<br />
*[[Geotag]]<br />
<br />
==References==<br />
<references/></div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=Google_Photos&diff=102030
Google Photos
2021-04-15T16:11:12Z
<p>Nfigue: /* Storage */ Fixed links, minor rewording</p>
<hr />
<div><span style="color: red; font-size: 50px;">Currently Being Edited</span><br />
<br />
<br />
[[File:Google_Photos_Logo_new.png|256px|thumb|right|The Google Photos logo. Copyright ©️: Google]]<br />
{{Nav-Bar|Topics##}}<br><br />
'''Google Photos''' is a photo and video storage and sharing service developed by [[Google]].<ref name="about">Google. (2021). [https://www.google.com/photos/about/ "Google Photos | About"]. ''Google Photos''. Retrieved March 26, 2021.</ref> Google Photos was released in May 2015 as it separated from Google’s former social network, [https://en.wikipedia.org/wiki/Google%2B Google+].<ref>Spradlin, L. (2015, May 24). [https://www.androidpolice.com/2015/05/24/an-exclusive-early-look-at-the-new-google-photos-app/ "An Exclusive Early Look At The New Google Photos App"]. ''Android Police''. Retrieved March 12, 2021.</ref> Google Photos was designed with the intent of creating a platform that allows people to store and easily access all of their pictures and videos from any device.<ref>Amateur Photographer. (2015, June 2). [https://www.amateurphotographer.co.uk/latest/photo-news/google-photos-service-raises-privacy-concerns-52721 "Google Photos Service Raises Privacy Concerns"]. ''Amateur Photographer''. Retrieved March 12, 2021.</ref><br />
<br />
After its release, Google Photos's user base rocketed to 200 million after one year, 500 million after two years, and surpassed 1 billion after four years.<ref>Porter, J. (2019, July 24). [https://www.theverge.com/2019/7/24/20708328/google-photos-users-gallery-go-1-billion "Google Photos passes the 1 billion users mark"]. ''The Verge''. Retrieved March 12, 2021.</ref> As of 2020, Google reports that about 28 billion photos and videos are uploaded every week and that the service is home to more than 4 trillion photos.<ref name="future">Ben-Yair, S. (2020, November 11). [https://blog.google/products/photos/storage-changes/ "Updating Google Photos' storage policy to build for the future"]. ''Google | The Keyword''. Retrieved March 12, 2021.</ref><br />
<br />
==Features==<br />
Google Photos is available on [[Android]], [https://en.wikipedia.org/wiki/IOS iOS], and [https://photos.google.com/ online].<ref name="JacobK">Kastrenakes, J. (2015, May 28). [https://www.theverge.com/2015/5/28/8678629/google-photos-app-announced "Google announces unlimited picture and video storage with new Photos app"]. ''The Verge''. Retrieved March 12, 2021.</ref> Photos and videos can be uploaded and accessed via any of these platforms.<ref name="Vox" /> The service organizes photos and videos by identifying any number of features present in the photo such as lakes, night, birthday, etc.<ref name="JacobK" /> It is capable of organizing photos by the faces of people and pets in them, even as the faces age.<ref name="Vox" /><ref name="JacobK" /> Users can manually fix or remove incorrect labels.<ref name="Vox" /> Google Photos also can group photos and videos by location.<ref name="Vox" /> It can determine a photo's location either by its embedded geotagging data, or by analyzing the photo for major landmarks (such as the [https://en.wikipedia.org/wiki/Eiffel_Tower Eiffel Tower]).<ref name="Vox" /><br />
<br />
The service includes a native photo and video editing software that can be used on any platform.<ref>Lowensohn, J. (2015, May 28). [https://www.theverge.com/2015/5/28/8673471/google-photos-hands-on-cloud-storage-io-2015 "Hands-on with Google's new Photos service"]. ''The Verge''. Retrieved March 12, 2021.</ref> Additionally, Google Photos offers a variety of ways to make sharing photos/videos easy and simple, such as generating web link to content that both users and non-users can access.<ref name="Vox">Mossberg, W. (2015, June 2). [https://www.vox.com/2015/6/2/11563182/the-new-google-photos-free-at-last-and-very-smart "The New Google Photos: Free at Last, and Very Smart"]. ''Vox''. Retrieved March 12, 2021.</ref> Google Photos allows users to share albums with someone directly via their Google account.<ref>Mathur, S. (2020, May 19). [https://blog.google/products/photos/new-controls-how-you-share-albums-google-photos/ "New controls for how you share albums in Google Photos"]. ''Google | The Keyword''. Retrieved March 12, 2021.</ref> In 2020, Google added a heat map feature that displays the concentration of photos in the library as a function of location.<ref>Newton, C. (2020, June 25). [https://www.theverge.com/2020/6/25/21301932/google-photos-redesign-map-view-memories-search "Google Photos gets a map view as part of a big new redesign"]. ''The Verge''. Retrieved March 12, 2021.</ref> In March of 2021, Google Photos started rolling out an advanced version of its video-editor for Android users. Users will be able to not only trim, stabilize, and rotate videos, but also crop to any aspect ratio, change perspective, and control brightness, contrast, highlights, and shadows. Google plans to release this feature to iOS in 2021.<ref>Li, A. (2021, March 29). [https://9to5google.com/2021/03/29/google-photos-video-editor/ "Update: Rolling out Google Photos for Android getting advanced video editor with cropping, filters."] ''9To5Google''. Retrieved April 2, 2021.</ref><br />
<br />
===Subscription Service===<br />
In late 2020, Google rolled out a subscription-based service that ships Google Photos users 10 prints of that months best photos.<ref>Perez, S. (2020, October 20).[https://techcrunch.com/2020/10/20/google-photos-revives-its-prints-subscription-service-expands-same-day-print-options/ "Google Photos revives its prints subscription service, expands same-day print options."] ''TechCrunch''. Retrieved April 2, 2021.</ref> The images are selected by a machine learning algorithm, but users can alter any choices. After some negative feedback through user-testing,<ref name="feedback">Nagaraj, P. (2020, June 21). [https://newslanded.com/2020/06/21/google-finally-cancels-controversial-google-photos-subscription-feature/ "Google finally cancels controversial google photos subscription feature."] ''News Landed''. Retrieved April 2, 2021.</ref> Google now allows higher customization of the look, and finishing of the prints along with a reduction in subscription costs from $7.99 to $6.99. Users are able to cancel the service at any time, and are also able to skip a month, if they so choose to. Customers can order prints in multiple sizes, and also ask for a same-day pick-up at their local [https://en.wikipedia.org/wiki/Walgreens Walgreens]. The most prominent feedback received for the service is the "outrageous cost"<ref name="feedback"/>, charging many times the price for printing at a traditional store.<ref name="feedback"/><br />
<br />
[[File:Google's_Revenue_Breakdown_2017-20.jpeg|300px|thumb|right|A breakdown of Google's sources of revenue from 2017 to 2020.<ref name="pivot" /> (Advertising revenue is in blue)]]<br />
<br />
[[File:Incorrectly-Labeled_Bikes-cropped.jpg|216px|thumb|left|This meal has been incorrectly labeled as a "bike" by Google's visual recognition algorithms. Copyright ©️: Cooper Stevens]]<br />
<br />
==Storage==<br />
<br />
Photos and videos are each uploaded to Google Photos in one of three ways: "original quality", "high quality", or "express."<ref name="size" /> "Original quality" uploads maintain their original resolution and use part of the associated Google account's 15GB of storage shared between all Google products (including [https://en.wikipedia.org/wiki/Gmail Gmail], [https://en.wikipedia.org/wiki/Google_Drive Google Drive], etc.).<ref name="size">Google. (2021). [https://support.google.com/photos/answer/6220791?co=GENIE.Platform%3DAndroid&oco=1#zippy=%2Cexpress-backup%2Coriginal-quality%2Chigh-quality "Choose the upload size of your photos & videos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> Meanwhile, "high quality" uploads have their resolution downgraded to 16 megapixels and 1080p.<ref name="size" /> Finally, "express" uploads are compressed to 3 megapixels and 480p.<ref name="size" /> The free tier of Google Photos allows unlimited uploads in resolutions up to "high quality."<ref name="size" /><br />
<br />
In November 2020, in an effort to increase the number of [https://en.wikipedia.org/wiki/Google_One Google One] subscriptions and reduce Google's reliance on ad-based revenue, Google announced that Google Photos will no longer offer free unlimited storage at "express" or "high quality" starting June 1, 2021.<ref name="pivot">Newman, J. (2020, November 28). [https://www.fastcompany.com/90579872/google-photos-no-unlimited-storage "The end of unlimited Google Photos storage is part of a bigger pivot"]. ''Fast Company''. Retrieved March 12, 2021.</ref><ref>Bohn, D. (2020, November 11). [https://www.theverge.com/2020/11/11/21560810/google-photos-unlimited-cap-free-uploads-15gb-ending "Google Photos will end its free unlimited storage on June 1st, 2021"]. ''The Verge''. Retrieved March 12, 2021.</ref> Existing photos and videos will remain unaffected.<ref name="future" /> After all 15GB of account storage have been used, users will either have to maintain a Google One subscription or upload from a [https://en.wikipedia.org/wiki/Google_Pixel Google Pixel] 5 or older device.<ref>Google. (2021). [https://support.google.com/photos/answer/10100180 "Storage changes for Google Photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref><ref name="size" /> Future Pixel devices will be unable to upload in "high quality" for free.<ref>Coberly, C. (2020, November 12). [https://www.techspot.com/news/87584-future-pixel-phone-owners-reportedly-wont-retain-access.html "Future Pixel phone owners may not retain unlimited Google Photos storage access"]. ''TechSpot''. Retrieved March 12, 2021.</ref><br />
<br />
[[File:Alcine1-cropped.png|216px|thumb|left|Jacky Alciné's Twitter post about the "Gorillas" misidentification.<ref name="gorillas" />]]<br />
<br />
==Motivations==<br />
<br />
There have been suggestions that Google released Google Photos in the interest of getting a foothold in the landscape of personal data in the form of visual imagery.<ref>Lomas, N. (2015, June 1). [https://techcrunch.com/2015/06/01/google-photos-reminder-smile-its-free-youre-the-product/ "Google Photos Reminder: Smile, It's Free - You're The Product!"]. ''TechCrunch''. Retrieved March 12, 2021.</ref> It has also been speculated that it was intended as a method of outsourcing work to train their visual recognition algorithms; the app frequently asks its users to manually verify (or reject) its proposed tags.<ref name="improve">Perrigo, M. (2020, November 9). [https://chromeunboxed.com/google-photos-crowdsource-feature "Google Photos wants you to answer questions to help improve its image recognition capabilities"]. ''Chrome Unboxed''. Retrieved March 12, 2021.</ref> <br />
<br />
===Bias in Algorithms===<br />
Bias in artificial intelligence is a widely discussed topic, with researchers voicing concerns that flawed training data can exacerbate or create unfair assumptions. One example is facial recognition algorithms misidentifying people of color more frequently than white people.<ref name="gender"> Ghosh, S. (2020, February 20). [https://www.businessinsider.nl/google-cloud-vision-api-wont-tag-images-by-gender-2020-2/ “Google AI will no longer use gender labels like ‘woman’ or ‘man’ on images of people to avoid bias”]. “Business Insider”. Retrieved April 6, 2021. </ref> <br />
<br />
Shortly after its release in 2015, Google Photos found itself under fire for a particularly offensive misidentification: user Jacky Alciné reported that he and his friend were classified as "Gorillas" by Google Photos's visual recognition algorithms.<ref name="gorillas">Mullen, J. (2015, July 2). [https://www.cnn.com/2015/07/02/tech/google-image-recognition-gorillas-tag "Google rushes to fix software that tagged photo with racial slur"]. ''CNN''. Retrieved March 19, 2021.</ref> In response, Google Photos promptly removed all labels relating to primates as a temporary fix.<ref>Hern, A. (2018, January 12). [https://www.theguardian.com/technology/2018/jan/12/google-racism-ban-gorilla-black-people "Google's solution to accidental algorithmic racism: ban gorillas"]. ''The Guardian''. Retrieved March 19, 2021.</ref> A spokesperson from Google stated that the labels "gorilla", "chimp", "chimpanzee", and "monkey" were blocked on the platform.<ref>Vincent, J. (2018, January 12). [https://www.theverge.com/2018/1/12/16882408/google-racist-gorillas-photo-recognition-algorithm-ai "Google ‘fixed’ its racist algorithm by removing gorillas from its image-labeling tech."] ''The Verge''. Retrieved April 2, 2021.</ref> Google continues to improve its visual recognition algorithms.<ref name="improve" /> According to Mr. Alcine, Google has mentioned a more intensified search into getting people of colour candidates through the door, but only time will tell if that'll happen and help correct the image Silicon Valley companies have with intersectional diversity - the act of unifying multiple fronts of disadvantaged people so that their voices are heard and not muted.<ref> (2015, July 1). [https://www.bbc.com/news/technology-33347866 “Google apologises for Photo app’s racist blunder”]. “BBC”. Retrieved April 6, 2021. </ref> The “gorilla workaround” illustrates the difficulties Google and other tech companies face in advancing image-recognition technology, which the companies hope to use in self-driving cars, personal assistants, and other products. Google’s caution illustrates an important shortcoming of existing machine learning algorithms. With enough data and computing power, software can be trained to categorize images to a high level of accuracy, but it can’t easily go beyond the experience of that training. Even the very best algorithms lack the ability to refine their interpretation of the world as humans do.<ref> Simonite, T. (2018, January 11). [https://www.wired.com/story/when-it-comes-to-gorillas-google-photos-remains-blind/ “When It Comes To Gorillas, Google Photos Remains Blind”]. “Wired”. Retrieved April 6, 2021. </ref><br />
<br />
[[File:Gendergoogle.png|thumbnail|right|Google's API no longer uses gender labels<ref name="gender"/>]]<br />
<br />
In 2020, Google made a change to its API tool Cloud Vision, which uses AI to analyze images and identify faces, landmarks, explicit content, and other recognizable features. The change involved eliminating gender labels on images, because a person’s gender can’t be determined just by how they look in a photo. Instead of using terms like “man” or “woman”, Google will tag images with labels such as “person” in order to avoid instilling AI algorithms with human bias.<ref> Lyons, K. (2020, February 20). [https://www.theverge.com/2020/2/20/21145356/google-ai-images-gender-bias-labels-people “Google Ai tool will no longer use gendered labels like ‘woman’ or ‘man’ in photos of people”]. “The Verge”. Retrieved April 6, 2021. </ref> <br />
<br />
<br />
In its own AI principles, Google notes that algorithms and datasets can reinforce bias: “We will seek to avoid unjust impacts on people, particularly those related to sensitive characteristics such as race, ethnicity, gender, nationality, income, sexual orientation, ability, and political or religious belief. We will design AI systems that provide appropriate opportunities for feedback, relevant explanations, and appeal. Our AI technologies will be subject to appropriate human direction and control.”<ref> Pichai, S. (2018, June 7). [https://www.blog.google/technology/ai/ai-principles/ “AI at Google: our principles”]. “Google”. Retrieved April 6, 2021. </ref><br />
<br />
==Ethical Implications==<br />
[[File:July_13,_2018_Timeline-light.jpg|205px|thumb|right|Google Photos work in conjunction with Google Maps to create an extremely precise summary of this user’s day. Copyright ©️: Cooper Stevens]]<br />
===Security===<br />
An obvious concern for a product like Google Photos is users' privacy and the security of their photos and videos from other individuals, third party institutions, and even the government.<ref>Hill, S. (2015, June 16). [https://www.androidauthority.com/google-photos-worried-privacy-616339/ "Google Photos: Should you be worried about privacy?"]. ''Android Authority''. Retrieved March 12, 2021.</ref> These concerns are substantiated by occurrences like Google accidentally sending users’ private videos to strangers in November 2019.<ref>Warren, T. (2020, February 4). [https://www.theverge.com/2020/2/4/21122044/google-photos-privacy-breach-takeout-data-video-strangers "Google admits it sent private videos in Google Photos to strangers"]. ''The Verge''. Retrieved March 12, 2021.</ref> To alleviate these concerns, Google continuously works to improve its security infrastructure to ensure that the only people that can see an account’s photos are the owner of the account and those that the owner has shared their photos with.<ref>Google. (2021). [https://safety.google/photos/#photos-advanced-infraestructure "Google Photos Safety &amp; Privacy Features"]. ''Google Safety Center''. Retrieved March 12, 2021.</ref><br />
<br />
===Computer Vision===<br />
Google Photos automatically runs every photo and video through visual recognition algorithms to identify objects and places.<ref>Google. (2021). [https://support.google.com/photos/answer/6220402?hl=en&amp;ref_topic=6128818 "Get started with Google Photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> Once "face grouping" is turned on by the user, it will also start to group together photos of individual people and pets.<ref name="searchPTP">Google. (2021). [https://support.google.com/photos/answer/6128838?hl=en&co=GENIE.Platform%3DDesktop&oco=0#zippy=%2Clearn-about-face-models "Search by people, things & places in your photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> Hence, there is cause for concern about Google collecting the information embedded in the photos (including, but not limited to, hobbies, family members, and pets) and selling it to third parties or using it to display more relevant advertisements.<ref>Luckerson, V. (2017, May 25). [https://www.theringer.com/2017/5/25/16043842/google-photos-data-collection-e8578b3256e0 "Why Google Is Suddenly Obsessed With Your Photos"]. ''The Ringer''. Retrieved March 12, 2021.</ref> To alleviate these concerns, Google ensures users that "face groups and labels in your account are only visible to you."<ref name="searchPTP" /><br />
<br />
===Geotagging===<br />
<br />
If a user allows Google to keep track of their location history under "Your Timeline" in Google Maps, Google Photos uses this alongside geotags embedded in photos to allow users to search their gallery by location and even review past trips minute-by-minute.<ref name="Vox" /><ref>Bonifacic, I. (2020, December 16). [https://www.engadget.com/google-photos-adds-maps-timeline-233633756.html "Google Photos lets you relive a day through your Maps timeline"]. ''Engadget''. Retrieved March 12, 2021.</ref><br />
<br />
Even without geotags, Google Photos is capable of intuiting a photo’s location by analyzing for major landmarks.<ref name="Vox" /> As a result, concerns have been raised about how this information is used and if it is secure.<ref>Komando, K. (2021, January 11). [https://www.usatoday.com/story/tech/columnist/komando/2021/01/07/google-map-knows-your-location-photos-you-took-how-turn-off/4113403001/ "Hidden map on your phone shows everywhere you’ve been and the photos you took there"]. ''USA Today''. Retrieved March 12, 2021.</ref><br />
<br />
Google is upfront that it uses this information to show users more relevant advertisements, but at the same time assures users that this information is never shared with advertisers.<ref>Google. (2021). [https://policies.google.com/technologies/location-data?hl=en-US "How Google uses location information"]. ''Privacy &amp; Terms''. Retrieved March 12, 2021.</ref> To accommodate these concerns, Google Photos gives its users the capability to turn off location history, remove location information from already existing photos, and choose whether or not to share a photo's location when shared.<ref>Google. (2021). [https://support.google.com/photos/answer/6153599?co=GENIE.Platform%3DAndroid&oco=1#zippy=%2Cview-your-timeline "Understand, find and edit your photos' locations"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref><br />
<br />
==See also==<br />
*[[Computer Vision]]<br />
*[[Facial Recognition]]<br />
*[[Geotag]]<br />
<br />
==References==<br />
<references/></div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=Google_Photos&diff=102029
Google Photos
2021-04-15T16:06:12Z
<p>Nfigue: Fixed references and reworded for clarity</p>
<hr />
<div><span style="color: red; font-size: 50px;">Currently Being Edited</span><br />
<br />
<br />
[[File:Google_Photos_Logo_new.png|256px|thumb|right|The Google Photos logo. Copyright ©️: Google]]<br />
{{Nav-Bar|Topics##}}<br><br />
'''Google Photos''' is a photo and video storage and sharing service developed by [[Google]].<ref name="about">Google. (2021). [https://www.google.com/photos/about/ "Google Photos | About"]. ''Google Photos''. Retrieved March 26, 2021.</ref> Google Photos was released in May 2015 as it separated from Google’s former social network, [https://en.wikipedia.org/wiki/Google%2B Google+].<ref>Spradlin, L. (2015, May 24). [https://www.androidpolice.com/2015/05/24/an-exclusive-early-look-at-the-new-google-photos-app/ "An Exclusive Early Look At The New Google Photos App"]. ''Android Police''. Retrieved March 12, 2021.</ref> Google Photos was designed with the intent of creating a platform that allows people to store and easily access all of their pictures and videos from any device.<ref>Amateur Photographer. (2015, June 2). [https://www.amateurphotographer.co.uk/latest/photo-news/google-photos-service-raises-privacy-concerns-52721 "Google Photos Service Raises Privacy Concerns"]. ''Amateur Photographer''. Retrieved March 12, 2021.</ref><br />
<br />
After its release, Google Photos's user base rocketed to 200 million after one year, 500 million after two years, and surpassed 1 billion after four years.<ref>Porter, J. (2019, July 24). [https://www.theverge.com/2019/7/24/20708328/google-photos-users-gallery-go-1-billion "Google Photos passes the 1 billion users mark"]. ''The Verge''. Retrieved March 12, 2021.</ref> As of 2020, Google reports that about 28 billion photos and videos are uploaded every week and that the service is home to more than 4 trillion photos.<ref name="future">Ben-Yair, S. (2020, November 11). [https://blog.google/products/photos/storage-changes/ "Updating Google Photos' storage policy to build for the future"]. ''Google | The Keyword''. Retrieved March 12, 2021.</ref><br />
<br />
==Features==<br />
Google Photos is available on [[Android]], [https://en.wikipedia.org/wiki/IOS iOS], and [https://photos.google.com/ online].<ref name="JacobK">Kastrenakes, J. (2015, May 28). [https://www.theverge.com/2015/5/28/8678629/google-photos-app-announced "Google announces unlimited picture and video storage with new Photos app"]. ''The Verge''. Retrieved March 12, 2021.</ref> Photos and videos can be uploaded and accessed via any of these platforms.<ref name="Vox" /> The service organizes photos and videos by identifying any number of features present in the photo such as lakes, night, birthday, etc.<ref name="JacobK" /> It is capable of organizing photos by the faces of people and pets in them, even as the faces age.<ref name="Vox" /><ref name="JacobK" /> Users can manually fix or remove incorrect labels.<ref name="Vox" /> Google Photos also can group photos and videos by location.<ref name="Vox" /> It can determine a photo's location either by its embedded geotagging data, or by analyzing the photo for major landmarks (such as the [https://en.wikipedia.org/wiki/Eiffel_Tower Eiffel Tower]).<ref name="Vox" /><br />
<br />
The service includes a native photo and video editing software that can be used on any platform.<ref>Lowensohn, J. (2015, May 28). [https://www.theverge.com/2015/5/28/8673471/google-photos-hands-on-cloud-storage-io-2015 "Hands-on with Google's new Photos service"]. ''The Verge''. Retrieved March 12, 2021.</ref> Additionally, Google Photos offers a variety of ways to make sharing photos/videos easy and simple, such as generating web link to content that both users and non-users can access.<ref name="Vox">Mossberg, W. (2015, June 2). [https://www.vox.com/2015/6/2/11563182/the-new-google-photos-free-at-last-and-very-smart "The New Google Photos: Free at Last, and Very Smart"]. ''Vox''. Retrieved March 12, 2021.</ref> Google Photos allows users to share albums with someone directly via their Google account.<ref>Mathur, S. (2020, May 19). [https://blog.google/products/photos/new-controls-how-you-share-albums-google-photos/ "New controls for how you share albums in Google Photos"]. ''Google | The Keyword''. Retrieved March 12, 2021.</ref> In 2020, Google added a heat map feature that displays the concentration of photos in the library as a function of location.<ref>Newton, C. (2020, June 25). [https://www.theverge.com/2020/6/25/21301932/google-photos-redesign-map-view-memories-search "Google Photos gets a map view as part of a big new redesign"]. ''The Verge''. Retrieved March 12, 2021.</ref> In March of 2021, Google Photos started rolling out an advanced version of its video-editor for Android users. Users will be able to not only trim, stabilize, and rotate videos, but also crop to any aspect ratio, change perspective, and control brightness, contrast, highlights, and shadows. Google plans to release this feature to iOS in 2021.<ref>Li, A. (2021, March 29). [https://9to5google.com/2021/03/29/google-photos-video-editor/ "Update: Rolling out Google Photos for Android getting advanced video editor with cropping, filters."] ''9To5Google''. Retrieved April 2, 2021.</ref><br />
<br />
===Subscription Service===<br />
In late 2020, Google rolled out a subscription-based service that ships Google Photos users 10 prints of that months best photos.<ref>Perez, S. (2020, October 20).[https://techcrunch.com/2020/10/20/google-photos-revives-its-prints-subscription-service-expands-same-day-print-options/ "Google Photos revives its prints subscription service, expands same-day print options."] ''TechCrunch''. Retrieved April 2, 2021.</ref> The images are selected by a machine learning algorithm, but users can alter any choices. After some negative feedback through user-testing,<ref name="feedback">Nagaraj, P. (2020, June 21). [https://newslanded.com/2020/06/21/google-finally-cancels-controversial-google-photos-subscription-feature/ "Google finally cancels controversial google photos subscription feature."] ''News Landed''. Retrieved April 2, 2021.</ref> Google now allows higher customization of the look, and finishing of the prints along with a reduction in subscription costs from $7.99 to $6.99. Users are able to cancel the service at any time, and are also able to skip a month, if they so choose to. Customers can order prints in multiple sizes, and also ask for a same-day pick-up at their local [https://en.wikipedia.org/wiki/Walgreens Walgreens]. The most prominent feedback received for the service is the "outrageous cost"<ref name="feedback"/>, charging many times the price for printing at a traditional store.<ref name="feedback"/><br />
<br />
[[File:Google's_Revenue_Breakdown_2017-20.jpeg|300px|thumb|right|A breakdown of Google's sources of revenue from 2017 to 2020.<ref name="pivot" /> (Advertising revenue is in blue)]]<br />
<br />
[[File:Incorrectly-Labeled_Bikes-cropped.jpg|216px|thumb|left|This meal has been incorrectly labeled as a "bike" by Google's visual recognition algorithms. Copyright ©️: Cooper Stevens]]<br />
<br />
==Storage==<br />
<br />
Photos and videos are each uploaded to Google Photos in one of three ways: "original quality", "high quality", or "express."<ref name="size" /> "Original quality" uploads maintain their original resolution and use part of the associated Google account's 15GB of storage shared between all Google products (including [[Gmail]], [[Google Drive]], etc.).<ref name="size">Google. (2021). [https://support.google.com/photos/answer/6220791?co=GENIE.Platform%3DAndroid&oco=1#zippy=%2Cexpress-backup%2Coriginal-quality%2Chigh-quality "Choose the upload size of your photos & videos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> Meanwhile, "high quality" uploads have their resolution downgraded to 16 megapixels and 1080p.<ref name="size" /> Finally, "express" uploads are compressed to 3 megapixels and 480p.<ref name="size" /> The free tier of Google Photos allows unlimited uploads in resolutions up to "high quality."<ref name="size" /><br />
<br />
In November 2020, in an effort to increase the number of [[Google One]] subscriptions and reduce Google's reliance on ad-based revenue, Google announced that Google Photos will no longer offer free unlimited storage at "express" or "high quality" starting June 1, 2021.<ref name="pivot">Newman, J. (2020, November 28). [https://www.fastcompany.com/90579872/google-photos-no-unlimited-storage "The end of unlimited Google Photos storage is part of a bigger pivot"]. ''Fast Company''. Retrieved March 12, 2021.</ref><ref>Bohn, D. (2020, November 11). [https://www.theverge.com/2020/11/11/21560810/google-photos-unlimited-cap-free-uploads-15gb-ending "Google Photos will end its free unlimited storage on June 1st, 2021"]. ''The Verge''. Retrieved March 12, 2021.</ref> Existing photos and videos will remain unaffected.<ref name="future" /> After all 15GB of account storage have been used, users will either have to maintain a Google One subscription or upload from a Google Pixel 5 device or earlier make.<ref>Google. (2021). [https://support.google.com/photos/answer/10100180 "Storage changes for Google Photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref><ref name="size" /> Future Pixel devices will be unable to upload in "high quality" for free.<ref>Coberly, C. (2020, November 12). [https://www.techspot.com/news/87584-future-pixel-phone-owners-reportedly-wont-retain-access.html "Future Pixel phone owners may not retain unlimited Google Photos storage access"]. ''TechSpot''. Retrieved March 12, 2021.</ref><br />
<br />
[[File:Alcine1-cropped.png|216px|thumb|left|Jacky Alciné's Twitter post about the "Gorillas" misidentification.<ref name="gorillas" />]]<br />
<br />
==Motivations==<br />
<br />
There have been suggestions that Google released Google Photos in the interest of getting a foothold in the landscape of personal data in the form of visual imagery.<ref>Lomas, N. (2015, June 1). [https://techcrunch.com/2015/06/01/google-photos-reminder-smile-its-free-youre-the-product/ "Google Photos Reminder: Smile, It's Free - You're The Product!"]. ''TechCrunch''. Retrieved March 12, 2021.</ref> It has also been speculated that it was intended as a method of outsourcing work to train their visual recognition algorithms; the app frequently asks its users to manually verify (or reject) its proposed tags.<ref name="improve">Perrigo, M. (2020, November 9). [https://chromeunboxed.com/google-photos-crowdsource-feature "Google Photos wants you to answer questions to help improve its image recognition capabilities"]. ''Chrome Unboxed''. Retrieved March 12, 2021.</ref> <br />
<br />
===Bias in Algorithms===<br />
Bias in artificial intelligence is a widely discussed topic, with researchers voicing concerns that flawed training data can exacerbate or create unfair assumptions. One example is facial recognition algorithms misidentifying people of color more frequently than white people.<ref name="gender"> Ghosh, S. (2020, February 20). [https://www.businessinsider.nl/google-cloud-vision-api-wont-tag-images-by-gender-2020-2/ “Google AI will no longer use gender labels like ‘woman’ or ‘man’ on images of people to avoid bias”]. “Business Insider”. Retrieved April 6, 2021. </ref> <br />
<br />
Shortly after its release in 2015, Google Photos found itself under fire for a particularly offensive misidentification: user Jacky Alciné reported that he and his friend were classified as "Gorillas" by Google Photos's visual recognition algorithms.<ref name="gorillas">Mullen, J. (2015, July 2). [https://www.cnn.com/2015/07/02/tech/google-image-recognition-gorillas-tag "Google rushes to fix software that tagged photo with racial slur"]. ''CNN''. Retrieved March 19, 2021.</ref> In response, Google Photos promptly removed all labels relating to primates as a temporary fix.<ref>Hern, A. (2018, January 12). [https://www.theguardian.com/technology/2018/jan/12/google-racism-ban-gorilla-black-people "Google's solution to accidental algorithmic racism: ban gorillas"]. ''The Guardian''. Retrieved March 19, 2021.</ref> A spokesperson from Google stated that the labels "gorilla", "chimp", "chimpanzee", and "monkey" were blocked on the platform.<ref>Vincent, J. (2018, January 12). [https://www.theverge.com/2018/1/12/16882408/google-racist-gorillas-photo-recognition-algorithm-ai "Google ‘fixed’ its racist algorithm by removing gorillas from its image-labeling tech."] ''The Verge''. Retrieved April 2, 2021.</ref> Google continues to improve its visual recognition algorithms.<ref name="improve" /> According to Mr. Alcine, Google has mentioned a more intensified search into getting people of colour candidates through the door, but only time will tell if that'll happen and help correct the image Silicon Valley companies have with intersectional diversity - the act of unifying multiple fronts of disadvantaged people so that their voices are heard and not muted.<ref> (2015, July 1). [https://www.bbc.com/news/technology-33347866 “Google apologises for Photo app’s racist blunder”]. “BBC”. Retrieved April 6, 2021. </ref> The “gorilla workaround” illustrates the difficulties Google and other tech companies face in advancing image-recognition technology, which the companies hope to use in self-driving cars, personal assistants, and other products. Google’s caution illustrates an important shortcoming of existing machine learning algorithms. With enough data and computing power, software can be trained to categorize images to a high level of accuracy, but it can’t easily go beyond the experience of that training. Even the very best algorithms lack the ability to refine their interpretation of the world as humans do.<ref> Simonite, T. (2018, January 11). [https://www.wired.com/story/when-it-comes-to-gorillas-google-photos-remains-blind/ “When It Comes To Gorillas, Google Photos Remains Blind”]. “Wired”. Retrieved April 6, 2021. </ref><br />
<br />
[[File:Gendergoogle.png|thumbnail|right|Google's API no longer uses gender labels<ref name="gender"/>]]<br />
<br />
In 2020, Google made a change to its API tool Cloud Vision, which uses AI to analyze images and identify faces, landmarks, explicit content, and other recognizable features. The change involved eliminating gender labels on images, because a person’s gender can’t be determined just by how they look in a photo. Instead of using terms like “man” or “woman”, Google will tag images with labels such as “person” in order to avoid instilling AI algorithms with human bias.<ref> Lyons, K. (2020, February 20). [https://www.theverge.com/2020/2/20/21145356/google-ai-images-gender-bias-labels-people “Google Ai tool will no longer use gendered labels like ‘woman’ or ‘man’ in photos of people”]. “The Verge”. Retrieved April 6, 2021. </ref> <br />
<br />
<br />
In its own AI principles, Google notes that algorithms and datasets can reinforce bias: “We will seek to avoid unjust impacts on people, particularly those related to sensitive characteristics such as race, ethnicity, gender, nationality, income, sexual orientation, ability, and political or religious belief. We will design AI systems that provide appropriate opportunities for feedback, relevant explanations, and appeal. Our AI technologies will be subject to appropriate human direction and control.”<ref> Pichai, S. (2018, June 7). [https://www.blog.google/technology/ai/ai-principles/ “AI at Google: our principles”]. “Google”. Retrieved April 6, 2021. </ref><br />
<br />
==Ethical Implications==<br />
[[File:July_13,_2018_Timeline-light.jpg|205px|thumb|right|Google Photos work in conjunction with Google Maps to create an extremely precise summary of this user’s day. Copyright ©️: Cooper Stevens]]<br />
===Security===<br />
An obvious concern for a product like Google Photos is users' privacy and the security of their photos and videos from other individuals, third party institutions, and even the government.<ref>Hill, S. (2015, June 16). [https://www.androidauthority.com/google-photos-worried-privacy-616339/ "Google Photos: Should you be worried about privacy?"]. ''Android Authority''. Retrieved March 12, 2021.</ref> These concerns are substantiated by occurrences like Google accidentally sending users’ private videos to strangers in November 2019.<ref>Warren, T. (2020, February 4). [https://www.theverge.com/2020/2/4/21122044/google-photos-privacy-breach-takeout-data-video-strangers "Google admits it sent private videos in Google Photos to strangers"]. ''The Verge''. Retrieved March 12, 2021.</ref> To alleviate these concerns, Google continuously works to improve its security infrastructure to ensure that the only people that can see an account’s photos are the owner of the account and those that the owner has shared their photos with.<ref>Google. (2021). [https://safety.google/photos/#photos-advanced-infraestructure "Google Photos Safety &amp; Privacy Features"]. ''Google Safety Center''. Retrieved March 12, 2021.</ref><br />
<br />
===Computer Vision===<br />
Google Photos automatically runs every photo and video through visual recognition algorithms to identify objects and places.<ref>Google. (2021). [https://support.google.com/photos/answer/6220402?hl=en&amp;ref_topic=6128818 "Get started with Google Photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> Once "face grouping" is turned on by the user, it will also start to group together photos of individual people and pets.<ref name="searchPTP">Google. (2021). [https://support.google.com/photos/answer/6128838?hl=en&co=GENIE.Platform%3DDesktop&oco=0#zippy=%2Clearn-about-face-models "Search by people, things & places in your photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> Hence, there is cause for concern about Google collecting the information embedded in the photos (including, but not limited to, hobbies, family members, and pets) and selling it to third parties or using it to display more relevant advertisements.<ref>Luckerson, V. (2017, May 25). [https://www.theringer.com/2017/5/25/16043842/google-photos-data-collection-e8578b3256e0 "Why Google Is Suddenly Obsessed With Your Photos"]. ''The Ringer''. Retrieved March 12, 2021.</ref> To alleviate these concerns, Google ensures users that "face groups and labels in your account are only visible to you."<ref name="searchPTP" /><br />
<br />
===Geotagging===<br />
<br />
If a user allows Google to keep track of their location history under "Your Timeline" in Google Maps, Google Photos uses this alongside geotags embedded in photos to allow users to search their gallery by location and even review past trips minute-by-minute.<ref name="Vox" /><ref>Bonifacic, I. (2020, December 16). [https://www.engadget.com/google-photos-adds-maps-timeline-233633756.html "Google Photos lets you relive a day through your Maps timeline"]. ''Engadget''. Retrieved March 12, 2021.</ref><br />
<br />
Even without geotags, Google Photos is capable of intuiting a photo’s location by analyzing for major landmarks.<ref name="Vox" /> As a result, concerns have been raised about how this information is used and if it is secure.<ref>Komando, K. (2021, January 11). [https://www.usatoday.com/story/tech/columnist/komando/2021/01/07/google-map-knows-your-location-photos-you-took-how-turn-off/4113403001/ "Hidden map on your phone shows everywhere you’ve been and the photos you took there"]. ''USA Today''. Retrieved March 12, 2021.</ref><br />
<br />
Google is upfront that it uses this information to show users more relevant advertisements, but at the same time assures users that this information is never shared with advertisers.<ref>Google. (2021). [https://policies.google.com/technologies/location-data?hl=en-US "How Google uses location information"]. ''Privacy &amp; Terms''. Retrieved March 12, 2021.</ref> To accommodate these concerns, Google Photos gives its users the capability to turn off location history, remove location information from already existing photos, and choose whether or not to share a photo's location when shared.<ref>Google. (2021). [https://support.google.com/photos/answer/6153599?co=GENIE.Platform%3DAndroid&oco=1#zippy=%2Cview-your-timeline "Understand, find and edit your photos' locations"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref><br />
<br />
==See also==<br />
*[[Computer Vision]]<br />
*[[Facial Recognition]]<br />
*[[Geotag]]<br />
<br />
==References==<br />
<references/></div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=Google_Photos&diff=102026
Google Photos
2021-04-15T15:48:09Z
<p>Nfigue: </p>
<hr />
<div><span style="color: red; font-size: 50px;">Currently Being Edited</span><br />
<br />
<br />
[[File:Google_Photos_Logo_new.png|256px|thumb|right|The Google Photos logo. Copyright ©️: Google]]<br />
{{Nav-Bar|Topics##}}<br><br />
'''Google Photos''' is a photo and video storage and sharing service developed by [[Google]].<ref name="about">Google. (2021). [https://www.google.com/photos/about/ "Google Photos | About"]. ''Google Photos''. Retrieved March 26, 2021.</ref> Google Photos was released in May 2015 as it separated from Google’s former social network, [[Google+]].<ref>Spradlin, L. (2015, May 24). [https://www.androidpolice.com/2015/05/24/an-exclusive-early-look-at-the-new-google-photos-app/ "An Exclusive Early Look At The New Google Photos App"]. ''Android Police''. Retrieved March 12, 2021.</ref> Google Photos was designed with the intent of creating a platform that allows people to store and easily access all of their pictures and videos from any device.<ref>Amateur Photographer. (2015, June 2). [https://www.amateurphotographer.co.uk/latest/photo-news/google-photos-service-raises-privacy-concerns-52721 "Google Photos Service Raises Privacy Concerns"]. ''Amateur Photographer''. Retrieved March 12, 2021.</ref><br />
<br />
After its release, Google Photos's user base rocketed to 200 million after one year, 500 million after two years, and surpassed 1 billion after four years.<ref>Porter, J. (2019, July 24). [https://www.theverge.com/2019/7/24/20708328/google-photos-users-gallery-go-1-billion "Google Photos passes the 1 billion users mark"]. ''The Verge''. Retrieved March 12, 2021.</ref> As of 2020, Google reports that about 28 billion photos and videos are uploaded every week and that the service is home to more than 4 trillion photos.<ref name="future">Ben-Yair, S. (2020, November 11). [https://blog.google/products/photos/storage-changes/ "Updating Google Photos' storage policy to build for the future"]. ''Google | The Keyword''. Retrieved March 12, 2021.</ref><br />
<br />
==Features==<br />
Google Photos is available on [[Android]], [[iOS]], and [https://photos.google.com/ online].<ref name="JacobK">Kastrenakes, J. (2015, May 28). [https://www.theverge.com/2015/5/28/8678629/google-photos-app-announced "Google announces unlimited picture and video storage with new Photos app"]. ''The Verge''. Retrieved March 12, 2021.</ref> Photos and videos can be uploaded and accessed via any of these platforms.<ref name="Vox" /> The service organizes photos and videos by identifying any number of features present in the photo such as lakes, night, birthday, etc.<ref name="JacobK" /> It is capable of organizing photos by the faces of people and pets in them, even as the faces age.<ref name="Vox" /><ref name="JacobK" /> Users can manually fix or remove incorrect labels.<ref name="Vox" /> Google Photos also can group photos and videos by location.<ref name="Vox" /> It can determine a photo's location either by its embedded geotagging data, or by analyzing the photo for major landmarks (such as the [[Eiffel Tower]]).<ref name="Vox" /><br />
<br />
The service includes a native photo and video editing software that can be used on any platform.<ref>Lowensohn, J. (2015, May 28). [https://www.theverge.com/2015/5/28/8673471/google-photos-hands-on-cloud-storage-io-2015 "Hands-on with Google's new Photos service"]. ''The Verge''. Retrieved March 12, 2021.</ref> Additionally, Google Photos offers a variety of ways to make sharing photos/videos easy and simple. One method that Google Photos offers is generating web links that both users and non-users can access.<ref name="Vox">Mossberg, W. (2015, June 2). [https://www.vox.com/2015/6/2/11563182/the-new-google-photos-free-at-last-and-very-smart "The New Google Photos: Free at Last, and Very Smart"]. ''Vox''. Retrieved March 12, 2021.</ref> Google Photos allows users to share albums with someone directly via their Google account.<ref>Mathur, S. (2020, May 19). [https://blog.google/products/photos/new-controls-how-you-share-albums-google-photos/ "New controls for how you share albums in Google Photos"]. ''Google | The Keyword''. Retrieved March 12, 2021.</ref> In 2020, Google added a heat map feature that displays the concentration of photos in the library as a function of location.<ref>Newton, C. (2020, June 25). [https://www.theverge.com/2020/6/25/21301932/google-photos-redesign-map-view-memories-search "Google Photos gets a map view as part of a big new redesign"]. ''The Verge''. Retrieved March 12, 2021.</ref>In March of 2021, Google Photos started rolling out an advanced version of its video-editor for Android users. Users will be able to not only trim, stabilize, and rotate videos, but also crop to any aspect ratio, change perspective, and control brightness, contrast, highlights, shadows, etc. Google plans to roll this feature out to iOS in the 2021.<ref>Li, A. (2021, March 29). [https://9to5google.com/2021/03/29/google-photos-video-editor/ "Update: Rolling out Google Photos for Android getting advanced video editor with cropping, filters."] ''9To5Google''. Retrieved April 2, 2021.</ref><br />
<br />
===Subscription Service===<br />
In late 2020, Google rolled out a subscription-based service that lets Google Photos users get 10 prints per month of their most-recent photos.<ref>Perez, S. (2020, October 20).[https://techcrunch.com/2020/10/20/google-photos-revives-its-prints-subscription-service-expands-same-day-print-options/ "Google Photos revives its prints subscription service, expands same-day print options."] ''TechCrunch''. Retrieved April 2, 2021.</ref> These photos are automatically selected by a machine learning algorithm, but users can edit the photo selection if they wish to. After some negative feedback through user-testing,<ref name="feedback">Nagaraj, P. (2020, June 21). [https://newslanded.com/2020/06/21/google-finally-cancels-controversial-google-photos-subscription-feature/ "Google finally cancels controversial google photos subscription feature."] ''News Landed''. Retrieved April 2, 2021.</ref> Google now allows higher customizability of the look, and finishing of the prints. Google also reduced the subscription costs from $7.99 to $6.99. Users are able to cancel the service at any time, and are also able to skip a month, if they so choose to. Customers can order prints in multiple sizes, and also ask for a same-day pick-up at their local Walgreens. One of the critiques this service has gotten has to do with the "outrageous cost".<ref name="feedback"/> $6.99 for 10 photos (excluding shipping) is a much higher cost than it would be for printing 1 photo for at a traditional store.<ref name="feedback"/><br />
<br />
[[File:Google's_Revenue_Breakdown_2017-20.jpeg|300px|thumb|right|A breakdown of Google's sources of revenue from 2017 to 2020.<ref name="pivot" /> (Advertising revenue is in blue)]]<br />
<br />
[[File:Incorrectly-Labeled_Bikes-cropped.jpg|216px|thumb|left|This meal has been incorrectly labeled as a "bike" by Google's visual recognition algorithms. Copyright ©️: Cooper Stevens]]<br />
<br />
==Storage==<br />
<br />
Photos and videos are each uploaded to Google Photos in one of three ways: "original quality", "high quality", or "express."<ref name="size" /> "Original quality" uploads maintain their original resolution and use part of the associated Google account's 15GB of storage shared between all Google products (including [[Gmail]], [[Google Drive]], etc.).<ref name="size">Google. (2021). [https://support.google.com/photos/answer/6220791?co=GENIE.Platform%3DAndroid&oco=1#zippy=%2Cexpress-backup%2Coriginal-quality%2Chigh-quality "Choose the upload size of your photos & videos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> Meanwhile, "high quality" uploads have their resolution downgraded to 16 megapixels and 1080p.<ref name="size" /> Finally, "express" uploads are compressed to 3 megapixels and 480p.<ref name="size" /> The free tier of Google Photos allows unlimited uploads in resolutions up to "high quality."<ref name="size" /><br />
<br />
In November 2020, in an effort to increase the number of [[Google One]] subscriptions and reduce Google's reliance on ad-based revenue, Google announced that Google Photos will no longer offer free unlimited storage at "express" or "high quality" starting June 1, 2021.<ref name="pivot">Newman, J. (2020, November 28). [https://www.fastcompany.com/90579872/google-photos-no-unlimited-storage "The end of unlimited Google Photos storage is part of a bigger pivot"]. ''Fast Company''. Retrieved March 12, 2021.</ref><ref>Bohn, D. (2020, November 11). [https://www.theverge.com/2020/11/11/21560810/google-photos-unlimited-cap-free-uploads-15gb-ending "Google Photos will end its free unlimited storage on June 1st, 2021"]. ''The Verge''. Retrieved March 12, 2021.</ref> Existing photos and videos will remain unaffected.<ref name="future" /> After all 15GB of account storage have been used, users will either have to maintain a Google One subscription or upload from a Google Pixel 5 device or earlier make.<ref>Google. (2021). [https://support.google.com/photos/answer/10100180 "Storage changes for Google Photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref><ref name="size" /> Future Pixel devices will be unable to upload in "high quality" for free.<ref>Coberly, C. (2020, November 12). [https://www.techspot.com/news/87584-future-pixel-phone-owners-reportedly-wont-retain-access.html "Future Pixel phone owners may not retain unlimited Google Photos storage access"]. ''TechSpot''. Retrieved March 12, 2021.</ref><br />
<br />
[[File:Alcine1-cropped.png|216px|thumb|left|Jacky Alciné's Twitter post about the "Gorillas" misidentification.<ref name="gorillas" />]]<br />
<br />
==Motivations==<br />
<br />
There have been suggestions that Google released Google Photos in the interest of getting a foothold in the landscape of personal data in the form of visual imagery.<ref>Lomas, N. (2015, June 1). [https://techcrunch.com/2015/06/01/google-photos-reminder-smile-its-free-youre-the-product/ "Google Photos Reminder: Smile, It's Free - You're The Product!"]. ''TechCrunch''. Retrieved March 12, 2021.</ref> It has also been speculated that it was intended as a method of outsourcing work to train their visual recognition algorithms; the app frequently asks its users to manually verify (or reject) its proposed tags.<ref name="improve">Perrigo, M. (2020, November 9). [https://chromeunboxed.com/google-photos-crowdsource-feature "Google Photos wants you to answer questions to help improve its image recognition capabilities"]. ''Chrome Unboxed''. Retrieved March 12, 2021.</ref> <br />
<br />
===Bias in Algorithms===<br />
Bias in artificial intelligence is a widely discussed topic, with researchers voicing concerns that flawed training data can exacerbate or create unfair assumptions. One example is facial recognition algorithms misidentifying people of color more frequently than white people.<ref name="gender"> Ghosh, S. (2020, February 20). [https://www.businessinsider.nl/google-cloud-vision-api-wont-tag-images-by-gender-2020-2/ “Google AI will no longer use gender labels like ‘woman’ or ‘man’ on images of people to avoid bias”]. “Business Insider”. Retrieved April 6, 2021. </ref> <br />
<br />
Shortly after its release in 2015, Google Photos found itself under fire for a particularly offensive misidentification: user Jacky Alciné reported that he and his friend were classified as "Gorillas" by Google Photos's visual recognition algorithms.<ref name="gorillas">Mullen, J. (2015, July 2). [https://www.cnn.com/2015/07/02/tech/google-image-recognition-gorillas-tag "Google rushes to fix software that tagged photo with racial slur"]. ''CNN''. Retrieved March 19, 2021.</ref> In response, Google Photos promptly removed all labels relating to primates as a temporary fix.<ref>Hern, A. (2018, January 12). [https://www.theguardian.com/technology/2018/jan/12/google-racism-ban-gorilla-black-people "Google's solution to accidental algorithmic racism: ban gorillas"]. ''The Guardian''. Retrieved March 19, 2021.</ref> A spokesperson from Google stated that the labels "gorilla", "chimp", "chimpanzee", and "monkey" were blocked on the platform.<ref>Vincent, J. (2018, January 12). [https://www.theverge.com/2018/1/12/16882408/google-racist-gorillas-photo-recognition-algorithm-ai "Google ‘fixed’ its racist algorithm by removing gorillas from its image-labeling tech."] ''The Verge''. Retrieved April 2, 2021.</ref> Google continues to improve its visual recognition algorithms.<ref name="improve" /> According to Mr. Alcine, Google has mentioned a more intensified search into getting people of colour candidates through the door, but only time will tell if that'll happen and help correct the image Silicon Valley companies have with intersectional diversity - the act of unifying multiple fronts of disadvantaged people so that their voices are heard and not muted.<ref> (2015, July 1). [https://www.bbc.com/news/technology-33347866 “Google apologises for Photo app’s racist blunder”]. “BBC”. Retrieved April 6, 2021. </ref> The “gorilla workaround” illustrates the difficulties Google and other tech companies face in advancing image-recognition technology, which the companies hope to use in self-driving cars, personal assistants, and other products. Google’s caution illustrates an important shortcoming of existing machine learning algorithms. With enough data and computing power, software can be trained to categorize images to a high level of accuracy, but it can’t easily go beyond the experience of that training. Even the very best algorithms lack the ability to refine their interpretation of the world as humans do.<ref> Simonite, T. (2018, January 11). [https://www.wired.com/story/when-it-comes-to-gorillas-google-photos-remains-blind/ “When It Comes To Gorillas, Google Photos Remains Blind”]. “Wired”. Retrieved April 6, 2021. </ref><br />
<br />
[[File:Gendergoogle.png|thumbnail|right|Google's API no longer uses gender labels<ref name="gender"/>]]<br />
<br />
In 2020, Google made a change to its API tool Cloud Vision, which uses AI to analyze images and identify faces, landmarks, explicit content, and other recognizable features. The change involved eliminating gender labels on images, because a person’s gender can’t be determined just by how they look in a photo. Instead of using terms like “man” or “woman”, Google will tag images with labels such as “person” in order to avoid instilling AI algorithms with human bias.<ref> Lyons, K. (2020, February 20). [https://www.theverge.com/2020/2/20/21145356/google-ai-images-gender-bias-labels-people “Google Ai tool will no longer use gendered labels like ‘woman’ or ‘man’ in photos of people”]. “The Verge”. Retrieved April 6, 2021. </ref> <br />
<br />
<br />
In its own AI principles, Google notes that algorithms and datasets can reinforce bias: “We will seek to avoid unjust impacts on people, particularly those related to sensitive characteristics such as race, ethnicity, gender, nationality, income, sexual orientation, ability, and political or religious belief. We will design AI systems that provide appropriate opportunities for feedback, relevant explanations, and appeal. Our AI technologies will be subject to appropriate human direction and control.”<ref> Pichai, S. (2018, June 7). [https://www.blog.google/technology/ai/ai-principles/ “AI at Google: our principles”]. “Google”. Retrieved April 6, 2021. </ref><br />
<br />
==Ethical Implications==<br />
[[File:July_13,_2018_Timeline-light.jpg|205px|thumb|right|Google Photos work in conjunction with Google Maps to create an extremely precise summary of this user’s day. Copyright ©️: Cooper Stevens]]<br />
===Security===<br />
An obvious concern for a product like Google Photos is users' privacy and the security of their photos and videos from other individuals, third party institutions, and even the government.<ref>Hill, S. (2015, June 16). [https://www.androidauthority.com/google-photos-worried-privacy-616339/ "Google Photos: Should you be worried about privacy?"]. ''Android Authority''. Retrieved March 12, 2021.</ref> These concerns are substantiated by occurrences like Google accidentally sending users’ private videos to strangers in November 2019.<ref>Warren, T. (2020, February 4). [https://www.theverge.com/2020/2/4/21122044/google-photos-privacy-breach-takeout-data-video-strangers "Google admits it sent private videos in Google Photos to strangers"]. ''The Verge''. Retrieved March 12, 2021.</ref> To alleviate these concerns, Google continuously works to improve its security infrastructure to ensure that the only people that can see an account’s photos are the owner of the account and those that the owner has shared their photos with.<ref>Google. (2021). [https://safety.google/photos/#photos-advanced-infraestructure "Google Photos Safety &amp; Privacy Features"]. ''Google Safety Center''. Retrieved March 12, 2021.</ref><br />
<br />
===Computer Vision===<br />
Google Photos automatically runs every photo and video through visual recognition algorithms to identify objects and places.<ref>Google. (2021). [https://support.google.com/photos/answer/6220402?hl=en&amp;ref_topic=6128818 "Get started with Google Photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> Once "face grouping" is turned on by the user, it will also start to group together photos of individual people and pets.<ref name="searchPTP">Google. (2021). [https://support.google.com/photos/answer/6128838?hl=en&co=GENIE.Platform%3DDesktop&oco=0#zippy=%2Clearn-about-face-models "Search by people, things & places in your photos"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref> Hence, there is cause for concern about Google collecting the information embedded in the photos (including, but not limited to, hobbies, family members, and pets) and selling it to third parties or using it to display more relevant advertisements.<ref>Luckerson, V. (2017, May 25). [https://www.theringer.com/2017/5/25/16043842/google-photos-data-collection-e8578b3256e0 "Why Google Is Suddenly Obsessed With Your Photos"]. ''The Ringer''. Retrieved March 12, 2021.</ref> To alleviate these concerns, Google ensures users that "face groups and labels in your account are only visible to you."<ref name="searchPTP" /><br />
<br />
===Geotagging===<br />
<br />
If a user allows Google to keep track of their location history under "Your Timeline" in Google Maps, Google Photos uses this alongside geotags embedded in photos to allow users to search their gallery by location and even review past trips minute-by-minute.<ref name="Vox" /><ref>Bonifacic, I. (2020, December 16). [https://www.engadget.com/google-photos-adds-maps-timeline-233633756.html "Google Photos lets you relive a day through your Maps timeline"]. ''Engadget''. Retrieved March 12, 2021.</ref><br />
<br />
Even without geotags, Google Photos is capable of intuiting a photo’s location by analyzing for major landmarks.<ref name="Vox" /> As a result, concerns have been raised about how this information is used and if it is secure.<ref>Komando, K. (2021, January 11). [https://www.usatoday.com/story/tech/columnist/komando/2021/01/07/google-map-knows-your-location-photos-you-took-how-turn-off/4113403001/ "Hidden map on your phone shows everywhere you’ve been and the photos you took there"]. ''USA Today''. Retrieved March 12, 2021.</ref><br />
<br />
Google is upfront that it uses this information to show users more relevant advertisements, but at the same time assures users that this information is never shared with advertisers.<ref>Google. (2021). [https://policies.google.com/technologies/location-data?hl=en-US "How Google uses location information"]. ''Privacy &amp; Terms''. Retrieved March 12, 2021.</ref> To accommodate these concerns, Google Photos gives its users the capability to turn off location history, remove location information from already existing photos, and choose whether or not to share a photo's location when shared.<ref>Google. (2021). [https://support.google.com/photos/answer/6153599?co=GENIE.Platform%3DAndroid&oco=1#zippy=%2Cview-your-timeline "Understand, find and edit your photos' locations"]. ''Google Photos Help''. Retrieved March 12, 2021.</ref><br />
<br />
==See also==<br />
*[[Computer Vision]]<br />
*[[Facial Recognition]]<br />
*[[Geotag]]<br />
<br />
==References==<br />
<references/></div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=Algorithmic_Audits&diff=101626
Algorithmic Audits
2021-04-10T17:20:15Z
<p>Nfigue: Sorry, forgot to remove the editing banner</p>
<hr />
<div><br />
<br />
<br />
[[File:Internet-law.jpg|thumbnail| visual of digital law on computer screen, image sourced from Google]]<br />
An algorithmic audit is a process involving the collection and analysis of an [https://en.wikipedia.org/wiki/Algorithm algorithm] in some context to determine if its behavior is negatively affecting the interests or rights of any person it influences<ref name="brown> Shea Brown, Jovana Davidovic, & Ali Hasan. (2021). The algorithm audit: Scoring the algorithms that score us. Big data & society, 8. SAGE Publishing.</ref>. <br />
[[File:Audits.jpg|thumbnail| Visualization of Algorithmic Audit, Image sourced from Google]]<br />
<br />
The internet runs on software that the majority of people don't understand, or carry the impression that the algorithms are neutral (unbiased), meaning the internet can function relatively unchecked <ref name="wagner">Wagner, B. (2016). Algorithmic regulation and the global default: Shifting norms in Internet technology. Etikk I Praksis - Nordic Journal of Applied Ethics, 10(1), 5-13. https://doi.org/10.5324/eip.v10i1.1961</ref>. A study conducted by the PEW research center in February of 2021 showed that 85% of adults in the U.S utilize the internet daily <ref>Perrin, Andrew, and Sara Atske. About Three-in-Ten U.S. Adults Say They Are 'ALMOST Constantly' Online. 26 Mar. 2021, www.pewresearch.org/fact-tank/2021/03/26/about-three-in-ten-u-s-adults-say-they-are-almost-constantly-online/. </ref>. <br />
<br />
This statistic represents billions of people worldwide and emphasizes algorithms' persistence in our daily lives, despite a lack of general understanding. By looking at online platforms as a single entity and analyzing the results they're producing, more can be learned about algorithms and the way they affect society<ref name="sandvig1">Sandvig, Christian, et al. "An algorithm audit." Data and Discrimination: Collected Essays. Washington, DC: New America Foundation (2014): 6-10.</ref>. <br />
<br />
== Background ==<br />
The rapid advancement observed in the [https://en.wikipedia.org/wiki/High_tech tech industry] has revolutionized the way humans work, interact, and communicate. A diverse array of features are integrated into [https://en.wikipedia.org/wiki/Smart_device smart] applications for maximum [https://en.wikipedia.org/wiki/Availability availability]. At the core of all these real-time digital services lie algorithms which provide essential functions like sorting, segmentation, personalized recommendations, and information management<ref name="kotras">Kotras, Baptiste. "Mass personalization: Predictive marketing algorithms and the reshaping of consumer knowledge." Big Data & Society 7.2 (2020): 2053951720951581.</ref>. <br />
<br />
In the last decade, researchers have identified numerous ethical breaches in online platforms and published their findings. Countries worldwide have rolled out their regulations and laws to keep the Internet safe for their citizens. However, the these efforts do little to limit the international liberties afforded online, including access, anonymity, and privacy. Because technology and algorithms are heavily integrated into humanity, ethical issues arise when algorithms are opaque to public scrutiny. Since the 2000s, the regulation of algorithms has become a concern to governments worldwide due to data and privacy implications, though legislation has made progress in essential areas<ref name="goldsmith">Goldsmith, J. (2007), "Who Controls the Internet? Illusions of a Borderless World", Strategic Direction, Vol. 23 No. 11. https://doi.org/10.1108/sd.2007.05623kae.001</ref>.<br />
<br />
Higher education computer science and other similar programs have seen an increase in undergraduate and graduate enrollment, proportional to the tech industry's growth<ref name="usdoe">U.S. Department of Education, National Center for Education Statistics, Higher Education General Information Survey (HEGIS), "Degrees and Other Formal Awards Conferred" surveys, 1970-71 through 1985-86; Integrated Postsecondary Education Data System (IPEDS), "Completions Survey" (IPEDS-C:87-99); and IPEDS Fall 2000 through Fall 2011:https://nces.ed.gov/programs/digest/d12/tables/dt12_349.asp</ref>. Awareness about algorithms and digital platforms' infrastructure enables the public to identify ethical breaches in these systems' functionality. This awareness has sparked activism for ethics in technology. For example, [https://en.wikipedia.org/wiki/J Joy Buolamwini] identifying the bias in facial recognition technologies or [https://casetext.com/case/american-airlines-inc-v-sabre SABRE] leveraging their recommendation algorithm to place airlines who paid, higher on the recommendation than their competitors<ref name="copeland">Copeland, D.G., Mason, R.O., and McKenney, J.L. (1995). Sabre: The Development of Information-Based Competence and Execution of Information-Based Competition. IEEE Annals of the History of Computing, vol. 17, no. 3: 30-57.</ref>.<br />
<br />
== Types of Audits ==<br />
<br />
=== Code Audit<ref name="sandvig">Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." Data and discrimination: converting critical concerns into productive inquiry 22 (2014): 4349-4357. pg. 9-15 </ref> ===<br />
[[File: Code Audit.png|thumbnail|Visual of code audit from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 9]]<br />
A code audit is based on the concept of [https://en.wikipedia.org/wiki/Algorithmic_transparency Algorithm Transparency], where researchers acquire a copy of the algorithm and vet it for unethical behavior. This is uncommon because software is considered valuable intellectual property, and companies are reluctant to release the code, even to researchers. <br />
<br />
An issue with this method is that public interest in software can be malicious. On many platforms, the algorithm designers constantly battle those who try to take advantage of the software, so releasing code for review may cause more harm than good.<br />
<br />
=== Noninvasive User Audit <ref name="sandvig"/>===<br />
[[File: NonInvasive User Audit.png|thumbnail|Visualization of code audit from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 11]]<br />
A noninvasive user audit involves researchers requesting assorted input and output data from end users in order to analyze the workings of an algorithm. This form of audit carries the benefit of not interacting with a platform itself, ensuring that the researcher is not accused of tampering. The most challenging aspect of a user audit is generating a truly random and representative sample, making it difficult to identify [https://en.wikipedia.org/wiki/Causality causality].<br />
<br />
=== Scraping Audit <ref name="sandvig"/>===<br />
[[File: Scraping Audit.png|thumbnail|Scraping audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 12]]<br />
A scraping audit involves researchers issuing repeated queries to a platform, then observing the results, employing programs that utilize API data mining and [https://en.wikipedia.org/wiki/Web_scraping webpage scraping]. This is an effective method to gather a large quantity of data to be analyzed, however this audit still encounters the issue of securing a random sample. <br />
<br />
Another difficulty with this method is the likelihood of being prosecuted under the Computer Fraud and Abuse Act ([https://en.wikipedia.org/wiki/Computer_Fraud_and_Abuse_Act CFAA]), a law which has been criticized as being too vague, as it criminalizes any unauthorized access to a computer <ref name="Whitehouse"> Whitehouse, Sheldon. (2016). Hacking into the Computer Fraud and Abuse Act: the CFAA at 30. The George Washington law review, 84(6), 1437. George Washington Law Review.</ref>. Consent is difficult to obtain from a platform operator in order to perform a scraping audit.<br />
<br />
Similarly, a Terms of Service document on a Website may not have the force of law but some scholarly institutions pose that Internet researchers must not violate the Terms of Internet platforms, making the results from this form of research difficult to publish without proper permissions.<br />
<br />
=== Sock Puppet Audit <ref name="sandvig"/>===<br />
[[File: Sock Puppet Audit.png|thumbnail|Sock Puppet Audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 13]]<br />
A sock puppet audit uses computer programs to impersonate users by creating false accounts. This method effectively investigates sensitive topics in difficult-to-access [https://en.wikipedia.org/wiki/Network_domain domains], and sock puppets can investigate features of systems that are not public and penetrate groups that are difficult to pinpoint and acknowledge. A large number of sock puppets are required to derive significant findings from the audit. This method is still susceptible to CFAA violations, presenting danger to the researchers conducting the study.<br />
<br />
=== Collaborative or Crowdsourced Audit <ref name="sandvig"/>===<br />
[[File: Crowdsourcing_Collaboration_Audit.png|thumbnail|Collaborative or Crowdsourced Audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 15]]<br />
A crowdsourced audit is similar to a sock puppet audit, employing real humans instead of a computer program imitating a user. This change circumvents the CFAA in most cases as long as the platform being audited allows anyone to create an account. This form of audit has the ability to utilize positive aspects of the four previously mentioned designs. With a large enough test sample and semi-automated tactics, it becomes more feasible to generate a random representative sample to investigate with. An example of this is [https://en.wikipedia.org/wiki/Amazon_Mechanical_Turk Amazon's Mechanical Turk] which allows a large enough testers group to produce significant results . The only drawback is a crowdsourced audit requires a large budget to pay the participants.<br />
<br />
== Audits in Practice==<br />
The purpose of algorithmic auditing is to ensure that some given software is not violating any legal or moral code which exists to protect people utilizing the internet. There are only a handful of historical and modern cases of transparent algorithm audits.<br />
<br />
In 1984 USCAB declared that the airline sorting algorithm created by [https://en.wikipedia.org/wiki/Sabre_(computer_system) SABRE] must be known to participating airlines under the [https://www.govinfo.gov/content/pkg/CFR-2012-title14-vol4/pdf/CFR-2012-title14-vol4-sec255-4.pdf Display of Information in the Code of Federal Regulations]. This regulatory provision resulted from a bias identified by airlines using SABRE's software, which involved "screen science." and recommending airlines based on privatized criteria and capital incentives. The airlines noticed that American Airlines received prioritization in SABRE's algorithms search results, which led to an investigation by the USCAB and DOJ.<br />
<br />
One instance which could have been circumvented with an algorithm audit is the Volkswagen emissions standard scandal which occurred in 2015 <ref name="wagner">Wagner, Ben. (2016). Algorithmic regulation and the global default: Shifting norms in Internet technology. Etikk i praksis, 10(1), 5. Norwegian University of Science and Technology Library.</ref>. Wagner describes how this is a prime example of a situation in which a standardized, scalable governing system regarding algorithms proves its necessity. They go on to mention the ways in which technology is a certainly not a neutral player in the modern social infrastructure and how regulation will have to improve in the future to avoid damage to society.<br />
<br />
== Ethical Implications ==<br />
Due to the scale of technology companies, regulation cannot keep pace. As a result, general guidelines and laws for digital interactions. However, as the infrastructure continues to grow, there are new ways for the system to be leveraged in new ways. As a result, Algorithm Audits have become the most efficient way to assess digital platforms and providers' ethical impact on their consumers and users. These Audits have formulated direct ways to check digital platforms and create protections for auditors. However, these methods are difficult to employ and fall short in many areas of regulation. <ref>O'neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. Crown.</ref><br />
<br />
=== Research Protections ===<br />
Because researchers attempting to audit a platform or provider can be targeted by these large tech companies, legislation has been put in place through the CFAA and U.S. Department of Justice to protect the researchers from lawsuits. This resulted from the 2018 court ruling in SANDVIG et al. V. SESSIONS, where four university researchers attempted to scrape information on discrimination in the housing department. <br />
[[File:SandvigSessions.jpg|thumbnail| Image of Sadvig Sessions ruling, sourced from Google]]<br />
===== SANDVIG et al V. SESSIONS<ref>Lee, A. (2019). Online Research and Competition under the CFAA. Available at SSRN 3259701.</ref> =====<br />
"On Mar. 27, the United States District Court of D.C. ruled that such actions should not be viewed as criminal under the statute, though it declined to weigh in on whether the professors' conduct would be protected under the First Amendment.<br />
Without reaching this constitutional question, the Court concludes that the CFAA does not criminalize mere terms-of-service violations on consumer websites and, thus, that plaintiffs' proposed research plans are not criminal under the CFAA," <br />
U.S. District Judge John Bates wrote in his opinion.<ref>https://cases.justia.com/federal/district-courts/district-of-columbia/dcdce/1:2016cv01368/180080/24/0.pdf?ts=1522488415</ref><br />
<br />
=== Privacy Policies & Anonymity ===<br />
<br />
Performing any sort of audit requiring interaction with a platform can be dangerous because technology companies have outlined in their privacy policies restrictions on how users can engage on the forum. Researchers can be flagged for breaking these policies and banned from the platform. This can inhibit audits of large, developed tech companies if they're detected and can lead to lawsuits and further complications when the research is published. These large companies monitor data traffic and can identify a user based on their access point or device metadata, making it hard even to attempt many of the audit types.<br />
<br />
== Future of Algorithmic Audits ==<br />
A research project conducted by Jack Bandy explores past research performed on algorithmic audits for the purpose of plotting future trajectory in the field. They outline four types of behavior revealed by studied audits: discrimination, distortion, exploitation, and misjudgment, where the main target of investigations has been discrimination. Bandy describes how very few audits focused on the exploitation of personal information relative to the high number of scholars who have voiced their concerns about the issue appearing in algorithms. Despite the high number of studies (N = 62), they state that there is remarkable room for improvement in the future. <br />
<br />
The consensus from studying prior audits, Bandy states, is that every type of behavior that is attempting to be identified could benefit from greater specificity when performing the research. For example, algorithm research into discriminatory pricing focused greatly on the identification of such a practice, but not much beyond that. Bandy suggests that each unique instance which could draw misbehavior from an algorithm deserves to be its own field of research. <br />
<br />
Another important facet of the report is the recommendations for methodology in the future, where Bandy describes the skew between number of audits performed on influential companies and how they should be more evenly distributed. Google had the most audits at 30 while similarly powerful corporations such as Twitter and Spotify only underwent 3. Replication is the last important piece mentioned for future audits, under the reasoning that algorithms can change very frequently and drastically so implementation of open science practices is vital to future auditing results <ref name="Bandy"> Bandy, Jack. (2021). Problematic Machine Behavior: A Systematic Literature Review of Algorithm Audits.</ref>.<br />
<br />
== References ==</div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=Algorithmic_Audits&diff=101069
Algorithmic Audits
2021-04-08T20:31:09Z
<p>Nfigue: Added Future of Audits section</p>
<hr />
<div><h1 style="font-size: 50; color: red; font-weight: 400;">Currently Being Edited</h1><br />
<br />
<br />
[[File:Internet-law.jpg|thumbnail| visual of digital law on computer screen, image sourced from Google]]<br />
An algorithmic audit is a process involving the collection and analysis of an [https://en.wikipedia.org/wiki/Algorithm algorithm] in some context to determine if its behavior is negatively affecting the interests or rights of any person it influences<ref name="brown> Shea Brown, Jovana Davidovic, & Ali Hasan. (2021). The algorithm audit: Scoring the algorithms that score us. Big data & society, 8. SAGE Publishing.</ref>. <br />
[[File:Audits.jpg|thumbnail| Visualization of Algorithmic Audit, Image sourced from Google]]<br />
<br />
The internet runs on software that the majority of people don't understand, or carry the impression that the algorithms are neutral (unbiased), meaning the internet can function relatively unchecked <ref name="wagner">Wagner, B. (2016). Algorithmic regulation and the global default: Shifting norms in Internet technology. Etikk I Praksis - Nordic Journal of Applied Ethics, 10(1), 5-13. https://doi.org/10.5324/eip.v10i1.1961</ref>. A study conducted by the PEW research center in February of 2021 showed that 85% of adults in the U.S utilize the internet daily <ref>Perrin, Andrew, and Sara Atske. About Three-in-Ten U.S. Adults Say They Are 'ALMOST Constantly' Online. 26 Mar. 2021, www.pewresearch.org/fact-tank/2021/03/26/about-three-in-ten-u-s-adults-say-they-are-almost-constantly-online/. </ref>. <br />
<br />
This statistic represents billions of people worldwide and emphasizes algorithms' persistence in our daily lives, despite a lack of general understanding. By looking at online platforms as a single entity and analyzing the results they're producing, more can be learned about algorithms and the way they affect society<ref name="sandvig1">Sandvig, Christian, et al. "An algorithm audit." Data and Discrimination: Collected Essays. Washington, DC: New America Foundation (2014): 6-10.</ref>. <br />
<br />
== Background ==<br />
The rapid advancement observed in the [https://en.wikipedia.org/wiki/High_tech tech industry] has revolutionized the way humans work, interact, and communicate. A diverse array of features are integrated into [https://en.wikipedia.org/wiki/Smart_device smart] applications for maximum [https://en.wikipedia.org/wiki/Availability availability]. At the core of all these real-time digital services lie algorithms which provide essential functions like sorting, segmentation, personalized recommendations, and information management<ref name="kotras">Kotras, Baptiste. "Mass personalization: Predictive marketing algorithms and the reshaping of consumer knowledge." Big Data & Society 7.2 (2020): 2053951720951581.</ref>. <br />
<br />
In the last decade, researchers have identified numerous ethical breaches in online platforms and published their findings. Countries worldwide have rolled out their regulations and laws to keep the Internet safe for their citizens. However, the these efforts do little to limit the international liberties afforded online, including access, anonymity, and privacy. Because technology and algorithms are heavily integrated into humanity, ethical issues arise when algorithms are opaque to public scrutiny. Since the 2000s, the regulation of algorithms has become a concern to governments worldwide due to data and privacy implications, though legislation has made progress in essential areas<ref name="goldsmith">Goldsmith, J. (2007), "Who Controls the Internet? Illusions of a Borderless World", Strategic Direction, Vol. 23 No. 11. https://doi.org/10.1108/sd.2007.05623kae.001</ref>.<br />
<br />
Higher education computer science and other similar programs have seen an increase in undergraduate and graduate enrollment, proportional to the tech industry's growth<ref name="usdoe">U.S. Department of Education, National Center for Education Statistics, Higher Education General Information Survey (HEGIS), "Degrees and Other Formal Awards Conferred" surveys, 1970-71 through 1985-86; Integrated Postsecondary Education Data System (IPEDS), "Completions Survey" (IPEDS-C:87-99); and IPEDS Fall 2000 through Fall 2011:https://nces.ed.gov/programs/digest/d12/tables/dt12_349.asp</ref>. Awareness about algorithms and digital platforms' infrastructure enables the public to identify ethical breaches in these systems' functionality. This awareness has sparked activism for ethics in technology. For example, [https://en.wikipedia.org/wiki/J Joy Buolamwini] identifying the bias in facial recognition technologies or [https://casetext.com/case/american-airlines-inc-v-sabre SABRE] leveraging their recommendation algorithm to place airlines who paid, higher on the recommendation than their competitors<ref name="copeland">Copeland, D.G., Mason, R.O., and McKenney, J.L. (1995). Sabre: The Development of Information-Based Competence and Execution of Information-Based Competition. IEEE Annals of the History of Computing, vol. 17, no. 3: 30-57.</ref>.<br />
<br />
== Types of Audits ==<br />
<br />
=== Code Audit<ref name="sandvig">Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." Data and discrimination: converting critical concerns into productive inquiry 22 (2014): 4349-4357. pg. 9-15 </ref> ===<br />
[[File: Code Audit.png|thumbnail|Visual of code audit from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 9]]<br />
A code audit is based on the concept of [https://en.wikipedia.org/wiki/Algorithmic_transparency Algorithm Transparency], where researchers acquire a copy of the algorithm and vet it for unethical behavior. This is uncommon because software is considered valuable intellectual property, and companies are reluctant to release the code, even to researchers. <br />
<br />
An issue with this method is that public interest in software can be malicious. On many platforms, the algorithm designers constantly battle those who try to take advantage of the software, so releasing code for review may cause more harm than good.<br />
<br />
=== Noninvasive User Audit <ref name="sandvig"/>===<br />
[[File: NonInvasive User Audit.png|thumbnail|Visualization of code audit from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 11]]<br />
A noninvasive user audit involves researchers requesting assorted input and output data from end users in order to analyze the workings of an algorithm. This form of audit carries the benefit of not interacting with a platform itself, ensuring that the researcher is not accused of tampering. The most challenging aspect of a user audit is generating a truly random and representative sample, making it difficult to identify [https://en.wikipedia.org/wiki/Causality causality].<br />
<br />
=== Scraping Audit <ref name="sandvig"/>===<br />
[[File: Scraping Audit.png|thumbnail|Scraping audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 12]]<br />
A scraping audit involves researchers issuing repeated queries to a platform, then observing the results, employing programs that utilize API data mining and [https://en.wikipedia.org/wiki/Web_scraping webpage scraping]. This is an effective method to gather a large quantity of data to be analyzed, however this audit still encounters the issue of securing a random sample. <br />
<br />
Another difficulty with this method is the likelihood of being prosecuted under the Computer Fraud and Abuse Act ([https://en.wikipedia.org/wiki/Computer_Fraud_and_Abuse_Act CFAA]), a law which has been criticized as being too vague, as it criminalizes any unauthorized access to a computer <ref name="Whitehouse"> Whitehouse, Sheldon. (2016). Hacking into the Computer Fraud and Abuse Act: the CFAA at 30. The George Washington law review, 84(6), 1437. George Washington Law Review.</ref>. Consent is difficult to obtain from a platform operator in order to perform a scraping audit.<br />
<br />
Similarly, a Terms of Service document on a Website may not have the force of law but some scholarly institutions pose that Internet researchers must not violate the Terms of Internet platforms, making the results from this form of research difficult to publish without proper permissions.<br />
<br />
=== Sock Puppet Audit <ref name="sandvig"/>===<br />
[[File: Sock Puppet Audit.png|thumbnail|Sock Puppet Audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 13]]<br />
A sock puppet audit uses computer programs to impersonate users by creating false accounts. This method effectively investigates sensitive topics in difficult-to-access [https://en.wikipedia.org/wiki/Network_domain domains], and sock puppets can investigate features of systems that are not public and penetrate groups that are difficult to pinpoint and acknowledge. A large number of sock puppets are required to derive significant findings from the audit. This method is still susceptible to CFAA violations, presenting danger to the researchers conducting the study.<br />
<br />
=== Collaborative or Crowdsourced Audit <ref name="sandvig"/>===<br />
[[File: Crowdsourcing_Collaboration_Audit.png|thumbnail|Collaborative or Crowdsourced Audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 15]]<br />
A crowdsourced audit is similar to a sock puppet audit, employing real humans instead of a computer program imitating a user. This change circumvents the CFAA in most cases as long as the platform being audited allows anyone to create an account. This form of audit has the ability to utilize positive aspects of the four previously mentioned designs. With a large enough test sample and semi-automated tactics, it becomes more feasible to generate a random representative sample to investigate with. An example of this is [https://en.wikipedia.org/wiki/Amazon_Mechanical_Turk Amazon's Mechanical Turk] which allows a large enough testers group to produce significant results . The only drawback is a crowdsourced audit requires a large budget to pay the participants.<br />
<br />
== Audits in Practice==<br />
The purpose of algorithmic auditing is to ensure that some given software is not violating any legal or moral code which exists to protect people utilizing the internet. There are only a handful of historical and modern cases of transparent algorithm audits.<br />
<br />
In 1984 USCAB declared that the airline sorting algorithm created by [https://en.wikipedia.org/wiki/Sabre_(computer_system) SABRE] must be known to participating airlines under the [https://www.govinfo.gov/content/pkg/CFR-2012-title14-vol4/pdf/CFR-2012-title14-vol4-sec255-4.pdf Display of Information in the Code of Federal Regulations]. This regulatory provision resulted from a bias identified by airlines using SABRE's software, which involved "screen science." and recommending airlines based on privatized criteria and capital incentives. The airlines noticed that American Airlines received prioritization in SABRE's algorithms search results, which led to an investigation by the USCAB and DOJ.<br />
<br />
One instance which could have been circumvented with an algorithm audit is the Volkswagen emissions standard scandal which occurred in 2015 <ref name="wagner">Wagner, Ben. (2016). Algorithmic regulation and the global default: Shifting norms in Internet technology. Etikk i praksis, 10(1), 5. Norwegian University of Science and Technology Library.</ref>. Wagner describes how this is a prime example of a situation in which a standardized, scalable governing system regarding algorithms proves its necessity. They go on to mention the ways in which technology is a certainly not a neutral player in the modern social infrastructure and how regulation will have to improve in the future to avoid damage to society.<br />
<br />
== Ethical Implications ==<br />
Due to the scale of technology companies, regulation cannot keep pace. As a result, general guidelines and laws for digital interactions. However, as the infrastructure continues to grow, there are new ways for the system to be leveraged in new ways. As a result, Algorithm Audits have become the most efficient way to assess digital platforms and providers' ethical impact on their consumers and users. These Audits have formulated direct ways to check digital platforms and create protections for auditors. However, these methods are difficult to employ and fall short in many areas of regulation. <ref>O'neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. Crown.</ref><br />
<br />
=== Research Protections ===<br />
Because researchers attempting to audit a platform or provider can be targeted by these large tech companies, legislation has been put in place through the CFAA and U.S. Department of Justice to protect the researchers from lawsuits. This resulted from the 2018 court ruling in SANDVIG et al. V. SESSIONS, where four university researchers attempted to scrape information on discrimination in the housing department. <br />
[[File:SandvigSessions.jpg|thumbnail| Image of Sadvig Sessions ruling, sourced from Google]]<br />
===== SANDVIG et al V. SESSIONS<ref>Lee, A. (2019). Online Research and Competition under the CFAA. Available at SSRN 3259701.</ref> =====<br />
"On Mar. 27, the United States District Court of D.C. ruled that such actions should not be viewed as criminal under the statute, though it declined to weigh in on whether the professors' conduct would be protected under the First Amendment.<br />
Without reaching this constitutional question, the Court concludes that the CFAA does not criminalize mere terms-of-service violations on consumer websites and, thus, that plaintiffs' proposed research plans are not criminal under the CFAA," <br />
U.S. District Judge John Bates wrote in his opinion.<ref>https://cases.justia.com/federal/district-courts/district-of-columbia/dcdce/1:2016cv01368/180080/24/0.pdf?ts=1522488415</ref><br />
<br />
=== Privacy Policies & Anonymity ===<br />
<br />
Performing any sort of audit requiring interaction with a platform can be dangerous because technology companies have outlined in their privacy policies restrictions on how users can engage on the forum. Researchers can be flagged for breaking these policies and banned from the platform. This can inhibit audits of large, developed tech companies if they're detected and can lead to lawsuits and further complications when the research is published. These large companies monitor data traffic and can identify a user based on their access point or device metadata, making it hard even to attempt many of the audit types.<br />
<br />
== Future of Algorithmic Audits ==<br />
A research project conducted by Jack Bandy explores past research performed on algorithmic audits for the purpose of plotting future trajectory in the field. They outline four types of behavior revealed by studied audits: discrimination, distortion, exploitation, and misjudgment, where the main target of investigations has been discrimination. Bandy describes how very few audits focused on the exploitation of personal information relative to the high number of scholars who have voiced their concerns about the issue appearing in algorithms. Despite the high number of studies (N = 62), they state that there is remarkable room for improvement in the future. <br />
<br />
The consensus from studying prior audits, Bandy states, is that every type of behavior that is attempting to be identified could benefit from greater specificity when performing the research. For example, algorithm research into discriminatory pricing focused greatly on the identification of such a practice, but not much beyond that. Bandy suggests that each unique instance which could draw misbehavior from an algorithm deserves to be its own field of research. <br />
<br />
Another important facet of the report is the recommendations for methodology in the future, where Bandy describes the skew between number of audits performed on influential companies and how they should be more evenly distributed. Google had the most audits at 30 while similarly powerful corporations such as Twitter and Spotify only underwent 3. Replication is the last important piece mentioned for future audits, under the reasoning that algorithms can change very frequently and drastically so implementation of open science practices is vital to future auditing results <ref name="Bandy"> Bandy, Jack. (2021). Problematic Machine Behavior: A Systematic Literature Review of Algorithm Audits.</ref>.<br />
<br />
== References ==</div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=Algorithmic_Audits&diff=101049
Algorithmic Audits
2021-04-08T19:20:01Z
<p>Nfigue: Modified and added to audits in practice section</p>
<hr />
<div><h1 style="font-size: 50; color: red; font-weight: 400;">Currently Being Edited</h1><br />
<br />
<br />
[[File:Internet-law.jpg|thumbnail| visual of digital law on computer screen, image sourced from Google]]<br />
An algorithmic audit is a process involving the collection and analysis of an [https://en.wikipedia.org/wiki/Algorithm algorithm] in some context to determine if its behavior is negatively affecting the interests or rights of any person it influences<ref name="brown> Shea Brown, Jovana Davidovic, & Ali Hasan. (2021). The algorithm audit: Scoring the algorithms that score us. Big data & society, 8. SAGE Publishing.</ref>. <br />
[[File:Audits.jpg|thumbnail| Visualization of Algorithmic Audit, Image sourced from Google]]<br />
<br />
The internet runs on software that the majority of people don't understand, or carry the impression that the algorithms are neutral (unbiased), meaning the internet can function relatively unchecked <ref name="wagner">Wagner, B. (2016). Algorithmic regulation and the global default: Shifting norms in Internet technology. Etikk I Praksis - Nordic Journal of Applied Ethics, 10(1), 5-13. https://doi.org/10.5324/eip.v10i1.1961</ref>. A study conducted by the PEW research center in February of 2021 showed that 85% of adults in the U.S utilize the internet daily <ref>Perrin, Andrew, and Sara Atske. About Three-in-Ten U.S. Adults Say They Are 'ALMOST Constantly' Online. 26 Mar. 2021, www.pewresearch.org/fact-tank/2021/03/26/about-three-in-ten-u-s-adults-say-they-are-almost-constantly-online/. </ref>. <br />
<br />
This statistic represents billions of people worldwide and emphasizes algorithms' persistence in our daily lives, despite a lack of general understanding. By looking at online platforms as a single entity and analyzing the results they're producing, more can be learned about algorithms and the way they affect society<ref name="sandvig1">Sandvig, Christian, et al. "An algorithm audit." Data and Discrimination: Collected Essays. Washington, DC: New America Foundation (2014): 6-10.</ref>. <br />
<br />
== Background ==<br />
The rapid advancement observed in the [https://en.wikipedia.org/wiki/High_tech tech industry] has revolutionized the way humans work, interact, and communicate. A diverse array of features are integrated into [https://en.wikipedia.org/wiki/Smart_device smart] applications for maximum [https://en.wikipedia.org/wiki/Availability availability]. At the core of all these real-time digital services lie algorithms which provide essential functions like sorting, segmentation, personalized recommendations, and information management<ref name="kotras">Kotras, Baptiste. "Mass personalization: Predictive marketing algorithms and the reshaping of consumer knowledge." Big Data & Society 7.2 (2020): 2053951720951581.</ref>. <br />
<br />
In the last decade, researchers have identified numerous ethical breaches in online platforms and published their findings. Countries worldwide have rolled out their regulations and laws to keep the Internet safe for their citizens. However, the these efforts do little to limit the international liberties afforded online, including access, anonymity, and privacy. Because technology and algorithms are heavily integrated into humanity, ethical issues arise when algorithms are opaque to public scrutiny. Since the 2000s, the regulation of algorithms has become a concern to governments worldwide due to data and privacy implications, though legislation has made progress in essential areas<ref name="goldsmith">Goldsmith, J. (2007), "Who Controls the Internet? Illusions of a Borderless World", Strategic Direction, Vol. 23 No. 11. https://doi.org/10.1108/sd.2007.05623kae.001</ref>.<br />
<br />
Higher education computer science and other similar programs have seen an increase in undergraduate and graduate enrollment, proportional to the tech industry's growth<ref name="usdoe">U.S. Department of Education, National Center for Education Statistics, Higher Education General Information Survey (HEGIS), "Degrees and Other Formal Awards Conferred" surveys, 1970-71 through 1985-86; Integrated Postsecondary Education Data System (IPEDS), "Completions Survey" (IPEDS-C:87-99); and IPEDS Fall 2000 through Fall 2011:https://nces.ed.gov/programs/digest/d12/tables/dt12_349.asp</ref>. Awareness about algorithms and digital platforms' infrastructure enables the public to identify ethical breaches in these systems' functionality. This awareness has sparked activism for ethics in technology. For example, [https://en.wikipedia.org/wiki/J Joy Buolamwini] identifying the bias in facial recognition technologies or [https://casetext.com/case/american-airlines-inc-v-sabre SABRE] leveraging their recommendation algorithm to place airlines who paid, higher on the recommendation than their competitors<ref name="copeland">Copeland, D.G., Mason, R.O., and McKenney, J.L. (1995). Sabre: The Development of Information-Based Competence and Execution of Information-Based Competition. IEEE Annals of the History of Computing, vol. 17, no. 3: 30-57.</ref>.<br />
<br />
== Types of Audits ==<br />
<br />
=== Code Audit<ref name="sandvig">Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." Data and discrimination: converting critical concerns into productive inquiry 22 (2014): 4349-4357. pg. 9-15 </ref> ===<br />
[[File: Code Audit.png|thumbnail|Visual of code audit from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 9]]<br />
A code audit is based on the concept of [https://en.wikipedia.org/wiki/Algorithmic_transparency Algorithm Transparency], where researchers acquire a copy of the algorithm and vet it for unethical behavior. This is uncommon because software is considered valuable intellectual property, and companies are reluctant to release the code, even to researchers. <br />
<br />
An issue with this method is that public interest in software can be malicious. On many platforms, the algorithm designers constantly battle those who try to take advantage of the software, so releasing code for review may cause more harm than good.<br />
<br />
=== Noninvasive User Audit <ref name="sandvig"/>===<br />
[[File: NonInvasive User Audit.png|thumbnail|Visualization of code audit from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 11]]<br />
A noninvasive user audit involves researchers requesting assorted input and output data from end users in order to analyze the workings of an algorithm. This form of audit carries the benefit of not interacting with a platform itself, ensuring that the researcher is not accused of tampering. The most challenging aspect of a user audit is generating a truly random and representative sample, making it difficult to identify [https://en.wikipedia.org/wiki/Causality causality].<br />
<br />
=== Scraping Audit <ref name="sandvig"/>===<br />
[[File: Scraping Audit.png|thumbnail|Scraping audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 12]]<br />
A scraping audit involves researchers issuing repeated queries to a platform, then observing the results, employing programs that utilize API data mining and [https://en.wikipedia.org/wiki/Web_scraping webpage scraping]. This is an effective method to gather a large quantity of data to be analyzed, however this audit still encounters the issue of securing a random sample. <br />
<br />
Another difficulty with this method is the likelihood of being prosecuted under the Computer Fraud and Abuse Act ([https://en.wikipedia.org/wiki/Computer_Fraud_and_Abuse_Act CFAA]), a law which has been criticized as being too vague, as it criminalizes any unauthorized access to a computer <ref name="Whitehouse"> Whitehouse, Sheldon. (2016). Hacking into the Computer Fraud and Abuse Act: the CFAA at 30. The George Washington law review, 84(6), 1437. George Washington Law Review.</ref>. Consent is difficult to obtain from a platform operator in order to perform a scraping audit.<br />
<br />
Similarly, a Terms of Service document on a Website may not have the force of law but some scholarly institutions pose that Internet researchers must not violate the Terms of Internet platforms, making the results from this form of research difficult to publish without proper permissions.<br />
<br />
=== Sock Puppet Audit <ref name="sandvig"/>===<br />
[[File: Sock Puppet Audit.png|thumbnail|Sock Puppet Audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 13]]<br />
A sock puppet audit uses computer programs to impersonate users by creating false accounts. This method effectively investigates sensitive topics in difficult-to-access [https://en.wikipedia.org/wiki/Network_domain domains], and sock puppets can investigate features of systems that are not public and penetrate groups that are difficult to pinpoint and acknowledge. A large number of sock puppets are required to derive significant findings from the audit. This method is still susceptible to CFAA violations, presenting danger to the researchers conducting the study.<br />
<br />
=== Collaborative or Crowdsourced Audit <ref name="sandvig"/>===<br />
[[File: Crowdsourcing_Collaboration_Audit.png|thumbnail|Collaborative or Crowdsourced Audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 15]]<br />
A crowdsourced audit is similar to a sock puppet audit, employing real humans instead of a computer program imitating a user. This change circumvents the CFAA in most cases as long as the platform being audited allows anyone to create an account. This form of audit has the ability to utilize positive aspects of the four previously mentioned designs. With a large enough test sample and semi-automated tactics, it becomes more feasible to generate a random representative sample to investigate with. An example of this is [https://en.wikipedia.org/wiki/Amazon_Mechanical_Turk Amazon's Mechanical Turk] which allows a large enough testers group to produce significant results . The only drawback is a crowdsourced audit requires a large budget to pay the participants.<br />
<br />
== Audits in Practice==<br />
The purpose of algorithmic auditing is to ensure that some given software is not violating any legal or moral code which exists to protect people utilizing the internet. There are only a handful of historical and modern cases of transparent algorithm audits.<br />
<br />
In 1984 USCAB declared that the airline sorting algorithm created by [https://en.wikipedia.org/wiki/Sabre_(computer_system) SABRE] must be known to participating airlines under the [https://www.govinfo.gov/content/pkg/CFR-2012-title14-vol4/pdf/CFR-2012-title14-vol4-sec255-4.pdf Display of Information in the Code of Federal Regulations]. This regulatory provision resulted from a bias identified by airlines using SABRE's software, which involved "screen science." and recommending airlines based on privatized criteria and capital incentives. The airlines noticed that American Airlines received prioritization in SABRE's algorithms search results, which led to an investigation by the USCAB and DOJ.<br />
<br />
One instance which could have been circumvented with an algorithm audit is the Volkswagen emissions standard scandal which occurred in 2015 <ref name="wagner">Wagner, Ben. (2016). Algorithmic regulation and the global default: Shifting norms in Internet technology. Etikk i praksis, 10(1), 5. Norwegian University of Science and Technology Library.</ref>. Wagner describes how this is a prime example of a situation in which a standardized, scalable governing system regarding algorithms proves its necessity. They go on to mention the ways in which technology is a certainly not a neutral player in the modern social infrastructure and how regulation will have to improve in the future to avoid damage to society.<br />
<br />
== Ethical Implications ==<br />
Due to the scale of technology companies, regulation cannot keep pace. As a result, general guidelines and laws for digital interactions. However, as the infrastructure continues to grow, there are new ways for the system to be leveraged in new ways. As a result, Algorithm Audits have become the most efficient way to assess digital platforms and providers' ethical impact on their consumers and users. These Audits have formulated direct ways to check digital platforms and create protections for auditors. However, these methods are difficult to employ and fall short in many areas of regulation. <ref>O'neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. Crown.</ref><br />
<br />
=== Research Protections ===<br />
Because researchers attempting to audit a platform or provider can be targeted by these large tech companies, legislation has been put in place through the CFAA and U.S. Department of Justice to protect the researchers from lawsuits. This resulted from the 2018 court ruling in SANDVIG et al. V. SESSIONS, where four university researchers attempted to scrape information on discrimination in the housing department. <br />
[[File:SandvigSessions.jpg|thumbnail| Image of Sadvig Sessions ruling, sourced from Google]]<br />
===== SANDVIG et al V. SESSIONS<ref>Lee, A. (2019). Online Research and Competition under the CFAA. Available at SSRN 3259701.</ref> =====<br />
"On Mar. 27, the United States District Court of D.C. ruled that such actions should not be viewed as criminal under the statute, though it declined to weigh in on whether the professors' conduct would be protected under the First Amendment.<br />
Without reaching this constitutional question, the Court concludes that the CFAA does not criminalize mere terms-of-service violations on consumer websites and, thus, that plaintiffs' proposed research plans are not criminal under the CFAA," <br />
U.S. District Judge John Bates wrote in his opinion.<ref>https://cases.justia.com/federal/district-courts/district-of-columbia/dcdce/1:2016cv01368/180080/24/0.pdf?ts=1522488415</ref><br />
<br />
=== Privacy Policies & Anonymity ===<br />
<br />
Performing any sort of audit requiring interaction with a platform can be dangerous because technology companies have outlined in their privacy policies restrictions on how users can engage on the forum. Researchers can be flagged for breaking these policies and banned from the platform. This can inhibit audits of large, developed tech companies if they're detected and can lead to lawsuits and further complications when the research is published. These large companies monitor data traffic and can identify a user based on their access point or device metadata, making it hard even to attempt many of the audit types.<br />
<br />
== References ==</div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=Algorithmic_Audits&diff=101041
Algorithmic Audits
2021-04-08T19:08:52Z
<p>Nfigue: /* Example Of Audit */</p>
<hr />
<div><h1 style="font-size: 50; color: red; font-weight: 400;">Currently Being Edited</h1><br />
<br />
<br />
[[File:Internet-law.jpg|thumbnail| visual of digital law on computer screen, image sourced from Google]]<br />
An algorithmic audit is a process involving the collection and analysis of an [https://en.wikipedia.org/wiki/Algorithm algorithm] in some context to determine if its behavior is negatively affecting the interests or rights of any person it influences<ref name="brown> Shea Brown, Jovana Davidovic, & Ali Hasan. (2021). The algorithm audit: Scoring the algorithms that score us. Big data & society, 8. SAGE Publishing.</ref>. <br />
[[File:Audits.jpg|thumbnail| Visualization of Algorithmic Audit, Image sourced from Google]]<br />
<br />
The internet runs on software that the majority of people don't understand, or carry the impression that the algorithms are neutral (unbiased), meaning the internet can function relatively unchecked <ref name="wagner">Wagner, B. (2016). Algorithmic regulation and the global default: Shifting norms in Internet technology. Etikk I Praksis - Nordic Journal of Applied Ethics, 10(1), 5-13. https://doi.org/10.5324/eip.v10i1.1961</ref>. A study conducted by the PEW research center in February of 2021 showed that 85% of adults in the U.S utilize the internet daily <ref>Perrin, Andrew, and Sara Atske. About Three-in-Ten U.S. Adults Say They Are 'ALMOST Constantly' Online. 26 Mar. 2021, www.pewresearch.org/fact-tank/2021/03/26/about-three-in-ten-u-s-adults-say-they-are-almost-constantly-online/. </ref>. <br />
<br />
This statistic represents billions of people worldwide and emphasizes algorithms' persistence in our daily lives, despite a lack of general understanding. By looking at online platforms as a single entity and analyzing the results they're producing, more can be learned about algorithms and the way they affect society<ref name="sandvig1">Sandvig, Christian, et al. "An algorithm audit." Data and Discrimination: Collected Essays. Washington, DC: New America Foundation (2014): 6-10.</ref>. <br />
<br />
== Background ==<br />
The rapid advancement observed in the [https://en.wikipedia.org/wiki/High_tech tech industry] has revolutionized the way humans work, interact, and communicate. A diverse array of features are integrated into [https://en.wikipedia.org/wiki/Smart_device smart] applications for maximum [https://en.wikipedia.org/wiki/Availability availability]. At the core of all these real-time digital services lie algorithms which provide essential functions like sorting, segmentation, personalized recommendations, and information management<ref name="kotras">Kotras, Baptiste. "Mass personalization: Predictive marketing algorithms and the reshaping of consumer knowledge." Big Data & Society 7.2 (2020): 2053951720951581.</ref>. <br />
<br />
In the last decade, researchers have identified numerous ethical breaches in online platforms and published their findings. Countries worldwide have rolled out their regulations and laws to keep the Internet safe for their citizens. However, the these efforts do little to limit the international liberties afforded online, including access, anonymity, and privacy. Because technology and algorithms are heavily integrated into humanity, ethical issues arise when algorithms are opaque to public scrutiny. Since the 2000s, the regulation of algorithms has become a concern to governments worldwide due to data and privacy implications, though legislation has made progress in essential areas<ref name="goldsmith">Goldsmith, J. (2007), "Who Controls the Internet? Illusions of a Borderless World", Strategic Direction, Vol. 23 No. 11. https://doi.org/10.1108/sd.2007.05623kae.001</ref>.<br />
<br />
Higher education computer science and other similar programs have seen an increase in undergraduate and graduate enrollment, proportional to the tech industry's growth<ref name="usdoe">U.S. Department of Education, National Center for Education Statistics, Higher Education General Information Survey (HEGIS), "Degrees and Other Formal Awards Conferred" surveys, 1970-71 through 1985-86; Integrated Postsecondary Education Data System (IPEDS), "Completions Survey" (IPEDS-C:87-99); and IPEDS Fall 2000 through Fall 2011:https://nces.ed.gov/programs/digest/d12/tables/dt12_349.asp</ref>. Awareness about algorithms and digital platforms' infrastructure enables the public to identify ethical breaches in these systems' functionality. This awareness has sparked activism for ethics in technology. For example, [https://en.wikipedia.org/wiki/J Joy Buolamwini] identifying the bias in facial recognition technologies or [https://casetext.com/case/american-airlines-inc-v-sabre SABRE] leveraging their recommendation algorithm to place airlines who paid, higher on the recommendation than their competitors<ref name="copeland">Copeland, D.G., Mason, R.O., and McKenney, J.L. (1995). Sabre: The Development of Information-Based Competence and Execution of Information-Based Competition. IEEE Annals of the History of Computing, vol. 17, no. 3: 30-57.</ref>.<br />
<br />
== Types of Audits ==<br />
<br />
=== Code Audit<ref name="sandvig">Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." Data and discrimination: converting critical concerns into productive inquiry 22 (2014): 4349-4357. pg. 9-15 </ref> ===<br />
[[File: Code Audit.png|thumbnail|Visual of code audit from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 9]]<br />
A code audit is based on the concept of [https://en.wikipedia.org/wiki/Algorithmic_transparency Algorithm Transparency], where researchers acquire a copy of the algorithm and vet it for unethical behavior. This is uncommon because software is considered valuable intellectual property, and companies are reluctant to release the code, even to researchers. <br />
<br />
An issue with this method is that public interest in software can be malicious. On many platforms, the algorithm designers constantly battle those who try to take advantage of the software, so releasing code for review may cause more harm than good.<br />
<br />
=== Noninvasive User Audit <ref name="sandvig"/>===<br />
[[File: NonInvasive User Audit.png|thumbnail|Visualization of code audit from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 11]]<br />
A noninvasive user audit involves researchers requesting assorted input and output data from end users in order to analyze the workings of an algorithm. This form of audit carries the benefit of not interacting with a platform itself, ensuring that the researcher is not accused of tampering. The most challenging aspect of a user audit is generating a truly random and representative sample, making it difficult to identify [https://en.wikipedia.org/wiki/Causality causality].<br />
<br />
=== Scraping Audit <ref name="sandvig"/>===<br />
[[File: Scraping Audit.png|thumbnail|Scraping audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 12]]<br />
A scraping audit involves researchers issuing repeated queries to a platform, then observing the results, employing programs that utilize API data mining and [https://en.wikipedia.org/wiki/Web_scraping webpage scraping]. This is an effective method to gather a large quantity of data to be analyzed, however this audit still encounters the issue of securing a random sample. <br />
<br />
Another difficulty with this method is the likelihood of being prosecuted under the Computer Fraud and Abuse Act ([https://en.wikipedia.org/wiki/Computer_Fraud_and_Abuse_Act CFAA]), a law which has been criticized as being too vague, as it criminalizes any unauthorized access to a computer <ref name="Whitehouse"> Whitehouse, Sheldon. (2016). Hacking into the Computer Fraud and Abuse Act: the CFAA at 30. The George Washington law review, 84(6), 1437. George Washington Law Review.</ref>. Consent is difficult to obtain from a platform operator in order to perform a scraping audit.<br />
<br />
Similarly, a Terms of Service document on a Website may not have the force of law but some scholarly institutions pose that Internet researchers must not violate the Terms of Internet platforms, making the results from this form of research difficult to publish without proper permissions.<br />
<br />
=== Sock Puppet Audit <ref name="sandvig"/>===<br />
[[File: Sock Puppet Audit.png|thumbnail|Sock Puppet Audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 13]]<br />
A sock puppet audit uses computer programs to impersonate users by creating false accounts. This method effectively investigates sensitive topics in difficult-to-access [https://en.wikipedia.org/wiki/Network_domain domains], and sock puppets can investigate features of systems that are not public and penetrate groups that are difficult to pinpoint and acknowledge. A large number of sock puppets are required to derive significant findings from the audit. This method is still susceptible to CFAA violations, presenting danger to the researchers conducting the study.<br />
<br />
=== Collaborative or Crowdsourced Audit <ref name="sandvig"/>===<br />
[[File: Crowdsourcing_Collaboration_Audit.png|thumbnail|Collaborative or Crowdsourced Audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 15]]<br />
A crowdsourced audit is similar to a sock puppet audit, employing real humans instead of a computer program imitating a user. This change circumvents the CFAA in most cases as long as the platform being audited allows anyone to create an account. This form of audit has the ability to utilize positive aspects of the four previously mentioned designs. With a large enough test sample and semi-automated tactics, it becomes more feasible to generate a random representative sample to investigate with. An example of this is [https://en.wikipedia.org/wiki/Amazon_Mechanical_Turk Amazon's Mechanical Turk] which allows a large enough testers group to produce significant results . The only drawback is a crowdsourced audit requires a large budget to pay the participants.<br />
<br />
== Examples Of Audits ==<br />
The purpose of algorithmic auditing is to ensure that some given software is not violating any legal or moral code which exists to protect people utilizing the internet. There are only a handful of historical and modern cases of transparent algorithm audits.<br />
<br />
In 1984 USCAB declared that the airline sorting algorithm created by [https://en.wikipedia.org/wiki/Sabre_(computer_system) SABRE] must be known to participating airlines under the [https://www.govinfo.gov/content/pkg/CFR-2012-title14-vol4/pdf/CFR-2012-title14-vol4-sec255-4.pdf Display of Information in the Code of Federal Regulations]. This regulatory provision resulted from a bias identified by airlines using SABRE's software, which involved "screen science." and recommending airlines based on privatized criteria and capital incentives. The airlines noticed that American Airlines received prioritization in SABRE's algorithms search results, which led to an investigation by the USCAB and DOJ.<br />
<br />
== Ethical Implications ==<br />
Due to the scale of technology companies, regulation cannot keep pace. As a result, general guidelines and laws for digital interactions. However, as the infrastructure continues to grow, there are new ways for the system to be leveraged in new ways. As a result, Algorithm Audits have become the most efficient way to assess digital platforms and providers' ethical impact on their consumers and users. These Audits have formulated direct ways to check digital platforms and create protections for auditors. However, these methods are difficult to employ and fall short in many areas of regulation. <ref>O'neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. Crown.</ref><br />
<br />
=== Research Protections ===<br />
Because researchers attempting to audit a platform or provider can be targeted by these large tech companies, legislation has been put in place through the CFAA and U.S. Department of Justice to protect the researchers from lawsuits. This resulted from the 2018 court ruling in SANDVIG et al. V. SESSIONS, where four university researchers attempted to scrape information on discrimination in the housing department. <br />
[[File:SandvigSessions.jpg|thumbnail| Image of Sadvig Sessions ruling, sourced from Google]]<br />
===== SANDVIG et al V. SESSIONS<ref>Lee, A. (2019). Online Research and Competition under the CFAA. Available at SSRN 3259701.</ref> =====<br />
"On Mar. 27, the United States District Court of D.C. ruled that such actions should not be viewed as criminal under the statute, though it declined to weigh in on whether the professors' conduct would be protected under the First Amendment.<br />
Without reaching this constitutional question, the Court concludes that the CFAA does not criminalize mere terms-of-service violations on consumer websites and, thus, that plaintiffs' proposed research plans are not criminal under the CFAA," <br />
U.S. District Judge John Bates wrote in his opinion.<ref>https://cases.justia.com/federal/district-courts/district-of-columbia/dcdce/1:2016cv01368/180080/24/0.pdf?ts=1522488415</ref><br />
<br />
=== Privacy Policies & Anonymity ===<br />
<br />
Performing any sort of audit requiring interaction with a platform can be dangerous because technology companies have outlined in their privacy policies restrictions on how users can engage on the forum. Researchers can be flagged for breaking these policies and banned from the platform. This can inhibit audits of large, developed tech companies if they're detected and can lead to lawsuits and further complications when the research is published. These large companies monitor data traffic and can identify a user based on their access point or device metadata, making it hard even to attempt many of the audit types.<br />
<br />
== References ==</div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=Algorithmic_Audits&diff=101035
Algorithmic Audits
2021-04-08T18:57:33Z
<p>Nfigue: Edited for understanding</p>
<hr />
<div><h1 style="font-size: 50; color: red; font-weight: 400;">Currently Being Edited</h1><br />
<br />
<br />
[[File:Internet-law.jpg|thumbnail| visual of digital law on computer screen, image sourced from Google]]<br />
An algorithmic audit is a process involving the collection and analysis of an [https://en.wikipedia.org/wiki/Algorithm algorithm] in some context to determine if its behavior is negatively affecting the interests or rights of any person it influences<ref name="brown> Shea Brown, Jovana Davidovic, & Ali Hasan. (2021). The algorithm audit: Scoring the algorithms that score us. Big data & society, 8. SAGE Publishing.</ref>. <br />
[[File:Audits.jpg|thumbnail| Visualization of Algorithmic Audit, Image sourced from Google]]<br />
<br />
The internet runs on software that the majority of people don't understand, or carry the impression that the algorithms are neutral (unbiased), meaning the internet can function relatively unchecked <ref name="wagner">Wagner, B. (2016). Algorithmic regulation and the global default: Shifting norms in Internet technology. Etikk I Praksis - Nordic Journal of Applied Ethics, 10(1), 5-13. https://doi.org/10.5324/eip.v10i1.1961</ref>. A study conducted by the PEW research center in February of 2021 showed that 85% of adults in the U.S utilize the internet daily <ref>Perrin, Andrew, and Sara Atske. About Three-in-Ten U.S. Adults Say They Are 'ALMOST Constantly' Online. 26 Mar. 2021, www.pewresearch.org/fact-tank/2021/03/26/about-three-in-ten-u-s-adults-say-they-are-almost-constantly-online/. </ref>. <br />
<br />
This statistic represents billions of people worldwide and emphasizes algorithms' persistence in our daily lives, despite a lack of general understanding. By looking at online platforms as a single entity and analyzing the results they're producing, more can be learned about algorithms and the way they affect society<ref name="sandvig1">Sandvig, Christian, et al. "An algorithm audit." Data and Discrimination: Collected Essays. Washington, DC: New America Foundation (2014): 6-10.</ref>. <br />
<br />
== Background ==<br />
The rapid advancement observed in the [https://en.wikipedia.org/wiki/High_tech tech industry] has revolutionized the way humans work, interact, and communicate. A diverse array of features are integrated into [https://en.wikipedia.org/wiki/Smart_device smart] applications for maximum [https://en.wikipedia.org/wiki/Availability availability]. At the core of all these real-time digital services lie algorithms which provide essential functions like sorting, segmentation, personalized recommendations, and information management<ref name="kotras">Kotras, Baptiste. "Mass personalization: Predictive marketing algorithms and the reshaping of consumer knowledge." Big Data & Society 7.2 (2020): 2053951720951581.</ref>. <br />
<br />
In the last decade, researchers have identified numerous ethical breaches in online platforms and published their findings. Countries worldwide have rolled out their regulations and laws to keep the Internet safe for their citizens. However, the these efforts do little to limit the international liberties afforded online, including access, anonymity, and privacy. Because technology and algorithms are heavily integrated into humanity, ethical issues arise when algorithms are opaque to public scrutiny. Since the 2000s, the regulation of algorithms has become a concern to governments worldwide due to data and privacy implications, though legislation has made progress in essential areas<ref name="goldsmith">Goldsmith, J. (2007), "Who Controls the Internet? Illusions of a Borderless World", Strategic Direction, Vol. 23 No. 11. https://doi.org/10.1108/sd.2007.05623kae.001</ref>.<br />
<br />
Higher education computer science and other similar programs have seen an increase in undergraduate and graduate enrollment, proportional to the tech industry's growth<ref name="usdoe">U.S. Department of Education, National Center for Education Statistics, Higher Education General Information Survey (HEGIS), "Degrees and Other Formal Awards Conferred" surveys, 1970-71 through 1985-86; Integrated Postsecondary Education Data System (IPEDS), "Completions Survey" (IPEDS-C:87-99); and IPEDS Fall 2000 through Fall 2011:https://nces.ed.gov/programs/digest/d12/tables/dt12_349.asp</ref>. Awareness about algorithms and digital platforms' infrastructure enables the public to identify ethical breaches in these systems' functionality. This awareness has sparked activism for ethics in technology. For example, [https://en.wikipedia.org/wiki/J Joy Buolamwini] identifying the bias in facial recognition technologies or [https://casetext.com/case/american-airlines-inc-v-sabre SABRE] leveraging their recommendation algorithm to place airlines who paid, higher on the recommendation than their competitors<ref name="copeland">Copeland, D.G., Mason, R.O., and McKenney, J.L. (1995). Sabre: The Development of Information-Based Competence and Execution of Information-Based Competition. IEEE Annals of the History of Computing, vol. 17, no. 3: 30-57.</ref>.<br />
<br />
== Types of Audits ==<br />
<br />
=== Code Audit<ref name="sandvig">Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." Data and discrimination: converting critical concerns into productive inquiry 22 (2014): 4349-4357. pg. 9-15 </ref> ===<br />
[[File: Code Audit.png|thumbnail|Visual of code audit from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 9]]<br />
A code audit is based on the concept of [https://en.wikipedia.org/wiki/Algorithmic_transparency Algorithm Transparency], where researchers acquire a copy of the algorithm and vet it for unethical behavior. This is uncommon because software is considered valuable intellectual property, and companies are reluctant to release the code, even to researchers. <br />
<br />
An issue with this method is that public interest in software can be malicious. On many platforms, the algorithm designers constantly battle those who try to take advantage of the software, so releasing code for review may cause more harm than good.<br />
<br />
=== Noninvasive User Audit <ref name="sandvig"/>===<br />
[[File: NonInvasive User Audit.png|thumbnail|Visualization of code audit from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 11]]<br />
A noninvasive user audit involves researchers requesting assorted input and output data from end users in order to analyze the workings of an algorithm. This form of audit carries the benefit of not interacting with a platform itself, ensuring that the researcher is not accused of tampering. The most challenging aspect of a user audit is generating a truly random and representative sample, making it difficult to identify [https://en.wikipedia.org/wiki/Causality causality].<br />
<br />
=== Scraping Audit <ref name="sandvig"/>===<br />
[[File: Scraping Audit.png|thumbnail|Scraping audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 12]]<br />
A scraping audit involves researchers issuing repeated queries to a platform, then observing the results, employing programs that utilize API data mining and [https://en.wikipedia.org/wiki/Web_scraping webpage scraping]. This is an effective method to gather a large quantity of data to be analyzed, however this audit still encounters the issue of securing a random sample. <br />
<br />
Another difficulty with this method is the likelihood of being prosecuted under the Computer Fraud and Abuse Act ([https://en.wikipedia.org/wiki/Computer_Fraud_and_Abuse_Act CFAA]), a law which has been criticized as being too vague, as it criminalizes any unauthorized access to a computer <ref name="Whitehouse"> Whitehouse, Sheldon. (2016). Hacking into the Computer Fraud and Abuse Act: the CFAA at 30. The George Washington law review, 84(6), 1437. George Washington Law Review.</ref>. Consent is difficult to obtain from a platform operator in order to perform a scraping audit.<br />
<br />
Similarly, a Terms of Service document on a Website may not have the force of law but some scholarly institutions pose that Internet researchers must not violate the Terms of Internet platforms, making the results from this form of research difficult to publish without proper permissions.<br />
<br />
=== Sock Puppet Audit <ref name="sandvig"/>===<br />
[[File: Sock Puppet Audit.png|thumbnail|Sock Puppet Audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 13]]<br />
A sock puppet audit uses computer programs to impersonate users by creating false accounts. This method effectively investigates sensitive topics in difficult-to-access [https://en.wikipedia.org/wiki/Network_domain domains], and sock puppets can investigate features of systems that are not public and penetrate groups that are difficult to pinpoint and acknowledge. A large number of sock puppets are required to derive significant findings from the audit. This method is still susceptible to CFAA violations, presenting danger to the researchers conducting the study.<br />
<br />
=== Collaborative or Crowdsourced Audit <ref name="sandvig"/>===<br />
[[File: Crowdsourcing_Collaboration_Audit.png|thumbnail|Collaborative or Crowdsourced Audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 15]]<br />
A crowdsourced audit is similar to a sock puppet audit, employing real humans instead of a computer program imitating a user. This change circumvents the CFAA in most cases as long as the platform being audited allows anyone to create an account. This form of audit has the ability to utilize positive aspects of the four previously mentioned designs. With a large enough test sample and semi-automated tactics, it becomes more feasible to generate a random representative sample to investigate with. An example of this is [https://en.wikipedia.org/wiki/Amazon_Mechanical_Turk Amazon's Mechanical Turk] which allows a large enough testers group to produce significant results . The only drawback is a crowdsourced audit requires a large budget to pay the participants.<br />
<br />
== Example Of Audit ==<br />
The purpose of algorithmic auditing is to ensure that some given software is not violating any legal or moral code which exists to protect people utilizing the internet. There is one situation which exemplifies this purpose, involving early computing systems.<br />
<br />
In 1984 USCAB declared that the airline sorting algorithm created by [https://en.wikipedia.org/wiki/Sabre_(computer_system) SABRE] must be known to participating airlines under the [https://www.govinfo.gov/content/pkg/CFR-2012-title14-vol4/pdf/CFR-2012-title14-vol4-sec255-4.pdf Display of Information in the Code of Federal Regulations]. This regulatory provision resulted from a bias identified by airlines using SABRE's software, which involved "screen science." and recommending airlines based on privatized criteria and capital incentives. The airlines noticed that American Airlines received prioritization in SABRE's algorithms search results, which led to an investigation by the USCAB and DOJ.<br />
<br />
== Ethical Implications ==<br />
Due to the scale of technology companies, regulation cannot keep pace. As a result, general guidelines and laws for digital interactions. However, as the infrastructure continues to grow, there are new ways for the system to be leveraged in new ways. As a result, Algorithm Audits have become the most efficient way to assess digital platforms and providers' ethical impact on their consumers and users. These Audits have formulated direct ways to check digital platforms and create protections for auditors. However, these methods are difficult to employ and fall short in many areas of regulation. <ref>O'neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. Crown.</ref><br />
<br />
=== Research Protections ===<br />
Because researchers attempting to audit a platform or provider can be targeted by these large tech companies, legislation has been put in place through the CFAA and U.S. Department of Justice to protect the researchers from lawsuits. This resulted from the 2018 court ruling in SANDVIG et al. V. SESSIONS, where four university researchers attempted to scrape information on discrimination in the housing department. <br />
[[File:SandvigSessions.jpg|thumbnail| Image of Sadvig Sessions ruling, sourced from Google]]<br />
===== SANDVIG et al V. SESSIONS<ref>Lee, A. (2019). Online Research and Competition under the CFAA. Available at SSRN 3259701.</ref> =====<br />
"On Mar. 27, the United States District Court of D.C. ruled that such actions should not be viewed as criminal under the statute, though it declined to weigh in on whether the professors' conduct would be protected under the First Amendment.<br />
Without reaching this constitutional question, the Court concludes that the CFAA does not criminalize mere terms-of-service violations on consumer websites and, thus, that plaintiffs' proposed research plans are not criminal under the CFAA," <br />
U.S. District Judge John Bates wrote in his opinion.<ref>https://cases.justia.com/federal/district-courts/district-of-columbia/dcdce/1:2016cv01368/180080/24/0.pdf?ts=1522488415</ref><br />
<br />
=== Privacy Policies & Anonymity ===<br />
<br />
Performing any sort of audit requiring interaction with a platform can be dangerous because technology companies have outlined in their privacy policies restrictions on how users can engage on the forum. Researchers can be flagged for breaking these policies and banned from the platform. This can inhibit audits of large, developed tech companies if they're detected and can lead to lawsuits and further complications when the research is published. These large companies monitor data traffic and can identify a user based on their access point or device metadata, making it hard even to attempt many of the audit types.<br />
<br />
== References ==</div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=Algorithmic_Audits&diff=101030
Algorithmic Audits
2021-04-08T18:50:21Z
<p>Nfigue: Removed most of the legislation section for being off-topic</p>
<hr />
<div><h1 style="font-size: 50; color: red; font-weight: 400;">Currently Being Edited</h1><br />
<br />
<br />
[[File:Internet-law.jpg|thumbnail| visual of digital law on computer screen, image sourced from Google]]<br />
An algorithmic audit is a process involving the collection and analysis of an [https://en.wikipedia.org/wiki/Algorithm algorithm] in some context to determine if its behavior is negatively affecting the interests or rights of any person it influences<ref name="brown> Shea Brown, Jovana Davidovic, & Ali Hasan. (2021). The algorithm audit: Scoring the algorithms that score us. Big data & society, 8. SAGE Publishing.</ref>. <br />
[[File:Audits.jpg|thumbnail| Visualization of Algorithmic Audit, Image sourced from Google]]<br />
<br />
The internet runs on software that the majority of people don't understand, or carry the impression that the algorithms are neutral (unbiased), meaning the internet can function relatively unchecked <ref name="wagner">Wagner, B. (2016). Algorithmic regulation and the global default: Shifting norms in Internet technology. Etikk I Praksis - Nordic Journal of Applied Ethics, 10(1), 5-13. https://doi.org/10.5324/eip.v10i1.1961</ref>. A study conducted by the PEW research center in February of 2021 showed that 85% of adults in the U.S utilize the internet daily <ref>Perrin, Andrew, and Sara Atske. About Three-in-Ten U.S. Adults Say They Are 'ALMOST Constantly' Online. 26 Mar. 2021, www.pewresearch.org/fact-tank/2021/03/26/about-three-in-ten-u-s-adults-say-they-are-almost-constantly-online/. </ref>. <br />
<br />
This statistic represents billions of people worldwide and emphasizes algorithms' persistence in our daily lives, despite a lack of general understanding. By looking at online platforms as a single entity and analyzing the results they're producing, more can be learned about algorithms and the way they affect society<ref name="sandvig1">Sandvig, Christian, et al. "An algorithm audit." Data and Discrimination: Collected Essays. Washington, DC: New America Foundation (2014): 6-10.</ref>. <br />
<br />
== Background ==<br />
The rapid advancement observed in the [https://en.wikipedia.org/wiki/High_tech tech industry] has revolutionized the way humans work, interact, and communicate. A diverse array of features are integrated into [https://en.wikipedia.org/wiki/Smart_device smart] applications for maximum [https://en.wikipedia.org/wiki/Availability availability]. At the core of all these real-time digital services lie algorithms which provide essential functions like sorting, segmentation, personalized recommendations, and information management<ref name="kotras">Kotras, Baptiste. "Mass personalization: Predictive marketing algorithms and the reshaping of consumer knowledge." Big Data & Society 7.2 (2020): 2053951720951581.</ref>. <br />
<br />
In the last decade, researchers have identified numerous ethical breaches in online platforms and published their findings. Countries worldwide have rolled out their regulations and laws to keep the Internet safe for their citizens. However, the these efforts do little to limit the international liberties afforded online, including access, anonymity, and privacy. Because technology and algorithms are heavily integrated into humanity, ethical issues arise when algorithms are opaque to public scrutiny. Since the 2000s, the regulation of algorithms has become a concern to governments worldwide due to data and privacy implications, though legislation has made progress in essential areas<ref name="goldsmith">Goldsmith, J. (2007), "Who Controls the Internet? Illusions of a Borderless World", Strategic Direction, Vol. 23 No. 11. https://doi.org/10.1108/sd.2007.05623kae.001</ref>.<br />
<br />
Higher education computer science and other similar programs have seen an increase in undergraduate and graduate enrollment, proportional to the tech industry's growth<ref name="usdoe">U.S. Department of Education, National Center for Education Statistics, Higher Education General Information Survey (HEGIS), "Degrees and Other Formal Awards Conferred" surveys, 1970-71 through 1985-86; Integrated Postsecondary Education Data System (IPEDS), "Completions Survey" (IPEDS-C:87-99); and IPEDS Fall 2000 through Fall 2011:https://nces.ed.gov/programs/digest/d12/tables/dt12_349.asp</ref>. Awareness about algorithms and digital platforms' infrastructure enables the public to identify ethical breaches in these systems' functionality. This awareness has sparked activism for ethics in technology. For example, [https://en.wikipedia.org/wiki/J Joy Buolamwini] identifying the bias in facial recognition technologies or [https://casetext.com/case/american-airlines-inc-v-sabre SABRE] leveraging their recommendation algorithm to place airlines who paid, higher on the recommendation than their competitors<ref name="copeland">Copeland, D.G., Mason, R.O., and McKenney, J.L. (1995). Sabre: The Development of Information-Based Competence and Execution of Information-Based Competition. IEEE Annals of the History of Computing, vol. 17, no. 3: 30-57.</ref>.<br />
<br />
== Types of Audits ==<br />
<br />
=== Code Audit<ref name="sandvig">Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." Data and discrimination: converting critical concerns into productive inquiry 22 (2014): 4349-4357. pg. 9-15 </ref> ===<br />
[[File: Code Audit.png|thumbnail|Visual of code audit from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 9]]<br />
A code audit is based on the concept of [https://en.wikipedia.org/wiki/Algorithmic_transparency Algorithm Transparency], where researchers acquire a copy of the algorithm and vet it for unethical behavior. This is uncommon because software is considered valuable intellectual property, and companies are reluctant to release the code, even to researchers. <br />
<br />
An issue with this method is that public interest in software can be malicious. On many platforms, the algorithm designers constantly battle those who try to take advantage of the software, so releasing code for review may cause more harm than good.<br />
<br />
=== Noninvasive User Audit <ref name="sandvig"/>===<br />
[[File: NonInvasive User Audit.png|thumbnail|Visualization of code audit from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 11]]<br />
A noninvasive user audit involves researchers requesting assorted input and output data from end users in order to analyze the workings of an algorithm. This form of audit carries the benefit of not interacting with a platform itself, ensuring that the researcher is not accused of tampering. The most challenging aspect of a user audit is generating a truly random and representative sample, making it difficult to identify [https://en.wikipedia.org/wiki/Causality causality].<br />
<br />
=== Scraping Audit <ref name="sandvig"/>===<br />
[[File: Scraping Audit.png|thumbnail|Scraping audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 12]]<br />
A scraping audit involves researchers issuing repeated queries to a platform, then observing the results, employing programs that utilize API data mining and [https://en.wikipedia.org/wiki/Web_scraping webpage scraping]. This is an effective method to gather a large quantity of data to be analyzed, however this audit still encounters the issue of securing a random sample. <br />
<br />
Another difficulty with this method is the likelihood of being prosecuted under the Computer Fraud and Abuse Act ([https://en.wikipedia.org/wiki/Computer_Fraud_and_Abuse_Act CFAA]), a law which has been criticized as being too vague, as it criminalizes any unauthorized access to a computer <ref name="Whitehouse"> Whitehouse, Sheldon. (2016). Hacking into the Computer Fraud and Abuse Act: the CFAA at 30. The George Washington law review, 84(6), 1437. George Washington Law Review.</ref>. Consent is difficult to obtain from a platform operator in order to perform a scraping audit.<br />
<br />
Similarly, a Terms of Service document on a Website may not have the force of law but some scholarly institutions pose that Internet researchers must not violate the Terms of Internet platforms, making the results from this form of research difficult to publish without proper permissions.<br />
<br />
=== Sock Puppet Audit <ref name="sandvig"/>===<br />
[[File: Sock Puppet Audit.png|thumbnail|Sock Puppet Audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 13]]<br />
A sock puppet audit uses computer programs to impersonate users by creating false accounts. This method effectively investigates sensitive topics in difficult-to-access [https://en.wikipedia.org/wiki/Network_domain domains], and sock puppets can investigate features of systems that are not public and penetrate groups that are difficult to pinpoint and acknowledge. A large number of sock puppets are required to derive significant findings from the audit. This method is still susceptible to CFAA violations, presenting danger to the researchers conducting the study.<br />
<br />
=== Collaborative or Crowdsourced Audit <ref name="sandvig"/>===<br />
[[File: Crowdsourcing_Collaboration_Audit.png|thumbnail|Collaborative or Crowdsourced Audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 15]]<br />
A crowdsourced audit is similar to a sock puppet audit, employing real humans instead of a computer program imitating a user. This change circumvents the CFAA in most cases as long as the platform being audited allows anyone to create an account. This form of audit has the ability to utilize positive aspects of the four previously mentioned designs. With a large enough test sample and semi-automated tactics, it becomes more feasible to generate a random representative sample to investigate with. An example of this is [https://en.wikipedia.org/wiki/Amazon_Mechanical_Turk Amazon's Mechanical Turk] which allows a large enough testers group to produce significant results . The only drawback is a crowdsourced audit requires a large budget to pay the participants.<br />
<br />
== Example Of Audit ==<br />
The purpose of algorithmic auditing is to ensure that some given software is not violating any legal or moral code which exists to protect people utilizing the internet. There is one situation which exemplifies this purpose, involving early computing systems.<br />
<br />
In 1984 USCAB declared that the airline sorting algorithm created by [https://en.wikipedia.org/wiki/Sabre_(computer_system) SABRE] must be known to participating airlines under the [https://www.govinfo.gov/content/pkg/CFR-2012-title14-vol4/pdf/CFR-2012-title14-vol4-sec255-4.pdf Display of Information in the Code of Federal Regulations]. This regulatory provision resulted from a bias identified by airlines using SABRE's software, which involved "screen science." and recommending airlines based on privatized criteria and capital incentives. The airlines noticed that American Airlines received prioritization in SABRE's algorithms search results, which led to an investigation by the USCAB and DOJ.<br />
<br />
== Ethical Implications ==<br />
Due to the scale of technology companies, regulation cannot keep pace. As a result, general guidelines and laws for digital interactions. However, as the infrastructure continues to grow, there are new ways for the system to be leveraged in new ways. As a result, Algorithm Audits have become the most efficient way to assess digital platforms and providers' ethical impact on their consumers and users. These Audits have formulated direct ways to check digital platforms and create protections for auditors. However, these methods are difficult to employ and fall short in many areas of regulation. <ref>O'neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. Crown.</ref><br />
<br />
=== Research Protections ===<br />
Because researchers attempting to audit a platform or provider can be targeted by these large tech companies, legislation has been put in place through the CFAA and U.S. Department of Justice to protect the researchers from lawsuits. This resulted from the 2018 court ruling in SANDVIG et al. V. SESSIONS, where four university researchers attempted to scrape information on discrimination in the housing department. <br />
[[File:SandvigSessions.jpg|thumbnail| Image of Sadvig Sessions ruling, sourced from Google]]<br />
===== SANDVIG et al V. SESSIONS<ref>Lee, A. (2019). Online Research and Competition under the CFAA. Available at SSRN 3259701.</ref> =====<br />
"On Mar. 27, the United States District Court of D.C. ruled that such actions should not be viewed as criminal under the statute, though it declined to weigh in on whether the professors' conduct would be protected under the First Amendment.<br />
Without reaching this constitutional question, the Court concludes that the CFAA does not criminalize mere terms-of-service violations on consumer websites and, thus, that plaintiffs' proposed research plans are not criminal under the CFAA," <br />
U.S. District Judge John Bates wrote in his opinion.<ref>https://cases.justia.com/federal/district-courts/district-of-columbia/dcdce/1:2016cv01368/180080/24/0.pdf?ts=1522488415</ref><br />
<br />
=== Privacy Policies & Anonymity ===<br />
<br />
If attempting to do a code, scraping, or sock puppet audit requires researchers to engage with the digital platform directly. This can be dangerous because technology companies have outlined in their privacy policies restrictions on how users can engage on the forum. Researchers can be flagged for breaking these policies and banned from the platform. This can inhibit researchers' abilities to audit large developed tech companies if they're detected and can lead to lawsuits and further complications when the research is published. These large companies deal with data and can identify a user based on their access point or device metadata, and this can make it hard even to attempt many of the audit types.<br />
<br />
== References ==</div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=Algorithmic_Audits&diff=101025
Algorithmic Audits
2021-04-08T18:31:02Z
<p>Nfigue: minor rewording</p>
<hr />
<div><h1 style="font-size: 50; color: red; font-weight: 400;">Currently Being Edited</h1><br />
<br />
<br />
[[File:Internet-law.jpg|thumbnail| visual of digital law on computer screen, image sourced from Google]]<br />
An algorithmic audit is a process involving the collection and analysis of an [https://en.wikipedia.org/wiki/Algorithm algorithm] in some context to determine if its behavior is negatively affecting the interests or rights of any person it influences<ref name="brown> Shea Brown, Jovana Davidovic, & Ali Hasan. (2021). The algorithm audit: Scoring the algorithms that score us. Big data & society, 8. SAGE Publishing.</ref>. <br />
[[File:Audits.jpg|thumbnail| Visualization of Algorithmic Audit, Image sourced from Google]]<br />
<br />
The internet runs on software that the majority of people don't understand, or carry the impression that the algorithms are neutral (unbiased), meaning the internet can function relatively unchecked <ref name="wagner">Wagner, B. (2016). Algorithmic regulation and the global default: Shifting norms in Internet technology. Etikk I Praksis - Nordic Journal of Applied Ethics, 10(1), 5-13. https://doi.org/10.5324/eip.v10i1.1961</ref>. A study conducted by the PEW research center in February of 2021 showed that 85% of adults in the U.S utilize the internet daily <ref>Perrin, Andrew, and Sara Atske. About Three-in-Ten U.S. Adults Say They Are 'ALMOST Constantly' Online. 26 Mar. 2021, www.pewresearch.org/fact-tank/2021/03/26/about-three-in-ten-u-s-adults-say-they-are-almost-constantly-online/. </ref>. <br />
<br />
This statistic represents billions of people worldwide and emphasizes algorithms' persistence in our daily lives, despite a lack of general understanding. By looking at online platforms as a single entity and analyzing the results they're producing, more can be learned about algorithms and the way they affect society<ref name="sandvig1">Sandvig, Christian, et al. "An algorithm audit." Data and Discrimination: Collected Essays. Washington, DC: New America Foundation (2014): 6-10.</ref>. <br />
<br />
== Background ==<br />
The rapid advancement observed in the [https://en.wikipedia.org/wiki/High_tech tech industry] has revolutionized the way humans work, interact, and communicate. A diverse array of features are integrated into [https://en.wikipedia.org/wiki/Smart_device smart] applications for maximum [https://en.wikipedia.org/wiki/Availability availability]. At the core of all these real-time digital services lie algorithms which provide essential functions like sorting, segmentation, personalized recommendations, and information management<ref name="kotras">Kotras, Baptiste. "Mass personalization: Predictive marketing algorithms and the reshaping of consumer knowledge." Big Data & Society 7.2 (2020): 2053951720951581.</ref>. <br />
<br />
In the last decade, researchers have identified numerous ethical breaches in online platforms and published their findings. Countries worldwide have rolled out their regulations and laws to keep the Internet safe for their citizens. However, the these efforts do little to limit the international liberties afforded online, including access, anonymity, and privacy. Because technology and algorithms are heavily integrated into humanity, ethical issues arise when algorithms are opaque to public scrutiny. Since the 2000s, the regulation of algorithms has become a concern to governments worldwide due to data and privacy implications, though legislation has made progress in essential areas<ref name="goldsmith">Goldsmith, J. (2007), "Who Controls the Internet? Illusions of a Borderless World", Strategic Direction, Vol. 23 No. 11. https://doi.org/10.1108/sd.2007.05623kae.001</ref>.<br />
<br />
Higher education computer science and other similar programs have seen an increase in undergraduate and graduate enrollment, proportional to the tech industry's growth<ref name="usdoe">U.S. Department of Education, National Center for Education Statistics, Higher Education General Information Survey (HEGIS), "Degrees and Other Formal Awards Conferred" surveys, 1970-71 through 1985-86; Integrated Postsecondary Education Data System (IPEDS), "Completions Survey" (IPEDS-C:87-99); and IPEDS Fall 2000 through Fall 2011:https://nces.ed.gov/programs/digest/d12/tables/dt12_349.asp</ref>. Awareness about algorithms and digital platforms' infrastructure enables the public to identify ethical breaches in these systems' functionality. This awareness has sparked activism for ethics in technology. For example, [https://en.wikipedia.org/wiki/J Joy Buolamwini] identifying the bias in facial recognition technologies or [https://casetext.com/case/american-airlines-inc-v-sabre SABRE] leveraging their recommendation algorithm to place airlines who paid, higher on the recommendation than their competitors<ref name="copeland">Copeland, D.G., Mason, R.O., and McKenney, J.L. (1995). Sabre: The Development of Information-Based Competence and Execution of Information-Based Competition. IEEE Annals of the History of Computing, vol. 17, no. 3: 30-57.</ref>.<br />
<br />
== Types of Audits ==<br />
<br />
=== Code Audit<ref name="sandvig">Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." Data and discrimination: converting critical concerns into productive inquiry 22 (2014): 4349-4357. pg. 9-15 </ref> ===<br />
[[File: Code Audit.png|thumbnail|Visual of code audit from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 9]]<br />
A code audit is based on the concept of [https://en.wikipedia.org/wiki/Algorithmic_transparency Algorithm Transparency], where researchers acquire a copy of the algorithm and vet it for unethical behavior. This is uncommon because software is considered valuable intellectual property, and companies are reluctant to release the code, even to researchers. <br />
<br />
An issue with this method is that public interest in software can be malicious. On many platforms, the algorithm designers constantly battle those who try to take advantage of the software, so releasing code for review may cause more harm than good.<br />
<br />
=== Noninvasive User Audit <ref name="sandvig"/>===<br />
[[File: NonInvasive User Audit.png|thumbnail|Visualization of code audit from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 11]]<br />
A noninvasive user audit involves researchers requesting assorted input and output data from end users in order to analyze the workings of an algorithm. This form of audit carries the benefit of not interacting with a platform itself, ensuring that the researcher is not accused of tampering. The most challenging aspect of a user audit is generating a truly random and representative sample, making it difficult to identify [https://en.wikipedia.org/wiki/Causality causality].<br />
<br />
=== Scraping Audit <ref name="sandvig"/>===<br />
[[File: Scraping Audit.png|thumbnail|Scraping audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 12]]<br />
A scraping audit involves researchers issuing repeated queries to a platform, then observing the results, employing programs that utilize API data mining and [https://en.wikipedia.org/wiki/Web_scraping webpage scraping]. This is an effective method to gather a large quantity of data to be analyzed, however this audit still encounters the issue of securing a random sample. <br />
<br />
Another difficulty with this method is the likelihood of being prosecuted under the Computer Fraud and Abuse Act ([https://en.wikipedia.org/wiki/Computer_Fraud_and_Abuse_Act CFAA]), a law which has been criticized as being too vague, as it criminalizes any unauthorized access to a computer. Consent is difficult to obtain from a platform operator in order to perform a scraping audit.<br />
<br />
Similarly, a Terms of Service document on a Website may not have the force of law but some scholarly institutions pose that Internet researchers must not violate the Terms of Internet platforms, making the results from this form of research difficult to publish without proper permissions.<br />
<br />
=== Sock Puppet Audit <ref name="sandvig"/>===<br />
[[File: Sock Puppet Audit.png|thumbnail|Sock Puppet Audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 13]]<br />
A sock puppet audit uses computer programs to impersonate users by creating false accounts. This method effectively investigates sensitive topics in difficult-to-access [https://en.wikipedia.org/wiki/Network_domain domains], and sock puppets can investigate features of systems that are not public and penetrate groups that are difficult to pinpoint and acknowledge. A large number of sock puppets are required to derive significant findings from the audit. This method is still susceptible to CFAA violations, presenting danger to the researchers conducting the study.<br />
<br />
=== Collaborative or Crowdsourced Audit <ref name="sandvig"/>===<br />
[[File: Crowdsourcing_Collaboration_Audit.png|thumbnail|Collaborative or Crowdsourced Audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 15]]<br />
A crowdsourced audit is similar to a sock puppet audit, employing real humans instead of a computer program imitating a user. This change circumvents the CFAA in most cases as long as the platform being audited allows anyone to create an account. This form of audit has the ability to utilize positive aspects of the four previously mentioned designs. With a large enough test sample and semi-automated tactics, it becomes more feasible to generate a random representative sample to investigate with. An example of this is [https://en.wikipedia.org/wiki/Amazon_Mechanical_Turk Amazon's Mechanical Turk] which allows a large enough testers group to produce significant results . The only drawback is a crowdsourced audit requires a large budget to pay the participants.<br />
<br />
== Legislation ==<br />
[[File:Cfaa.jpg|thumbnail| Image of CFAA text, sourced from Google]]<br />
Since the internet became a public commodity, the value of information has increased significantly. The US justice system has seen numerous cases involving digital privacy, ownership, plagiarism, hacking, fraud, and more. In response, laws have developed to categorize and respond to common disputes.<ref>upcounsel (2020). Internet Law: Everything You Need to Know.</ref><br />
<br />
=== Display of Information<ref>Code, U. S. US Code of Federal Regulations.</ref> ===<br />
In 1984 USCAB declared that the airline sorting algorithm created by [https://en.wikipedia.org/wiki/Sabre_(computer_system) SABRE] must be known to participating airlines under the [https://www.govinfo.gov/content/pkg/CFR-2012-title14-vol4/pdf/CFR-2012-title14-vol4-sec255-4.pdf Display of Information in the Code of Federal Regulations]. This regulatory provision resulted from a bias identified by airlines using SABRE's software, which involved "screen science." and recommending airlines based on privatized criteria and capital incentives. The airlines noticed that American Airlines received prioritization in SABRE's algorithms search results, which led to an investigation by the USCAB and DOJ.<br />
<br />
=== The Communications Decency Act<ref>Ardia, D. S. (2009). Free speech savior or shield for scoundrels: an empirical study of intermediary immunity under Section 230 of the Communications Decency Act. Loy. LAL Rev., 43, 373.</ref> ===<br />
[https://en.wikipedia.org/wiki/Section_230 Section 230] of the Communications Decency Act states that "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." This section is one of the most potent active legislative acts concerning the internet and digital communications and one of its defining limitations. This provision protects service providers from the legal implications of removing content deemed obscene or offensive, even of constitutionally protected speech, as long as it is done in good faith.<br />
<br />
=== US Computer Fraud and Abuse Act<ref>Griffith, D. S. (1990). The computer fraud and abuse act of 1986: a measured response to a growing problem. Vand. L. Rev., 43, 453.</ref> ===<br />
The United States Computer Fraud and Abuse Act attaches sentences to digital behaviors deemed illegal or unlawful and provides clearly stated repercussions for committing such an offense. <br />
<br />
The Act states: “The CFAA prohibits intentionally accessing a computer without authorization or in excess of authorization" but fails to define what ‘without authorization’ means. With harsh penalty schemes and malleable provisions, it has become a tool ripe for abuse, prompting potential changes to the legislation <ref name="Whitehouse"> Whitehouse, Sheldon. (2016). Hacking into the Computer Fraud and Abuse Act: the CFAA at 30. The George Washington law review, 84(6), 1437. George Washington Law Review.</ref> .<br />
{| class="wikitable"<br />
|-<br />
! Crime/ Offense !! Years In Prison<br />
|-<br />
| Obtaining National Security Information || 10<br />
|-<br />
| Accessing a Computer and Obtaining Information || 1-5<br />
|-<br />
| Trespassing in a Government Computer || 1<br />
|-<br />
| Accessing a Computer to Defraud and Obtain Value || 5<br />
|-<br />
| Intentionally Damaging by Knowing Transmission || 1-10<br />
|-<br />
| Recklessly Damaging by Intentional Access || 1-5<br />
|-<br />
| Negligently Causing Damage and Loss by Intentional Access || 1<br />
|-<br />
| Trafficking in Passwords || 1<br />
|-<br />
| Extortion Involving Computers || 5<br />
|-<br />
| Attempt and Conspiracy to Commit such an Offense || 10<br />
|}<br />
<br />
=== Photo Sharing Law ===<br />
A law put in place by U.S. Legislation. The Photo Sharing Law outlines the general regulation for photography in the U.S. This resulted from the potential security implications internally and the privacy of citizens. The law mainly outlines these regulations:<br />
# Private property is protected and established by the owner of the property, it is illegal to refuse to abide by the owner's requests concerning photographing activity.<br />
#Photographing private property from within the public domain is not illegal.<br />
# taking pictures of public places objects and structures are legal unless explicitly prohibited.<br />
# Outer space requires a license to be issued by the National Oceanic and Atmospheric Administration in advance.<br />
# Commercial photography will likely require a permit and proof of insurance.<br />
# Photographing accident scenes and law enforcement activities is legal, as long as it doesn't hinder the operations of law enforcement, medical, emergency, or security personnel by filming.<br />
# Any filming with the intent of doing unlawful harm against a subject may be a violation of the law in itself.<br />
<br />
<br />
== Ethical Implications ==<br />
Due to the scale of technology companies, regulation cannot keep pace. As a result, general guidelines and laws for digital interactions. However, as the infrastructure continues to grow, there are new ways for the system to be leveraged in new ways. As a result, Algorithm Audits have become the most efficient way to assess digital platforms and providers' ethical impact on their consumers and users. These Audits have formulated direct ways to check digital platforms and create protections for auditors. However, these methods are difficult to employ and fall short in many areas of regulation. <ref>O'neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. Crown.</ref><br />
<br />
=== Research Protections ===<br />
Because researchers attempting to audit a platform or provider can be targeted by these large tech companies, legislation has been put in place through the CFAA and U.S. Department of Justice to protect the researchers from lawsuits. This resulted from the 2018 court ruling in SANDVIG et al. V. SESSIONS, where four university researchers attempted to scrape information on discrimination in the housing department. <br />
[[File:SandvigSessions.jpg|thumbnail| Image of Sadvig Sessions ruling, sourced from Google]]<br />
===== SANDVIG et al V. SESSIONS<ref>Lee, A. (2019). Online Research and Competition under the CFAA. Available at SSRN 3259701.</ref> =====<br />
"On Mar. 27, the United States District Court of D.C. ruled that such actions should not be viewed as criminal under the statute, though it declined to weigh in on whether the professors' conduct would be protected under the First Amendment.<br />
Without reaching this constitutional question, the Court concludes that the CFAA does not criminalize mere terms-of-service violations on consumer websites and, thus, that plaintiffs' proposed research plans are not criminal under the CFAA," <br />
U.S. District Judge John Bates wrote in his opinion.<ref>https://cases.justia.com/federal/district-courts/district-of-columbia/dcdce/1:2016cv01368/180080/24/0.pdf?ts=1522488415</ref><br />
<br />
=== Privacy Policies & Anonymity ===<br />
<br />
If attempting to do a code, scraping, or sock puppet audit requires researchers to engage with the digital platform directly. This can be dangerous because technology companies have outlined in their privacy policies restrictions on how users can engage on the forum. Researchers can be flagged for breaking these policies and banned from the platform. This can inhibit researchers' abilities to audit large developed tech companies if they're detected and can lead to lawsuits and further complications when the research is published. These large companies deal with data and can identify a user based on their access point or device metadata, and this can make it hard even to attempt many of the audit types.<br />
<br />
== References ==</div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=Algorithmic_Audits&diff=101018
Algorithmic Audits
2021-04-08T18:14:34Z
<p>Nfigue: Changed introduction definition to be more explanatory</p>
<hr />
<div><h1 style="font-size: 50; color: red; font-weight: 400;">Currently Being Edited</h1><br />
<br />
<br />
[[File:Internet-law.jpg|thumbnail| visual of digital law on computer screen, image sourced from Google]]<br />
An algorithmic audit is a process involving the collection and analysis of an [https://en.wikipedia.org/wiki/Algorithm algorithm] in some context to determine if its behavior is negatively affecting the interests or rights of any person it influences<ref name="brown> Shea Brown, Jovana Davidovic, & Ali Hasan. (2021). The algorithm audit: Scoring the algorithms that score us. Big data & society, 8. SAGE Publishing.</ref>. <br />
[[File:Audits.jpg|thumbnail| Visualization of Algorithmic Audit, Image sourced from Google]]<br />
<br />
The internet runs on algorithms that a majority of people don't understand, meaning it can function relatively unchecked and leaving people with the impression that these algorithms are neutral (unbiased)<ref name="wagner">Wagner, B. (2016). Algorithmic regulation and the global default: Shifting norms in Internet technology. Etikk I Praksis - Nordic Journal of Applied Ethics, 10(1), 5-13. https://doi.org/10.5324/eip.v10i1.1961</ref>. A study conducted by the PEW research center in February of 2021 showed that 85% of adults in the U.S utilize the internet daily <ref>Perrin, Andrew, and Sara Atske. About Three-in-Ten U.S. Adults Say They Are 'ALMOST Constantly' Online. 26 Mar. 2021, www.pewresearch.org/fact-tank/2021/03/26/about-three-in-ten-u-s-adults-say-they-are-almost-constantly-online/. </ref>. <br />
<br />
This statistic represents billions of people worldwide and emphasizes algorithms' persistence in our daily lives, despite a lack of general understanding. By looking at online platforms as a single entity and analyzing the results they're producing, more can be learned about algorithms and the way they affect society<ref name="sandvig1">Sandvig, Christian, et al. "An algorithm audit." Data and Discrimination: Collected Essays. Washington, DC: New America Foundation (2014): 6-10.</ref>. <br />
<br />
== Background ==<br />
The rapid advancement observed in the [https://en.wikipedia.org/wiki/High_tech tech industry] has revolutionized the way humans work, interact, and communicate. A diverse array of features are integrated into [https://en.wikipedia.org/wiki/Smart_device smart] applications for maximum [https://en.wikipedia.org/wiki/Availability availability]. At the core of all these real-time digital services lie algorithms which provide essential functions like sorting, segmentation, personalized recommendations, and information management<ref name="kotras">Kotras, Baptiste. "Mass personalization: Predictive marketing algorithms and the reshaping of consumer knowledge." Big Data & Society 7.2 (2020): 2053951720951581.</ref>. <br />
<br />
In the last decade, researchers have identified numerous ethical breaches in online platforms and published their findings. Countries worldwide have rolled out their regulations and laws to keep the Internet safe for their citizens. However, the these efforts do little to limit the international liberties afforded online, including access, anonymity, and privacy. Because technology and algorithms are heavily integrated into humanity, ethical issues arise when algorithms are opaque to public scrutiny. Since the 2000s, the regulation of algorithms has become a concern to governments worldwide due to data and privacy implications, though legislation has made progress in essential areas<ref name="goldsmith">Goldsmith, J. (2007), "Who Controls the Internet? Illusions of a Borderless World", Strategic Direction, Vol. 23 No. 11. https://doi.org/10.1108/sd.2007.05623kae.001</ref>.<br />
<br />
Higher education computer science and other similar programs have seen an increase in undergraduate and graduate enrollment, proportional to the tech industry's growth<ref name="usdoe">U.S. Department of Education, National Center for Education Statistics, Higher Education General Information Survey (HEGIS), "Degrees and Other Formal Awards Conferred" surveys, 1970-71 through 1985-86; Integrated Postsecondary Education Data System (IPEDS), "Completions Survey" (IPEDS-C:87-99); and IPEDS Fall 2000 through Fall 2011:https://nces.ed.gov/programs/digest/d12/tables/dt12_349.asp</ref>. Awareness about algorithms and digital platforms' infrastructure enables the public to identify ethical breaches in these systems' functionality. This awareness has sparked activism for ethics in technology. For example, [https://en.wikipedia.org/wiki/J Joy Buolamwini] identifying the bias in facial recognition technologies or [https://casetext.com/case/american-airlines-inc-v-sabre SABRE] leveraging their recommendation algorithm to place airlines who paid, higher on the recommendation than their competitors<ref name="copeland">Copeland, D.G., Mason, R.O., and McKenney, J.L. (1995). Sabre: The Development of Information-Based Competence and Execution of Information-Based Competition. IEEE Annals of the History of Computing, vol. 17, no. 3: 30-57.</ref>.<br />
<br />
== Types of Audits ==<br />
<br />
=== Code Audit<ref name="sandvig">Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." Data and discrimination: converting critical concerns into productive inquiry 22 (2014): 4349-4357. pg. 9-15 </ref> ===<br />
[[File: Code Audit.png|thumbnail|Visual of code audit from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 9]]<br />
A code audit is based on the concept of [https://en.wikipedia.org/wiki/Algorithmic_transparency Algorithm Transparency], where researchers acquire a copy of the algorithm and vet it for unethical behavior. This is uncommon because software is considered valuable intellectual property, and companies are reluctant to release the code, even to researchers. <br />
<br />
An issue with this method is that public interest in software can be malicious. On many platforms, the algorithm designers constantly battle those who try to take advantage of the software, so releasing code for review may cause more harm than good.<br />
<br />
=== Noninvasive User Audit <ref name="sandvig"/>===<br />
[[File: NonInvasive User Audit.png|thumbnail|Visualization of code audit from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 11]]<br />
A noninvasive user audit involves researchers requesting assorted input and output data from end users in order to analyze the workings of an algorithm. This form of audit carries the benefit of not interacting with a platform itself, ensuring that the researcher is not accused of tampering. The most challenging aspect of a user audit is generating a truly random and representative sample, making it difficult to identify [https://en.wikipedia.org/wiki/Causality causality].<br />
<br />
=== Scraping Audit <ref name="sandvig"/>===<br />
[[File: Scraping Audit.png|thumbnail|Scraping audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 12]]<br />
A scraping audit involves researchers issuing repeated queries to a platform, then observing the results, employing programs that utilize API data mining and [https://en.wikipedia.org/wiki/Web_scraping webpage scraping]. This is an effective method to gather a large quantity of data to be analyzed, however this audit still encounters the issue of securing a random sample. <br />
<br />
Another difficulty with this method is the likelihood of being prosecuted under the Computer Fraud and Abuse Act ([https://en.wikipedia.org/wiki/Computer_Fraud_and_Abuse_Act CFAA]), a law which has been criticized as being too vague, as it criminalizes any unauthorized access to a computer. Consent is difficult to obtain from a platform operator in order to perform a scraping audit.<br />
<br />
Similarly, a Terms of Service document on a Website may not have the force of law but some scholarly institutions pose that Internet researchers must not violate the Terms of Internet platforms, making the results from this form of research difficult to publish without proper permissions.<br />
<br />
=== Sock Puppet Audit <ref name="sandvig"/>===<br />
[[File: Sock Puppet Audit.png|thumbnail|Sock Puppet Audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 13]]<br />
A sock puppet audit uses computer programs to impersonate users by creating false accounts. This method effectively investigates sensitive topics in difficult-to-access [https://en.wikipedia.org/wiki/Network_domain domains], and sock puppets can investigate features of systems that are not public and penetrate groups that are difficult to pinpoint and acknowledge. A large number of sock puppets are required to derive significant findings from the audit. This method is still susceptible to CFAA violations, presenting danger to the researchers conducting the study.<br />
<br />
=== Collaborative or Crowdsourced Audit <ref name="sandvig"/>===<br />
[[File: Crowdsourcing_Collaboration_Audit.png|thumbnail|Collaborative or Crowdsourced Audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 15]]<br />
A crowdsourced audit is similar to a sock puppet audit, employing real humans instead of a computer program imitating a user. This change circumvents the CFAA in most cases as long as the platform being audited allows anyone to create an account. This form of audit has the ability to utilize positive aspects of the four previously mentioned designs. With a large enough test sample and semi-automated tactics, it becomes more feasible to generate a random representative sample to investigate with. An example of this is [https://en.wikipedia.org/wiki/Amazon_Mechanical_Turk Amazon's Mechanical Turk] which allows a large enough testers group to produce significant results . The only drawback is a crowdsourced audit requires a large budget to pay the participants.<br />
<br />
== Legislation ==<br />
[[File:Cfaa.jpg|thumbnail| Image of CFAA text, sourced from Google]]<br />
Since the internet became a public commodity, the value of information has increased significantly. The US justice system has seen numerous cases involving digital privacy, ownership, plagiarism, hacking, fraud, and more. In response, laws have developed to categorize and respond to common disputes.<ref>upcounsel (2020). Internet Law: Everything You Need to Know.</ref><br />
<br />
=== Display of Information<ref>Code, U. S. US Code of Federal Regulations.</ref> ===<br />
In 1984 USCAB declared that the airline sorting algorithm created by [https://en.wikipedia.org/wiki/Sabre_(computer_system) SABRE] must be known to participating airlines under the [https://www.govinfo.gov/content/pkg/CFR-2012-title14-vol4/pdf/CFR-2012-title14-vol4-sec255-4.pdf Display of Information in the Code of Federal Regulations]. This regulatory provision resulted from a bias identified by airlines using SABRE's software, which involved "screen science." and recommending airlines based on privatized criteria and capital incentives. The airlines noticed that American Airlines received prioritization in SABRE's algorithms search results, which led to an investigation by the USCAB and DOJ.<br />
<br />
=== The Communications Decency Act<ref>Ardia, D. S. (2009). Free speech savior or shield for scoundrels: an empirical study of intermediary immunity under Section 230 of the Communications Decency Act. Loy. LAL Rev., 43, 373.</ref> ===<br />
[https://en.wikipedia.org/wiki/Section_230 Section 230] of the Communications Decency Act states that "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." This section is one of the most potent active legislative acts concerning the internet and digital communications and one of its defining limitations. This provision protects service providers from the legal implications of removing content deemed obscene or offensive, even of constitutionally protected speech, as long as it is done in good faith.<br />
<br />
=== US Computer Fraud and Abuse Act<ref>Griffith, D. S. (1990). The computer fraud and abuse act of 1986: a measured response to a growing problem. Vand. L. Rev., 43, 453.</ref> ===<br />
The United States Computer Fraud and Abuse Act attaches sentences to digital behaviors deemed illegal or unlawful and provides clearly stated repercussions for committing such an offense. <br />
<br />
The Act states: “The CFAA prohibits intentionally accessing a computer without authorization or in excess of authorization" but fails to define what ‘without authorization’ means. With harsh penalty schemes and malleable provisions, it has become a tool ripe for abuse, prompting potential changes to the legislation <ref name="Whitehouse"> Whitehouse, Sheldon. (2016). Hacking into the Computer Fraud and Abuse Act: the CFAA at 30. The George Washington law review, 84(6), 1437. George Washington Law Review.</ref> .<br />
{| class="wikitable"<br />
|-<br />
! Crime/ Offense !! Years In Prison<br />
|-<br />
| Obtaining National Security Information || 10<br />
|-<br />
| Accessing a Computer and Obtaining Information || 1-5<br />
|-<br />
| Trespassing in a Government Computer || 1<br />
|-<br />
| Accessing a Computer to Defraud and Obtain Value || 5<br />
|-<br />
| Intentionally Damaging by Knowing Transmission || 1-10<br />
|-<br />
| Recklessly Damaging by Intentional Access || 1-5<br />
|-<br />
| Negligently Causing Damage and Loss by Intentional Access || 1<br />
|-<br />
| Trafficking in Passwords || 1<br />
|-<br />
| Extortion Involving Computers || 5<br />
|-<br />
| Attempt and Conspiracy to Commit such an Offense || 10<br />
|}<br />
<br />
=== Photo Sharing Law ===<br />
A law put in place by U.S. Legislation. The Photo Sharing Law outlines the general regulation for photography in the U.S. This resulted from the potential security implications internally and the privacy of citizens. The law mainly outlines these regulations:<br />
# Private property is protected and established by the owner of the property, it is illegal to refuse to abide by the owner's requests concerning photographing activity.<br />
#Photographing private property from within the public domain is not illegal.<br />
# taking pictures of public places objects and structures are legal unless explicitly prohibited.<br />
# Outer space requires a license to be issued by the National Oceanic and Atmospheric Administration in advance.<br />
# Commercial photography will likely require a permit and proof of insurance.<br />
# Photographing accident scenes and law enforcement activities is legal, as long as it doesn't hinder the operations of law enforcement, medical, emergency, or security personnel by filming.<br />
# Any filming with the intent of doing unlawful harm against a subject may be a violation of the law in itself.<br />
<br />
<br />
== Ethical Implications ==<br />
Due to the scale of technology companies, regulation cannot keep pace. As a result, general guidelines and laws for digital interactions. However, as the infrastructure continues to grow, there are new ways for the system to be leveraged in new ways. As a result, Algorithm Audits have become the most efficient way to assess digital platforms and providers' ethical impact on their consumers and users. These Audits have formulated direct ways to check digital platforms and create protections for auditors. However, these methods are difficult to employ and fall short in many areas of regulation. <ref>O'neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. Crown.</ref><br />
<br />
=== Research Protections ===<br />
Because researchers attempting to audit a platform or provider can be targeted by these large tech companies, legislation has been put in place through the CFAA and U.S. Department of Justice to protect the researchers from lawsuits. This resulted from the 2018 court ruling in SANDVIG et al. V. SESSIONS, where four university researchers attempted to scrape information on discrimination in the housing department. <br />
[[File:SandvigSessions.jpg|thumbnail| Image of Sadvig Sessions ruling, sourced from Google]]<br />
===== SANDVIG et al V. SESSIONS<ref>Lee, A. (2019). Online Research and Competition under the CFAA. Available at SSRN 3259701.</ref> =====<br />
"On Mar. 27, the United States District Court of D.C. ruled that such actions should not be viewed as criminal under the statute, though it declined to weigh in on whether the professors' conduct would be protected under the First Amendment.<br />
Without reaching this constitutional question, the Court concludes that the CFAA does not criminalize mere terms-of-service violations on consumer websites and, thus, that plaintiffs' proposed research plans are not criminal under the CFAA," <br />
U.S. District Judge John Bates wrote in his opinion.<ref>https://cases.justia.com/federal/district-courts/district-of-columbia/dcdce/1:2016cv01368/180080/24/0.pdf?ts=1522488415</ref><br />
<br />
=== Privacy Policies & Anonymity ===<br />
<br />
If attempting to do a code, scraping, or sock puppet audit requires researchers to engage with the digital platform directly. This can be dangerous because technology companies have outlined in their privacy policies restrictions on how users can engage on the forum. Researchers can be flagged for breaking these policies and banned from the platform. This can inhibit researchers' abilities to audit large developed tech companies if they're detected and can lead to lawsuits and further complications when the research is published. These large companies deal with data and can identify a user based on their access point or device metadata, and this can make it hard even to attempt many of the audit types.<br />
<br />
== References ==</div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=Algorithmic_Audits&diff=100939
Algorithmic Audits
2021-04-08T16:55:18Z
<p>Nfigue: Small changes to CFAA section, added citation</p>
<hr />
<div><h1 style="font-size: 50; color: red; font-weight: 400;">Currently Being Edited</h1><br />
<br />
<br />
[[File:Internet-law.jpg|thumbnail| visual of digital law on computer screen, image sourced from Google]]<br />
<br />
Algorithmic Auditing is a research design that has shown promise in diagnosing the unwanted consequences of [https://en.wikipedia.org/wiki/Algorithm algorithmic] systems. <ref name="auditing_algorithms">https://auditingalgorithms.science/?page_id=89#:~:text=Auditing%20Algorithms%3A%20Adding%20Accountability%20to,in%20diagnosing%20the%20unwanted%20consequences</ref><br />
[[File:Audits.jpg|thumbnail| Visualization of Algorithmic Audit, Image sourced from Google]]<br />
<br />
The internet runs on algorithms that a majority of people don't understand, meaning it can function relatively unchecked and leaving people with the impression that these algorithms are neutral (unbiased)<ref name="wagner">Wagner, B. (2016). Algorithmic regulation and the global default: Shifting norms in Internet technology. Etikk I Praksis - Nordic Journal of Applied Ethics, 10(1), 5-13. https://doi.org/10.5324/eip.v10i1.1961</ref>. A study conducted by the PEW research center in February of 2021 showed that 85% of adults in the U.S utilize the internet daily <ref>Perrin, Andrew, and Sara Atske. About Three-in-Ten U.S. Adults Say They Are 'ALMOST Constantly' Online. 26 Mar. 2021, www.pewresearch.org/fact-tank/2021/03/26/about-three-in-ten-u-s-adults-say-they-are-almost-constantly-online/. </ref>. <br />
<br />
This statistic represents billions of people worldwide and emphasizes algorithms' persistence in our daily lives, despite a lack of general understanding. By looking at online platforms as a single entity and analyzing the results they're producing, more can be learned about algorithms and the way they affect society<ref name="sandvig1">Sandvig, Christian, et al. "An algorithm audit." Data and Discrimination: Collected Essays. Washington, DC: New America Foundation (2014): 6-10.</ref>. <br />
<br />
== Background ==<br />
The rapid advancement observed in the [https://en.wikipedia.org/wiki/High_tech tech industry] has revolutionized the way humans work, interact, and communicate. A diverse array of features are integrated into [https://en.wikipedia.org/wiki/Smart_device smart] applications for maximum [https://en.wikipedia.org/wiki/Availability availability]. At the core of all these real-time digital services lie algorithms which provide essential functions like sorting, segmentation, personalized recommendations, and information management<ref name="kotras">Kotras, Baptiste. "Mass personalization: Predictive marketing algorithms and the reshaping of consumer knowledge." Big Data & Society 7.2 (2020): 2053951720951581.</ref>. <br />
<br />
In the last decade, researchers have identified numerous ethical breaches in online platforms and published their findings. Countries worldwide have rolled out their regulations and laws to keep the Internet safe for their citizens. However, the these efforts do little to limit the international liberties afforded online, including access, anonymity, and privacy. Because technology and algorithms are heavily integrated into humanity, ethical issues arise when algorithms are opaque to public scrutiny. Since the 2000s, the regulation of algorithms has become a concern to governments worldwide due to data and privacy implications, though legislation has made progress in essential areas<ref name="goldsmith">Goldsmith, J. (2007), "Who Controls the Internet? Illusions of a Borderless World", Strategic Direction, Vol. 23 No. 11. https://doi.org/10.1108/sd.2007.05623kae.001</ref>.<br />
<br />
Higher education computer science and other similar programs have seen an increase in undergraduate and graduate enrollment, proportional to the tech industry's growth<ref name="usdoe">U.S. Department of Education, National Center for Education Statistics, Higher Education General Information Survey (HEGIS), "Degrees and Other Formal Awards Conferred" surveys, 1970-71 through 1985-86; Integrated Postsecondary Education Data System (IPEDS), "Completions Survey" (IPEDS-C:87-99); and IPEDS Fall 2000 through Fall 2011:https://nces.ed.gov/programs/digest/d12/tables/dt12_349.asp</ref>. Awareness about algorithms and digital platforms' infrastructure enables the public to identify ethical breaches in these systems' functionality. This awareness has sparked activism for ethics in technology. For example, [https://en.wikipedia.org/wiki/J Joy Buolamwini] identifying the bias in facial recognition technologies or [https://casetext.com/case/american-airlines-inc-v-sabre SABRE] leveraging their recommendation algorithm to place airlines who paid, higher on the recommendation than their competitors<ref name="copeland">Copeland, D.G., Mason, R.O., and McKenney, J.L. (1995). Sabre: The Development of Information-Based Competence and Execution of Information-Based Competition. IEEE Annals of the History of Computing, vol. 17, no. 3: 30-57.</ref>.<br />
<br />
== Types of Audits ==<br />
<br />
=== Code Audit<ref name="sandvig">Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." Data and discrimination: converting critical concerns into productive inquiry 22 (2014): 4349-4357. pg. 9-15 </ref> ===<br />
[[File: Code Audit.png|thumbnail|Visual of code audit from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 9]]<br />
A code audit is based on the concept of [https://en.wikipedia.org/wiki/Algorithmic_transparency Algorithm Transparency], where researchers acquire a copy of the algorithm and vet it for unethical behavior. This is uncommon because software is considered valuable intellectual property, and companies are reluctant to release the code, even to researchers. <br />
<br />
An issue with this method is that public interest in software can be malicious. On many platforms, the algorithm designers constantly battle those who try to take advantage of the software, so releasing code for review may cause more harm than good.<br />
<br />
=== Noninvasive User Audit <ref name="sandvig"/>===<br />
[[File: NonInvasive User Audit.png|thumbnail|Visualization of code audit from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 11]]<br />
A noninvasive user audit involves researchers requesting assorted input and output data from end users in order to analyze the workings of an algorithm. This form of audit carries the benefit of not interacting with a platform itself, ensuring that the researcher is not accused of tampering. The most challenging aspect of a user audit is generating a truly random and representative sample, making it difficult to identify [https://en.wikipedia.org/wiki/Causality causality].<br />
<br />
=== Scraping Audit <ref name="sandvig"/>===<br />
[[File: Scraping Audit.png|thumbnail|Scraping audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 12]]<br />
A scraping audit involves researchers issuing repeated queries to a platform, then observing the results, employing programs that utilize API data mining and [https://en.wikipedia.org/wiki/Web_scraping webpage scraping]. This is an effective method to gather a large quantity of data to be analyzed, however this audit still encounters the issue of securing a random sample. <br />
<br />
Another difficulty with this method is the likelihood of being prosecuted under the Computer Fraud and Abuse Act ([https://en.wikipedia.org/wiki/Computer_Fraud_and_Abuse_Act CFAA]), a law which has been criticized as being too vague, as it criminalizes any unauthorized access to a computer. Consent is difficult to obtain from a platform operator in order to perform a scraping audit.<br />
<br />
Similarly, a Terms of Service document on a Website may not have the force of law but some scholarly institutions pose that Internet researchers must not violate the Terms of Internet platforms, making the results from this form of research difficult to publish without proper permissions.<br />
<br />
=== Sock Puppet Audit <ref name="sandvig"/>===<br />
[[File: Sock Puppet Audit.png|thumbnail|Sock Puppet Audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 13]]<br />
A sock puppet audit uses computer programs to impersonate users by creating false accounts. This method effectively investigates sensitive topics in difficult-to-access [https://en.wikipedia.org/wiki/Network_domain domains], and sock puppets can investigate features of systems that are not public and penetrate groups that are difficult to pinpoint and acknowledge. A large number of sock puppets are required to derive significant findings from the audit. This method is still susceptible to CFAA violations, presenting danger to the researchers conducting the study.<br />
<br />
=== Collaborative or Crowdsourced Audit <ref name="sandvig"/>===<br />
[[File: Crowdsourcing_Collaboration_Audit.png|thumbnail|Collaborative or Crowdsourced Audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 15]]<br />
A crowdsourced audit is similar to a sock puppet audit, employing real humans instead of a computer program imitating a user. This change circumvents the CFAA in most cases as long as the platform being audited allows anyone to create an account. This form of audit has the ability to utilize positive aspects of the four previously mentioned designs. With a large enough test sample and semi-automated tactics, it becomes more feasible to generate a random representative sample to investigate with. An example of this is [https://en.wikipedia.org/wiki/Amazon_Mechanical_Turk Amazon's Mechanical Turk] which allows a large enough testers group to produce significant results . The only drawback is a crowdsourced audit requires a large budget to pay the participants.<br />
<br />
== Legislation ==<br />
[[File:Cfaa.jpg|thumbnail| Image of CFAA text, sourced from Google]]<br />
Since the internet became a public commodity, the value of information has increased significantly. The US justice system has seen numerous cases involving digital privacy, ownership, plagiarism, hacking, fraud, and more. In response, laws have developed to categorize and respond to common disputes.<ref>upcounsel (2020). Internet Law: Everything You Need to Know.</ref><br />
<br />
=== Display of Information<ref>Code, U. S. US Code of Federal Regulations.</ref> ===<br />
In 1984 USCAB declared that the airline sorting algorithm created by [https://en.wikipedia.org/wiki/Sabre_(computer_system) SABRE] must be known to participating airlines under the [https://www.govinfo.gov/content/pkg/CFR-2012-title14-vol4/pdf/CFR-2012-title14-vol4-sec255-4.pdf Display of Information in the Code of Federal Regulations]. This regulatory provision resulted from a bias identified by airlines using SABRE's software, which involved "screen science." and recommending airlines based on privatized criteria and capital incentives. The airlines noticed that American Airlines received prioritization in SABRE's algorithms search results, which led to an investigation by the USCAB and DOJ.<br />
<br />
=== The Communications Decency Act<ref>Ardia, D. S. (2009). Free speech savior or shield for scoundrels: an empirical study of intermediary immunity under Section 230 of the Communications Decency Act. Loy. LAL Rev., 43, 373.</ref> ===<br />
[https://en.wikipedia.org/wiki/Section_230 Section 230] of the Communications Decency Act states that "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." This section is one of the most potent active legislative acts concerning the internet and digital communications and one of its defining limitations. This provision protects service providers from the legal implications of removing content deemed obscene or offensive, even of constitutionally protected speech, as long as it is done in good faith.<br />
<br />
=== US Computer Fraud and Abuse Act<ref>Griffith, D. S. (1990). The computer fraud and abuse act of 1986: a measured response to a growing problem. Vand. L. Rev., 43, 453.</ref> ===<br />
The United States Computer Fraud and Abuse Act attaches sentences to digital behaviors deemed illegal or unlawful and provides clearly stated repercussions for committing such an offense. <br />
<br />
The Act states: “The CFAA prohibits intentionally accessing a computer without authorization or in excess of authorization" but fails to define what ‘without authorization’ means. With harsh penalty schemes and malleable provisions, it has become a tool ripe for abuse, prompting potential changes to the legislation <ref name="Whitehouse"> Whitehouse, Sheldon. (2016). Hacking into the Computer Fraud and Abuse Act: the CFAA at 30. The George Washington law review, 84(6), 1437. George Washington Law Review.</ref> .<br />
{| class="wikitable"<br />
|-<br />
! Crime/ Offense !! Years In Prison<br />
|-<br />
| Obtaining National Security Information || 10<br />
|-<br />
| Accessing a Computer and Obtaining Information || 1-5<br />
|-<br />
| Trespassing in a Government Computer || 1<br />
|-<br />
| Accessing a Computer to Defraud and Obtain Value || 5<br />
|-<br />
| Intentionally Damaging by Knowing Transmission || 1-10<br />
|-<br />
| Recklessly Damaging by Intentional Access || 1-5<br />
|-<br />
| Negligently Causing Damage and Loss by Intentional Access || 1<br />
|-<br />
| Trafficking in Passwords || 1<br />
|-<br />
| Extortion Involving Computers || 5<br />
|-<br />
| Attempt and Conspiracy to Commit such an Offense || 10<br />
|}<br />
<br />
=== Photo Sharing Law ===<br />
A law put in place by U.S. Legislation. The Photo Sharing Law outlines the general regulation for photography in the U.S. This resulted from the potential security implications internally and the privacy of citizens. The law mainly outlines these regulations:<br />
# Private property is protected and established by the owner of the property, it is illegal to refuse to abide by the owner's requests concerning photographing activity.<br />
#Photographing private property from within the public domain is not illegal.<br />
# taking pictures of public places objects and structures are legal unless explicitly prohibited.<br />
# Outer space requires a license to be issued by the National Oceanic and Atmospheric Administration in advance.<br />
# Commercial photography will likely require a permit and proof of insurance.<br />
# Photographing accident scenes and law enforcement activities is legal, as long as it doesn't hinder the operations of law enforcement, medical, emergency, or security personnel by filming.<br />
# Any filming with the intent of doing unlawful harm against a subject may be a violation of the law in itself.<br />
<br />
<br />
== Ethical Implications ==<br />
Due to the scale of technology companies, regulation cannot keep pace. As a result, general guidelines and laws for digital interactions. However, as the infrastructure continues to grow, there are new ways for the system to be leveraged in new ways. As a result, Algorithm Audits have become the most efficient way to assess digital platforms and providers' ethical impact on their consumers and users. These Audits have formulated direct ways to check digital platforms and create protections for auditors. However, these methods are difficult to employ and fall short in many areas of regulation. <ref>O'neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. Crown.</ref><br />
<br />
=== Research Protections ===<br />
Because researchers attempting to audit a platform or provider can be targeted by these large tech companies, legislation has been put in place through the CFAA and U.S. Department of Justice to protect the researchers from lawsuits. This resulted from the 2018 court ruling in SANDVIG et al. V. SESSIONS, where four university researchers attempted to scrape information on discrimination in the housing department. <br />
[[File:SandvigSessions.jpg|thumbnail| Image of Sadvig Sessions ruling, sourced from Google]]<br />
===== SANDVIG et al V. SESSIONS<ref>Lee, A. (2019). Online Research and Competition under the CFAA. Available at SSRN 3259701.</ref> =====<br />
"On Mar. 27, the United States District Court of D.C. ruled that such actions should not be viewed as criminal under the statute, though it declined to weigh in on whether the professors' conduct would be protected under the First Amendment.<br />
Without reaching this constitutional question, the Court concludes that the CFAA does not criminalize mere terms-of-service violations on consumer websites and, thus, that plaintiffs' proposed research plans are not criminal under the CFAA," <br />
U.S. District Judge John Bates wrote in his opinion.<ref>https://cases.justia.com/federal/district-courts/district-of-columbia/dcdce/1:2016cv01368/180080/24/0.pdf?ts=1522488415</ref><br />
<br />
=== Privacy Policies & Anonymity ===<br />
<br />
If attempting to do a code, scraping, or sock puppet audit requires researchers to engage with the digital platform directly. This can be dangerous because technology companies have outlined in their privacy policies restrictions on how users can engage on the forum. Researchers can be flagged for breaking these policies and banned from the platform. This can inhibit researchers' abilities to audit large developed tech companies if they're detected and can lead to lawsuits and further complications when the research is published. These large companies deal with data and can identify a user based on their access point or device metadata, and this can make it hard even to attempt many of the audit types.<br />
<br />
== References ==</div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=Algorithmic_Audits&diff=100917
Algorithmic Audits
2021-04-08T16:25:13Z
<p>Nfigue: a bracket</p>
<hr />
<div><h1 style="font-size: 50; color: red; font-weight: 400;">Currently Being Edited</h1><br />
<br />
<br />
[[File:Internet-law.jpg|thumbnail| visual of digital law on computer screen, image sourced from Google]]<br />
<br />
Algorithmic Auditing is a research design that has shown promise in diagnosing the unwanted consequences of [https://en.wikipedia.org/wiki/Algorithm algorithmic] systems. <ref name="auditing_algorithms">https://auditingalgorithms.science/?page_id=89#:~:text=Auditing%20Algorithms%3A%20Adding%20Accountability%20to,in%20diagnosing%20the%20unwanted%20consequences</ref><br />
[[File:Audits.jpg|thumbnail| Visualization of Algorithmic Audit, Image sourced from Google]]<br />
<br />
The internet runs on algorithms that a majority of people don't understand, meaning it can function relatively unchecked and leaving people with the impression that these algorithms are neutral (unbiased)<ref name="wagner">Wagner, B. (2016). Algorithmic regulation and the global default: Shifting norms in Internet technology. Etikk I Praksis - Nordic Journal of Applied Ethics, 10(1), 5-13. https://doi.org/10.5324/eip.v10i1.1961</ref>. A study conducted by the PEW research center in February of 2021 showed that 85% of adults in the U.S utilize the internet daily <ref>Perrin, Andrew, and Sara Atske. About Three-in-Ten U.S. Adults Say They Are 'ALMOST Constantly' Online. 26 Mar. 2021, www.pewresearch.org/fact-tank/2021/03/26/about-three-in-ten-u-s-adults-say-they-are-almost-constantly-online/. </ref>. <br />
<br />
This statistic represents billions of people worldwide and emphasizes algorithms' persistence in our daily lives, despite a lack of general understanding. By looking at online platforms as a single entity and analyzing the results they're producing, more can be learned about algorithms and the way they affect society<ref name="sandvig1">Sandvig, Christian, et al. "An algorithm audit." Data and Discrimination: Collected Essays. Washington, DC: New America Foundation (2014): 6-10.</ref>. <br />
<br />
== Background ==<br />
The rapid advancement observed in the [https://en.wikipedia.org/wiki/High_tech tech industry] has revolutionized the way humans work, interact, and communicate. A diverse array of features are integrated into [https://en.wikipedia.org/wiki/Smart_device smart] applications for maximum [https://en.wikipedia.org/wiki/Availability availability]. At the core of all these real-time digital services lie algorithms which provide essential functions like sorting, segmentation, personalized recommendations, and information management<ref name="kotras">Kotras, Baptiste. "Mass personalization: Predictive marketing algorithms and the reshaping of consumer knowledge." Big Data & Society 7.2 (2020): 2053951720951581.</ref>. <br />
<br />
In the last decade, researchers have identified numerous ethical breaches in online platforms and published their findings. Countries worldwide have rolled out their regulations and laws to keep the Internet safe for their citizens. However, the these efforts do little to limit the international liberties afforded online, including access, anonymity, and privacy. Because technology and algorithms are heavily integrated into humanity, ethical issues arise when algorithms are opaque to public scrutiny. Since the 2000s, the regulation of algorithms has become a concern to governments worldwide due to data and privacy implications, though legislation has made progress in essential areas<ref name="goldsmith">Goldsmith, J. (2007), "Who Controls the Internet? Illusions of a Borderless World", Strategic Direction, Vol. 23 No. 11. https://doi.org/10.1108/sd.2007.05623kae.001</ref>.<br />
<br />
Higher education computer science and other similar programs have seen an increase in undergraduate and graduate enrollment, proportional to the tech industry's growth<ref name="usdoe">U.S. Department of Education, National Center for Education Statistics, Higher Education General Information Survey (HEGIS), "Degrees and Other Formal Awards Conferred" surveys, 1970-71 through 1985-86; Integrated Postsecondary Education Data System (IPEDS), "Completions Survey" (IPEDS-C:87-99); and IPEDS Fall 2000 through Fall 2011:https://nces.ed.gov/programs/digest/d12/tables/dt12_349.asp</ref>. Awareness about algorithms and digital platforms' infrastructure enables the public to identify ethical breaches in these systems' functionality. This awareness has sparked activism for ethics in technology. For example, [https://en.wikipedia.org/wiki/J Joy Buolamwini] identifying the bias in facial recognition technologies or [https://casetext.com/case/american-airlines-inc-v-sabre SABRE] leveraging their recommendation algorithm to place airlines who paid, higher on the recommendation than their competitors<ref name="copeland">Copeland, D.G., Mason, R.O., and McKenney, J.L. (1995). Sabre: The Development of Information-Based Competence and Execution of Information-Based Competition. IEEE Annals of the History of Computing, vol. 17, no. 3: 30-57.</ref>.<br />
<br />
== Types of Audits ==<br />
<br />
=== Code Audit<ref name="sandvig">Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." Data and discrimination: converting critical concerns into productive inquiry 22 (2014): 4349-4357. pg. 9-15 </ref> ===<br />
[[File: Code Audit.png|thumbnail|Visual of code audit from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 9]]<br />
A code audit is based on the concept of [https://en.wikipedia.org/wiki/Algorithmic_transparency Algorithm Transparency], where researchers acquire a copy of the algorithm and vet it for unethical behavior. This is uncommon because software is considered valuable intellectual property, and companies are reluctant to release the code, even to researchers. <br />
<br />
An issue with this method is that public interest in software can be malicious. On many platforms, the algorithm designers constantly battle those who try to take advantage of the software, so releasing code for review may cause more harm than good.<br />
<br />
=== Noninvasive User Audit <ref name="sandvig"/>===<br />
[[File: NonInvasive User Audit.png|thumbnail|Visualization of code audit from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 11]]<br />
A noninvasive user audit involves researchers requesting assorted input and output data from end users in order to analyze the workings of an algorithm. This form of audit carries the benefit of not interacting with a platform itself, ensuring that the researcher is not accused of tampering. The most challenging aspect of a user audit is generating a truly random and representative sample, making it difficult to identify [https://en.wikipedia.org/wiki/Causality causality].<br />
<br />
=== Scraping Audit <ref name="sandvig"/>===<br />
[[File: Scraping Audit.png|thumbnail|Scraping audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 12]]<br />
A scraping audit involves researchers issuing repeated queries to a platform, then observing the results, employing programs that utilize API data mining and [https://en.wikipedia.org/wiki/Web_scraping webpage scraping]. This is an effective method to gather a large quantity of data to be analyzed, however this audit still encounters the issue of securing a random sample. <br />
<br />
Another difficulty with this method is the likelihood of being prosecuted under the Computer Fraud and Abuse Act ([https://en.wikipedia.org/wiki/Computer_Fraud_and_Abuse_Act CFAA]), a law which has been criticized as being too vague, as it criminalizes any unauthorized access to a computer. Consent is difficult to obtain from a platform operator in order to perform a scraping audit.<br />
<br />
Similarly, a Terms of Service document on a Website may not have the force of law but some scholarly institutions pose that Internet researchers must not violate the Terms of Internet platforms, making the results from this form of research difficult to publish without proper permissions.<br />
<br />
=== Sock Puppet Audit <ref name="sandvig"/>===<br />
[[File: Sock Puppet Audit.png|thumbnail|Sock Puppet Audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 13]]<br />
A sock puppet audit uses computer programs to impersonate users by creating false accounts. This method effectively investigates sensitive topics in difficult-to-access [https://en.wikipedia.org/wiki/Network_domain domains], and sock puppets can investigate features of systems that are not public and penetrate groups that are difficult to pinpoint and acknowledge. A large number of sock puppets are required to derive significant findings from the audit. This method is still susceptible to CFAA violations, presenting danger to the researchers conducting the study.<br />
<br />
=== Collaborative or Crowdsourced Audit <ref name="sandvig"/>===<br />
[[File: Crowdsourcing_Collaboration_Audit.png|thumbnail|Collaborative or Crowdsourced Audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 15]]<br />
A crowdsourced audit is similar to a sock puppet audit, employing real humans instead of a computer program imitating a user. This change circumvents the CFAA in most cases as long as the platform being audited allows anyone to create an account. This form of audit has the ability to utilize positive aspects of the four previously mentioned designs. With a large enough test sample and semi-automated tactics, it becomes more feasible to generate a random representative sample to investigate with. An example of this is [https://en.wikipedia.org/wiki/Amazon_Mechanical_Turk Amazon's Mechanical Turk] which allows a large enough testers group to produce significant results . The only drawback is a crowdsourced audit requires a large budget to pay the participants.<br />
<br />
== Legislation ==<br />
[[File:Cfaa.jpg|thumbnail| Image of CFAA text, sourced from Google]]<br />
Since the internet became a public commodity, the value of information has increased significantly. The US justice system has seen numerous cases involving digital privacy, ownership, plagiarism, hacking, fraud, and more. In response, laws have developed to categorize and respond to common disputes.<ref>upcounsel (2020). Internet Law: Everything You Need to Know.</ref><br />
<br />
=== Display of Information<ref>Code, U. S. US Code of Federal Regulations.</ref> ===<br />
In 1984 USCAB declared that the airline sorting algorithm created by [https://en.wikipedia.org/wiki/Sabre_(computer_system) SABRE] must be known to participating airlines under the [https://www.govinfo.gov/content/pkg/CFR-2012-title14-vol4/pdf/CFR-2012-title14-vol4-sec255-4.pdf Display of Information in the Code of Federal Regulations]. This regulatory provision resulted from a bias identified by airlines using SABRE's software, which involved "screen science." and recommending airlines based on privatized criteria and capital incentives. The airlines noticed that American Airlines received prioritization in SABRE's algorithms search results, which led to an investigation by the USCAB and DOJ.<br />
<br />
=== The Communications Decency Act<ref>Ardia, D. S. (2009). Free speech savior or shield for scoundrels: an empirical study of intermediary immunity under Section 230 of the Communications Decency Act. Loy. LAL Rev., 43, 373.</ref> ===<br />
[https://en.wikipedia.org/wiki/Section_230 Section 230] of the Communications Decency Act states that "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." This section is one of the most potent active legislative acts concerning the internet and digital communications and one of its defining limitations. This provision protects service providers from the legal implications of removing content deemed obscene or offensive, even of constitutionally protected speech, as long as it is done in good faith.<br />
<br />
=== US Computer Fraud and Abuse Act<ref>Griffith, D. S. (1990). The computer fraud and abuse act of 1986: a measured response to a growing problem. Vand. L. Rev., 43, 453.</ref> ===<br />
The [https://en.wikipedia.org/wiki/Computer_Fraud_and_Abuse_Act United states CFAA] is an act that attaches sentences to digital behaviors deemed illegal or unlawful and has clearly stated the repercussions for committing such an offense. <br />
<br />
The Act states: “The CFAA prohibits intentionally accessing a computer without authorization or in excess of authorization but fails to define what ‘without authorization’ means. With harsh penalty schemes and malleable provisions, it has become a tool ripe for abuse and uses against nearly every aspect of computer activity.”<br />
{| class="wikitable"<br />
|-<br />
! Crime/ Offense !! Years In Prison<br />
|-<br />
| Obtaining National Security Information || 10<br />
|-<br />
| Accessing a Computer and Obtaining Information || 1-5<br />
|-<br />
| Trespassing in a Government Computer || 1<br />
|-<br />
| Accessing a Computer to Defraud and Obtain Value || 5<br />
|-<br />
| Intentionally Damaging by Knowing Transmission || 1-10<br />
|-<br />
| Recklessly Damaging by Intentional Access || 1-5<br />
|-<br />
| Negligently Causing Damage and Loss by Intentional Access || 1<br />
|-<br />
| Trafficking in Passwords || 1<br />
|-<br />
| Extortion Involving Computers || 5<br />
|-<br />
| Attempt and Conspiracy to Commit such an Offense || 10<br />
|}<br />
<br />
=== Photo Sharing Law ===<br />
A law put in place by U.S. Legislation. The Photo Sharing Law outlines the general regulation for photography in the U.S. This resulted from the potential security implications internally and the privacy of citizens. The law mainly outlines these regulations:<br />
# Private property is protected and established by the owner of the property, it is illegal to refuse to abide by the owner's requests concerning photographing activity.<br />
#Photographing private property from within the public domain is not illegal.<br />
# taking pictures of public places objects and structures are legal unless explicitly prohibited.<br />
# Outer space requires a license to be issued by the National Oceanic and Atmospheric Administration in advance.<br />
# Commercial photography will likely require a permit and proof of insurance.<br />
# Photographing accident scenes and law enforcement activities is legal, as long as it doesn't hinder the operations of law enforcement, medical, emergency, or security personnel by filming.<br />
# Any filming with the intent of doing unlawful harm against a subject may be a violation of the law in itself.<br />
<br />
<br />
== Ethical Implications ==<br />
Due to the scale of technology companies, regulation cannot keep pace. As a result, general guidelines and laws for digital interactions. However, as the infrastructure continues to grow, there are new ways for the system to be leveraged in new ways. As a result, Algorithm Audits have become the most efficient way to assess digital platforms and providers' ethical impact on their consumers and users. These Audits have formulated direct ways to check digital platforms and create protections for auditors. However, these methods are difficult to employ and fall short in many areas of regulation. <ref>O'neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. Crown.</ref><br />
<br />
=== Research Protections ===<br />
Because researchers attempting to audit a platform or provider can be targeted by these large tech companies, legislation has been put in place through the CFAA and U.S. Department of Justice to protect the researchers from lawsuits. This resulted from the 2018 court ruling in SANDVIG et al. V. SESSIONS, where four university researchers attempted to scrape information on discrimination in the housing department. <br />
[[File:SandvigSessions.jpg|thumbnail| Image of Sadvig Sessions ruling, sourced from Google]]<br />
===== SANDVIG et al V. SESSIONS<ref>Lee, A. (2019). Online Research and Competition under the CFAA. Available at SSRN 3259701.</ref> =====<br />
"On Mar. 27, the United States District Court of D.C. ruled that such actions should not be viewed as criminal under the statute, though it declined to weigh in on whether the professors' conduct would be protected under the First Amendment.<br />
Without reaching this constitutional question, the Court concludes that the CFAA does not criminalize mere terms-of-service violations on consumer websites and, thus, that plaintiffs' proposed research plans are not criminal under the CFAA," <br />
U.S. District Judge John Bates wrote in his opinion.<ref>https://cases.justia.com/federal/district-courts/district-of-columbia/dcdce/1:2016cv01368/180080/24/0.pdf?ts=1522488415</ref><br />
<br />
=== Privacy Policies & Anonymity ===<br />
<br />
If attempting to do a code, scraping, or sock puppet audit requires researchers to engage with the digital platform directly. This can be dangerous because technology companies have outlined in their privacy policies restrictions on how users can engage on the forum. Researchers can be flagged for breaking these policies and banned from the platform. This can inhibit researchers' abilities to audit large developed tech companies if they're detected and can lead to lawsuits and further complications when the research is published. These large companies deal with data and can identify a user based on their access point or device metadata, and this can make it hard even to attempt many of the audit types.<br />
<br />
== References ==</div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=Algorithmic_Audits&diff=100911
Algorithmic Audits
2021-04-08T16:08:46Z
<p>Nfigue: Rewrote section for understanding</p>
<hr />
<div><h1 style="font-size: 50; color: red; font-weight: 400;">Currently Being Edited</h1><br />
<br />
<br />
[[File:Internet-law.jpg|thumbnail| visual of digital law on computer screen, image sourced from Google]]<br />
<br />
Algorithmic Auditing is a research design that has shown promise in diagnosing the unwanted consequences of [https://en.wikipedia.org/wiki/Algorithm algorithmic] systems. <ref name="auditing_algorithms">https://auditingalgorithms.science/?page_id=89#:~:text=Auditing%20Algorithms%3A%20Adding%20Accountability%20to,in%20diagnosing%20the%20unwanted%20consequences</ref><br />
[[File:Audits.jpg|thumbnail| Visualization of Algorithmic Audit, Image sourced from Google]]<br />
<br />
The internet runs on algorithms that a majority of people don't understand, meaning it can function relatively unchecked and leaving people with the impression that these algorithms are neutral (unbiased)<ref name="wagner">Wagner, B. (2016). Algorithmic regulation and the global default: Shifting norms in Internet technology. Etikk I Praksis - Nordic Journal of Applied Ethics, 10(1), 5-13. https://doi.org/10.5324/eip.v10i1.1961</ref>. A study conducted by the PEW research center in February of 2021 showed that 85% of adults in the U.S utilize the internet daily <ref>Perrin, Andrew, and Sara Atske. About Three-in-Ten U.S. Adults Say They Are 'ALMOST Constantly' Online. 26 Mar. 2021, www.pewresearch.org/fact-tank/2021/03/26/about-three-in-ten-u-s-adults-say-they-are-almost-constantly-online/. </ref>. <br />
<br />
This statistic represents billions of people worldwide and emphasizes algorithms' persistence in our daily lives, despite a lack of general understanding. By looking at online platforms as a single entity and analyzing the results they're producing, more can be learned about algorithms and the way they affect society<ref name="sandvig1">Sandvig, Christian, et al. "An algorithm audit." Data and Discrimination: Collected Essays. Washington, DC: New America Foundation (2014): 6-10.</ref>. <br />
<br />
== Background ==<br />
The rapid advancement observed in the [https://en.wikipedia.org/wiki/High_tech tech industry] has revolutionized the way humans work, interact, and communicate. A diverse array of features are integrated into [https://en.wikipedia.org/wiki/Smart_device smart] applications for maximum [https://en.wikipedia.org/wiki/Availability availability]. At the core of all these real-time digital services lie algorithms which provide essential functions like sorting, segmentation, personalized recommendations, and information management<ref name="kotras">Kotras, Baptiste. "Mass personalization: Predictive marketing algorithms and the reshaping of consumer knowledge." Big Data & Society 7.2 (2020): 2053951720951581.</ref>. <br />
<br />
In the last decade, researchers have identified numerous ethical breaches in online platforms and published their findings. Countries worldwide have rolled out their regulations and laws to keep the Internet safe for their citizens. However, the these efforts do little to limit the international liberties afforded online, including access, anonymity, and privacy. Because technology and algorithms are heavily integrated into humanity, ethical issues arise when algorithms are opaque to public scrutiny. Since the 2000s, the regulation of algorithms has become a concern to governments worldwide due to data and privacy implications, though legislation has made progress in essential areas<ref name="goldsmith">Goldsmith, J. (2007), "Who Controls the Internet? Illusions of a Borderless World", Strategic Direction, Vol. 23 No. 11. https://doi.org/10.1108/sd.2007.05623kae.001</ref>.<br />
<br />
Higher education computer science and other similar programs have seen an increase in undergraduate and graduate enrollment, proportional to the tech industry's growth<ref name="usdoe">U.S. Department of Education, National Center for Education Statistics, Higher Education General Information Survey (HEGIS), "Degrees and Other Formal Awards Conferred" surveys, 1970-71 through 1985-86; Integrated Postsecondary Education Data System (IPEDS), "Completions Survey" (IPEDS-C:87-99); and IPEDS Fall 2000 through Fall 2011:https://nces.ed.gov/programs/digest/d12/tables/dt12_349.asp</ref>. Awareness about algorithms and digital platforms' infrastructure enables the public to identify ethical breaches in these systems' functionality. This awareness has sparked activism for ethics in technology. For example, [https://en.wikipedia.org/wiki/J Joy Buolamwini] identifying the bias in facial recognition technologies or [https://casetext.com/case/american-airlines-inc-v-sabre SABRE] leveraging their recommendation algorithm to place airlines who paid, higher on the recommendation than their competitors<ref name="copeland">Copeland, D.G., Mason, R.O., and McKenney, J.L. (1995). Sabre: The Development of Information-Based Competence and Execution of Information-Based Competition. IEEE Annals of the History of Computing, vol. 17, no. 3: 30-57.</ref>.<br />
<br />
== Types of Audits ==<br />
<br />
=== Code Audit<ref name="sandvig">Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." Data and discrimination: converting critical concerns into productive inquiry 22 (2014): 4349-4357. pg. 9-15 </ref> ===<br />
[[File: Code Audit.png|thumbnail|Visual of code audit from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 9]]<br />
A code audit is based on the concept of [https://en.wikipedia.org/wiki/Algorithmic_transparency Algorithm Transparency], where researchers acquire a copy of the algorithm and vet it for unethical behavior. This is uncommon because software is considered valuable intellectual property, and companies are reluctant to release the code, even to researchers. <br />
<br />
An issue with this method is that public interest in software can be malicious. On many platforms, the algorithm designers constantly battle those who try to take advantage of the software, so releasing code for review may cause more harm than good.<br />
<br />
=== Noninvasive User Audit <ref name="sandvig"/>===<br />
[[File: NonInvasive User Audit.png|thumbnail|Visualization of code audit from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 11]]<br />
A noninvasive user audit involves researchers requesting assorted input and output data from end users in order to analyze the workings of an algorithm. This form of audit carries the benefit of not interacting with a platform itself, ensuring that the researcher is not accused of tampering. The most challenging aspect of a user audit is generating a truly random and representative sample, making it difficult to identify [https://en.wikipedia.org/wiki/Causality causality].<br />
<br />
=== Scraping Audit <ref name="sandvig"/>===<br />
[[File: Scraping Audit.png|thumbnail|Scraping audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 12]]<br />
A scraping audit involves researchers issuing repeated queries to a platform, then observing the results, employing programs that utilize API data mining and [https://en.wikipedia.org/wiki/Web_scraping webpage scraping]. This is an effective method to gather a large quantity of data to be analyzed, however this audit still encounters the issue of securing a random sample. <br />
<br />
Another difficulty with this method is the likelihood of being prosecuted under the Computer Fraud and Abuse Act ([https://en.wikipedia.org/wiki/Computer_Fraud_and_Abuse_Act CFAA]), a law which has been criticized as being too vague, as it criminalizes any unauthorized access to a computer. Consent is difficult to obtain from a platform operator in order to perform a scraping audit.<br />
<br />
Similarly, a Terms of Service document on a Website may not have the force of law but some scholarly institutions pose that Internet researchers must not violate the Terms of Internet platforms, making the results from this form of research difficult to publish without proper permissions.<br />
<br />
=== Sock Puppet Audit <ref name="sandvig"/>===<br />
[[File: Sock Puppet Audit.png|thumbnail|Sock Puppet Audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 13] <br />
A sock puppet audit uses computer programs to impersonate users by creating false accounts. This method effectively investigates sensitive topics in difficult-to-access [https://en.wikipedia.org/wiki/Network_domain domains], and sock puppets can investigate features of systems that are not public and penetrate groups that are difficult to pinpoint and acknowledge. A large number of sock puppets are required to derive significant findings from the audit. This method is still susceptible to CFAA violations, presenting danger to the researchers conducting the study.<br />
<br />
=== Collaborative or Crowdsourced Audit <ref name="sandvig"/>===<br />
[[File: Crowdsourcing_Collaboration_Audit.png|thumbnail|Collaborative or Crowdsourced Audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 15]]<br />
A crowdsourced audit is similar to a sock puppet audit, employing real humans instead of a computer program imitating a user. This change circumvents the CFAA in most cases as long as the platform being audited allows anyone to create an account. This form of audit has the ability to utilize positive aspects of the four previously mentioned designs. With a large enough test sample and semi-automated tactics, it becomes more feasible to generate a random representative sample to investigate with. An example of this is [https://en.wikipedia.org/wiki/Amazon_Mechanical_Turk Amazon's Mechanical Turk] which allows a large enough testers group to produce significant results . The only drawback is a crowdsourced audit requires a large budget to pay the participants.<br />
<br />
== Legislation ==<br />
[[File:Cfaa.jpg|thumbnail| Image of CFAA text, sourced from Google]]<br />
Since the internet became a public commodity, the value of information has increased significantly. The US justice system has seen numerous cases involving digital privacy, ownership, plagiarism, hacking, fraud, and more. In response, laws have developed to categorize and respond to common disputes.<ref>upcounsel (2020). Internet Law: Everything You Need to Know.</ref><br />
<br />
=== Display of Information<ref>Code, U. S. US Code of Federal Regulations.</ref> ===<br />
In 1984 USCAB declared that the airline sorting algorithm created by [https://en.wikipedia.org/wiki/Sabre_(computer_system) SABRE] must be known to participating airlines under the [https://www.govinfo.gov/content/pkg/CFR-2012-title14-vol4/pdf/CFR-2012-title14-vol4-sec255-4.pdf Display of Information in the Code of Federal Regulations]. This regulatory provision resulted from a bias identified by airlines using SABRE's software, which involved "screen science." and recommending airlines based on privatized criteria and capital incentives. The airlines noticed that American Airlines received prioritization in SABRE's algorithms search results, which led to an investigation by the USCAB and DOJ.<br />
<br />
=== The Communications Decency Act<ref>Ardia, D. S. (2009). Free speech savior or shield for scoundrels: an empirical study of intermediary immunity under Section 230 of the Communications Decency Act. Loy. LAL Rev., 43, 373.</ref> ===<br />
[https://en.wikipedia.org/wiki/Section_230 Section 230] of the Communications Decency Act states that "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." This section is one of the most potent active legislative acts concerning the internet and digital communications and one of its defining limitations. This provision protects service providers from the legal implications of removing content deemed obscene or offensive, even of constitutionally protected speech, as long as it is done in good faith.<br />
<br />
=== US Computer Fraud and Abuse Act<ref>Griffith, D. S. (1990). The computer fraud and abuse act of 1986: a measured response to a growing problem. Vand. L. Rev., 43, 453.</ref> ===<br />
The [https://en.wikipedia.org/wiki/Computer_Fraud_and_Abuse_Act United states CFAA] is an act that attaches sentences to digital behaviors deemed illegal or unlawful and has clearly stated the repercussions for committing such an offense. <br />
<br />
The Act states: “The CFAA prohibits intentionally accessing a computer without authorization or in excess of authorization but fails to define what ‘without authorization’ means. With harsh penalty schemes and malleable provisions, it has become a tool ripe for abuse and uses against nearly every aspect of computer activity.”<br />
{| class="wikitable"<br />
|-<br />
! Crime/ Offense !! Years In Prison<br />
|-<br />
| Obtaining National Security Information || 10<br />
|-<br />
| Accessing a Computer and Obtaining Information || 1-5<br />
|-<br />
| Trespassing in a Government Computer || 1<br />
|-<br />
| Accessing a Computer to Defraud and Obtain Value || 5<br />
|-<br />
| Intentionally Damaging by Knowing Transmission || 1-10<br />
|-<br />
| Recklessly Damaging by Intentional Access || 1-5<br />
|-<br />
| Negligently Causing Damage and Loss by Intentional Access || 1<br />
|-<br />
| Trafficking in Passwords || 1<br />
|-<br />
| Extortion Involving Computers || 5<br />
|-<br />
| Attempt and Conspiracy to Commit such an Offense || 10<br />
|}<br />
<br />
=== Photo Sharing Law ===<br />
A law put in place by U.S. Legislation. The Photo Sharing Law outlines the general regulation for photography in the U.S. This resulted from the potential security implications internally and the privacy of citizens. The law mainly outlines these regulations:<br />
# Private property is protected and established by the owner of the property, it is illegal to refuse to abide by the owner's requests concerning photographing activity.<br />
#Photographing private property from within the public domain is not illegal.<br />
# taking pictures of public places objects and structures are legal unless explicitly prohibited.<br />
# Outer space requires a license to be issued by the National Oceanic and Atmospheric Administration in advance.<br />
# Commercial photography will likely require a permit and proof of insurance.<br />
# Photographing accident scenes and law enforcement activities is legal, as long as it doesn't hinder the operations of law enforcement, medical, emergency, or security personnel by filming.<br />
# Any filming with the intent of doing unlawful harm against a subject may be a violation of the law in itself.<br />
<br />
<br />
== Ethical Implications ==<br />
Due to the scale of technology companies, regulation cannot keep pace. As a result, general guidelines and laws for digital interactions. However, as the infrastructure continues to grow, there are new ways for the system to be leveraged in new ways. As a result, Algorithm Audits have become the most efficient way to assess digital platforms and providers' ethical impact on their consumers and users. These Audits have formulated direct ways to check digital platforms and create protections for auditors. However, these methods are difficult to employ and fall short in many areas of regulation. <ref>O'neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. Crown.</ref><br />
<br />
=== Research Protections ===<br />
Because researchers attempting to audit a platform or provider can be targeted by these large tech companies, legislation has been put in place through the CFAA and U.S. Department of Justice to protect the researchers from lawsuits. This resulted from the 2018 court ruling in SANDVIG et al. V. SESSIONS, where four university researchers attempted to scrape information on discrimination in the housing department. <br />
[[File:SandvigSessions.jpg|thumbnail| Image of Sadvig Sessions ruling, sourced from Google]]<br />
===== SANDVIG et al V. SESSIONS<ref>Lee, A. (2019). Online Research and Competition under the CFAA. Available at SSRN 3259701.</ref> =====<br />
"On Mar. 27, the United States District Court of D.C. ruled that such actions should not be viewed as criminal under the statute, though it declined to weigh in on whether the professors' conduct would be protected under the First Amendment.<br />
Without reaching this constitutional question, the Court concludes that the CFAA does not criminalize mere terms-of-service violations on consumer websites and, thus, that plaintiffs' proposed research plans are not criminal under the CFAA," <br />
U.S. District Judge John Bates wrote in his opinion.<ref>https://cases.justia.com/federal/district-courts/district-of-columbia/dcdce/1:2016cv01368/180080/24/0.pdf?ts=1522488415</ref><br />
<br />
=== Privacy Policies & Anonymity ===<br />
<br />
If attempting to do a code, scraping, or sock puppet audit requires researchers to engage with the digital platform directly. This can be dangerous because technology companies have outlined in their privacy policies restrictions on how users can engage on the forum. Researchers can be flagged for breaking these policies and banned from the platform. This can inhibit researchers' abilities to audit large developed tech companies if they're detected and can lead to lawsuits and further complications when the research is published. These large companies deal with data and can identify a user based on their access point or device metadata, and this can make it hard even to attempt many of the audit types.<br />
<br />
== References ==</div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=Algorithmic_Audits&diff=100905
Algorithmic Audits
2021-04-08T15:56:05Z
<p>Nfigue: a word</p>
<hr />
<div><h1 style="font-size: 50; color: red; font-weight: 400;">Currently Being Edited</h1><br />
<br />
<br />
[[File:Internet-law.jpg|thumbnail| visual of digital law on computer screen, image sourced from Google]]<br />
<br />
Algorithmic Auditing is a research design that has shown promise in diagnosing the unwanted consequences of [https://en.wikipedia.org/wiki/Algorithm algorithmic] systems. <ref name="auditing_algorithms">https://auditingalgorithms.science/?page_id=89#:~:text=Auditing%20Algorithms%3A%20Adding%20Accountability%20to,in%20diagnosing%20the%20unwanted%20consequences</ref><br />
[[File:Audits.jpg|thumbnail| Visualization of Algorithmic Audit, Image sourced from Google]]<br />
<br />
The internet runs on algorithms that a majority of people don't understand, meaning it can function relatively unchecked and leaving people with the impression that these algorithms are neutral (unbiased)<ref name="wagner">Wagner, B. (2016). Algorithmic regulation and the global default: Shifting norms in Internet technology. Etikk I Praksis - Nordic Journal of Applied Ethics, 10(1), 5-13. https://doi.org/10.5324/eip.v10i1.1961</ref>. A study conducted by the PEW research center in February of 2021 showed that 85% of adults in the U.S utilize the internet daily <ref>Perrin, Andrew, and Sara Atske. About Three-in-Ten U.S. Adults Say They Are 'ALMOST Constantly' Online. 26 Mar. 2021, www.pewresearch.org/fact-tank/2021/03/26/about-three-in-ten-u-s-adults-say-they-are-almost-constantly-online/. </ref>. <br />
<br />
This statistic represents billions of people worldwide and emphasizes algorithms' persistence in our daily lives, despite a lack of general understanding. By looking at online platforms as a single entity and analyzing the results they're producing, more can be learned about algorithms and the way they affect society<ref name="sandvig1">Sandvig, Christian, et al. "An algorithm audit." Data and Discrimination: Collected Essays. Washington, DC: New America Foundation (2014): 6-10.</ref>. <br />
<br />
== Background ==<br />
The rapid advancement observed in the [https://en.wikipedia.org/wiki/High_tech tech industry] has revolutionized the way humans work, interact, and communicate. A diverse array of features are integrated into [https://en.wikipedia.org/wiki/Smart_device smart] applications for maximum [https://en.wikipedia.org/wiki/Availability availability]. At the core of all these real-time digital services lie algorithms which provide essential functions like sorting, segmentation, personalized recommendations, and information management<ref name="kotras">Kotras, Baptiste. "Mass personalization: Predictive marketing algorithms and the reshaping of consumer knowledge." Big Data & Society 7.2 (2020): 2053951720951581.</ref>. <br />
<br />
In the last decade, researchers have identified numerous ethical breaches in online platforms and published their findings. Countries worldwide have rolled out their regulations and laws to keep the Internet safe for their citizens. However, the these efforts do little to limit the international liberties afforded online, including access, anonymity, and privacy. Because technology and algorithms are heavily integrated into humanity, ethical issues arise when algorithms are opaque to public scrutiny. Since the 2000s, the regulation of algorithms has become a concern to governments worldwide due to data and privacy implications, though legislation has made progress in essential areas<ref name="goldsmith">Goldsmith, J. (2007), "Who Controls the Internet? Illusions of a Borderless World", Strategic Direction, Vol. 23 No. 11. https://doi.org/10.1108/sd.2007.05623kae.001</ref>.<br />
<br />
Higher education computer science and other similar programs have seen an increase in undergraduate and graduate enrollment, proportional to the tech industry's growth<ref name="usdoe">U.S. Department of Education, National Center for Education Statistics, Higher Education General Information Survey (HEGIS), "Degrees and Other Formal Awards Conferred" surveys, 1970-71 through 1985-86; Integrated Postsecondary Education Data System (IPEDS), "Completions Survey" (IPEDS-C:87-99); and IPEDS Fall 2000 through Fall 2011:https://nces.ed.gov/programs/digest/d12/tables/dt12_349.asp</ref>. Awareness about algorithms and digital platforms' infrastructure enables the public to identify ethical breaches in these systems' functionality. This awareness has sparked activism for ethics in technology. For example, [https://en.wikipedia.org/wiki/J Joy Buolamwini] identifying the bias in facial recognition technologies or [https://casetext.com/case/american-airlines-inc-v-sabre SABRE] leveraging their recommendation algorithm to place airlines who paid, higher on the recommendation than their competitors<ref name="copeland">Copeland, D.G., Mason, R.O., and McKenney, J.L. (1995). Sabre: The Development of Information-Based Competence and Execution of Information-Based Competition. IEEE Annals of the History of Computing, vol. 17, no. 3: 30-57.</ref>.<br />
<br />
== Types of Audits ==<br />
<br />
=== Code Audit<ref name="sandvig">Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." Data and discrimination: converting critical concerns into productive inquiry 22 (2014): 4349-4357. pg. 9-15 </ref> ===<br />
[[File: Code Audit.png|thumbnail|Visual of code audit from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 9]]<br />
A code audit is based on the concept of [https://en.wikipedia.org/wiki/Algorithmic_transparency Algorithm Transparency], where researchers acquire a copy of the algorithm and vet it for unethical behavior. This is uncommon because software is considered valuable intellectual property, and companies are reluctant to release the code, even to researchers. <br />
<br />
An issue with this method is that public interest in software can be malicious. On many platforms, the algorithm designers constantly battle those who try to take advantage of the software, so releasing code for review may cause more harm than good.<br />
<br />
=== Noninvasive User Audit <ref name="sandvig"/>===<br />
[[File: NonInvasive User Audit.png|thumbnail|Visualization of code audit from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 11]]<br />
A noninvasive user audit involves researchers requesting assorted input and output data from end users in order to analyze the workings of an algorithm. This form of audit carries the benefit of not interacting with a platform itself, ensuring that the researcher is not accused of tampering. The most challenging aspect of a user audit is generating a truly random and representative sample, making it difficult to identify [https://en.wikipedia.org/wiki/Causality causality].<br />
<br />
=== Scraping Audit <ref name="sandvig"/>===<br />
[[File: Scraping Audit.png|thumbnail|Scraping audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 12]]<br />
A scraping audit involves researchers issuing repeated queries to a platform, then observing the results, employing programs that utilize API data mining and [https://en.wikipedia.org/wiki/Web_scraping webpage scraping]. This is an effective method to gather a large quantity of data to be analyzed, however this audit still encounters the issue of securing a random sample. <br />
<br />
Another difficulty with this method is the likelihood of being prosecuted under the Computer Fraud and Abuse Act ([https://en.wikipedia.org/wiki/Computer_Fraud_and_Abuse_Act CFAA]), a law which has been criticized as being too vague, as it criminalizes any unauthorized access to a computer. Consent is difficult to obtain from a platform operator in order to perform a scraping audit.<br />
<br />
Similarly, a Terms of Service document on a Website may not have the force of law but some scholarly institutions pose that Internet researchers must not violate the Terms of Internet platforms, making the results from this form of research difficult to publish without proper permissions.<br />
<br />
=== Sock Puppet Audit <ref name="sandvig"/>===<br />
[[File: Sock Puppet Audit.png|thumbnail|Sock Puppet Audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 13] <br />
A sock puppet audit uses computer programs to impersonate users by creating false accounts. This method effectively investigates sensitive topics in difficult-to-access [https://en.wikipedia.org/wiki/Network_domain domains], and sock puppets can investigate features of systems that are not public and penetrate groups that are difficult to pinpoint and acknowledge. A large number of sock puppets are required to derive significant findings from the audit. This method is still susceptible to CFAA violations, presenting danger to the researchers conducting the study.<br />
<br />
=== Collaborative or Crowdsourced Audit <ref name="sandvig"/>===<br />
[[File: Crowdsourcing_Collaboration_Audit.png|thumbnail|Collaborative or Crowdsourced Audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 15]]<br />
A sock puppet audit and a crowdsourced audit only differ in one respect: in the latter<br />
design, the tester is a human. While it seems absurd that it would be illegal to use a computer<br />
program to act as a user yet legal to hire a person to act like a computer program, indeed this<br />
appears to be the case. The CFAA restrictions on unauthorized use are unlikely to be triggered<br />
by a genuine human user interacting with an online platform, especially a platform that accepts<br />
new user accounts from anyone on the Internet. A great deal of effort goes into preventing<br />
automated use of free Internet platforms by malicious bots and computer programs, and these<br />
are expressly prohibited in the Terms of Service given by platform providers. Yet an actual<br />
human being who is paid to do something with an Internet platform is unlikely to trigger any of<br />
these prohibitions.<br />
<br />
An example of this is Amazon's Mechanical Turk which allows a large enough testers group to produce significant results through semi-automated crowdsourcing. The only setback is this audit requires a large budget to pay the participants.<br />
<br />
== Legislation ==<br />
[[File:Cfaa.jpg|thumbnail| Image of CFAA text, sourced from Google]]<br />
Since the internet became a public commodity, the value of information has increased significantly. The US justice system has seen numerous cases involving digital privacy, ownership, plagiarism, hacking, fraud, and more. In response, laws have developed to categorize and respond to common disputes.<ref>upcounsel (2020). Internet Law: Everything You Need to Know.</ref><br />
<br />
=== Display of Information<ref>Code, U. S. US Code of Federal Regulations.</ref> ===<br />
In 1984 USCAB declared that the airline sorting algorithm created by [https://en.wikipedia.org/wiki/Sabre_(computer_system) SABRE] must be known to participating airlines under the [https://www.govinfo.gov/content/pkg/CFR-2012-title14-vol4/pdf/CFR-2012-title14-vol4-sec255-4.pdf Display of Information in the Code of Federal Regulations]. This regulatory provision resulted from a bias identified by airlines using SABRE's software, which involved "screen science." and recommending airlines based on privatized criteria and capital incentives. The airlines noticed that American Airlines received prioritization in SABRE's algorithms search results, which led to an investigation by the USCAB and DOJ.<br />
<br />
=== The Communications Decency Act<ref>Ardia, D. S. (2009). Free speech savior or shield for scoundrels: an empirical study of intermediary immunity under Section 230 of the Communications Decency Act. Loy. LAL Rev., 43, 373.</ref> ===<br />
[https://en.wikipedia.org/wiki/Section_230 Section 230] of the Communications Decency Act states that "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." This section is one of the most potent active legislative acts concerning the internet and digital communications and one of its defining limitations. This provision protects service providers from the legal implications of removing content deemed obscene or offensive, even of constitutionally protected speech, as long as it is done in good faith.<br />
<br />
=== US Computer Fraud and Abuse Act<ref>Griffith, D. S. (1990). The computer fraud and abuse act of 1986: a measured response to a growing problem. Vand. L. Rev., 43, 453.</ref> ===<br />
The [https://en.wikipedia.org/wiki/Computer_Fraud_and_Abuse_Act United states CFAA] is an act that attaches sentences to digital behaviors deemed illegal or unlawful and has clearly stated the repercussions for committing such an offense. <br />
<br />
The Act states: “The CFAA prohibits intentionally accessing a computer without authorization or in excess of authorization but fails to define what ‘without authorization’ means. With harsh penalty schemes and malleable provisions, it has become a tool ripe for abuse and uses against nearly every aspect of computer activity.”<br />
{| class="wikitable"<br />
|-<br />
! Crime/ Offense !! Years In Prison<br />
|-<br />
| Obtaining National Security Information || 10<br />
|-<br />
| Accessing a Computer and Obtaining Information || 1-5<br />
|-<br />
| Trespassing in a Government Computer || 1<br />
|-<br />
| Accessing a Computer to Defraud and Obtain Value || 5<br />
|-<br />
| Intentionally Damaging by Knowing Transmission || 1-10<br />
|-<br />
| Recklessly Damaging by Intentional Access || 1-5<br />
|-<br />
| Negligently Causing Damage and Loss by Intentional Access || 1<br />
|-<br />
| Trafficking in Passwords || 1<br />
|-<br />
| Extortion Involving Computers || 5<br />
|-<br />
| Attempt and Conspiracy to Commit such an Offense || 10<br />
|}<br />
<br />
=== Photo Sharing Law ===<br />
A law put in place by U.S. Legislation. The Photo Sharing Law outlines the general regulation for photography in the U.S. This resulted from the potential security implications internally and the privacy of citizens. The law mainly outlines these regulations:<br />
# Private property is protected and established by the owner of the property, it is illegal to refuse to abide by the owner's requests concerning photographing activity.<br />
#Photographing private property from within the public domain is not illegal.<br />
# taking pictures of public places objects and structures are legal unless explicitly prohibited.<br />
# Outer space requires a license to be issued by the National Oceanic and Atmospheric Administration in advance.<br />
# Commercial photography will likely require a permit and proof of insurance.<br />
# Photographing accident scenes and law enforcement activities is legal, as long as it doesn't hinder the operations of law enforcement, medical, emergency, or security personnel by filming.<br />
# Any filming with the intent of doing unlawful harm against a subject may be a violation of the law in itself.<br />
<br />
<br />
== Ethical Implications ==<br />
Due to the scale of technology companies, regulation cannot keep pace. As a result, general guidelines and laws for digital interactions. However, as the infrastructure continues to grow, there are new ways for the system to be leveraged in new ways. As a result, Algorithm Audits have become the most efficient way to assess digital platforms and providers' ethical impact on their consumers and users. These Audits have formulated direct ways to check digital platforms and create protections for auditors. However, these methods are difficult to employ and fall short in many areas of regulation. <ref>O'neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. Crown.</ref><br />
<br />
=== Research Protections ===<br />
Because researchers attempting to audit a platform or provider can be targeted by these large tech companies, legislation has been put in place through the CFAA and U.S. Department of Justice to protect the researchers from lawsuits. This resulted from the 2018 court ruling in SANDVIG et al. V. SESSIONS, where four university researchers attempted to scrape information on discrimination in the housing department. <br />
[[File:SandvigSessions.jpg|thumbnail| Image of Sadvig Sessions ruling, sourced from Google]]<br />
===== SANDVIG et al V. SESSIONS<ref>Lee, A. (2019). Online Research and Competition under the CFAA. Available at SSRN 3259701.</ref> =====<br />
"On Mar. 27, the United States District Court of D.C. ruled that such actions should not be viewed as criminal under the statute, though it declined to weigh in on whether the professors' conduct would be protected under the First Amendment.<br />
Without reaching this constitutional question, the Court concludes that the CFAA does not criminalize mere terms-of-service violations on consumer websites and, thus, that plaintiffs' proposed research plans are not criminal under the CFAA," <br />
U.S. District Judge John Bates wrote in his opinion.<ref>https://cases.justia.com/federal/district-courts/district-of-columbia/dcdce/1:2016cv01368/180080/24/0.pdf?ts=1522488415</ref><br />
<br />
=== Privacy Policies & Anonymity ===<br />
<br />
If attempting to do a code, scraping, or sock puppet audit requires researchers to engage with the digital platform directly. This can be dangerous because technology companies have outlined in their privacy policies restrictions on how users can engage on the forum. Researchers can be flagged for breaking these policies and banned from the platform. This can inhibit researchers' abilities to audit large developed tech companies if they're detected and can lead to lawsuits and further complications when the research is published. These large companies deal with data and can identify a user based on their access point or device metadata, and this can make it hard even to attempt many of the audit types.<br />
<br />
== References ==</div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=Algorithmic_Audits&diff=100904
Algorithmic Audits
2021-04-08T15:55:42Z
<p>Nfigue: Reworded sock puppet audit section and added links</p>
<hr />
<div><h1 style="font-size: 50; color: red; font-weight: 400;">Currently Being Edited</h1><br />
<br />
<br />
[[File:Internet-law.jpg|thumbnail| visual of digital law on computer screen, image sourced from Google]]<br />
<br />
Algorithmic Auditing is a research design that has shown promise in diagnosing the unwanted consequences of [https://en.wikipedia.org/wiki/Algorithm algorithmic] systems. <ref name="auditing_algorithms">https://auditingalgorithms.science/?page_id=89#:~:text=Auditing%20Algorithms%3A%20Adding%20Accountability%20to,in%20diagnosing%20the%20unwanted%20consequences</ref><br />
[[File:Audits.jpg|thumbnail| Visualization of Algorithmic Audit, Image sourced from Google]]<br />
<br />
The internet runs on algorithms that a majority of people don't understand, meaning it can function relatively unchecked and leaving people with the impression that these algorithms are neutral (unbiased)<ref name="wagner">Wagner, B. (2016). Algorithmic regulation and the global default: Shifting norms in Internet technology. Etikk I Praksis - Nordic Journal of Applied Ethics, 10(1), 5-13. https://doi.org/10.5324/eip.v10i1.1961</ref>. A study conducted by the PEW research center in February of 2021 showed that 85% of adults in the U.S utilize the internet daily <ref>Perrin, Andrew, and Sara Atske. About Three-in-Ten U.S. Adults Say They Are 'ALMOST Constantly' Online. 26 Mar. 2021, www.pewresearch.org/fact-tank/2021/03/26/about-three-in-ten-u-s-adults-say-they-are-almost-constantly-online/. </ref>. <br />
<br />
This statistic represents billions of people worldwide and emphasizes algorithms' persistence in our daily lives, despite a lack of general understanding. By looking at online platforms as a single entity and analyzing the results they're producing, more can be learned about algorithms and the way they affect society<ref name="sandvig1">Sandvig, Christian, et al. "An algorithm audit." Data and Discrimination: Collected Essays. Washington, DC: New America Foundation (2014): 6-10.</ref>. <br />
<br />
== Background ==<br />
The rapid advancement observed in the [https://en.wikipedia.org/wiki/High_tech tech industry] has revolutionized the way humans work, interact, and communicate. A diverse array of features are integrated into [https://en.wikipedia.org/wiki/Smart_device smart] applications for maximum [https://en.wikipedia.org/wiki/Availability availability]. At the core of all these real-time digital services lie algorithms which provide essential functions like sorting, segmentation, personalized recommendations, and information management<ref name="kotras">Kotras, Baptiste. "Mass personalization: Predictive marketing algorithms and the reshaping of consumer knowledge." Big Data & Society 7.2 (2020): 2053951720951581.</ref>. <br />
<br />
In the last decade, researchers have identified numerous ethical breaches in online platforms and published their findings. Countries worldwide have rolled out their regulations and laws to keep the Internet safe for their citizens. However, the these efforts do little to limit the international liberties afforded online, including access, anonymity, and privacy. Because technology and algorithms are heavily integrated into humanity, ethical issues arise when algorithms are opaque to public scrutiny. Since the 2000s, the regulation of algorithms has become a concern to governments worldwide due to data and privacy implications, though legislation has made progress in essential areas<ref name="goldsmith">Goldsmith, J. (2007), "Who Controls the Internet? Illusions of a Borderless World", Strategic Direction, Vol. 23 No. 11. https://doi.org/10.1108/sd.2007.05623kae.001</ref>.<br />
<br />
Higher education computer science and other similar programs have seen an increase in undergraduate and graduate enrollment, proportional to the tech industry's growth<ref name="usdoe">U.S. Department of Education, National Center for Education Statistics, Higher Education General Information Survey (HEGIS), "Degrees and Other Formal Awards Conferred" surveys, 1970-71 through 1985-86; Integrated Postsecondary Education Data System (IPEDS), "Completions Survey" (IPEDS-C:87-99); and IPEDS Fall 2000 through Fall 2011:https://nces.ed.gov/programs/digest/d12/tables/dt12_349.asp</ref>. Awareness about algorithms and digital platforms' infrastructure enables the public to identify ethical breaches in these systems' functionality. This awareness has sparked activism for ethics in technology. For example, [https://en.wikipedia.org/wiki/J Joy Buolamwini] identifying the bias in facial recognition technologies or [https://casetext.com/case/american-airlines-inc-v-sabre SABRE] leveraging their recommendation algorithm to place airlines who paid, higher on the recommendation than their competitors<ref name="copeland">Copeland, D.G., Mason, R.O., and McKenney, J.L. (1995). Sabre: The Development of Information-Based Competence and Execution of Information-Based Competition. IEEE Annals of the History of Computing, vol. 17, no. 3: 30-57.</ref>.<br />
<br />
== Types of Audits ==<br />
<br />
=== Code Audit<ref name="sandvig">Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." Data and discrimination: converting critical concerns into productive inquiry 22 (2014): 4349-4357. pg. 9-15 </ref> ===<br />
[[File: Code Audit.png|thumbnail|Visual of code audit from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 9]]<br />
A code audit is based on the concept of [https://en.wikipedia.org/wiki/Algorithmic_transparency Algorithm Transparency], where researchers acquire a copy of the algorithm and vet it for unethical behavior. This is uncommon because software is considered valuable intellectual property, and companies are reluctant to release the code, even to researchers. <br />
<br />
An issue with this method is that public interest in software can be malicious. On many platforms, the algorithm designers constantly battle those who try to take advantage of the software, so releasing code for review may cause more harm than good.<br />
<br />
=== Noninvasive User Audit <ref name="sandvig"/>===<br />
[[File: NonInvasive User Audit.png|thumbnail|Visualization of code audit from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 11]]<br />
A noninvasive user audit involves researchers requesting assorted input and output data from end users in order to analyze the workings of an algorithm. This form of audit carries the benefit of not interacting with a platform itself, ensuring that the researcher is not accused of tampering. The most challenging aspect of a user audit is generating a truly random and representative sample, making it difficult to identify [https://en.wikipedia.org/wiki/Causality causality].<br />
<br />
=== Scraping Audit <ref name="sandvig"/>===<br />
[[File: Scraping Audit.png|thumbnail|Scraping audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 12]]<br />
A scraping audit involves researchers issuing repeated queries to a platform, then observing the results, employing programs that utilize API data mining and [https://en.wikipedia.org/wiki/Web_scraping webpage scraping]. This is an effective method to gather a large quantity of data to be analyzed, however this audit still encounters the issue of securing a random sample. <br />
<br />
Another difficulty with this method is the likelihood of being prosecuted under the Computer Fraud and Abuse Act ([https://en.wikipedia.org/wiki/Computer_Fraud_and_Abuse_Act CFAA]), a law which has been criticized as being too vague, as it criminalizes any unauthorized access to a computer. Consent is difficult to obtain from a platform operator in order to perform a scraping audit.<br />
<br />
Similaryly, a Terms of Service document on a Website may not have the force of law but some scholarly institutions pose that Internet researchers must not violate the Terms of Internet platforms, making the results from this form of research difficult to publish without proper permissions.<br />
<br />
=== Sock Puppet Audit <ref name="sandvig"/>===<br />
[[File: Sock Puppet Audit.png|thumbnail|Sock Puppet Audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 13] <br />
A sock puppet audit uses computer programs to impersonate users by creating false accounts. This method effectively investigates sensitive topics in difficult-to-access [https://en.wikipedia.org/wiki/Network_domain domains], and sock puppets can investigate features of systems that are not public and penetrate groups that are difficult to pinpoint and acknowledge. A large number of sock puppets are required to derive significant findings from the audit. This method is still susceptible to CFAA violations, presenting danger to the researchers conducting the study.<br />
<br />
=== Collaborative or Crowdsourced Audit <ref name="sandvig"/>===<br />
[[File: Crowdsourcing_Collaboration_Audit.png|thumbnail|Collaborative or Crowdsourced Audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 15]]<br />
A sock puppet audit and a crowdsourced audit only differ in one respect: in the latter<br />
design, the tester is a human. While it seems absurd that it would be illegal to use a computer<br />
program to act as a user yet legal to hire a person to act like a computer program, indeed this<br />
appears to be the case. The CFAA restrictions on unauthorized use are unlikely to be triggered<br />
by a genuine human user interacting with an online platform, especially a platform that accepts<br />
new user accounts from anyone on the Internet. A great deal of effort goes into preventing<br />
automated use of free Internet platforms by malicious bots and computer programs, and these<br />
are expressly prohibited in the Terms of Service given by platform providers. Yet an actual<br />
human being who is paid to do something with an Internet platform is unlikely to trigger any of<br />
these prohibitions.<br />
<br />
An example of this is Amazon's Mechanical Turk which allows a large enough testers group to produce significant results through semi-automated crowdsourcing. The only setback is this audit requires a large budget to pay the participants.<br />
<br />
== Legislation ==<br />
[[File:Cfaa.jpg|thumbnail| Image of CFAA text, sourced from Google]]<br />
Since the internet became a public commodity, the value of information has increased significantly. The US justice system has seen numerous cases involving digital privacy, ownership, plagiarism, hacking, fraud, and more. In response, laws have developed to categorize and respond to common disputes.<ref>upcounsel (2020). Internet Law: Everything You Need to Know.</ref><br />
<br />
=== Display of Information<ref>Code, U. S. US Code of Federal Regulations.</ref> ===<br />
In 1984 USCAB declared that the airline sorting algorithm created by [https://en.wikipedia.org/wiki/Sabre_(computer_system) SABRE] must be known to participating airlines under the [https://www.govinfo.gov/content/pkg/CFR-2012-title14-vol4/pdf/CFR-2012-title14-vol4-sec255-4.pdf Display of Information in the Code of Federal Regulations]. This regulatory provision resulted from a bias identified by airlines using SABRE's software, which involved "screen science." and recommending airlines based on privatized criteria and capital incentives. The airlines noticed that American Airlines received prioritization in SABRE's algorithms search results, which led to an investigation by the USCAB and DOJ.<br />
<br />
=== The Communications Decency Act<ref>Ardia, D. S. (2009). Free speech savior or shield for scoundrels: an empirical study of intermediary immunity under Section 230 of the Communications Decency Act. Loy. LAL Rev., 43, 373.</ref> ===<br />
[https://en.wikipedia.org/wiki/Section_230 Section 230] of the Communications Decency Act states that "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." This section is one of the most potent active legislative acts concerning the internet and digital communications and one of its defining limitations. This provision protects service providers from the legal implications of removing content deemed obscene or offensive, even of constitutionally protected speech, as long as it is done in good faith.<br />
<br />
=== US Computer Fraud and Abuse Act<ref>Griffith, D. S. (1990). The computer fraud and abuse act of 1986: a measured response to a growing problem. Vand. L. Rev., 43, 453.</ref> ===<br />
The [https://en.wikipedia.org/wiki/Computer_Fraud_and_Abuse_Act United states CFAA] is an act that attaches sentences to digital behaviors deemed illegal or unlawful and has clearly stated the repercussions for committing such an offense. <br />
<br />
The Act states: “The CFAA prohibits intentionally accessing a computer without authorization or in excess of authorization but fails to define what ‘without authorization’ means. With harsh penalty schemes and malleable provisions, it has become a tool ripe for abuse and uses against nearly every aspect of computer activity.”<br />
{| class="wikitable"<br />
|-<br />
! Crime/ Offense !! Years In Prison<br />
|-<br />
| Obtaining National Security Information || 10<br />
|-<br />
| Accessing a Computer and Obtaining Information || 1-5<br />
|-<br />
| Trespassing in a Government Computer || 1<br />
|-<br />
| Accessing a Computer to Defraud and Obtain Value || 5<br />
|-<br />
| Intentionally Damaging by Knowing Transmission || 1-10<br />
|-<br />
| Recklessly Damaging by Intentional Access || 1-5<br />
|-<br />
| Negligently Causing Damage and Loss by Intentional Access || 1<br />
|-<br />
| Trafficking in Passwords || 1<br />
|-<br />
| Extortion Involving Computers || 5<br />
|-<br />
| Attempt and Conspiracy to Commit such an Offense || 10<br />
|}<br />
<br />
=== Photo Sharing Law ===<br />
A law put in place by U.S. Legislation. The Photo Sharing Law outlines the general regulation for photography in the U.S. This resulted from the potential security implications internally and the privacy of citizens. The law mainly outlines these regulations:<br />
# Private property is protected and established by the owner of the property, it is illegal to refuse to abide by the owner's requests concerning photographing activity.<br />
#Photographing private property from within the public domain is not illegal.<br />
# taking pictures of public places objects and structures are legal unless explicitly prohibited.<br />
# Outer space requires a license to be issued by the National Oceanic and Atmospheric Administration in advance.<br />
# Commercial photography will likely require a permit and proof of insurance.<br />
# Photographing accident scenes and law enforcement activities is legal, as long as it doesn't hinder the operations of law enforcement, medical, emergency, or security personnel by filming.<br />
# Any filming with the intent of doing unlawful harm against a subject may be a violation of the law in itself.<br />
<br />
<br />
== Ethical Implications ==<br />
Due to the scale of technology companies, regulation cannot keep pace. As a result, general guidelines and laws for digital interactions. However, as the infrastructure continues to grow, there are new ways for the system to be leveraged in new ways. As a result, Algorithm Audits have become the most efficient way to assess digital platforms and providers' ethical impact on their consumers and users. These Audits have formulated direct ways to check digital platforms and create protections for auditors. However, these methods are difficult to employ and fall short in many areas of regulation. <ref>O'neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. Crown.</ref><br />
<br />
=== Research Protections ===<br />
Because researchers attempting to audit a platform or provider can be targeted by these large tech companies, legislation has been put in place through the CFAA and U.S. Department of Justice to protect the researchers from lawsuits. This resulted from the 2018 court ruling in SANDVIG et al. V. SESSIONS, where four university researchers attempted to scrape information on discrimination in the housing department. <br />
[[File:SandvigSessions.jpg|thumbnail| Image of Sadvig Sessions ruling, sourced from Google]]<br />
===== SANDVIG et al V. SESSIONS<ref>Lee, A. (2019). Online Research and Competition under the CFAA. Available at SSRN 3259701.</ref> =====<br />
"On Mar. 27, the United States District Court of D.C. ruled that such actions should not be viewed as criminal under the statute, though it declined to weigh in on whether the professors' conduct would be protected under the First Amendment.<br />
Without reaching this constitutional question, the Court concludes that the CFAA does not criminalize mere terms-of-service violations on consumer websites and, thus, that plaintiffs' proposed research plans are not criminal under the CFAA," <br />
U.S. District Judge John Bates wrote in his opinion.<ref>https://cases.justia.com/federal/district-courts/district-of-columbia/dcdce/1:2016cv01368/180080/24/0.pdf?ts=1522488415</ref><br />
<br />
=== Privacy Policies & Anonymity ===<br />
<br />
If attempting to do a code, scraping, or sock puppet audit requires researchers to engage with the digital platform directly. This can be dangerous because technology companies have outlined in their privacy policies restrictions on how users can engage on the forum. Researchers can be flagged for breaking these policies and banned from the platform. This can inhibit researchers' abilities to audit large developed tech companies if they're detected and can lead to lawsuits and further complications when the research is published. These large companies deal with data and can identify a user based on their access point or device metadata, and this can make it hard even to attempt many of the audit types.<br />
<br />
== References ==</div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=Algorithmic_Audits&diff=100901
Algorithmic Audits
2021-04-08T15:49:49Z
<p>Nfigue: /* Scraping Audit */</p>
<hr />
<div><h1 style="font-size: 50; color: red; font-weight: 400;">Currently Being Edited</h1><br />
<br />
<br />
[[File:Internet-law.jpg|thumbnail| visual of digital law on computer screen, image sourced from Google]]<br />
<br />
Algorithmic Auditing is a research design that has shown promise in diagnosing the unwanted consequences of [https://en.wikipedia.org/wiki/Algorithm algorithmic] systems. <ref name="auditing_algorithms">https://auditingalgorithms.science/?page_id=89#:~:text=Auditing%20Algorithms%3A%20Adding%20Accountability%20to,in%20diagnosing%20the%20unwanted%20consequences</ref><br />
[[File:Audits.jpg|thumbnail| Visualization of Algorithmic Audit, Image sourced from Google]]<br />
<br />
The internet runs on algorithms that a majority of people don't understand, meaning it can function relatively unchecked and leaving people with the impression that these algorithms are neutral (unbiased)<ref name="wagner">Wagner, B. (2016). Algorithmic regulation and the global default: Shifting norms in Internet technology. Etikk I Praksis - Nordic Journal of Applied Ethics, 10(1), 5-13. https://doi.org/10.5324/eip.v10i1.1961</ref>. A study conducted by the PEW research center in February of 2021 showed that 85% of adults in the U.S utilize the internet daily <ref>Perrin, Andrew, and Sara Atske. About Three-in-Ten U.S. Adults Say They Are 'ALMOST Constantly' Online. 26 Mar. 2021, www.pewresearch.org/fact-tank/2021/03/26/about-three-in-ten-u-s-adults-say-they-are-almost-constantly-online/. </ref>. <br />
<br />
This statistic represents billions of people worldwide and emphasizes algorithms' persistence in our daily lives, despite a lack of general understanding. By looking at online platforms as a single entity and analyzing the results they're producing, more can be learned about algorithms and the way they affect society<ref name="sandvig1">Sandvig, Christian, et al. "An algorithm audit." Data and Discrimination: Collected Essays. Washington, DC: New America Foundation (2014): 6-10.</ref>. <br />
<br />
== Background ==<br />
The rapid advancement observed in the [https://en.wikipedia.org/wiki/High_tech tech industry] has revolutionized the way humans work, interact, and communicate. A diverse array of features are integrated into [https://en.wikipedia.org/wiki/Smart_device smart] applications for maximum [https://en.wikipedia.org/wiki/Availability availability]. At the core of all these real-time digital services lie algorithms which provide essential functions like sorting, segmentation, personalized recommendations, and information management<ref name="kotras">Kotras, Baptiste. "Mass personalization: Predictive marketing algorithms and the reshaping of consumer knowledge." Big Data & Society 7.2 (2020): 2053951720951581.</ref>. <br />
<br />
In the last decade, researchers have identified numerous ethical breaches in online platforms and published their findings. Countries worldwide have rolled out their regulations and laws to keep the Internet safe for their citizens. However, the these efforts do little to limit the international liberties afforded online, including access, anonymity, and privacy. Because technology and algorithms are heavily integrated into humanity, ethical issues arise when algorithms are opaque to public scrutiny. Since the 2000s, the regulation of algorithms has become a concern to governments worldwide due to data and privacy implications, though legislation has made progress in essential areas<ref name="goldsmith">Goldsmith, J. (2007), "Who Controls the Internet? Illusions of a Borderless World", Strategic Direction, Vol. 23 No. 11. https://doi.org/10.1108/sd.2007.05623kae.001</ref>.<br />
<br />
Higher education computer science and other similar programs have seen an increase in undergraduate and graduate enrollment, proportional to the tech industry's growth<ref name="usdoe">U.S. Department of Education, National Center for Education Statistics, Higher Education General Information Survey (HEGIS), "Degrees and Other Formal Awards Conferred" surveys, 1970-71 through 1985-86; Integrated Postsecondary Education Data System (IPEDS), "Completions Survey" (IPEDS-C:87-99); and IPEDS Fall 2000 through Fall 2011:https://nces.ed.gov/programs/digest/d12/tables/dt12_349.asp</ref>. Awareness about algorithms and digital platforms' infrastructure enables the public to identify ethical breaches in these systems' functionality. This awareness has sparked activism for ethics in technology. For example, [https://en.wikipedia.org/wiki/J Joy Buolamwini] identifying the bias in facial recognition technologies or [https://casetext.com/case/american-airlines-inc-v-sabre SABRE] leveraging their recommendation algorithm to place airlines who paid, higher on the recommendation than their competitors<ref name="copeland">Copeland, D.G., Mason, R.O., and McKenney, J.L. (1995). Sabre: The Development of Information-Based Competence and Execution of Information-Based Competition. IEEE Annals of the History of Computing, vol. 17, no. 3: 30-57.</ref>.<br />
<br />
== Types of Audits ==<br />
<br />
=== Code Audit<ref name="sandvig">Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." Data and discrimination: converting critical concerns into productive inquiry 22 (2014): 4349-4357. pg. 9-15 </ref> ===<br />
[[File: Code Audit.png|thumbnail|Visual of code audit from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 9]]<br />
A code audit is based on the concept of [https://en.wikipedia.org/wiki/Algorithmic_transparency Algorithm Transparency], where researchers acquire a copy of the algorithm and vet it for unethical behavior. This is uncommon because software is considered valuable intellectual property, and companies are reluctant to release the code, even to researchers. <br />
<br />
An issue with this method is that public interest in software can be malicious. On many platforms, the algorithm designers constantly battle those who try to take advantage of the software, so releasing code for review may cause more harm than good.<br />
<br />
=== Noninvasive User Audit <ref name="sandvig"/>===<br />
[[File: NonInvasive User Audit.png|thumbnail|Visualization of code audit from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 11]]<br />
A noninvasive user audit involves researchers requesting assorted input and output data from end users in order to analyze the workings of an algorithm. This form of audit carries the benefit of not interacting with a platform itself, ensuring that the researcher is not accused of tampering. The most challenging aspect of a user audit is generating a truly random and representative sample, making it difficult to identify [https://en.wikipedia.org/wiki/Causality causality].<br />
<br />
=== Scraping Audit <ref name="sandvig"/>===<br />
[[File: Scraping Audit.png|thumbnail|Scraping audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 12]]<br />
A scraping audit involves researchers issuing repeated queries to a platform, then observing the results, employing programs that utilize API data mining and [https://en.wikipedia.org/wiki/Web_scraping webpage scraping]. This is an effective method to gather a large quantity of data to be analyzed, however this audit still encounters the issue of securing a random sample. <br />
<br />
Another difficulty with this method is the likelihood of being prosecuted under the Computer Fraud and Abuse Act ([https://en.wikipedia.org/wiki/Computer_Fraud_and_Abuse_Act CFAA]), a law which has been criticized as being too vague, as it criminalizes any unauthorized access to a computer. Consent is difficult to obtain from a platform operator in order to perform a scraping audit.<br />
<br />
Similaryly, a Terms of Service document on a Website may not have the force of law but some scholarly institutions pose that Internet researchers must not violate the Terms of Internet platforms, making the results from this form of research difficult to publish without proper permissions.<br />
<br />
=== Sock Puppet Audit <ref name="sandvig"/>===<br />
[[File: Sock Puppet Audit.png|thumbnail|Sock Puppet Audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 13]]<br />
Sock puppet audits essentially use computer programs to impersonate users by creating false accounts. This method effectively investigates sensitive topics in difficult-to-access domains, and sock puppets can investigate features of systems that are not public and penetrate groups that are difficult to pinpoint and acknowledge. A large number of sock puppets are required to derive significant findings from the audit. However, this method is still susceptible to CFAA violations requiring the audit's purpose to be strictly for research. <br />
<br />
=== Collaborative or Crowdsourced Audit <ref name="sandvig"/>===<br />
[[File: Crowdsourcing_Collaboration_Audit.png|thumbnail|Collaborative or Crowdsourced Audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 15]]<br />
A sock puppet audit and a crowdsourced audit only differ in one respect: in the latter<br />
design, the tester is a human. While it seems absurd that it would be illegal to use a computer<br />
program to act as a user yet legal to hire a person to act like a computer program, indeed this<br />
appears to be the case. The CFAA restrictions on unauthorized use are unlikely to be triggered<br />
by a genuine human user interacting with an online platform, especially a platform that accepts<br />
new user accounts from anyone on the Internet. A great deal of effort goes into preventing<br />
automated use of free Internet platforms by malicious bots and computer programs, and these<br />
are expressly prohibited in the Terms of Service given by platform providers. Yet an actual<br />
human being who is paid to do something with an Internet platform is unlikely to trigger any of<br />
these prohibitions.<br />
<br />
An example of this is Amazon's Mechanical Turk which allows a large enough testers group to produce significant results through semi-automated crowdsourcing. The only setback is this audit requires a large budget to pay the participants.<br />
<br />
== Legislation ==<br />
[[File:Cfaa.jpg|thumbnail| Image of CFAA text, sourced from Google]]<br />
Since the internet became a public commodity, the value of information has increased significantly. The US justice system has seen numerous cases involving digital privacy, ownership, plagiarism, hacking, fraud, and more. In response, laws have developed to categorize and respond to common disputes.<ref>upcounsel (2020). Internet Law: Everything You Need to Know.</ref><br />
<br />
=== Display of Information<ref>Code, U. S. US Code of Federal Regulations.</ref> ===<br />
In 1984 USCAB declared that the airline sorting algorithm created by [https://en.wikipedia.org/wiki/Sabre_(computer_system) SABRE] must be known to participating airlines under the [https://www.govinfo.gov/content/pkg/CFR-2012-title14-vol4/pdf/CFR-2012-title14-vol4-sec255-4.pdf Display of Information in the Code of Federal Regulations]. This regulatory provision resulted from a bias identified by airlines using SABRE's software, which involved "screen science." and recommending airlines based on privatized criteria and capital incentives. The airlines noticed that American Airlines received prioritization in SABRE's algorithms search results, which led to an investigation by the USCAB and DOJ.<br />
<br />
=== The Communications Decency Act<ref>Ardia, D. S. (2009). Free speech savior or shield for scoundrels: an empirical study of intermediary immunity under Section 230 of the Communications Decency Act. Loy. LAL Rev., 43, 373.</ref> ===<br />
[https://en.wikipedia.org/wiki/Section_230 Section 230] of the Communications Decency Act states that "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." This section is one of the most potent active legislative acts concerning the internet and digital communications and one of its defining limitations. This provision protects service providers from the legal implications of removing content deemed obscene or offensive, even of constitutionally protected speech, as long as it is done in good faith.<br />
<br />
=== US Computer Fraud and Abuse Act<ref>Griffith, D. S. (1990). The computer fraud and abuse act of 1986: a measured response to a growing problem. Vand. L. Rev., 43, 453.</ref> ===<br />
The [https://en.wikipedia.org/wiki/Computer_Fraud_and_Abuse_Act United states CFAA] is an act that attaches sentences to digital behaviors deemed illegal or unlawful and has clearly stated the repercussions for committing such an offense. <br />
<br />
The Act states: “The CFAA prohibits intentionally accessing a computer without authorization or in excess of authorization but fails to define what ‘without authorization’ means. With harsh penalty schemes and malleable provisions, it has become a tool ripe for abuse and uses against nearly every aspect of computer activity.”<br />
{| class="wikitable"<br />
|-<br />
! Crime/ Offense !! Years In Prison<br />
|-<br />
| Obtaining National Security Information || 10<br />
|-<br />
| Accessing a Computer and Obtaining Information || 1-5<br />
|-<br />
| Trespassing in a Government Computer || 1<br />
|-<br />
| Accessing a Computer to Defraud and Obtain Value || 5<br />
|-<br />
| Intentionally Damaging by Knowing Transmission || 1-10<br />
|-<br />
| Recklessly Damaging by Intentional Access || 1-5<br />
|-<br />
| Negligently Causing Damage and Loss by Intentional Access || 1<br />
|-<br />
| Trafficking in Passwords || 1<br />
|-<br />
| Extortion Involving Computers || 5<br />
|-<br />
| Attempt and Conspiracy to Commit such an Offense || 10<br />
|}<br />
<br />
=== Photo Sharing Law ===<br />
A law put in place by U.S. Legislation. The Photo Sharing Law outlines the general regulation for photography in the U.S. This resulted from the potential security implications internally and the privacy of citizens. The law mainly outlines these regulations:<br />
# Private property is protected and established by the owner of the property, it is illegal to refuse to abide by the owner's requests concerning photographing activity.<br />
#Photographing private property from within the public domain is not illegal.<br />
# taking pictures of public places objects and structures are legal unless explicitly prohibited.<br />
# Outer space requires a license to be issued by the National Oceanic and Atmospheric Administration in advance.<br />
# Commercial photography will likely require a permit and proof of insurance.<br />
# Photographing accident scenes and law enforcement activities is legal, as long as it doesn't hinder the operations of law enforcement, medical, emergency, or security personnel by filming.<br />
# Any filming with the intent of doing unlawful harm against a subject may be a violation of the law in itself.<br />
<br />
<br />
== Ethical Implications ==<br />
Due to the scale of technology companies, regulation cannot keep pace. As a result, general guidelines and laws for digital interactions. However, as the infrastructure continues to grow, there are new ways for the system to be leveraged in new ways. As a result, Algorithm Audits have become the most efficient way to assess digital platforms and providers' ethical impact on their consumers and users. These Audits have formulated direct ways to check digital platforms and create protections for auditors. However, these methods are difficult to employ and fall short in many areas of regulation. <ref>O'neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. Crown.</ref><br />
<br />
=== Research Protections ===<br />
Because researchers attempting to audit a platform or provider can be targeted by these large tech companies, legislation has been put in place through the CFAA and U.S. Department of Justice to protect the researchers from lawsuits. This resulted from the 2018 court ruling in SANDVIG et al. V. SESSIONS, where four university researchers attempted to scrape information on discrimination in the housing department. <br />
[[File:SandvigSessions.jpg|thumbnail| Image of Sadvig Sessions ruling, sourced from Google]]<br />
===== SANDVIG et al V. SESSIONS<ref>Lee, A. (2019). Online Research and Competition under the CFAA. Available at SSRN 3259701.</ref> =====<br />
"On Mar. 27, the United States District Court of D.C. ruled that such actions should not be viewed as criminal under the statute, though it declined to weigh in on whether the professors' conduct would be protected under the First Amendment.<br />
Without reaching this constitutional question, the Court concludes that the CFAA does not criminalize mere terms-of-service violations on consumer websites and, thus, that plaintiffs' proposed research plans are not criminal under the CFAA," <br />
U.S. District Judge John Bates wrote in his opinion.<ref>https://cases.justia.com/federal/district-courts/district-of-columbia/dcdce/1:2016cv01368/180080/24/0.pdf?ts=1522488415</ref><br />
<br />
=== Privacy Policies & Anonymity ===<br />
<br />
If attempting to do a code, scraping, or sock puppet audit requires researchers to engage with the digital platform directly. This can be dangerous because technology companies have outlined in their privacy policies restrictions on how users can engage on the forum. Researchers can be flagged for breaking these policies and banned from the platform. This can inhibit researchers' abilities to audit large developed tech companies if they're detected and can lead to lawsuits and further complications when the research is published. These large companies deal with data and can identify a user based on their access point or device metadata, and this can make it hard even to attempt many of the audit types.<br />
<br />
== References ==</div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=Algorithmic_Audits&diff=100885
Algorithmic Audits
2021-04-08T15:13:23Z
<p>Nfigue: Reworded Scraping audit for understanding, added links</p>
<hr />
<div><h1 style="font-size: 50; color: red; font-weight: 400;">Currently Being Edited</h1><br />
<br />
<br />
[[File:Internet-law.jpg|thumbnail| visual of digital law on computer screen, image sourced from Google]]<br />
<br />
Algorithmic Auditing is a research design that has shown promise in diagnosing the unwanted consequences of [https://en.wikipedia.org/wiki/Algorithm algorithmic] systems. <ref name="auditing_algorithms">https://auditingalgorithms.science/?page_id=89#:~:text=Auditing%20Algorithms%3A%20Adding%20Accountability%20to,in%20diagnosing%20the%20unwanted%20consequences</ref><br />
[[File:Audits.jpg|thumbnail| Visualization of Algorithmic Audit, Image sourced from Google]]<br />
<br />
The internet runs on algorithms that a majority of people don't understand, meaning it can function relatively unchecked and leaving people with the impression that these algorithms are neutral (unbiased)<ref name="wagner">Wagner, B. (2016). Algorithmic regulation and the global default: Shifting norms in Internet technology. Etikk I Praksis - Nordic Journal of Applied Ethics, 10(1), 5-13. https://doi.org/10.5324/eip.v10i1.1961</ref>. A study conducted by the PEW research center in February of 2021 showed that 85% of adults in the U.S utilize the internet daily <ref>Perrin, Andrew, and Sara Atske. About Three-in-Ten U.S. Adults Say They Are 'ALMOST Constantly' Online. 26 Mar. 2021, www.pewresearch.org/fact-tank/2021/03/26/about-three-in-ten-u-s-adults-say-they-are-almost-constantly-online/. </ref>. <br />
<br />
This statistic represents billions of people worldwide and emphasizes algorithms' persistence in our daily lives, despite a lack of general understanding. By looking at online platforms as a single entity and analyzing the results they're producing, more can be learned about algorithms and the way they affect society<ref name="sandvig1">Sandvig, Christian, et al. "An algorithm audit." Data and Discrimination: Collected Essays. Washington, DC: New America Foundation (2014): 6-10.</ref>. <br />
<br />
== Background ==<br />
The rapid advancement observed in the [https://en.wikipedia.org/wiki/High_tech tech industry] has revolutionized the way humans work, interact, and communicate. A diverse array of features are integrated into [https://en.wikipedia.org/wiki/Smart_device smart] applications for maximum [https://en.wikipedia.org/wiki/Availability availability]. At the core of all these real-time digital services lie algorithms which provide essential functions like sorting, segmentation, personalized recommendations, and information management<ref name="kotras">Kotras, Baptiste. "Mass personalization: Predictive marketing algorithms and the reshaping of consumer knowledge." Big Data & Society 7.2 (2020): 2053951720951581.</ref>. <br />
<br />
In the last decade, researchers have identified numerous ethical breaches in online platforms and published their findings. Countries worldwide have rolled out their regulations and laws to keep the Internet safe for their citizens. However, the these efforts do little to limit the international liberties afforded online, including access, anonymity, and privacy. Because technology and algorithms are heavily integrated into humanity, ethical issues arise when algorithms are opaque to public scrutiny. Since the 2000s, the regulation of algorithms has become a concern to governments worldwide due to data and privacy implications, though legislation has made progress in essential areas<ref name="goldsmith">Goldsmith, J. (2007), "Who Controls the Internet? Illusions of a Borderless World", Strategic Direction, Vol. 23 No. 11. https://doi.org/10.1108/sd.2007.05623kae.001</ref>.<br />
<br />
Higher education computer science and other similar programs have seen an increase in undergraduate and graduate enrollment, proportional to the tech industry's growth<ref name="usdoe">U.S. Department of Education, National Center for Education Statistics, Higher Education General Information Survey (HEGIS), "Degrees and Other Formal Awards Conferred" surveys, 1970-71 through 1985-86; Integrated Postsecondary Education Data System (IPEDS), "Completions Survey" (IPEDS-C:87-99); and IPEDS Fall 2000 through Fall 2011:https://nces.ed.gov/programs/digest/d12/tables/dt12_349.asp</ref>. Awareness about algorithms and digital platforms' infrastructure enables the public to identify ethical breaches in these systems' functionality. This awareness has sparked activism for ethics in technology. For example, [https://en.wikipedia.org/wiki/J Joy Buolamwini] identifying the bias in facial recognition technologies or [https://casetext.com/case/american-airlines-inc-v-sabre SABRE] leveraging their recommendation algorithm to place airlines who paid, higher on the recommendation than their competitors<ref name="copeland">Copeland, D.G., Mason, R.O., and McKenney, J.L. (1995). Sabre: The Development of Information-Based Competence and Execution of Information-Based Competition. IEEE Annals of the History of Computing, vol. 17, no. 3: 30-57.</ref>.<br />
<br />
== Types of Audits ==<br />
<br />
=== Code Audit<ref name="sandvig">Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." Data and discrimination: converting critical concerns into productive inquiry 22 (2014): 4349-4357. pg. 9-15 </ref> ===<br />
[[File: Code Audit.png|thumbnail|Visual of code audit from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 9]]<br />
A code audit is based on the concept of [https://en.wikipedia.org/wiki/Algorithmic_transparency Algorithm Transparency], where researchers acquire a copy of the algorithm and vet it for unethical behavior. This is uncommon because software is considered valuable intellectual property, and companies are reluctant to release the code, even to researchers. <br />
<br />
An issue with this method is that public interest in software can be malicious. On many platforms, the algorithm designers constantly battle those who try to take advantage of the software, so releasing code for review may cause more harm than good.<br />
<br />
=== Noninvasive User Audit <ref name="sandvig"/>===<br />
[[File: NonInvasive User Audit.png|thumbnail|Visualization of code audit from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 11]]<br />
A noninvasive user audit involves researchers requesting assorted input and output data from end users in order to analyze the workings of an algorithm. This form of audit carries the benefit of not interacting with a platform itself, ensuring that the researcher is not accused of tampering. The most challenging aspect of a user audit is generating a truly random and representative sample, making it difficult to identify [https://en.wikipedia.org/wiki/Causality causality].<br />
<br />
=== Scraping Audit <ref name="sandvig"/>===<br />
[[File: Scraping Audit.png|thumbnail|Scraping audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 12]]<br />
A scraping audit involves researchers issuing repeated queries to a platform, then observing the results, employing programs that utilize API data mining and [https://en.wikipedia.org/wiki/Web_scraping webpage scraping]. This is an effective method to gather a large quantity of data to be analyzed, however this audit still encounters the issue of securing a random sample. <br />
<br />
Another difficulty with this method is the likelihood of being prosecuted under the [https://en.wikipedia.org/wiki/Computer_Fraud_and_Abuse_Act Computer Fraud and Abuse Act]<br />
(CFAA), a law which has been criticized as being too vague, as it criminalizes any unauthorized access to a computer. Consent is difficult to obtain from a platform operator to perform a scraping audit.<br />
<br />
Similaryly, a Terms of Service document on a Website may not have the force of law but some scholarly institutions pose that Internet researchers must not violate the Terms of Internet platforms, making the results from this form of research difficult to publish without proper permissions.<br />
<br />
=== Sock Puppet Audit <ref name="sandvig"/>===<br />
[[File: Sock Puppet Audit.png|thumbnail|Sock Puppet Audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 13]]<br />
Sock puppet audits essentially use computer programs to impersonate users by creating false accounts. This method effectively investigates sensitive topics in difficult-to-access domains, and sock puppets can investigate features of systems that are not public and penetrate groups that are difficult to pinpoint and acknowledge. A large number of sock puppets are required to derive significant findings from the audit. However, this method is still susceptible to CFAA violations requiring the audit's purpose to be strictly for research. <br />
<br />
=== Collaborative or Crowdsourced Audit <ref name="sandvig"/>===<br />
[[File: Crowdsourcing_Collaboration_Audit.png|thumbnail|Collaborative or Crowdsourced Audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 15]]<br />
A sock puppet audit and a crowdsourced audit only differ in one respect: in the latter<br />
design, the tester is a human. While it seems absurd that it would be illegal to use a computer<br />
program to act as a user yet legal to hire a person to act like a computer program, indeed this<br />
appears to be the case. The CFAA restrictions on unauthorized use are unlikely to be triggered<br />
by a genuine human user interacting with an online platform, especially a platform that accepts<br />
new user accounts from anyone on the Internet. A great deal of effort goes into preventing<br />
automated use of free Internet platforms by malicious bots and computer programs, and these<br />
are expressly prohibited in the Terms of Service given by platform providers. Yet an actual<br />
human being who is paid to do something with an Internet platform is unlikely to trigger any of<br />
these prohibitions.<br />
<br />
An example of this is Amazon's Mechanical Turk which allows a large enough testers group to produce significant results through semi-automated crowdsourcing. The only setback is this audit requires a large budget to pay the participants.<br />
<br />
== Legislation ==<br />
[[File:Cfaa.jpg|thumbnail| Image of CFAA text, sourced from Google]]<br />
Since the internet became a public commodity, the value of information has increased significantly. The US justice system has seen numerous cases involving digital privacy, ownership, plagiarism, hacking, fraud, and more. In response, laws have developed to categorize and respond to common disputes.<ref>upcounsel (2020). Internet Law: Everything You Need to Know.</ref><br />
<br />
=== Display of Information<ref>Code, U. S. US Code of Federal Regulations.</ref> ===<br />
In 1984 USCAB declared that the airline sorting algorithm created by [https://en.wikipedia.org/wiki/Sabre_(computer_system) SABRE] must be known to participating airlines under the [https://www.govinfo.gov/content/pkg/CFR-2012-title14-vol4/pdf/CFR-2012-title14-vol4-sec255-4.pdf Display of Information in the Code of Federal Regulations]. This regulatory provision resulted from a bias identified by airlines using SABRE's software, which involved "screen science." and recommending airlines based on privatized criteria and capital incentives. The airlines noticed that American Airlines received prioritization in SABRE's algorithms search results, which led to an investigation by the USCAB and DOJ.<br />
<br />
=== The Communications Decency Act<ref>Ardia, D. S. (2009). Free speech savior or shield for scoundrels: an empirical study of intermediary immunity under Section 230 of the Communications Decency Act. Loy. LAL Rev., 43, 373.</ref> ===<br />
[https://en.wikipedia.org/wiki/Section_230 Section 230] of the Communications Decency Act states that "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." This section is one of the most potent active legislative acts concerning the internet and digital communications and one of its defining limitations. This provision protects service providers from the legal implications of removing content deemed obscene or offensive, even of constitutionally protected speech, as long as it is done in good faith.<br />
<br />
=== US Computer Fraud and Abuse Act<ref>Griffith, D. S. (1990). The computer fraud and abuse act of 1986: a measured response to a growing problem. Vand. L. Rev., 43, 453.</ref> ===<br />
The [https://en.wikipedia.org/wiki/Computer_Fraud_and_Abuse_Act United states CFAA] is an act that attaches sentences to digital behaviors deemed illegal or unlawful and has clearly stated the repercussions for committing such an offense. <br />
<br />
The Act states: “The CFAA prohibits intentionally accessing a computer without authorization or in excess of authorization but fails to define what ‘without authorization’ means. With harsh penalty schemes and malleable provisions, it has become a tool ripe for abuse and uses against nearly every aspect of computer activity.”<br />
{| class="wikitable"<br />
|-<br />
! Crime/ Offense !! Years In Prison<br />
|-<br />
| Obtaining National Security Information || 10<br />
|-<br />
| Accessing a Computer and Obtaining Information || 1-5<br />
|-<br />
| Trespassing in a Government Computer || 1<br />
|-<br />
| Accessing a Computer to Defraud and Obtain Value || 5<br />
|-<br />
| Intentionally Damaging by Knowing Transmission || 1-10<br />
|-<br />
| Recklessly Damaging by Intentional Access || 1-5<br />
|-<br />
| Negligently Causing Damage and Loss by Intentional Access || 1<br />
|-<br />
| Trafficking in Passwords || 1<br />
|-<br />
| Extortion Involving Computers || 5<br />
|-<br />
| Attempt and Conspiracy to Commit such an Offense || 10<br />
|}<br />
<br />
=== Photo Sharing Law ===<br />
A law put in place by U.S. Legislation. The Photo Sharing Law outlines the general regulation for photography in the U.S. This resulted from the potential security implications internally and the privacy of citizens. The law mainly outlines these regulations:<br />
# Private property is protected and established by the owner of the property, it is illegal to refuse to abide by the owner's requests concerning photographing activity.<br />
#Photographing private property from within the public domain is not illegal.<br />
# taking pictures of public places objects and structures are legal unless explicitly prohibited.<br />
# Outer space requires a license to be issued by the National Oceanic and Atmospheric Administration in advance.<br />
# Commercial photography will likely require a permit and proof of insurance.<br />
# Photographing accident scenes and law enforcement activities is legal, as long as it doesn't hinder the operations of law enforcement, medical, emergency, or security personnel by filming.<br />
# Any filming with the intent of doing unlawful harm against a subject may be a violation of the law in itself.<br />
<br />
<br />
== Ethical Implications ==<br />
Due to the scale of technology companies, regulation cannot keep pace. As a result, general guidelines and laws for digital interactions. However, as the infrastructure continues to grow, there are new ways for the system to be leveraged in new ways. As a result, Algorithm Audits have become the most efficient way to assess digital platforms and providers' ethical impact on their consumers and users. These Audits have formulated direct ways to check digital platforms and create protections for auditors. However, these methods are difficult to employ and fall short in many areas of regulation. <ref>O'neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. Crown.</ref><br />
<br />
=== Research Protections ===<br />
Because researchers attempting to audit a platform or provider can be targeted by these large tech companies, legislation has been put in place through the CFAA and U.S. Department of Justice to protect the researchers from lawsuits. This resulted from the 2018 court ruling in SANDVIG et al. V. SESSIONS, where four university researchers attempted to scrape information on discrimination in the housing department. <br />
[[File:SandvigSessions.jpg|thumbnail| Image of Sadvig Sessions ruling, sourced from Google]]<br />
===== SANDVIG et al V. SESSIONS<ref>Lee, A. (2019). Online Research and Competition under the CFAA. Available at SSRN 3259701.</ref> =====<br />
"On Mar. 27, the United States District Court of D.C. ruled that such actions should not be viewed as criminal under the statute, though it declined to weigh in on whether the professors' conduct would be protected under the First Amendment.<br />
Without reaching this constitutional question, the Court concludes that the CFAA does not criminalize mere terms-of-service violations on consumer websites and, thus, that plaintiffs' proposed research plans are not criminal under the CFAA," <br />
U.S. District Judge John Bates wrote in his opinion.<ref>https://cases.justia.com/federal/district-courts/district-of-columbia/dcdce/1:2016cv01368/180080/24/0.pdf?ts=1522488415</ref><br />
<br />
=== Privacy Policies & Anonymity ===<br />
<br />
If attempting to do a code, scraping, or sock puppet audit requires researchers to engage with the digital platform directly. This can be dangerous because technology companies have outlined in their privacy policies restrictions on how users can engage on the forum. Researchers can be flagged for breaking these policies and banned from the platform. This can inhibit researchers' abilities to audit large developed tech companies if they're detected and can lead to lawsuits and further complications when the research is published. These large companies deal with data and can identify a user based on their access point or device metadata, and this can make it hard even to attempt many of the audit types.<br />
<br />
== References ==</div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=Algorithmic_Audits&diff=100882
Algorithmic Audits
2021-04-08T14:50:49Z
<p>Nfigue: Reworded user audit section for understanding, added link</p>
<hr />
<div><h1 style="font-size: 50; color: red; font-weight: 400;">Currently Being Edited</h1><br />
<br />
<br />
[[File:Internet-law.jpg|thumbnail| visual of digital law on computer screen, image sourced from Google]]<br />
<br />
Algorithmic Auditing is a research design that has shown promise in diagnosing the unwanted consequences of [https://en.wikipedia.org/wiki/Algorithm algorithmic] systems. <ref name="auditing_algorithms">https://auditingalgorithms.science/?page_id=89#:~:text=Auditing%20Algorithms%3A%20Adding%20Accountability%20to,in%20diagnosing%20the%20unwanted%20consequences</ref><br />
[[File:Audits.jpg|thumbnail| Visualization of Algorithmic Audit, Image sourced from Google]]<br />
<br />
The internet runs on algorithms that a majority of people don't understand, meaning it can function relatively unchecked and leaving people with the impression that these algorithms are neutral (unbiased)<ref name="wagner">Wagner, B. (2016). Algorithmic regulation and the global default: Shifting norms in Internet technology. Etikk I Praksis - Nordic Journal of Applied Ethics, 10(1), 5-13. https://doi.org/10.5324/eip.v10i1.1961</ref>. A study conducted by the PEW research center in February of 2021 showed that 85% of adults in the U.S utilize the internet daily <ref>Perrin, Andrew, and Sara Atske. About Three-in-Ten U.S. Adults Say They Are 'ALMOST Constantly' Online. 26 Mar. 2021, www.pewresearch.org/fact-tank/2021/03/26/about-three-in-ten-u-s-adults-say-they-are-almost-constantly-online/. </ref>. <br />
<br />
This statistic represents billions of people worldwide and emphasizes algorithms' persistence in our daily lives, despite a lack of general understanding. By looking at online platforms as a single entity and analyzing the results they're producing, more can be learned about algorithms and the way they affect society<ref name="sandvig1">Sandvig, Christian, et al. "An algorithm audit." Data and Discrimination: Collected Essays. Washington, DC: New America Foundation (2014): 6-10.</ref>. <br />
<br />
== Background ==<br />
The rapid advancement observed in the [https://en.wikipedia.org/wiki/High_tech tech industry] has revolutionized the way humans work, interact, and communicate. A diverse array of features are integrated into [https://en.wikipedia.org/wiki/Smart_device smart] applications for maximum [https://en.wikipedia.org/wiki/Availability availability]. At the core of all these real-time digital services lie algorithms which provide essential functions like sorting, segmentation, personalized recommendations, and information management<ref name="kotras">Kotras, Baptiste. "Mass personalization: Predictive marketing algorithms and the reshaping of consumer knowledge." Big Data & Society 7.2 (2020): 2053951720951581.</ref>. <br />
<br />
In the last decade, researchers have identified numerous ethical breaches in online platforms and published their findings. Countries worldwide have rolled out their regulations and laws to keep the Internet safe for their citizens. However, the these efforts do little to limit the international liberties afforded online, including access, anonymity, and privacy. Because technology and algorithms are heavily integrated into humanity, ethical issues arise when algorithms are opaque to public scrutiny. Since the 2000s, the regulation of algorithms has become a concern to governments worldwide due to data and privacy implications, though legislation has made progress in essential areas<ref name="goldsmith">Goldsmith, J. (2007), "Who Controls the Internet? Illusions of a Borderless World", Strategic Direction, Vol. 23 No. 11. https://doi.org/10.1108/sd.2007.05623kae.001</ref>.<br />
<br />
Higher education computer science and other similar programs have seen an increase in undergraduate and graduate enrollment, proportional to the tech industry's growth<ref name="usdoe">U.S. Department of Education, National Center for Education Statistics, Higher Education General Information Survey (HEGIS), "Degrees and Other Formal Awards Conferred" surveys, 1970-71 through 1985-86; Integrated Postsecondary Education Data System (IPEDS), "Completions Survey" (IPEDS-C:87-99); and IPEDS Fall 2000 through Fall 2011:https://nces.ed.gov/programs/digest/d12/tables/dt12_349.asp</ref>. Awareness about algorithms and digital platforms' infrastructure enables the public to identify ethical breaches in these systems' functionality. This awareness has sparked activism for ethics in technology. For example, [https://en.wikipedia.org/wiki/J Joy Buolamwini] identifying the bias in facial recognition technologies or [https://casetext.com/case/american-airlines-inc-v-sabre SABRE] leveraging their recommendation algorithm to place airlines who paid, higher on the recommendation than their competitors<ref name="copeland">Copeland, D.G., Mason, R.O., and McKenney, J.L. (1995). Sabre: The Development of Information-Based Competence and Execution of Information-Based Competition. IEEE Annals of the History of Computing, vol. 17, no. 3: 30-57.</ref>.<br />
<br />
== Types of Audits ==<br />
<br />
=== Code Audit<ref name="sandvig">Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." Data and discrimination: converting critical concerns into productive inquiry 22 (2014): 4349-4357. pg. 9-15 </ref> ===<br />
[[File: Code Audit.png|thumbnail|Visual of code audit from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 9]]<br />
A code audit is based on the concept of [https://en.wikipedia.org/wiki/Algorithmic_transparency Algorithm Transparency], where researchers acquire a copy of the algorithm and vet it for unethical behavior. This is uncommon because software is considered valuable intellectual property, and companies are reluctant to release the code, even to researchers. <br />
<br />
An issue with this method is that public interest in software can be malicious. On many platforms, the algorithm designers constantly battle those who try to take advantage of the software, so releasing code for review may cause more harm than good.<br />
<br />
=== Noninvasive User Audit <ref name="sandvig"/>===<br />
[[File: NonInvasive User Audit.png|thumbnail|Visualization of code audit from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 11]]<br />
A noninvasive user audit involves researchers requesting assorted input and output data from end users in order to analyze the workings of an algorithm. This form of audit carries the benefit of not interacting with a platform itself, ensuring that the researcher is not accused of tampering. The most challenging aspect of a user audit is generating a truly random and representative sample, making it difficult to identify [https://en.wikipedia.org/wiki/Causality causality].<br />
<br />
=== Scraping Audit <ref name="sandvig"/>===<br />
[[File: Scraping Audit.png|thumbnail|Scraping audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 12]]<br />
In scaping audits, researchers issue repeated queries to a platform, then observe and record the results, employing programs that utilize API data mining and webpage scraping. This is effective because it can gather a large quantity of data that researchers can test. However, it is not ideal because if the audit isn't strictly for research purposes, one could serve 1-10 years in prison for a CFAA violation resulting from this auditing technique.<br />
Terms of service also present a problem. While a Terms of Service document on a Website may not have the force of law or even operate as a contract, some scholarly associations have taken the position that Internet researchers must not violate the Terms of Service agreements of Internet platforms, making many the results from designs like this one potentially impossible to publish in some venues.<br />
<br />
=== Sock Puppet Audit <ref name="sandvig"/>===<br />
[[File: Sock Puppet Audit.png|thumbnail|Sock Puppet Audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 13]]<br />
Sock puppet audits essentially use computer programs to impersonate users by creating false accounts. This method effectively investigates sensitive topics in difficult-to-access domains, and sock puppets can investigate features of systems that are not public and penetrate groups that are difficult to pinpoint and acknowledge. A large number of sock puppets are required to derive significant findings from the audit. However, this method is still susceptible to CFAA violations requiring the audit's purpose to be strictly for research. <br />
<br />
=== Collaborative or Crowdsourced Audit <ref name="sandvig"/>===<br />
[[File: Crowdsourcing_Collaboration_Audit.png|thumbnail|Collaborative or Crowdsourced Audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 15]]<br />
A sock puppet audit and a crowdsourced audit only differ in one respect: in the latter<br />
design, the tester is a human. While it seems absurd that it would be illegal to use a computer<br />
program to act as a user yet legal to hire a person to act like a computer program, indeed this<br />
appears to be the case. The CFAA restrictions on unauthorized use are unlikely to be triggered<br />
by a genuine human user interacting with an online platform, especially a platform that accepts<br />
new user accounts from anyone on the Internet. A great deal of effort goes into preventing<br />
automated use of free Internet platforms by malicious bots and computer programs, and these<br />
are expressly prohibited in the Terms of Service given by platform providers. Yet an actual<br />
human being who is paid to do something with an Internet platform is unlikely to trigger any of<br />
these prohibitions.<br />
<br />
An example of this is Amazon's Mechanical Turk which allows a large enough testers group to produce significant results through semi-automated crowdsourcing. The only setback is this audit requires a large budget to pay the participants.<br />
<br />
== Legislation ==<br />
[[File:Cfaa.jpg|thumbnail| Image of CFAA text, sourced from Google]]<br />
Since the internet became a public commodity, the value of information has increased significantly. The US justice system has seen numerous cases involving digital privacy, ownership, plagiarism, hacking, fraud, and more. In response, laws have developed to categorize and respond to common disputes.<ref>upcounsel (2020). Internet Law: Everything You Need to Know.</ref><br />
<br />
=== Display of Information<ref>Code, U. S. US Code of Federal Regulations.</ref> ===<br />
In 1984 USCAB declared that the airline sorting algorithm created by [https://en.wikipedia.org/wiki/Sabre_(computer_system) SABRE] must be known to participating airlines under the [https://www.govinfo.gov/content/pkg/CFR-2012-title14-vol4/pdf/CFR-2012-title14-vol4-sec255-4.pdf Display of Information in the Code of Federal Regulations]. This regulatory provision resulted from a bias identified by airlines using SABRE's software, which involved "screen science." and recommending airlines based on privatized criteria and capital incentives. The airlines noticed that American Airlines received prioritization in SABRE's algorithms search results, which led to an investigation by the USCAB and DOJ.<br />
<br />
=== The Communications Decency Act<ref>Ardia, D. S. (2009). Free speech savior or shield for scoundrels: an empirical study of intermediary immunity under Section 230 of the Communications Decency Act. Loy. LAL Rev., 43, 373.</ref> ===<br />
[https://en.wikipedia.org/wiki/Section_230 Section 230] of the Communications Decency Act states that "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." This section is one of the most potent active legislative acts concerning the internet and digital communications and one of its defining limitations. This provision protects service providers from the legal implications of removing content deemed obscene or offensive, even of constitutionally protected speech, as long as it is done in good faith.<br />
<br />
=== US Computer Fraud and Abuse Act<ref>Griffith, D. S. (1990). The computer fraud and abuse act of 1986: a measured response to a growing problem. Vand. L. Rev., 43, 453.</ref> ===<br />
The [https://en.wikipedia.org/wiki/Computer_Fraud_and_Abuse_Act United states CFAA] is an act that attaches sentences to digital behaviors deemed illegal or unlawful and has clearly stated the repercussions for committing such an offense. <br />
<br />
The Act states: “The CFAA prohibits intentionally accessing a computer without authorization or in excess of authorization but fails to define what ‘without authorization’ means. With harsh penalty schemes and malleable provisions, it has become a tool ripe for abuse and uses against nearly every aspect of computer activity.”<br />
{| class="wikitable"<br />
|-<br />
! Crime/ Offense !! Years In Prison<br />
|-<br />
| Obtaining National Security Information || 10<br />
|-<br />
| Accessing a Computer and Obtaining Information || 1-5<br />
|-<br />
| Trespassing in a Government Computer || 1<br />
|-<br />
| Accessing a Computer to Defraud and Obtain Value || 5<br />
|-<br />
| Intentionally Damaging by Knowing Transmission || 1-10<br />
|-<br />
| Recklessly Damaging by Intentional Access || 1-5<br />
|-<br />
| Negligently Causing Damage and Loss by Intentional Access || 1<br />
|-<br />
| Trafficking in Passwords || 1<br />
|-<br />
| Extortion Involving Computers || 5<br />
|-<br />
| Attempt and Conspiracy to Commit such an Offense || 10<br />
|}<br />
<br />
=== Photo Sharing Law ===<br />
A law put in place by U.S. Legislation. The Photo Sharing Law outlines the general regulation for photography in the U.S. This resulted from the potential security implications internally and the privacy of citizens. The law mainly outlines these regulations:<br />
# Private property is protected and established by the owner of the property, it is illegal to refuse to abide by the owner's requests concerning photographing activity.<br />
#Photographing private property from within the public domain is not illegal.<br />
# taking pictures of public places objects and structures are legal unless explicitly prohibited.<br />
# Outer space requires a license to be issued by the National Oceanic and Atmospheric Administration in advance.<br />
# Commercial photography will likely require a permit and proof of insurance.<br />
# Photographing accident scenes and law enforcement activities is legal, as long as it doesn't hinder the operations of law enforcement, medical, emergency, or security personnel by filming.<br />
# Any filming with the intent of doing unlawful harm against a subject may be a violation of the law in itself.<br />
<br />
<br />
== Ethical Implications ==<br />
Due to the scale of technology companies, regulation cannot keep pace. As a result, general guidelines and laws for digital interactions. However, as the infrastructure continues to grow, there are new ways for the system to be leveraged in new ways. As a result, Algorithm Audits have become the most efficient way to assess digital platforms and providers' ethical impact on their consumers and users. These Audits have formulated direct ways to check digital platforms and create protections for auditors. However, these methods are difficult to employ and fall short in many areas of regulation. <ref>O'neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. Crown.</ref><br />
<br />
=== Research Protections ===<br />
Because researchers attempting to audit a platform or provider can be targeted by these large tech companies, legislation has been put in place through the CFAA and U.S. Department of Justice to protect the researchers from lawsuits. This resulted from the 2018 court ruling in SANDVIG et al. V. SESSIONS, where four university researchers attempted to scrape information on discrimination in the housing department. <br />
[[File:SandvigSessions.jpg|thumbnail| Image of Sadvig Sessions ruling, sourced from Google]]<br />
===== SANDVIG et al V. SESSIONS<ref>Lee, A. (2019). Online Research and Competition under the CFAA. Available at SSRN 3259701.</ref> =====<br />
"On Mar. 27, the United States District Court of D.C. ruled that such actions should not be viewed as criminal under the statute, though it declined to weigh in on whether the professors' conduct would be protected under the First Amendment.<br />
Without reaching this constitutional question, the Court concludes that the CFAA does not criminalize mere terms-of-service violations on consumer websites and, thus, that plaintiffs' proposed research plans are not criminal under the CFAA," <br />
U.S. District Judge John Bates wrote in his opinion.<ref>https://cases.justia.com/federal/district-courts/district-of-columbia/dcdce/1:2016cv01368/180080/24/0.pdf?ts=1522488415</ref><br />
<br />
=== Privacy Policies & Anonymity ===<br />
<br />
If attempting to do a code, scraping, or sock puppet audit requires researchers to engage with the digital platform directly. This can be dangerous because technology companies have outlined in their privacy policies restrictions on how users can engage on the forum. Researchers can be flagged for breaking these policies and banned from the platform. This can inhibit researchers' abilities to audit large developed tech companies if they're detected and can lead to lawsuits and further complications when the research is published. These large companies deal with data and can identify a user based on their access point or device metadata, and this can make it hard even to attempt many of the audit types.<br />
<br />
== References ==</div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=Algorithmic_Audits&diff=100876
Algorithmic Audits
2021-04-08T14:10:44Z
<p>Nfigue: Added links to Code Audit section, removed some text for understanding</p>
<hr />
<div><h1 style="font-size: 50; color: red; font-weight: 400;">Currently Being Edited</h1><br />
<br />
<br />
[[File:Internet-law.jpg|thumbnail| visual of digital law on computer screen, image sourced from Google]]<br />
<br />
Algorithmic Auditing is a research design that has shown promise in diagnosing the unwanted consequences of [https://en.wikipedia.org/wiki/Algorithm algorithmic] systems. <ref name="auditing_algorithms">https://auditingalgorithms.science/?page_id=89#:~:text=Auditing%20Algorithms%3A%20Adding%20Accountability%20to,in%20diagnosing%20the%20unwanted%20consequences</ref><br />
[[File:Audits.jpg|thumbnail| Visualization of Algorithmic Audit, Image sourced from Google]]<br />
<br />
The internet runs on algorithms that a majority of people don't understand, meaning it can function relatively unchecked and leaving people with the impression that these algorithms are neutral (unbiased)<ref name="wagner">Wagner, B. (2016). Algorithmic regulation and the global default: Shifting norms in Internet technology. Etikk I Praksis - Nordic Journal of Applied Ethics, 10(1), 5-13. https://doi.org/10.5324/eip.v10i1.1961</ref>. A study conducted by the PEW research center in February of 2021 showed that 85% of adults in the U.S utilize the internet daily <ref>Perrin, Andrew, and Sara Atske. About Three-in-Ten U.S. Adults Say They Are 'ALMOST Constantly' Online. 26 Mar. 2021, www.pewresearch.org/fact-tank/2021/03/26/about-three-in-ten-u-s-adults-say-they-are-almost-constantly-online/. </ref>. <br />
<br />
This statistic represents billions of people worldwide and emphasizes algorithms' persistence in our daily lives, despite a lack of general understanding. By looking at online platforms as a single entity and analyzing the results they're producing, more can be learned about algorithms and the way they affect society<ref name="sandvig1">Sandvig, Christian, et al. "An algorithm audit." Data and Discrimination: Collected Essays. Washington, DC: New America Foundation (2014): 6-10.</ref>. <br />
<br />
== Background ==<br />
The rapid advancement observed in the [https://en.wikipedia.org/wiki/High_tech tech industry] has revolutionized the way humans work, interact, and communicate. A diverse array of features are integrated into [https://en.wikipedia.org/wiki/Smart_device smart] applications for maximum [https://en.wikipedia.org/wiki/Availability availability]. At the core of all these real-time digital services lie algorithms which provide essential functions like sorting, segmentation, personalized recommendations, and information management<ref name="kotras">Kotras, Baptiste. "Mass personalization: Predictive marketing algorithms and the reshaping of consumer knowledge." Big Data & Society 7.2 (2020): 2053951720951581.</ref>. <br />
<br />
In the last decade, researchers have identified numerous ethical breaches in online platforms and published their findings. Countries worldwide have rolled out their regulations and laws to keep the Internet safe for their citizens. However, the these efforts do little to limit the international liberties afforded online, including access, anonymity, and privacy. Because technology and algorithms are heavily integrated into humanity, ethical issues arise when algorithms are opaque to public scrutiny. Since the 2000s, the regulation of algorithms has become a concern to governments worldwide due to data and privacy implications, though legislation has made progress in essential areas<ref name="goldsmith">Goldsmith, J. (2007), "Who Controls the Internet? Illusions of a Borderless World", Strategic Direction, Vol. 23 No. 11. https://doi.org/10.1108/sd.2007.05623kae.001</ref>.<br />
<br />
Higher education computer science and other similar programs have seen an increase in undergraduate and graduate enrollment, proportional to the tech industry's growth<ref name="usdoe">U.S. Department of Education, National Center for Education Statistics, Higher Education General Information Survey (HEGIS), "Degrees and Other Formal Awards Conferred" surveys, 1970-71 through 1985-86; Integrated Postsecondary Education Data System (IPEDS), "Completions Survey" (IPEDS-C:87-99); and IPEDS Fall 2000 through Fall 2011:https://nces.ed.gov/programs/digest/d12/tables/dt12_349.asp</ref>. Awareness about algorithms and digital platforms' infrastructure enables the public to identify ethical breaches in these systems' functionality. This awareness has sparked activism for ethics in technology. For example, [https://en.wikipedia.org/wiki/J Joy Buolamwini] identifying the bias in facial recognition technologies or [https://casetext.com/case/american-airlines-inc-v-sabre SABRE] leveraging their recommendation algorithm to place airlines who paid, higher on the recommendation than their competitors<ref name="copeland">Copeland, D.G., Mason, R.O., and McKenney, J.L. (1995). Sabre: The Development of Information-Based Competence and Execution of Information-Based Competition. IEEE Annals of the History of Computing, vol. 17, no. 3: 30-57.</ref>.<br />
<br />
== Types of Audits ==<br />
<br />
=== Code Audit<ref name="sandvig">Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." Data and discrimination: converting critical concerns into productive inquiry 22 (2014): 4349-4357. pg. 9-15 </ref> ===<br />
[[File: Code Audit.png|thumbnail|Visual of code audit from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 9]]<br />
A code audit is based on the concept of [https://en.wikipedia.org/wiki/Algorithmic_transparency Algorithm Transparency], where researchers acquire a copy of the algorithm and vet it for unethical behavior. This is uncommon because software is considered valuable intellectual property, and companies are reluctant to release the code, even to researchers. <br />
<br />
An issue with this method is that public interest in software can be malicious. On many platforms, the algorithm designers constantly battle those who try to take advantage of the software, so releasing code for review may cause more harm than good.<br />
<br />
=== Noninvasive User Audit <ref name="sandvig"/>===<br />
[[File: NonInvasive User Audit.png|thumbnail|Visualization of code audit from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 11]]<br />
An audit where researchers approach users and ask users to share their search queries, results, and other data with them. However, this method is historically unreliable and has a small sample size that is not random, which makes it difficult to infer causality. This audit loses the benefit of any manipulation or random assignment to conditions this is not an experimental design, and it may be quite difficult to infer causality from any particular results. For instance, a finding that a disadvantaged demographic tended to receive search results that are inferior in some way might be the result of a variety of causes.<br />
<br />
=== Scraping Audit <ref name="sandvig"/>===<br />
[[File: Scraping Audit.png|thumbnail|Scraping audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 12]]<br />
In scaping audits, researchers issue repeated queries to a platform, then observe and record the results, employing programs that utilize API data mining and webpage scraping. This is effective because it can gather a large quantity of data that researchers can test. However, it is not ideal because if the audit isn't strictly for research purposes, one could serve 1-10 years in prison for a CFAA violation resulting from this auditing technique.<br />
Terms of service also present a problem. While a Terms of Service document on a Website may not have the force of law or even operate as a contract, some scholarly associations have taken the position that Internet researchers must not violate the Terms of Service agreements of Internet platforms, making many the results from designs like this one potentially impossible to publish in some venues.<br />
<br />
=== Sock Puppet Audit <ref name="sandvig"/>===<br />
[[File: Sock Puppet Audit.png|thumbnail|Sock Puppet Audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 13]]<br />
Sock puppet audits essentially use computer programs to impersonate users by creating false accounts. This method effectively investigates sensitive topics in difficult-to-access domains, and sock puppets can investigate features of systems that are not public and penetrate groups that are difficult to pinpoint and acknowledge. A large number of sock puppets are required to derive significant findings from the audit. However, this method is still susceptible to CFAA violations requiring the audit's purpose to be strictly for research. <br />
<br />
=== Collaborative or Crowdsourced Audit <ref name="sandvig"/>===<br />
[[File: Crowdsourcing_Collaboration_Audit.png|thumbnail|Collaborative or Crowdsourced Audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 15]]<br />
A sock puppet audit and a crowdsourced audit only differ in one respect: in the latter<br />
design, the tester is a human. While it seems absurd that it would be illegal to use a computer<br />
program to act as a user yet legal to hire a person to act like a computer program, indeed this<br />
appears to be the case. The CFAA restrictions on unauthorized use are unlikely to be triggered<br />
by a genuine human user interacting with an online platform, especially a platform that accepts<br />
new user accounts from anyone on the Internet. A great deal of effort goes into preventing<br />
automated use of free Internet platforms by malicious bots and computer programs, and these<br />
are expressly prohibited in the Terms of Service given by platform providers. Yet an actual<br />
human being who is paid to do something with an Internet platform is unlikely to trigger any of<br />
these prohibitions.<br />
<br />
An example of this is Amazon's Mechanical Turk which allows a large enough testers group to produce significant results through semi-automated crowdsourcing. The only setback is this audit requires a large budget to pay the participants.<br />
<br />
== Legislation ==<br />
[[File:Cfaa.jpg|thumbnail| Image of CFAA text, sourced from Google]]<br />
Since the internet became a public commodity, the value of information has increased significantly. The US justice system has seen numerous cases involving digital privacy, ownership, plagiarism, hacking, fraud, and more. In response, laws have developed to categorize and respond to common disputes.<ref>upcounsel (2020). Internet Law: Everything You Need to Know.</ref><br />
<br />
=== Display of Information<ref>Code, U. S. US Code of Federal Regulations.</ref> ===<br />
In 1984 USCAB declared that the airline sorting algorithm created by [https://en.wikipedia.org/wiki/Sabre_(computer_system) SABRE] must be known to participating airlines under the [https://www.govinfo.gov/content/pkg/CFR-2012-title14-vol4/pdf/CFR-2012-title14-vol4-sec255-4.pdf Display of Information in the Code of Federal Regulations]. This regulatory provision resulted from a bias identified by airlines using SABRE's software, which involved "screen science." and recommending airlines based on privatized criteria and capital incentives. The airlines noticed that American Airlines received prioritization in SABRE's algorithms search results, which led to an investigation by the USCAB and DOJ.<br />
<br />
=== The Communications Decency Act<ref>Ardia, D. S. (2009). Free speech savior or shield for scoundrels: an empirical study of intermediary immunity under Section 230 of the Communications Decency Act. Loy. LAL Rev., 43, 373.</ref> ===<br />
[https://en.wikipedia.org/wiki/Section_230 Section 230] of the Communications Decency Act states that "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." This section is one of the most potent active legislative acts concerning the internet and digital communications and one of its defining limitations. This provision protects service providers from the legal implications of removing content deemed obscene or offensive, even of constitutionally protected speech, as long as it is done in good faith.<br />
<br />
=== US Computer Fraud and Abuse Act<ref>Griffith, D. S. (1990). The computer fraud and abuse act of 1986: a measured response to a growing problem. Vand. L. Rev., 43, 453.</ref> ===<br />
The [https://en.wikipedia.org/wiki/Computer_Fraud_and_Abuse_Act United states CFAA] is an act that attaches sentences to digital behaviors deemed illegal or unlawful and has clearly stated the repercussions for committing such an offense. <br />
<br />
The Act states: “The CFAA prohibits intentionally accessing a computer without authorization or in excess of authorization but fails to define what ‘without authorization’ means. With harsh penalty schemes and malleable provisions, it has become a tool ripe for abuse and uses against nearly every aspect of computer activity.”<br />
{| class="wikitable"<br />
|-<br />
! Crime/ Offense !! Years In Prison<br />
|-<br />
| Obtaining National Security Information || 10<br />
|-<br />
| Accessing a Computer and Obtaining Information || 1-5<br />
|-<br />
| Trespassing in a Government Computer || 1<br />
|-<br />
| Accessing a Computer to Defraud and Obtain Value || 5<br />
|-<br />
| Intentionally Damaging by Knowing Transmission || 1-10<br />
|-<br />
| Recklessly Damaging by Intentional Access || 1-5<br />
|-<br />
| Negligently Causing Damage and Loss by Intentional Access || 1<br />
|-<br />
| Trafficking in Passwords || 1<br />
|-<br />
| Extortion Involving Computers || 5<br />
|-<br />
| Attempt and Conspiracy to Commit such an Offense || 10<br />
|}<br />
<br />
=== Photo Sharing Law ===<br />
A law put in place by U.S. Legislation. The Photo Sharing Law outlines the general regulation for photography in the U.S. This resulted from the potential security implications internally and the privacy of citizens. The law mainly outlines these regulations:<br />
# Private property is protected and established by the owner of the property, it is illegal to refuse to abide by the owner's requests concerning photographing activity.<br />
#Photographing private property from within the public domain is not illegal.<br />
# taking pictures of public places objects and structures are legal unless explicitly prohibited.<br />
# Outer space requires a license to be issued by the National Oceanic and Atmospheric Administration in advance.<br />
# Commercial photography will likely require a permit and proof of insurance.<br />
# Photographing accident scenes and law enforcement activities is legal, as long as it doesn't hinder the operations of law enforcement, medical, emergency, or security personnel by filming.<br />
# Any filming with the intent of doing unlawful harm against a subject may be a violation of the law in itself.<br />
<br />
<br />
== Ethical Implications ==<br />
Due to the scale of technology companies, regulation cannot keep pace. As a result, general guidelines and laws for digital interactions. However, as the infrastructure continues to grow, there are new ways for the system to be leveraged in new ways. As a result, Algorithm Audits have become the most efficient way to assess digital platforms and providers' ethical impact on their consumers and users. These Audits have formulated direct ways to check digital platforms and create protections for auditors. However, these methods are difficult to employ and fall short in many areas of regulation. <ref>O'neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. Crown.</ref><br />
<br />
=== Research Protections ===<br />
Because researchers attempting to audit a platform or provider can be targeted by these large tech companies, legislation has been put in place through the CFAA and U.S. Department of Justice to protect the researchers from lawsuits. This resulted from the 2018 court ruling in SANDVIG et al. V. SESSIONS, where four university researchers attempted to scrape information on discrimination in the housing department. <br />
[[File:SandvigSessions.jpg|thumbnail| Image of Sadvig Sessions ruling, sourced from Google]]<br />
===== SANDVIG et al V. SESSIONS<ref>Lee, A. (2019). Online Research and Competition under the CFAA. Available at SSRN 3259701.</ref> =====<br />
"On Mar. 27, the United States District Court of D.C. ruled that such actions should not be viewed as criminal under the statute, though it declined to weigh in on whether the professors' conduct would be protected under the First Amendment.<br />
Without reaching this constitutional question, the Court concludes that the CFAA does not criminalize mere terms-of-service violations on consumer websites and, thus, that plaintiffs' proposed research plans are not criminal under the CFAA," <br />
U.S. District Judge John Bates wrote in his opinion.<ref>https://cases.justia.com/federal/district-courts/district-of-columbia/dcdce/1:2016cv01368/180080/24/0.pdf?ts=1522488415</ref><br />
<br />
=== Privacy Policies & Anonymity ===<br />
<br />
If attempting to do a code, scraping, or sock puppet audit requires researchers to engage with the digital platform directly. This can be dangerous because technology companies have outlined in their privacy policies restrictions on how users can engage on the forum. Researchers can be flagged for breaking these policies and banned from the platform. This can inhibit researchers' abilities to audit large developed tech companies if they're detected and can lead to lawsuits and further complications when the research is published. These large companies deal with data and can identify a user based on their access point or device metadata, and this can make it hard even to attempt many of the audit types.<br />
<br />
== References ==</div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=Algorithmic_Audits&diff=100869
Algorithmic Audits
2021-04-08T14:05:22Z
<p>Nfigue: More edits and links in the intro and background sections</p>
<hr />
<div><h1 style="font-size: 50; color: red; font-weight: 400;">Currently Being Edited</h1><br />
<br />
<br />
[[File:Internet-law.jpg|thumbnail| visual of digital law on computer screen, image sourced from Google]]<br />
<br />
Algorithmic Auditing is a research design that has shown promise in diagnosing the unwanted consequences of [https://en.wikipedia.org/wiki/Algorithm algorithmic] systems. <ref name="auditing_algorithms">https://auditingalgorithms.science/?page_id=89#:~:text=Auditing%20Algorithms%3A%20Adding%20Accountability%20to,in%20diagnosing%20the%20unwanted%20consequences</ref><br />
[[File:Audits.jpg|thumbnail| Visualization of Algorithmic Audit, Image sourced from Google]]<br />
<br />
The internet runs on algorithms that a majority of people don't understand, meaning it can function relatively unchecked and leaving people with the impression that these algorithms are neutral (unbiased)<ref name="wagner">Wagner, B. (2016). Algorithmic regulation and the global default: Shifting norms in Internet technology. Etikk I Praksis - Nordic Journal of Applied Ethics, 10(1), 5-13. https://doi.org/10.5324/eip.v10i1.1961</ref>. A study conducted by the PEW research center in February of 2021 showed that 85% of adults in the U.S utilize the internet daily <ref>Perrin, Andrew, and Sara Atske. About Three-in-Ten U.S. Adults Say They Are 'ALMOST Constantly' Online. 26 Mar. 2021, www.pewresearch.org/fact-tank/2021/03/26/about-three-in-ten-u-s-adults-say-they-are-almost-constantly-online/. </ref>. <br />
<br />
This statistic represents billions of people worldwide and emphasizes algorithms' persistence in our daily lives, despite a lack of general understanding. By looking at online platforms as a single entity and analyzing the results they're producing, more can be learned about algorithms and the way they affect society<ref name="sandvig1">Sandvig, Christian, et al. "An algorithm audit." Data and Discrimination: Collected Essays. Washington, DC: New America Foundation (2014): 6-10.</ref>. <br />
<br />
== Background ==<br />
The rapid advancement observed in the [https://en.wikipedia.org/wiki/High_tech tech industry] has revolutionized the way humans work, interact, and communicate. A diverse array of features are integrated into [https://en.wikipedia.org/wiki/Smart_device smart] applications for maximum [https://en.wikipedia.org/wiki/Availability availability]. At the core of all these real-time digital services lie algorithms which provide essential functions like sorting, segmentation, personalized recommendations, and information management<ref name="kotras">Kotras, Baptiste. "Mass personalization: Predictive marketing algorithms and the reshaping of consumer knowledge." Big Data & Society 7.2 (2020): 2053951720951581.</ref>. <br />
<br />
In the last decade, researchers have identified numerous ethical breaches in online platforms and published their findings. Countries worldwide have rolled out their regulations and laws to keep the Internet safe for their citizens. However, the these efforts do little to limit the international liberties afforded online, including access, anonymity, and privacy. Because technology and algorithms are heavily integrated into humanity, ethical issues arise when algorithms are opaque to public scrutiny. Since the 2000s, the regulation of algorithms has become a concern to governments worldwide due to data and privacy implications, though legislation has made progress in essential areas<ref name="goldsmith">Goldsmith, J. (2007), "Who Controls the Internet? Illusions of a Borderless World", Strategic Direction, Vol. 23 No. 11. https://doi.org/10.1108/sd.2007.05623kae.001</ref>.<br />
<br />
Higher education computer science and other similar programs have seen an increase in undergraduate and graduate enrollment, proportional to the tech industry's growth<ref name="usdoe">U.S. Department of Education, National Center for Education Statistics, Higher Education General Information Survey (HEGIS), "Degrees and Other Formal Awards Conferred" surveys, 1970-71 through 1985-86; Integrated Postsecondary Education Data System (IPEDS), "Completions Survey" (IPEDS-C:87-99); and IPEDS Fall 2000 through Fall 2011:https://nces.ed.gov/programs/digest/d12/tables/dt12_349.asp</ref>. Awareness about algorithms and digital platforms' infrastructure enables the public to identify ethical breaches in these systems' functionality. This awareness has sparked activism for ethics in technology. For example, [https://en.wikipedia.org/wiki/J Joy Buolamwini] identifying the bias in facial recognition technologies or [https://casetext.com/case/american-airlines-inc-v-sabre SABRE] leveraging their recommendation algorithm to place airlines who paid, higher on the recommendation than their competitors<ref name="copeland">Copeland, D.G., Mason, R.O., and McKenney, J.L. (1995). Sabre: The Development of Information-Based Competence and Execution of Information-Based Competition. IEEE Annals of the History of Computing, vol. 17, no. 3: 30-57.</ref>.<br />
<br />
== Types of Audits ==<br />
<br />
=== Code Audit<ref name="sandvig">Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." Data and discrimination: converting critical concerns into productive inquiry 22 (2014): 4349-4357. pg. 9-15 </ref> ===<br />
[[File: Code Audit.png|thumbnail|Visual of code audit from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 9]]<br />
These audits use the idea of "Algorithm Transparency," where researchers acquire a copy of the algorithm and vet it for unethical platform behavior. This rarely happens because the program algorithm is considered valuable intellectual property, and tech companies are reluctant to release the code, even to researchers. <br />
<br />
A major problem is that the public interest disclosure of just algorithms might be likely to<br />
produce serious negative consequences. On many platforms, the algorithm designers constantly<br />
operate a game of cat-and-mouse with those who would abuse or “game” their algorithm. These<br />
adversaries may themselves be criminals (such as spammers or hackers) and aiding them<br />
could conceivably be a greater harm than detecting unfair discrimination in the platform itself.<br />
<br />
=== Noninvasive User Audit <ref name="sandvig"/>===<br />
[[File: NonInvasive User Audit.png|thumbnail|Visualization of code audit from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 11]]<br />
An audit where researchers approach users and ask users to share their search queries, results, and other data with them. However, this method is historically unreliable and has a small sample size that is not random, which makes it difficult to infer causality. This audit loses the benefit of any manipulation or random assignment to conditions this is not an experimental design, and it may be quite difficult to infer causality from any particular results. For instance, a finding that a disadvantaged demographic tended to receive search results that are inferior in some way might be the result of a variety of causes.<br />
<br />
=== Scraping Audit <ref name="sandvig"/>===<br />
[[File: Scraping Audit.png|thumbnail|Scraping audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 12]]<br />
In scaping audits, researchers issue repeated queries to a platform, then observe and record the results, employing programs that utilize API data mining and webpage scraping. This is effective because it can gather a large quantity of data that researchers can test. However, it is not ideal because if the audit isn't strictly for research purposes, one could serve 1-10 years in prison for a CFAA violation resulting from this auditing technique.<br />
Terms of service also present a problem. While a Terms of Service document on a Website may not have the force of law or even operate as a contract, some scholarly associations have taken the position that Internet researchers must not violate the Terms of Service agreements of Internet platforms, making many the results from designs like this one potentially impossible to publish in some venues.<br />
<br />
=== Sock Puppet Audit <ref name="sandvig"/>===<br />
[[File: Sock Puppet Audit.png|thumbnail|Sock Puppet Audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 13]]<br />
Sock puppet audits essentially use computer programs to impersonate users by creating false accounts. This method effectively investigates sensitive topics in difficult-to-access domains, and sock puppets can investigate features of systems that are not public and penetrate groups that are difficult to pinpoint and acknowledge. A large number of sock puppets are required to derive significant findings from the audit. However, this method is still susceptible to CFAA violations requiring the audit's purpose to be strictly for research. <br />
<br />
=== Collaborative or Crowdsourced Audit <ref name="sandvig"/>===<br />
[[File: Crowdsourcing_Collaboration_Audit.png|thumbnail|Collaborative or Crowdsourced Audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 15]]<br />
A sock puppet audit and a crowdsourced audit only differ in one respect: in the latter<br />
design, the tester is a human. While it seems absurd that it would be illegal to use a computer<br />
program to act as a user yet legal to hire a person to act like a computer program, indeed this<br />
appears to be the case. The CFAA restrictions on unauthorized use are unlikely to be triggered<br />
by a genuine human user interacting with an online platform, especially a platform that accepts<br />
new user accounts from anyone on the Internet. A great deal of effort goes into preventing<br />
automated use of free Internet platforms by malicious bots and computer programs, and these<br />
are expressly prohibited in the Terms of Service given by platform providers. Yet an actual<br />
human being who is paid to do something with an Internet platform is unlikely to trigger any of<br />
these prohibitions.<br />
<br />
An example of this is Amazon's Mechanical Turk which allows a large enough testers group to produce significant results through semi-automated crowdsourcing. The only setback is this audit requires a large budget to pay the participants.<br />
<br />
== Legislation ==<br />
[[File:Cfaa.jpg|thumbnail| Image of CFAA text, sourced from Google]]<br />
Since the internet became a public commodity, the value of information has increased significantly. The US justice system has seen numerous cases involving digital privacy, ownership, plagiarism, hacking, fraud, and more. In response, laws have developed to categorize and respond to common disputes.<ref>upcounsel (2020). Internet Law: Everything You Need to Know.</ref><br />
<br />
=== Display of Information<ref>Code, U. S. US Code of Federal Regulations.</ref> ===<br />
In 1984 USCAB declared that the airline sorting algorithm created by [https://en.wikipedia.org/wiki/Sabre_(computer_system) SABRE] must be known to participating airlines under the [https://www.govinfo.gov/content/pkg/CFR-2012-title14-vol4/pdf/CFR-2012-title14-vol4-sec255-4.pdf Display of Information in the Code of Federal Regulations]. This regulatory provision resulted from a bias identified by airlines using SABRE's software, which involved "screen science." and recommending airlines based on privatized criteria and capital incentives. The airlines noticed that American Airlines received prioritization in SABRE's algorithms search results, which led to an investigation by the USCAB and DOJ.<br />
<br />
=== The Communications Decency Act<ref>Ardia, D. S. (2009). Free speech savior or shield for scoundrels: an empirical study of intermediary immunity under Section 230 of the Communications Decency Act. Loy. LAL Rev., 43, 373.</ref> ===<br />
[https://en.wikipedia.org/wiki/Section_230 Section 230] of the Communications Decency Act states that "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." This section is one of the most potent active legislative acts concerning the internet and digital communications and one of its defining limitations. This provision protects service providers from the legal implications of removing content deemed obscene or offensive, even of constitutionally protected speech, as long as it is done in good faith.<br />
<br />
=== US Computer Fraud and Abuse Act<ref>Griffith, D. S. (1990). The computer fraud and abuse act of 1986: a measured response to a growing problem. Vand. L. Rev., 43, 453.</ref> ===<br />
The [https://en.wikipedia.org/wiki/Computer_Fraud_and_Abuse_Act United states CFAA] is an act that attaches sentences to digital behaviors deemed illegal or unlawful and has clearly stated the repercussions for committing such an offense. <br />
<br />
The Act states: “The CFAA prohibits intentionally accessing a computer without authorization or in excess of authorization but fails to define what ‘without authorization’ means. With harsh penalty schemes and malleable provisions, it has become a tool ripe for abuse and uses against nearly every aspect of computer activity.”<br />
{| class="wikitable"<br />
|-<br />
! Crime/ Offense !! Years In Prison<br />
|-<br />
| Obtaining National Security Information || 10<br />
|-<br />
| Accessing a Computer and Obtaining Information || 1-5<br />
|-<br />
| Trespassing in a Government Computer || 1<br />
|-<br />
| Accessing a Computer to Defraud and Obtain Value || 5<br />
|-<br />
| Intentionally Damaging by Knowing Transmission || 1-10<br />
|-<br />
| Recklessly Damaging by Intentional Access || 1-5<br />
|-<br />
| Negligently Causing Damage and Loss by Intentional Access || 1<br />
|-<br />
| Trafficking in Passwords || 1<br />
|-<br />
| Extortion Involving Computers || 5<br />
|-<br />
| Attempt and Conspiracy to Commit such an Offense || 10<br />
|}<br />
<br />
=== Photo Sharing Law ===<br />
A law put in place by U.S. Legislation. The Photo Sharing Law outlines the general regulation for photography in the U.S. This resulted from the potential security implications internally and the privacy of citizens. The law mainly outlines these regulations:<br />
# Private property is protected and established by the owner of the property, it is illegal to refuse to abide by the owner's requests concerning photographing activity.<br />
#Photographing private property from within the public domain is not illegal.<br />
# taking pictures of public places objects and structures are legal unless explicitly prohibited.<br />
# Outer space requires a license to be issued by the National Oceanic and Atmospheric Administration in advance.<br />
# Commercial photography will likely require a permit and proof of insurance.<br />
# Photographing accident scenes and law enforcement activities is legal, as long as it doesn't hinder the operations of law enforcement, medical, emergency, or security personnel by filming.<br />
# Any filming with the intent of doing unlawful harm against a subject may be a violation of the law in itself.<br />
<br />
<br />
== Ethical Implications ==<br />
Due to the scale of technology companies, regulation cannot keep pace. As a result, general guidelines and laws for digital interactions. However, as the infrastructure continues to grow, there are new ways for the system to be leveraged in new ways. As a result, Algorithm Audits have become the most efficient way to assess digital platforms and providers' ethical impact on their consumers and users. These Audits have formulated direct ways to check digital platforms and create protections for auditors. However, these methods are difficult to employ and fall short in many areas of regulation. <ref>O'neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. Crown.</ref><br />
<br />
=== Research Protections ===<br />
Because researchers attempting to audit a platform or provider can be targeted by these large tech companies, legislation has been put in place through the CFAA and U.S. Department of Justice to protect the researchers from lawsuits. This resulted from the 2018 court ruling in SANDVIG et al. V. SESSIONS, where four university researchers attempted to scrape information on discrimination in the housing department. <br />
[[File:SandvigSessions.jpg|thumbnail| Image of Sadvig Sessions ruling, sourced from Google]]<br />
===== SANDVIG et al V. SESSIONS<ref>Lee, A. (2019). Online Research and Competition under the CFAA. Available at SSRN 3259701.</ref> =====<br />
"On Mar. 27, the United States District Court of D.C. ruled that such actions should not be viewed as criminal under the statute, though it declined to weigh in on whether the professors' conduct would be protected under the First Amendment.<br />
Without reaching this constitutional question, the Court concludes that the CFAA does not criminalize mere terms-of-service violations on consumer websites and, thus, that plaintiffs' proposed research plans are not criminal under the CFAA," <br />
U.S. District Judge John Bates wrote in his opinion.<ref>https://cases.justia.com/federal/district-courts/district-of-columbia/dcdce/1:2016cv01368/180080/24/0.pdf?ts=1522488415</ref><br />
<br />
=== Privacy Policies & Anonymity ===<br />
<br />
If attempting to do a code, scraping, or sock puppet audit requires researchers to engage with the digital platform directly. This can be dangerous because technology companies have outlined in their privacy policies restrictions on how users can engage on the forum. Researchers can be flagged for breaking these policies and banned from the platform. This can inhibit researchers' abilities to audit large developed tech companies if they're detected and can lead to lawsuits and further complications when the research is published. These large companies deal with data and can identify a user based on their access point or device metadata, and this can make it hard even to attempt many of the audit types.<br />
<br />
== References ==</div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=Algorithmic_Audits&diff=100739
Algorithmic Audits
2021-04-06T23:07:12Z
<p>Nfigue: edits for understanding</p>
<hr />
<div><h1 style="font-size: 50; color: red; font-weight: 400;">Currently Being Edited</h1><br />
<br />
<br />
[[File:Internet-law.jpg|thumbnail| visual of digital law on computer screen, image sourced from Google]]<br />
<br />
Algorithmic Auditing is a research design that has shown promise in diagnosing the unwanted consequences of algorithmic systems. <ref name="auditing_algorithms">https://auditingalgorithms.science/?page_id=89#:~:text=Auditing%20Algorithms%3A%20Adding%20Accountability%20to,in%20diagnosing%20the%20unwanted%20consequences</ref><br />
[[File:Audits.jpg|thumbnail| Visualization of Algorithmic Audit, Image sourced from Google]]<br />
<br />
The internet runs on algorithms that the majority of people don't understand meaning it can function relatively unchecked, unlike other widespread modern commodities (cars, housing). A study conducted by the PEW research center in February of 2021 showed that 85% of adults in the U.S utilize the internet daily <ref>Perrin, Andrew, and Sara Atske. About Three-in-Ten U.S. Adults Say They Are 'ALMOST Constantly' Online. 26 Mar. 2021, www.pewresearch.org/fact-tank/2021/03/26/about-three-in-ten-u-s-adults-say-they-are-almost-constantly-online/. </ref>. <br />
<br />
This behavior represents billions of people worldwide and emphasizes algorithms' persistence in our daily lives, despite a lack of general understanding. By looking at online platforms as a single entity and analyzing the results they're producing, more can be learned about algorithms and the way they affect society<ref name="sandvig1">Sandvig, Christian, et al. "An algorithm audit." Data and Discrimination: Collected Essays. Washington, DC: New America Foundation (2014): 6-10.</ref>. <br />
<br />
== Background ==<br />
The rapid progress observed in the tech industry has revolutionized the way humans work, interact, and communicate. A diverse array of features are integrated into "smart" object applications and available instantaneously. At the core of all these real-time digital services lie algorithms which provide essential functions like sorting, segmentation, personalized recommendations, and information management<ref name="kotras">Kotras, Baptiste. "Mass personalization: Predictive marketing algorithms and the reshaping of consumer knowledge." Big Data & Society 7.2 (2020): 2053951720951581.</ref>. <br />
<br />
In the last decade, researchers have identified numerous ethical breaches in online platforms and published their findings. Countries worldwide have rolled out their regulations and laws to keep the Internet safe for their citizens. However, the these efforts do little to limit the international liberties afforded online, including access, anonymity, and privacy. Because technology and algorithms are heavily integrated into humanity, ethical issues arise when algorithms are opaque to public scrutiny. Since the 2000s, the regulation of algorithms has become a concern to governments worldwide due to data and privacy implications, though legislation has made progress in essential areas<ref name="goldsmith">Goldsmith, J. (2007), "Who Controls the Internet? Illusions of a Borderless World", Strategic Direction, Vol. 23 No. 11. https://doi.org/10.1108/sd.2007.05623kae.001</ref>.<br />
<br />
Higher education computer science and other similar programs have seen an increase in undergraduate and graduate enrollment, proportional to the tech industry's growth<ref name="usdoe">U.S. Department of Education, National Center for Education Statistics, Higher Education General Information Survey (HEGIS), "Degrees and Other Formal Awards Conferred" surveys, 1970-71 through 1985-86; Integrated Postsecondary Education Data System (IPEDS), "Completions Survey" (IPEDS-C:87-99); and IPEDS Fall 2000 through Fall 2011:https://nces.ed.gov/programs/digest/d12/tables/dt12_349.asp</ref>. Awareness about algorithms and digital platforms' infrastructure enables the public to identify ethical breaches in these systems' functionality. This awareness has sparked activism for ethics in technology. For example, [https://en.wikipedia.org/wiki/J Joy Buolamwini] identifying the bias in facial recognition technologies or [https://casetext.com/case/american-airlines-inc-v-sabre SABRE] leveraging their recommendation algorithm to place airlines who paid, higher on the recommendation than their competitors<ref name="copeland">Copeland, D.G., Mason, R.O., and McKenney, J.L. (1995). Sabre: The Development of Information-Based Competence and Execution of Information-Based Competition. IEEE Annals of the History of Computing, vol. 17, no. 3: 30-57.</ref>.<br />
<br />
== Types of Audits ==<br />
<br />
=== Code Audit<ref name="sandvig">Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." Data and discrimination: converting critical concerns into productive inquiry 22 (2014): 4349-4357. pg. 9-15 </ref> ===<br />
[[File: Code Audit.png|thumbnail|Visual of code audit from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 9]]<br />
These audits use the idea of "Algorithm Transparency," where researchers acquire a copy of the algorithm and vet it for unethical platform behavior. This rarely happens because the program algorithm is considered valuable intellectual property, and tech companies are reluctant to release the code, even to researchers. <br />
<br />
A major problem is that the public interest disclosure of just algorithms might be likely to<br />
produce serious negative consequences. On many platforms, the algorithm designers constantly<br />
operate a game of cat-and-mouse with those who would abuse or “game” their algorithm. These<br />
adversaries may themselves be criminals (such as spammers or hackers) and aiding them<br />
could conceivably be a greater harm than detecting unfair discrimination in the platform itself.<br />
<br />
=== Noninvasive User Audit <ref name="sandvig"/>===<br />
[[File: NonInvasive User Audit.png|thumbnail|Visualization of code audit from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 11]]<br />
An audit where researchers approach users and ask users to share their search queries, results, and other data with them. However, this method is historically unreliable and has a small sample size that is not random, which makes it difficult to infer causality. This audit loses the benefit of any manipulation or random assignment to conditions this is not an experimental design, and it may be quite difficult to infer causality from any particular results. For instance, a finding that a disadvantaged demographic tended to receive search results that are inferior in some way might be the result of a variety of causes.<br />
<br />
=== Scraping Audit <ref name="sandvig"/>===<br />
[[File: Scraping Audit.png|thumbnail|Scraping audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 12]]<br />
In scaping audits, researchers issue repeated queries to a platform, then observe and record the results, employing programs that utilize API data mining and webpage scraping. This is effective because it can gather a large quantity of data that researchers can test. However, it is not ideal because if the audit isn't strictly for research purposes, one could serve 1-10 years in prison for a CFAA violation resulting from this auditing technique.<br />
Terms of service also present a problem. While a Terms of Service document on a Website may not have the force of law or even operate as a contract, some scholarly associations have taken the position that Internet researchers must not violate the Terms of Service agreements of Internet platforms, making many the results from designs like this one potentially impossible to publish in some venues.<br />
<br />
=== Sock Puppet Audit <ref name="sandvig"/>===<br />
[[File: Sock Puppet Audit.png|thumbnail|Sock Puppet Audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 13]]<br />
Sock puppet audits essentially use computer programs to impersonate users by creating false accounts. This method effectively investigates sensitive topics in difficult-to-access domains, and sock puppets can investigate features of systems that are not public and penetrate groups that are difficult to pinpoint and acknowledge. A large number of sock puppets are required to derive significant findings from the audit. However, this method is still susceptible to CFAA violations requiring the audit's purpose to be strictly for research. <br />
<br />
=== Collaborative or Crowdsourced Audit <ref name="sandvig"/>===<br />
[[File: Crowdsourcing_Collaboration_Audit.png|thumbnail|Collaborative or Crowdsourced Audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 15]]<br />
A sock puppet audit and a crowdsourced audit only differ in one respect: in the latter<br />
design, the tester is a human. While it seems absurd that it would be illegal to use a computer<br />
program to act as a user yet legal to hire a person to act like a computer program, indeed this<br />
appears to be the case. The CFAA restrictions on unauthorized use are unlikely to be triggered<br />
by a genuine human user interacting with an online platform, especially a platform that accepts<br />
new user accounts from anyone on the Internet. A great deal of effort goes into preventing<br />
automated use of free Internet platforms by malicious bots and computer programs, and these<br />
are expressly prohibited in the Terms of Service given by platform providers. Yet an actual<br />
human being who is paid to do something with an Internet platform is unlikely to trigger any of<br />
these prohibitions.<br />
<br />
An example of this is Amazon's Mechanical Turk which allows a large enough testers group to produce significant results through semi-automated crowdsourcing. The only setback is this audit requires a large budget to pay the participants.<br />
<br />
== Legislation ==<br />
[[File:Cfaa.jpg|thumbnail| Image of CFAA text, sourced from Google]]<br />
Since the internet became a public commodity, the value of information has increased significantly. The US justice system has seen numerous cases involving digital privacy, ownership, plagiarism, hacking, fraud, and more. In response, laws have developed to categorize and respond to common disputes.<ref>upcounsel (2020). Internet Law: Everything You Need to Know.</ref><br />
<br />
=== Display of Information<ref>Code, U. S. US Code of Federal Regulations.</ref> ===<br />
In 1984 USCAB declared that the airline sorting algorithm created by [https://en.wikipedia.org/wiki/Sabre_(computer_system) SABRE] must be known to participating airlines under the [https://www.govinfo.gov/content/pkg/CFR-2012-title14-vol4/pdf/CFR-2012-title14-vol4-sec255-4.pdf Display of Information in the Code of Federal Regulations]. This regulatory provision resulted from a bias identified by airlines using SABRE's software, which involved "screen science." and recommending airlines based on privatized criteria and capital incentives. The airlines noticed that American Airlines received prioritization in SABRE's algorithms search results, which led to an investigation by the USCAB and DOJ.<br />
<br />
=== The Communications Decency Act<ref>Ardia, D. S. (2009). Free speech savior or shield for scoundrels: an empirical study of intermediary immunity under Section 230 of the Communications Decency Act. Loy. LAL Rev., 43, 373.</ref> ===<br />
[https://en.wikipedia.org/wiki/Section_230 Section 230] of the Communications Decency Act states that "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." This section is one of the most potent active legislative acts concerning the internet and digital communications and one of its defining limitations. This provision protects service providers from the legal implications of removing content deemed obscene or offensive, even of constitutionally protected speech, as long as it is done in good faith.<br />
<br />
=== US Computer Fraud and Abuse Act<ref>Griffith, D. S. (1990). The computer fraud and abuse act of 1986: a measured response to a growing problem. Vand. L. Rev., 43, 453.</ref> ===<br />
The [https://en.wikipedia.org/wiki/Computer_Fraud_and_Abuse_Act United states CFAA] is an act that attaches sentences to digital behaviors deemed illegal or unlawful and has clearly stated the repercussions for committing such an offense. <br />
<br />
The Act states: “The CFAA prohibits intentionally accessing a computer without authorization or in excess of authorization but fails to define what ‘without authorization’ means. With harsh penalty schemes and malleable provisions, it has become a tool ripe for abuse and uses against nearly every aspect of computer activity.”<br />
{| class="wikitable"<br />
|-<br />
! Crime/ Offense !! Years In Prison<br />
|-<br />
| Obtaining National Security Information || 10<br />
|-<br />
| Accessing a Computer and Obtaining Information || 1-5<br />
|-<br />
| Trespassing in a Government Computer || 1<br />
|-<br />
| Accessing a Computer to Defraud and Obtain Value || 5<br />
|-<br />
| Intentionally Damaging by Knowing Transmission || 1-10<br />
|-<br />
| Recklessly Damaging by Intentional Access || 1-5<br />
|-<br />
| Negligently Causing Damage and Loss by Intentional Access || 1<br />
|-<br />
| Trafficking in Passwords || 1<br />
|-<br />
| Extortion Involving Computers || 5<br />
|-<br />
| Attempt and Conspiracy to Commit such an Offense || 10<br />
|}<br />
<br />
=== Photo Sharing Law ===<br />
A law put in place by U.S. Legislation. The Photo Sharing Law outlines the general regulation for photography in the U.S. This resulted from the potential security implications internally and the privacy of citizens. The law mainly outlines these regulations:<br />
# Private property is protected and established by the owner of the property, it is illegal to refuse to abide by the owner's requests concerning photographing activity.<br />
#Photographing private property from within the public domain is not illegal.<br />
# taking pictures of public places objects and structures are legal unless explicitly prohibited.<br />
# Outer space requires a license to be issued by the National Oceanic and Atmospheric Administration in advance.<br />
# Commercial photography will likely require a permit and proof of insurance.<br />
# Photographing accident scenes and law enforcement activities is legal, as long as it doesn't hinder the operations of law enforcement, medical, emergency, or security personnel by filming.<br />
# Any filming with the intent of doing unlawful harm against a subject may be a violation of the law in itself.<br />
<br />
<br />
== Ethical Implications ==<br />
Due to the scale of technology companies, regulation cannot keep pace. As a result, general guidelines and laws for digital interactions. However, as the infrastructure continues to grow, there are new ways for the system to be leveraged in new ways. As a result, Algorithm Audits have become the most efficient way to assess digital platforms and providers' ethical impact on their consumers and users. These Audits have formulated direct ways to check digital platforms and create protections for auditors. However, these methods are difficult to employ and fall short in many areas of regulation. <ref>O'neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. Crown.</ref><br />
<br />
=== Research Protections ===<br />
Because researchers attempting to audit a platform or provider can be targeted by these large tech companies, legislation has been put in place through the CFAA and U.S. Department of Justice to protect the researchers from lawsuits. This resulted from the 2018 court ruling in SANDVIG et al. V. SESSIONS, where four university researchers attempted to scrape information on discrimination in the housing department. <br />
[[File:SandvigSessions.jpg|thumbnail| Image of Sadvig Sessions ruling, sourced from Google]]<br />
===== SANDVIG et al V. SESSIONS<ref>Lee, A. (2019). Online Research and Competition under the CFAA. Available at SSRN 3259701.</ref> =====<br />
"On Mar. 27, the United States District Court of D.C. ruled that such actions should not be viewed as criminal under the statute, though it declined to weigh in on whether the professors' conduct would be protected under the First Amendment.<br />
Without reaching this constitutional question, the Court concludes that the CFAA does not criminalize mere terms-of-service violations on consumer websites and, thus, that plaintiffs' proposed research plans are not criminal under the CFAA," <br />
U.S. District Judge John Bates wrote in his opinion.<ref>https://cases.justia.com/federal/district-courts/district-of-columbia/dcdce/1:2016cv01368/180080/24/0.pdf?ts=1522488415</ref><br />
<br />
=== Privacy Policies & Anonymity ===<br />
<br />
If attempting to do a code, scraping, or sock puppet audit requires researchers to engage with the digital platform directly. This can be dangerous because technology companies have outlined in their privacy policies restrictions on how users can engage on the forum. Researchers can be flagged for breaking these policies and banned from the platform. This can inhibit researchers' abilities to audit large developed tech companies if they're detected and can lead to lawsuits and further complications when the research is published. These large companies deal with data and can identify a user based on their access point or device metadata, and this can make it hard even to attempt many of the audit types.<br />
<br />
== References ==</div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=Algorithmic_Audits&diff=100738
Algorithmic Audits
2021-04-06T22:18:41Z
<p>Nfigue: Reformatted some references</p>
<hr />
<div><h1 style="font-size: 50; color: red; font-weight: 400;">Currently Being Edited</h1><br />
<br />
<br />
[[File:Internet-law.jpg|thumbnail| visual of digital law on computer screen, image sourced from Google]]<br />
<br />
Auditing Algorithms: Adding Accountability to Automated Authority is a group of events designed to produce a white paper that will help define and develop the emerging research community for "algorithm auditing." Algorithmic Auditing is a research design that has shown promise in diagnosing the unwanted consequences of algorithmic systems. <ref name="auditing_algorithms">https://auditingalgorithms.science/?page_id=89#:~:text=Auditing%20Algorithms%3A%20Adding%20Accountability%20to,in%20diagnosing%20the%20unwanted%20consequences</ref><br />
[[File:Audits.jpg|thumbnail| Visualization of Algorithmic Audit, Image sourced from Google]]<br />
<br />
Since developers established the internet, it has run on algorithms that most people don't understand; therefore, it can function relatively unchecked. A study conducted by the PEW research center in February of 2021 showed that 85% of adults in the U.S use the internet daily <ref>Perrin, Andrew, and Sara Atske. About Three-in-Ten U.S. Adults Say They Are 'ALMOST Constantly' Online. 26 Mar. 2021, www.pewresearch.org/fact-tank/2021/03/26/about-three-in-ten-u-s-adults-say-they-are-almost-constantly-online/. </ref>. <br />
<br />
This behavior represents billions of people worldwide and emphasizes algorithms' persistence in our daily lives, despite the tremendous gap between the usage and the understanding of algorithms<ref name="sandvig1">Sandvig, Christian, et al. "An algorithm audit." Data and Discrimination: Collected Essays. Washington, DC: New America Foundation (2014): 6-10.</ref>. By looking at online platforms as a single entity and analyzing the results they're producing, we can learn more about algorithms and how they affect our society.<br />
<br />
== Background ==<br />
The rapid progress that has been present in the tech industry has revolutionized the way humans work, interact, and communicate. A diverse array of practices are integrated into "smart" object applications and available instantaneously. At the core of all these real-time digital services lie algorithms that provide essential functions like sorting, segmentation, personalization, recommendations, and information management<ref name="kotras">Kotras, Baptiste. "Mass personalization: Predictive marketing algorithms and the reshaping of consumer knowledge." Big Data & Society 7.2 (2020): 2053951720951581.</ref>. <br />
<br />
In the last decade, researchers have identified numerous ethical breaches of online platforms and published their findings. Countries worldwide have rolled out their regulations and laws over the years to keep the Internet safe for their citizens. However, the government's efforts do little to limit the international liberties afforded online, including access, anonymity, and privacy. Because technology and algorithms are heavily integrated into humanity, ethical issues arise because algorithms are opaque to public scrutiny and understanding. Since the 2000s, the regulation of algorithms has become a concern to governments worldwide due to its data and privacy implications, and legislation has made progress in essential areas<ref name="goldsmith">Goldsmith, J. (2007), "Who Controls the Internet? Illusions of a Borderless World", Strategic Direction, Vol. 23 No. 11. https://doi.org/10.1108/sd.2007.05623kae.001</ref>.<br />
<br />
Additionally, in higher education, computer science and similar degrees have seen an increase in undergraduate and graduate enrollment, proportional to the tech industry's growth<ref name="usdoe">U.S. Department of Education, National Center for Education Statistics, Higher Education General Information Survey (HEGIS), "Degrees and Other Formal Awards Conferred" surveys, 1970-71 through 1985-86; Integrated Postsecondary Education Data System (IPEDS), "Completions Survey" (IPEDS-C:87-99); and IPEDS Fall 2000 through Fall 2011:https://nces.ed.gov/programs/digest/d12/tables/dt12_349.asp</ref>. Awareness about algorithms and digital platforms' infrastructure enables the public to identify ethical breaches in these systems' functionality. This awareness has sparked activism for ethics in technology. For example, [https://en.wikipedia.org/wiki/J Joy Buolamwini] identifying the bias in facial recognition technologies or [https://casetext.com/case/american-airlines-inc-v-sabre SABRE] leveraging their recommendation algorithm to place airlines who paid, higher on the recommendation than their competitors<ref name="copeland">Copeland, D.G., Mason, R.O., and McKenney, J.L. (1995). Sabre: The Development of Information-Based Competence and Execution of Information-Based Competition. IEEE Annals of the History of Computing, vol. 17, no. 3: 30-57.</ref>.<br />
<br />
== Types of Audits ==<br />
<br />
=== Code Audit<ref name="sandvig">Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." Data and discrimination: converting critical concerns into productive inquiry 22 (2014): 4349-4357. pg. 9-15 </ref> ===<br />
[[File: Code Audit.png|thumbnail|Visual of code audit from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 9]]<br />
These audits use the idea of "Algorithm Transparency," where researchers acquire a copy of the algorithm and vet it for unethical platform behavior. This rarely happens because the program algorithm is considered valuable intellectual property, and tech companies are reluctant to release the code, even to researchers. <br />
<br />
A major problem is that the public interest disclosure of just algorithms might be likely to<br />
produce serious negative consequences. On many platforms, the algorithm designers constantly<br />
operate a game of cat-and-mouse with those who would abuse or “game” their algorithm. These<br />
adversaries may themselves be criminals (such as spammers or hackers) and aiding them<br />
could conceivably be a greater harm than detecting unfair discrimination in the platform itself.<br />
<br />
=== Noninvasive User Audit <ref name="sandvig"/>===<br />
[[File: NonInvasive User Audit.png|thumbnail|Visualization of code audit from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 11]]<br />
An audit where researchers approach users and ask users to share their search queries, results, and other data with them. However, this method is historically unreliable and has a small sample size that is not random, which makes it difficult to infer causality. This audit loses the benefit of any manipulation or random assignment to conditions this is not an experimental design, and it may be quite difficult to infer causality from any particular results. For instance, a finding that a disadvantaged demographic tended to receive search results that are inferior in some way might be the result of a variety of causes.<br />
<br />
=== Scraping Audit <ref name="sandvig"/>===<br />
[[File: Scraping Audit.png|thumbnail|Scraping audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 12]]<br />
In scaping audits, researchers issue repeated queries to a platform, then observe and record the results, employing programs that utilize API data mining and webpage scraping. This is effective because it can gather a large quantity of data that researchers can test. However, it is not ideal because if the audit isn't strictly for research purposes, one could serve 1-10 years in prison for a CFAA violation resulting from this auditing technique.<br />
Terms of service also present a problem. While a Terms of Service document on a Website may not have the force of law or even operate as a contract, some scholarly associations have taken the position that Internet researchers must not violate the Terms of Service agreements of Internet platforms, making many the results from designs like this one potentially impossible to publish in some venues.<br />
<br />
=== Sock Puppet Audit <ref name="sandvig"/>===<br />
[[File: Sock Puppet Audit.png|thumbnail|Sock Puppet Audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 13]]<br />
Sock puppet audits essentially use computer programs to impersonate users by creating false accounts. This method effectively investigates sensitive topics in difficult-to-access domains, and sock puppets can investigate features of systems that are not public and penetrate groups that are difficult to pinpoint and acknowledge. A large number of sock puppets are required to derive significant findings from the audit. However, this method is still susceptible to CFAA violations requiring the audit's purpose to be strictly for research. <br />
<br />
=== Collaborative or Crowdsourced Audit <ref name="sandvig"/>===<br />
[[File: Crowdsourcing_Collaboration_Audit.png|thumbnail|Collaborative or Crowdsourced Audit visual from Sandvig, Christian, et al. "Auditing algorithms: Research methods for detecting discrimination on internet platforms." pg 15]]<br />
A sock puppet audit and a crowdsourced audit only differ in one respect: in the latter<br />
design, the tester is a human. While it seems absurd that it would be illegal to use a computer<br />
program to act as a user yet legal to hire a person to act like a computer program, indeed this<br />
appears to be the case. The CFAA restrictions on unauthorized use are unlikely to be triggered<br />
by a genuine human user interacting with an online platform, especially a platform that accepts<br />
new user accounts from anyone on the Internet. A great deal of effort goes into preventing<br />
automated use of free Internet platforms by malicious bots and computer programs, and these<br />
are expressly prohibited in the Terms of Service given by platform providers. Yet an actual<br />
human being who is paid to do something with an Internet platform is unlikely to trigger any of<br />
these prohibitions.<br />
<br />
An example of this is Amazon's Mechanical Turk which allows a large enough testers group to produce significant results through semi-automated crowdsourcing. The only setback is this audit requires a large budget to pay the participants.<br />
<br />
== Legislation ==<br />
[[File:Cfaa.jpg|thumbnail| Image of CFAA text, sourced from Google]]<br />
Since the internet became a public commodity, the value of information has increased significantly. The US justice system has seen numerous cases involving digital privacy, ownership, plagiarism, hacking, fraud, and more. In response, laws have developed to categorize and respond to common disputes.<ref>upcounsel (2020). Internet Law: Everything You Need to Know.</ref><br />
<br />
=== Display of Information<ref>Code, U. S. US Code of Federal Regulations.</ref> ===<br />
In 1984 USCAB declared that the airline sorting algorithm created by [https://en.wikipedia.org/wiki/Sabre_(computer_system) SABRE] must be known to participating airlines under the [https://www.govinfo.gov/content/pkg/CFR-2012-title14-vol4/pdf/CFR-2012-title14-vol4-sec255-4.pdf Display of Information in the Code of Federal Regulations]. This regulatory provision resulted from a bias identified by airlines using SABRE's software, which involved "screen science." and recommending airlines based on privatized criteria and capital incentives. The airlines noticed that American Airlines received prioritization in SABRE's algorithms search results, which led to an investigation by the USCAB and DOJ.<br />
<br />
=== The Communications Decency Act<ref>Ardia, D. S. (2009). Free speech savior or shield for scoundrels: an empirical study of intermediary immunity under Section 230 of the Communications Decency Act. Loy. LAL Rev., 43, 373.</ref> ===<br />
[https://en.wikipedia.org/wiki/Section_230 Section 230] of the Communications Decency Act states that "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." This section is one of the most potent active legislative acts concerning the internet and digital communications and one of its defining limitations. This provision protects service providers from the legal implications of removing content deemed obscene or offensive, even of constitutionally protected speech, as long as it is done in good faith.<br />
<br />
=== US Computer Fraud and Abuse Act<ref>Griffith, D. S. (1990). The computer fraud and abuse act of 1986: a measured response to a growing problem. Vand. L. Rev., 43, 453.</ref> ===<br />
The [https://en.wikipedia.org/wiki/Computer_Fraud_and_Abuse_Act United states CFAA] is an act that attaches sentences to digital behaviors deemed illegal or unlawful and has clearly stated the repercussions for committing such an offense. <br />
<br />
The Act states: “The CFAA prohibits intentionally accessing a computer without authorization or in excess of authorization but fails to define what ‘without authorization’ means. With harsh penalty schemes and malleable provisions, it has become a tool ripe for abuse and uses against nearly every aspect of computer activity.”<br />
{| class="wikitable"<br />
|-<br />
! Crime/ Offense !! Years In Prison<br />
|-<br />
| Obtaining National Security Information || 10<br />
|-<br />
| Accessing a Computer and Obtaining Information || 1-5<br />
|-<br />
| Trespassing in a Government Computer || 1<br />
|-<br />
| Accessing a Computer to Defraud and Obtain Value || 5<br />
|-<br />
| Intentionally Damaging by Knowing Transmission || 1-10<br />
|-<br />
| Recklessly Damaging by Intentional Access || 1-5<br />
|-<br />
| Negligently Causing Damage and Loss by Intentional Access || 1<br />
|-<br />
| Trafficking in Passwords || 1<br />
|-<br />
| Extortion Involving Computers || 5<br />
|-<br />
| Attempt and Conspiracy to Commit such an Offense || 10<br />
|}<br />
<br />
=== Photo Sharing Law ===<br />
A law put in place by U.S. Legislation. The Photo Sharing Law outlines the general regulation for photography in the U.S. This resulted from the potential security implications internally and the privacy of citizens. The law mainly outlines these regulations:<br />
# Private property is protected and established by the owner of the property, it is illegal to refuse to abide by the owner's requests concerning photographing activity.<br />
#Photographing private property from within the public domain is not illegal.<br />
# taking pictures of public places objects and structures are legal unless explicitly prohibited.<br />
# Outer space requires a license to be issued by the National Oceanic and Atmospheric Administration in advance.<br />
# Commercial photography will likely require a permit and proof of insurance.<br />
# Photographing accident scenes and law enforcement activities is legal, as long as it doesn't hinder the operations of law enforcement, medical, emergency, or security personnel by filming.<br />
# Any filming with the intent of doing unlawful harm against a subject may be a violation of the law in itself.<br />
<br />
<br />
== Ethical Implications ==<br />
Due to the scale of technology companies, regulation cannot keep pace. As a result, general guidelines and laws for digital interactions. However, as the infrastructure continues to grow, there are new ways for the system to be leveraged in new ways. As a result, Algorithm Audits have become the most efficient way to assess digital platforms and providers' ethical impact on their consumers and users. These Audits have formulated direct ways to check digital platforms and create protections for auditors. However, these methods are difficult to employ and fall short in many areas of regulation. <ref>O'neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. Crown.</ref><br />
<br />
=== Research Protections ===<br />
Because researchers attempting to audit a platform or provider can be targeted by these large tech companies, legislation has been put in place through the CFAA and U.S. Department of Justice to protect the researchers from lawsuits. This resulted from the 2018 court ruling in SANDVIG et al. V. SESSIONS, where four university researchers attempted to scrape information on discrimination in the housing department. <br />
[[File:SandvigSessions.jpg|thumbnail| Image of Sadvig Sessions ruling, sourced from Google]]<br />
===== SANDVIG et al V. SESSIONS<ref>Lee, A. (2019). Online Research and Competition under the CFAA. Available at SSRN 3259701.</ref> =====<br />
"On Mar. 27, the United States District Court of D.C. ruled that such actions should not be viewed as criminal under the statute, though it declined to weigh in on whether the professors' conduct would be protected under the First Amendment.<br />
Without reaching this constitutional question, the Court concludes that the CFAA does not criminalize mere terms-of-service violations on consumer websites and, thus, that plaintiffs' proposed research plans are not criminal under the CFAA," <br />
U.S. District Judge John Bates wrote in his opinion.<ref>https://cases.justia.com/federal/district-courts/district-of-columbia/dcdce/1:2016cv01368/180080/24/0.pdf?ts=1522488415</ref><br />
<br />
=== Privacy Policies & Anonymity ===<br />
<br />
If attempting to do a code, scraping, or sock puppet audit requires researchers to engage with the digital platform directly. This can be dangerous because technology companies have outlined in their privacy policies restrictions on how users can engage on the forum. Researchers can be flagged for breaking these policies and banned from the platform. This can inhibit researchers' abilities to audit large developed tech companies if they're detected and can lead to lawsuits and further complications when the research is published. These large companies deal with data and can identify a user based on their access point or device metadata, and this can make it hard even to attempt many of the audit types.<br />
<br />
== References ==</div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=MuslimPro&diff=100139
MuslimPro
2021-04-02T11:17:06Z
<p>Nfigue: /* Islamic Apps */ a word</p>
<hr />
<div>Muslim Pro is a [https://en.wikipedia.org/wiki/Mobile_app mobile application (app)] designed for purpose of assisting Muslims with their religious activities such as prayer and fasting. It is the most popular Islamic religious application available on both [https://en.wikipedia.org/wiki/Android_(operating_system) Android] and [https://en.wikipedia.org/wiki/IOS iOS], providing services to over 100 million users<ref>“Muslims Reel over a Prayer App That Sold User Data: 'A Betrayal from within Our Own Community'.” Los Angeles Times, Los Angeles Times, 23 Nov. 2020, www.latimes.com/business/technology/story/2020-11-23/muslim-pro-data-location-sales-military-contractors. </ref>. Muslim Pro was created by the founder and former CEO of Bitsmedia, Erwan Mace<ref name="edge-markets">Raj, Adeline Paul. “Tech: Growing the World's Most Popular Lifestyle App for Muslims.” The Edge Markets, 20 June 2018, www.theedgemarkets.com/article/tech-growing-worlds-most-popular-lifestyle-app-muslims. </ref>, a French national and former employee of Southeast Asia Google.<br />
<br />
[[File:Muslimpro.jpeg|150px|thumb|right|Muslim Pro icon on iOS device. <ref> Ali Baker. Muslim Pro Icon. Mar 11 2020 </ref>]]<br />
<br />
<br />
==Background==<br />
<br />
[[File:Erwan mace.jpg|100px|thumb|right|Photo of Erwan Mace. <ref>“Erwan Mace.” App Masters, 25 Mar. 2015, appmasters.com/bitsmedia-erwan-mace/bitsmedia-erwan-mace-2/.</ref>]]<br />
<br />
Muslim Pro was founded in Singapore by Mace in 2010. It is developed and supported by a Singapore based company known as Bitsmedia. The initial release consisted of prayer time reminders and assistance with reading the [https://en.wikipedia.org/wiki/Quran Holy Quran]. The app has since evolved to provide more features to the users such as finding nearby [https://en.wikipedia.org/wiki/Halal Halal] food. In 2019, Erwan Mace stepped down as Bitsmedia CEO and was replaced by Louis-Bernard Carcouet. <ref name="guardian">“ACLU Files Request over Data US Collected via Muslim App Used by Millions.” The Guardian, Guardian News and Media, 3 Dec. 2020, www.theguardian.com/us-news/2020/dec/03/aclu-seeks-release-records-data-us-collected-via-muslim-app-used-millions. </ref> As of 2019, Muslim Pro has accumulated over 100 million unique users around the world.<br />
<br />
Muslim Pro and similar religious apps were born from the need to keep religious content accessible for people who don't have time to worship in a traditional sense <ref name="rinker"/>. Although the infusion with technology is looked down on by some members of Islam<ref name="rinker"/>, the power of the internet is being used to ensure that Muslims stay relevant given technological innovation <ref name="dailytime">Five most innovative islamic apps of 2018. (2018, Jun 26). Daily Times Retrieved from https://proxy.lib.umich.edu/login?url=https://www-proquest-com.proxy.lib.umich.edu/newspapers/five-most-innovative-islamic-apps-2018/docview/2058841830/se-2?accountid=14667</ref>. Smartphone apps have filled the place where personal interaction historically reigned, causing some controversy between young and old member of religious communities. <br />
<br />
Muslim pro became the subject of a privacy controversy due to allegations of sending information about their users to data brokers associated with the United States military.<ref name="guardian">“ACLU Files Request over Data US Collected via Muslim App Used by Millions.” The Guardian, Guardian News and Media, 3 Dec. 2020, www.theguardian.com/us-news/2020/dec/03/aclu-seeks-release-records-data-us-collected-via-muslim-app-used-millions. </ref><br />
<br />
==Functionality==<br />
Muslim Pro is only available on devices running the iOS or Android operating systems. The app is designed to help Muslims observe their religious practices and accommodate users based on their location. The app provides reminders to users about prayer times, a compass to help orient their position in prayers towards the [https://en.wikipedia.org/wiki/Kaaba Ka’ba], and a digital version of the Holy Quran available in 40 different languages.<ref> Pro, Muslim. Big 10 for Muslim Pro - 10 Million Global Downloads for Mobile App, 30 June 2018, www.prnewswire.com/news-releases/big-10-for-muslim-pro---10-million-global-downloads-for-mobile-app-255471731.html#:~:text=Muslim%20Pro%20is%20an%20Islamic,and%20BlackBerry%20World%20(BlackBerry). </ref> The app utilizes location services to help Muslims find local halal food, along with finding the nearest mosques in their local area. Muslim Pro provides assistance within the holy month of Ramadan helping users keep track of their fasting based on their location. The app can help Muslims plan out their pilgrimage (or Hajj) and allows for connecting with other Muslims on the app by sharing posts related to Islam on the application timeline. <ref name="features"> “Muslim Pro for IPhone and Android.” Www.muslimpro.com, www.muslimpro.com/features. </ref><br />
<br />
[[File:Location muslim.PNG|100px|thumb|left|Screenshot of Muslim Pro asking for user Location <ref> Ali Baker. Muslim Pro Location Screenshot. Mar 11 2020 </ref>]]<br />
Muslim Pro utilizes ads to help pay for their expenses and also provides the option of a subscription model which allows users to avoid ads by paying a monthly fee. Subscribers receive benefits such as additional voices for Qur’an and Adhan readings, as well as access to new color schemes. <ref name="features"> “Muslim Pro for IPhone and Android.” Www.muslimpro.com, www.muslimpro.com/features. </ref><br />
<br />
Muslim Pro relies on knowing the user’s [https://en.wikipedia.org/wiki/Geopositioning geolocation] data for most functions to work correctly. When a user opens the application, the service will ask for permission to access the devices GPS. According to the privacy policy of Muslim Pro, the application processes, stores, and analyzes data sent from the users using the application including personal details, location, and device specific information <ref> Pro, Muslim. “Privacy Policy.” Muslim Pro - Help Center, 25 Mar. 2021, support.muslimpro.com/hc/en-us/articles/203485970-Privacy-Policy.</ref>.<br />
<br />
===World Influence===<br />
Bunt describes the positive ways in which Muslim religion assisting apps secure the future of the Islamic religion through modern channels. They comment on the ability of having the Qur'an available easily at any moment and it's effect on keeping the religion relevant for the future. A strong point in their book is a specific example detailing how marginalized and minority Muslim persons have a stronger voice in any discussion interpreting their faith. They mention [https://en.wikipedia.org/wiki/Twitter Twitter] and the prevalence of use by religious leaders to reach a wider and more engaged audience through the platform<ref name="bunt">Bunt, G.R. (2018). Hashtag Islam: How Cyber-Islamic Environments Are Transforming Religious Authority. Chapel Hill: The University of North Carolina Press. Retrieved from https://muse-jhu-edu.proxy.lib.umich.edu/book/61421</ref>.<br />
<br />
==Religious Apps==<br />
Muslim Pro falls into a category along with similar apps for many other religions which help users uphold their religious commitments<ref name="rinker">Rinker, C. H. (2016). Religious Apps for Smartphones and Tablets: Transforming Religious Authority and the Nature of Religion. Interdisciplinary Journal of Research on Religion, 12(4). Retrieved from https://www.religjournal.com/pdf/ijrr12004.pdf</ref>, reports a study performed by Hughes. The research found the majority of users of any faith-related mobile application use the software as a way to stay connected to their faith even without a direct community to be a part of. Users also described how they do not have time to attend religious services and instead remain involved on their own time whenever possible. Hughes describes how the usage of religious apps are transforming religion from an institutionalized practice to an everyday activity is keeping a younger generation engaged in practicing faith. The research concludes with a statement describing how apps disrupt the traditional importance of communal faith, while offering many benefits for any generation of faith.<br />
<br />
==Ethical Concerns==<br />
===Data Selling Controversy===<br />
[[File:Vicemedia.png|thumb|right|Logo of Vice Media <ref> “Vice Media.” Vice, www.vice.com/en. </ref>]]<br />
In 2020, an expose by Vice Motherboard identified Muslim Pro as one of the many applications that have had their location data accessed by the United States military<ref name="vice">“How the U.S. Military Buys Location Data from Ordinary Apps.” VICE, www.vice.com/en/article/jgqm5x/us-military-location-data-xmode-locate-x. </ref>. The article provides details concerning the U.S. Governments purchasing of Muslim Pro user location data through [https://en.wikipedia.org/wiki/X-Mode_social X-mode], a [https://en.wikipedia.org/wiki/Information_broker data broker] focused on location information, to assist counter terrorism activities. Another point of controversy lies in Muslim Pro not stating the true destination of user data in their privacy policy.<ref name="latimes">“Muslims Reel over a Prayer App That Sold User Data: 'A Betrayal from within Our Own Community'.” Los Angeles Times, Los Angeles Times, 23 Nov. 2020, www.latimes.com/business/technology/story/2020-11-23/muslim-pro-data-location-sales-military-contractors. </ref><br />
<br />
The expose also mentions other companies and applications with similar target audiences that are complicit in indirectly selling data to the United States government. Dating applications such as Muslim Mingle and Iran Social have been found to sell information to data brokers such as X-mode. <ref name="vice">“How the U.S. Military Buys Location Data from Ordinary Apps.” VICE, www.vice.com/en/article/jgqm5x/us-military-location-data-xmode-locate-x.</ref><br />
<br />
Due to the history of the United States military using location data to assist their controversial drone strikes and counter terrorism activities, Bitsmedia have received heavy criticism and condemnation from the Muslim community, the ACLU, and many data rights activists concerning their actions. <ref name="latimes">“Muslims Reel over a Prayer App That Sold User Data: 'A Betrayal from within Our Own Community'.” Los Angeles Times, Los Angeles Times, 23 Nov. 2020, www.latimes.com/business/technology/story/2020-11-23/muslim-pro-data-location-sales-military-contractors. </ref><br />
<br />
In response to the allegations, Muslim Pro have released a statement denying any involvement in selling user data to the U.S. government or any third party data brokers such as X-mode.<ref>Pro, Muslim. “Statement from Muslim Pro.” Muslim Pro - Help Center, 17 Dec. 2020, support.muslimpro.com/hc/en-us/articles/360052648551-Statement-from-Muslim-Pro. </ref><br />
<br />
===Islamic Apps===<br />
<br />
[[File:Muslimproplaystore.PNG|350px|thumbnail|right|Listing for Muslim Pro on the Google Play Store]]<br />
<br />
Muslim Pro exists among around 300 other Islam centered religious apps on the [https://en.wikipedia.org/wiki/Google_Play Google Play Store].<ref name="hameed">Hameed, A. (2019). Survey, Analysis and Issues of Islamic Android Apps. Elkawnie Journal of Islamic Science and Technology, 5(1). Retrieved from https://jurnal.ar-raniry.ac.id/index.php/elkawnie/article/view/4541/pdf</ref> Hameed states that a major issue regarding these apps is the lack of accessibility for users because proper categories do not exist in which to place these religious applications. They say that these missing identifiers make it difficult for users to properly discover and utilize these programs compared to other apps on the Mobile platforms.<br />
<br />
The research also describes another issue related to the lack of verification present when developing and publishing apps containing Islamic content to the Play Store. They discovered 90% of users surveyed reported they did not know how to inspect the content they were reading to be sure it was authentic as well as finding that many Islamic apps provided false information. Hameed states that this can mislead users to spread false information and will require work in the future to construct a content review framework to ensure accurate information is portrayed to those who rely on the apps for worship.<br />
<br />
==References==</div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=MuslimPro&diff=100102
MuslimPro
2021-04-02T10:12:04Z
<p>Nfigue: /* Islamic Apps */ Added image to show the lack of fitting categories for religious apps</p>
<hr />
<div>Muslim Pro is a [https://en.wikipedia.org/wiki/Mobile_app mobile application (app)] designed for purpose of assisting Muslims with their religious activities such as prayer and fasting. It is the most popular Islamic religious application available on both [https://en.wikipedia.org/wiki/Android_(operating_system) Android] and [https://en.wikipedia.org/wiki/IOS iOS], providing services to over 100 million users<ref>“Muslims Reel over a Prayer App That Sold User Data: 'A Betrayal from within Our Own Community'.” Los Angeles Times, Los Angeles Times, 23 Nov. 2020, www.latimes.com/business/technology/story/2020-11-23/muslim-pro-data-location-sales-military-contractors. </ref>. Muslim Pro was created by the founder and former CEO of Bitsmedia, Erwan Mace<ref name="edge-markets">Raj, Adeline Paul. “Tech: Growing the World's Most Popular Lifestyle App for Muslims.” The Edge Markets, 20 June 2018, www.theedgemarkets.com/article/tech-growing-worlds-most-popular-lifestyle-app-muslims. </ref>, a French national and former employee of Southeast Asia Google.<br />
<br />
[[File:Muslimpro.jpeg|150px|thumb|right|Muslim Pro icon on iOS device. <ref> Ali Baker. Muslim Pro Icon. Mar 11 2020 </ref>]]<br />
<br />
<br />
==Background==<br />
<br />
[[File:Erwan mace.jpg|100px|thumb|right|Photo of Erwan Mace. <ref>“Erwan Mace.” App Masters, 25 Mar. 2015, appmasters.com/bitsmedia-erwan-mace/bitsmedia-erwan-mace-2/.</ref>]]<br />
<br />
Muslim Pro was founded in Singapore by Mace in 2010. It is developed and supported by a Singapore based company known as Bitsmedia. The initial release consisted of prayer time reminders and assistance with reading the [https://en.wikipedia.org/wiki/Quran Holy Quran]. The app has since evolved to provide more features to the users such as finding nearby [https://en.wikipedia.org/wiki/Halal Halal] food. In 2019, Erwan Mace stepped down as Bitsmedia CEO and was replaced by Louis-Bernard Carcouet. <ref name="guardian">“ACLU Files Request over Data US Collected via Muslim App Used by Millions.” The Guardian, Guardian News and Media, 3 Dec. 2020, www.theguardian.com/us-news/2020/dec/03/aclu-seeks-release-records-data-us-collected-via-muslim-app-used-millions. </ref> As of 2019, Muslim Pro has accumulated over 100 million unique users around the world.<br />
<br />
Muslim Pro and similar religious apps were born from the need to keep religious content accessible for people who don't have time to worship in a traditional sense <ref name="rinker"/>. Although the infusion with technology is looked down on by some members of Islam<ref name="rinker"/>, the power of the internet is being used to ensure that Muslims stay relevant given technological innovation <ref name="dailytime">Five most innovative islamic apps of 2018. (2018, Jun 26). Daily Times Retrieved from https://proxy.lib.umich.edu/login?url=https://www-proquest-com.proxy.lib.umich.edu/newspapers/five-most-innovative-islamic-apps-2018/docview/2058841830/se-2?accountid=14667</ref>. Smartphone apps have filled the place where personal interaction historically reigned, causing some controversy between young and old member of religious communities. <br />
<br />
Muslim pro became the subject of a privacy controversy due to allegations of sending information about their users to data brokers associated with the United States military.<ref name="guardian">“ACLU Files Request over Data US Collected via Muslim App Used by Millions.” The Guardian, Guardian News and Media, 3 Dec. 2020, www.theguardian.com/us-news/2020/dec/03/aclu-seeks-release-records-data-us-collected-via-muslim-app-used-millions. </ref><br />
<br />
==Functionality==<br />
Muslim Pro is only available on devices running the iOS or Android operating systems. The app is designed to help Muslims observe their religious practices and accommodate users based on their location. The app provides reminders to users about prayer times, a compass to help orient their position in prayers towards the [https://en.wikipedia.org/wiki/Kaaba Ka’ba], and a digital version of the Holy Quran available in 40 different languages.<ref> Pro, Muslim. Big 10 for Muslim Pro - 10 Million Global Downloads for Mobile App, 30 June 2018, www.prnewswire.com/news-releases/big-10-for-muslim-pro---10-million-global-downloads-for-mobile-app-255471731.html#:~:text=Muslim%20Pro%20is%20an%20Islamic,and%20BlackBerry%20World%20(BlackBerry). </ref> The app utilizes location services to help Muslims find local halal food, along with finding the nearest mosques in their local area. Muslim Pro provides assistance within the holy month of Ramadan helping users keep track of their fasting based on their location. The app can help Muslims plan out their pilgrimage (or Hajj) and allows for connecting with other Muslims on the app by sharing posts related to Islam on the application timeline. <ref name="features"> “Muslim Pro for IPhone and Android.” Www.muslimpro.com, www.muslimpro.com/features. </ref><br />
<br />
[[File:Location muslim.PNG|100px|thumb|left|Screenshot of Muslim Pro asking for user Location <ref> Ali Baker. Muslim Pro Location Screenshot. Mar 11 2020 </ref>]]<br />
Muslim Pro utilizes ads to help pay for their expenses and also provides the option of a subscription model which allows users to avoid ads by paying a monthly fee. Subscribers receive benefits such as additional voices for Qur’an and Adhan readings, as well as access to new color schemes. <ref name="features"> “Muslim Pro for IPhone and Android.” Www.muslimpro.com, www.muslimpro.com/features. </ref><br />
<br />
Muslim Pro relies on knowing the user’s [https://en.wikipedia.org/wiki/Geopositioning geolocation] data for most functions to work correctly. When a user opens the application, the service will ask for permission to access the devices GPS. According to the privacy policy of Muslim Pro, the application processes, stores, and analyzes data sent from the users using the application including personal details, location, and device specific information <ref> Pro, Muslim. “Privacy Policy.” Muslim Pro - Help Center, 25 Mar. 2021, support.muslimpro.com/hc/en-us/articles/203485970-Privacy-Policy.</ref>.<br />
<br />
===World Influence===<br />
Bunt describes the positive ways in which Muslim religion assisting apps secure the future of the Islamic religion through modern channels. They comment on the ability of having the Qur'an available easily at any moment and it's effect on keeping the religion relevant for the future. A strong point in their book is a specific example detailing how marginalized and minority Muslim persons have a stronger voice in any discussion interpreting their faith. They mention [https://en.wikipedia.org/wiki/Twitter Twitter] and the prevalence of use by religious leaders to reach a wider and more engaged audience through the platform<ref name="bunt">Bunt, G.R. (2018). Hashtag Islam: How Cyber-Islamic Environments Are Transforming Religious Authority. Chapel Hill: The University of North Carolina Press. Retrieved from https://muse-jhu-edu.proxy.lib.umich.edu/book/61421</ref>.<br />
<br />
==Religious Apps==<br />
Muslim Pro falls into a category along with similar apps for many other religions which help users uphold their religious commitments<ref name="rinker">Rinker, C. H. (2016). Religious Apps for Smartphones and Tablets: Transforming Religious Authority and the Nature of Religion. Interdisciplinary Journal of Research on Religion, 12(4). Retrieved from https://www.religjournal.com/pdf/ijrr12004.pdf</ref>, reports a study performed by Hughes. The research found the majority of users of any faith-related mobile application use the software as a way to stay connected to their faith even without a direct community to be a part of. Users also described how they do not have time to attend religious services and instead remain involved on their own time whenever possible. Hughes describes how the usage of religious apps are transforming religion from an institutionalized practice to an everyday activity is keeping a younger generation engaged in practicing faith. The research concludes with a statement describing how apps disrupt the traditional importance of communal faith, while offering many benefits for any generation of faith.<br />
<br />
==Ethical Concerns==<br />
===Data Selling Controversy===<br />
[[File:Vicemedia.png|thumb|right|Logo of Vice Media <ref> “Vice Media.” Vice, www.vice.com/en. </ref>]]<br />
In 2020, an expose by Vice Motherboard identified Muslim Pro as one of the many applications that have had their location data accessed by the United States military<ref name="vice">“How the U.S. Military Buys Location Data from Ordinary Apps.” VICE, www.vice.com/en/article/jgqm5x/us-military-location-data-xmode-locate-x. </ref>. The article provides details concerning the U.S. Governments purchasing of Muslim Pro user location data through [https://en.wikipedia.org/wiki/X-Mode_social X-mode], a [https://en.wikipedia.org/wiki/Information_broker data broker] focused on location information, to assist counter terrorism activities. Another point of controversy lies in Muslim Pro not stating the true destination of user data in their privacy policy.<ref name="latimes">“Muslims Reel over a Prayer App That Sold User Data: 'A Betrayal from within Our Own Community'.” Los Angeles Times, Los Angeles Times, 23 Nov. 2020, www.latimes.com/business/technology/story/2020-11-23/muslim-pro-data-location-sales-military-contractors. </ref><br />
<br />
The expose also mentions other companies and applications with similar target audiences that are complicit in indirectly selling data to the United States government. Dating applications such as Muslim Mingle and Iran Social have been found to sell information to data brokers such as X-mode. <ref name="vice">“How the U.S. Military Buys Location Data from Ordinary Apps.” VICE, www.vice.com/en/article/jgqm5x/us-military-location-data-xmode-locate-x.</ref><br />
<br />
Due to the history of the United States military using location data to assist their controversial drone strikes and counter terrorism activities, Bitsmedia have received heavy criticism and condemnation from the Muslim community, the ACLU, and many data rights activists concerning their actions. <ref name="latimes">“Muslims Reel over a Prayer App That Sold User Data: 'A Betrayal from within Our Own Community'.” Los Angeles Times, Los Angeles Times, 23 Nov. 2020, www.latimes.com/business/technology/story/2020-11-23/muslim-pro-data-location-sales-military-contractors. </ref><br />
<br />
In response to the allegations, Muslim Pro have released a statement denying any involvement in selling user data to the U.S. government or any third party data brokers such as X-mode.<ref>Pro, Muslim. “Statement from Muslim Pro.” Muslim Pro - Help Center, 17 Dec. 2020, support.muslimpro.com/hc/en-us/articles/360052648551-Statement-from-Muslim-Pro. </ref><br />
<br />
===Islamic Apps===<br />
<br />
[[File:Muslimproplaystore.PNG|350px|thumbnail|right|Listing for Muslim Pro on the Google Play Store]]<br />
<br />
Muslim Pro exists among around 300 other Islam centered religious apps on the [https://en.wikipedia.org/wiki/Google_Play Google Play Store].<ref name="hameed">Hameed, A. (2019). Survey, Analysis and Issues of Islamic Android Apps. Elkawnie Journal of Islamic Science and Technology, 5(1). Retrieved from https://jurnal.ar-raniry.ac.id/index.php/elkawnie/article/view/4541/pdf</ref> Hameed states that a major issue regarding these apps is the lack of accessibility for users because proper categories do not exist in which to place these religious apps into. They say that these missing identifiers make it difficult for users to properly discover and utilize these programs compared to other apps on the Google platform.<br />
<br />
The research also describes another issue related to the lack of verification present when developing and publishing apps containing Islamic content to the Play Store. They discovered 90% of those surveyed reported they did not know how to inspect the content they were reading to be sure it was authentic as well as finding that many Islamic apps provided false information. Hameed states that this can mislead users to spread false information and will require work in the future to construct a content review framework to ensure accurate information is portrayed to those who rely on the apps for worship.<br />
<br />
==References==</div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=File:Muslimproplaystore.PNG&diff=100101
File:Muslimproplaystore.PNG
2021-04-02T10:05:37Z
<p>Nfigue: Google Play Store listing for Muslim Pro displaying the lack of appropriate category for the application</p>
<hr />
<div>Google Play Store listing for Muslim Pro displaying the lack of appropriate category for the application</div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=MuslimPro&diff=100100
MuslimPro
2021-04-02T10:02:49Z
<p>Nfigue: Added more links to topics that are not common knowledge, made adjustments to multiple sentences for readability</p>
<hr />
<div>Muslim Pro is a [https://en.wikipedia.org/wiki/Mobile_app mobile application (app)] designed for purpose of assisting Muslims with their religious activities such as prayer and fasting. It is the most popular Islamic religious application available on both [https://en.wikipedia.org/wiki/Android_(operating_system) Android] and [https://en.wikipedia.org/wiki/IOS iOS], providing services to over 100 million users<ref>“Muslims Reel over a Prayer App That Sold User Data: 'A Betrayal from within Our Own Community'.” Los Angeles Times, Los Angeles Times, 23 Nov. 2020, www.latimes.com/business/technology/story/2020-11-23/muslim-pro-data-location-sales-military-contractors. </ref>. Muslim Pro was created by the founder and former CEO of Bitsmedia, Erwan Mace<ref name="edge-markets">Raj, Adeline Paul. “Tech: Growing the World's Most Popular Lifestyle App for Muslims.” The Edge Markets, 20 June 2018, www.theedgemarkets.com/article/tech-growing-worlds-most-popular-lifestyle-app-muslims. </ref>, a French national and former employee of Southeast Asia Google.<br />
<br />
[[File:Muslimpro.jpeg|150px|thumb|right|Muslim Pro icon on iOS device. <ref> Ali Baker. Muslim Pro Icon. Mar 11 2020 </ref>]]<br />
<br />
<br />
==Background==<br />
<br />
[[File:Erwan mace.jpg|100px|thumb|right|Photo of Erwan Mace. <ref>“Erwan Mace.” App Masters, 25 Mar. 2015, appmasters.com/bitsmedia-erwan-mace/bitsmedia-erwan-mace-2/.</ref>]]<br />
<br />
Muslim Pro was founded in Singapore by Mace in 2010. It is developed and supported by a Singapore based company known as Bitsmedia. The initial release consisted of prayer time reminders and assistance with reading the [https://en.wikipedia.org/wiki/Quran Holy Quran]. The app has since evolved to provide more features to the users such as finding nearby [https://en.wikipedia.org/wiki/Halal Halal] food. In 2019, Erwan Mace stepped down as Bitsmedia CEO and was replaced by Louis-Bernard Carcouet. <ref name="guardian">“ACLU Files Request over Data US Collected via Muslim App Used by Millions.” The Guardian, Guardian News and Media, 3 Dec. 2020, www.theguardian.com/us-news/2020/dec/03/aclu-seeks-release-records-data-us-collected-via-muslim-app-used-millions. </ref> As of 2019, Muslim Pro has accumulated over 100 million unique users around the world.<br />
<br />
Muslim Pro and similar religious apps were born from the need to keep religious content accessible for people who don't have time to worship in a traditional sense <ref name="rinker"/>. Although the infusion with technology is looked down on by some members of Islam<ref name="rinker"/>, the power of the internet is being used to ensure that Muslims stay relevant given technological innovation <ref name="dailytime">Five most innovative islamic apps of 2018. (2018, Jun 26). Daily Times Retrieved from https://proxy.lib.umich.edu/login?url=https://www-proquest-com.proxy.lib.umich.edu/newspapers/five-most-innovative-islamic-apps-2018/docview/2058841830/se-2?accountid=14667</ref>. Smartphone apps have filled the place where personal interaction historically reigned, causing some controversy between young and old member of religious communities. <br />
<br />
Muslim pro became the subject of a privacy controversy due to allegations of sending information about their users to data brokers associated with the United States military.<ref name="guardian">“ACLU Files Request over Data US Collected via Muslim App Used by Millions.” The Guardian, Guardian News and Media, 3 Dec. 2020, www.theguardian.com/us-news/2020/dec/03/aclu-seeks-release-records-data-us-collected-via-muslim-app-used-millions. </ref><br />
<br />
==Functionality==<br />
Muslim Pro is only available on devices running the iOS or Android operating systems. The app is designed to help Muslims observe their religious practices and accommodate users based on their location. The app provides reminders to users about prayer times, a compass to help orient their position in prayers towards the [https://en.wikipedia.org/wiki/Kaaba Ka’ba], and a digital version of the Holy Quran available in 40 different languages.<ref> Pro, Muslim. Big 10 for Muslim Pro - 10 Million Global Downloads for Mobile App, 30 June 2018, www.prnewswire.com/news-releases/big-10-for-muslim-pro---10-million-global-downloads-for-mobile-app-255471731.html#:~:text=Muslim%20Pro%20is%20an%20Islamic,and%20BlackBerry%20World%20(BlackBerry). </ref> The app utilizes location services to help Muslims find local halal food, along with finding the nearest mosques in their local area. Muslim Pro provides assistance within the holy month of Ramadan helping users keep track of their fasting based on their location. The app can help Muslims plan out their pilgrimage (or Hajj) and allows for connecting with other Muslims on the app by sharing posts related to Islam on the application timeline. <ref name="features"> “Muslim Pro for IPhone and Android.” Www.muslimpro.com, www.muslimpro.com/features. </ref><br />
<br />
[[File:Location muslim.PNG|100px|thumb|left|Screenshot of Muslim Pro asking for user Location <ref> Ali Baker. Muslim Pro Location Screenshot. Mar 11 2020 </ref>]]<br />
Muslim Pro utilizes ads to help pay for their expenses and also provides the option of a subscription model which allows users to avoid ads by paying a monthly fee. Subscribers receive benefits such as additional voices for Qur’an and Adhan readings, as well as access to new color schemes. <ref name="features"> “Muslim Pro for IPhone and Android.” Www.muslimpro.com, www.muslimpro.com/features. </ref><br />
<br />
Muslim Pro relies on knowing the user’s [https://en.wikipedia.org/wiki/Geopositioning geolocation] data for most functions to work correctly. When a user opens the application, the service will ask for permission to access the devices GPS. According to the privacy policy of Muslim Pro, the application processes, stores, and analyzes data sent from the users using the application including personal details, location, and device specific information <ref> Pro, Muslim. “Privacy Policy.” Muslim Pro - Help Center, 25 Mar. 2021, support.muslimpro.com/hc/en-us/articles/203485970-Privacy-Policy.</ref>.<br />
<br />
===World Influence===<br />
Bunt describes the positive ways in which Muslim religion assisting apps secure the future of the Islamic religion through modern channels. They comment on the ability of having the Qur'an available easily at any moment and it's effect on keeping the religion relevant for the future. A strong point in their book is a specific example detailing how marginalized and minority Muslim persons have a stronger voice in any discussion interpreting their faith. They mention [https://en.wikipedia.org/wiki/Twitter Twitter] and the prevalence of use by religious leaders to reach a wider and more engaged audience through the platform<ref name="bunt">Bunt, G.R. (2018). Hashtag Islam: How Cyber-Islamic Environments Are Transforming Religious Authority. Chapel Hill: The University of North Carolina Press. Retrieved from https://muse-jhu-edu.proxy.lib.umich.edu/book/61421</ref>.<br />
<br />
==Religious Apps==<br />
Muslim Pro falls into a category along with similar apps for many other religions which help users uphold their religious commitments<ref name="rinker">Rinker, C. H. (2016). Religious Apps for Smartphones and Tablets: Transforming Religious Authority and the Nature of Religion. Interdisciplinary Journal of Research on Religion, 12(4). Retrieved from https://www.religjournal.com/pdf/ijrr12004.pdf</ref>, reports a study performed by Hughes. The research found the majority of users of any faith-related mobile application use the software as a way to stay connected to their faith even without a direct community to be a part of. Users also described how they do not have time to attend religious services and instead remain involved on their own time whenever possible. Hughes describes how the usage of religious apps are transforming religion from an institutionalized practice to an everyday activity is keeping a younger generation engaged in practicing faith. The research concludes with a statement describing how apps disrupt the traditional importance of communal faith, while offering many benefits for any generation of faith.<br />
<br />
==Ethical Concerns==<br />
===Data Selling Controversy===<br />
[[File:Vicemedia.png|thumb|right|Logo of Vice Media <ref> “Vice Media.” Vice, www.vice.com/en. </ref>]]<br />
In 2020, an expose by Vice Motherboard identified Muslim Pro as one of the many applications that have had their location data accessed by the United States military<ref name="vice">“How the U.S. Military Buys Location Data from Ordinary Apps.” VICE, www.vice.com/en/article/jgqm5x/us-military-location-data-xmode-locate-x. </ref>. The article provides details concerning the U.S. Governments purchasing of Muslim Pro user location data through [https://en.wikipedia.org/wiki/X-Mode_social X-mode], a [https://en.wikipedia.org/wiki/Information_broker data broker] focused on location information, to assist counter terrorism activities. Another point of controversy lies in Muslim Pro not stating the true destination of user data in their privacy policy.<ref name="latimes">“Muslims Reel over a Prayer App That Sold User Data: 'A Betrayal from within Our Own Community'.” Los Angeles Times, Los Angeles Times, 23 Nov. 2020, www.latimes.com/business/technology/story/2020-11-23/muslim-pro-data-location-sales-military-contractors. </ref><br />
<br />
The expose also mentions other companies and applications with similar target audiences that are complicit in indirectly selling data to the United States government. Dating applications such as Muslim Mingle and Iran Social have been found to sell information to data brokers such as X-mode. <ref name="vice">“How the U.S. Military Buys Location Data from Ordinary Apps.” VICE, www.vice.com/en/article/jgqm5x/us-military-location-data-xmode-locate-x.</ref><br />
<br />
Due to the history of the United States military using location data to assist their controversial drone strikes and counter terrorism activities, Bitsmedia have received heavy criticism and condemnation from the Muslim community, the ACLU, and many data rights activists concerning their actions. <ref name="latimes">“Muslims Reel over a Prayer App That Sold User Data: 'A Betrayal from within Our Own Community'.” Los Angeles Times, Los Angeles Times, 23 Nov. 2020, www.latimes.com/business/technology/story/2020-11-23/muslim-pro-data-location-sales-military-contractors. </ref><br />
<br />
In response to the allegations, Muslim Pro have released a statement denying any involvement in selling user data to the U.S. government or any third party data brokers such as X-mode.<ref>Pro, Muslim. “Statement from Muslim Pro.” Muslim Pro - Help Center, 17 Dec. 2020, support.muslimpro.com/hc/en-us/articles/360052648551-Statement-from-Muslim-Pro. </ref><br />
<br />
===Islamic Apps===<br />
Muslim Pro exists among around 300 other Islam centered religious apps on the [https://en.wikipedia.org/wiki/Google_Play Google Play Store].<ref name="hameed">Hameed, A. (2019). Survey, Analysis and Issues of Islamic Android Apps. Elkawnie Journal of Islamic Science and Technology, 5(1). Retrieved from https://jurnal.ar-raniry.ac.id/index.php/elkawnie/article/view/4541/pdf</ref> Hameed states that a major issue regarding these apps is the lack of accessibility for users because proper categories do not exist in which to place these religious apps into. They say that these missing identifiers make it difficult for users to properly discover and utilize these programs compared to other apps on the Google platform.<br />
<br />
The research also describes another issue related to the lack of verification present when developing and publishing apps containing Islamic content to the Play Store. They discovered 90% of those surveyed reported they did not know how to inspect the content they were reading to be sure it was authentic as well as finding that many Islamic apps provided false information. Hameed states that this can mislead users to spread false information and will require work in the future to construct a content review framework to ensure accurate information is portrayed to those who rely on the apps for worship.<br />
<br />
==References==</div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=MuslimPro&diff=100096
MuslimPro
2021-04-02T09:50:57Z
<p>Nfigue: /* Functionality */ moved links</p>
<hr />
<div>Muslim Pro is a [https://en.wikipedia.org/wiki/Mobile_app mobile application] (app) designed for purpose of assisting Muslims with their religious activities such as prayer and fasting. It is the most popular Islamic religious application available on both Android and iOS, providing services to over 100 million users<ref>“Muslims Reel over a Prayer App That Sold User Data: 'A Betrayal from within Our Own Community'.” Los Angeles Times, Los Angeles Times, 23 Nov. 2020, www.latimes.com/business/technology/story/2020-11-23/muslim-pro-data-location-sales-military-contractors. </ref>. Muslim Pro was created in 2010 by the founder and former CEO of Bitsmedia, Erwan Mace<ref name="edge-markets">Raj, Adeline Paul. “Tech: Growing the World's Most Popular Lifestyle App for Muslims.” The Edge Markets, 20 June 2018, www.theedgemarkets.com/article/tech-growing-worlds-most-popular-lifestyle-app-muslims. </ref>, a French national and former employee of Southeast Asia Google.<br />
<br />
[[File:Muslimpro.jpeg|150px|thumb|right|Muslim Pro icon on iOS device. <ref> Ali Baker. Muslim Pro Icon. Mar 11 2020 </ref>]]<br />
<br />
<br />
==Background==<br />
<br />
[[File:Erwan mace.jpg|100px|thumb|right|Photo of Erwan Mace. <ref>“Erwan Mace.” App Masters, 25 Mar. 2015, appmasters.com/bitsmedia-erwan-mace/bitsmedia-erwan-mace-2/.</ref>]]<br />
<br />
Muslim Pro was founded in Singapore by Erwan Mace in 2010. It is developed and supported by a Singapore based company known as Bitsmedia. The initial release consisted of prayer time reminders and assistance with reading the Holy Quran. The app has since evolved to provide more features to the users. In 2019, Erwan Mace stepped down as Bitsmedia CEO and was replaced by Louis-Bernard Carcouet. <ref name="guardian">“ACLU Files Request over Data US Collected via Muslim App Used by Millions.” The Guardian, Guardian News and Media, 3 Dec. 2020, www.theguardian.com/us-news/2020/dec/03/aclu-seeks-release-records-data-us-collected-via-muslim-app-used-millions. </ref> As of 2019, the application has accumulated over 100 million unique users around the world.<br />
<br />
Muslim Pro and similar religious apps were born from the need to keep religious content accessible for people who don't have time to worship in a traditional sense <ref name="rinker"/>. Although the infusion with technology is looked down on by some members of Islam<ref name="rinker"/>, the power of the internet is being used to ensure that Muslims stay relevant given technological innovation <ref name="dailytime">Five most innovative islamic apps of 2018. (2018, Jun 26). Daily Times Retrieved from https://proxy.lib.umich.edu/login?url=https://www-proquest-com.proxy.lib.umich.edu/newspapers/five-most-innovative-islamic-apps-2018/docview/2058841830/se-2?accountid=14667</ref>. Smartphone apps have filled the place where personal interaction historically reigned, causing some controversy between young and old member of religious communities. <br />
<br />
Muslim pro was the subject of a privacy controversy due to allegations of sending information about their users to data brokers associated with the United States military.<ref name="guardian">“ACLU Files Request over Data US Collected via Muslim App Used by Millions.” The Guardian, Guardian News and Media, 3 Dec. 2020, www.theguardian.com/us-news/2020/dec/03/aclu-seeks-release-records-data-us-collected-via-muslim-app-used-millions. </ref><br />
<br />
==Functionality==<br />
Muslim Pro is only available on devices running the iOS or Android operating systems. The app is designed to help Muslims observe their religious practices and accommodate users based on their location. The app provides reminders to users about prayer times, a compass to help orient your position in prayers towards the Ka’ba, and a digital version of the Holy Quran available in 40 different languages.<ref> Pro, Muslim. Big 10 for Muslim Pro - 10 Million Global Downloads for Mobile App, 30 June 2018, www.prnewswire.com/news-releases/big-10-for-muslim-pro---10-million-global-downloads-for-mobile-app-255471731.html#:~:text=Muslim%20Pro%20is%20an%20Islamic,and%20BlackBerry%20World%20(BlackBerry). </ref> The app also utilizes location services to help Muslims find local halal food in their area. It also helps users find the nearest mosques in their local vicinity. Muslim Pro also provides assistance within the holy month of Ramadan helping users keep track of their fasting based on their location. The app can help Muslims plan out their pilgrimage (or Hajj) and allows for connecting with other Muslims on the app by sharing posts related to Islam on the application timeline. <ref name="features"> “Muslim Pro for IPhone and Android.” Www.muslimpro.com, www.muslimpro.com/features. </ref><br />
<br />
[[File:Location muslim.PNG|100px|thumb|left|Screenshot of Muslim Pro asking for user Location <ref> Ali Baker. Muslim Pro Location Screenshot. Mar 11 2020 </ref>]]<br />
Muslim Pro utilizes ads to help pay for their expenses and also provides the option of a subscription model which allows users to avoid ads by paying a monthly fee. Subscribers receive benefits such as additional voices for Qur’an and Adhan readings, as well as access to new color schemes. <ref name="features"> “Muslim Pro for IPhone and Android.” Www.muslimpro.com, www.muslimpro.com/features. </ref><br />
<br />
Muslim Pro relies on knowing the user’s [https://en.wikipedia.org/wiki/Geopositioning geolocation] data for most functions to work correctly. When a user opens the application, the service will ask for permission to access the devices GPS. According to the privacy policy of Muslim Pro, the application processes, stores, and analyzes data sent from the users using the application including personal details, location, and device specific information <ref> Pro, Muslim. “Privacy Policy.” Muslim Pro - Help Center, 25 Mar. 2021, support.muslimpro.com/hc/en-us/articles/203485970-Privacy-Policy.</ref>.<br />
<br />
===World Influence===<br />
Bunt describes the positive ways in which Muslim religion assisting apps secure the future of the Islamic religion through modern channels. They comment on the ability of having the Qur'an available easily at any moment and it's effect on keeping the religion relevant for the future. A strong point in their book is a specific example detailing how marginalized and minority Muslim persons have a stronger voice in any discussion interpreting their faith. They mention [https://en.wikipedia.org/wiki/Twitter Twitter] and the prevalence of use by religious leaders to reach a wider and more engaged audience through the platform<ref name="bunt">Bunt, G.R. (2018). Hashtag Islam: How Cyber-Islamic Environments Are Transforming Religious Authority. Chapel Hill: The University of North Carolina Press. Retrieved from https://muse-jhu-edu.proxy.lib.umich.edu/book/61421</ref>.<br />
<br />
==Religious Apps==<br />
Muslim Pro falls into a category along with similar apps for many other religions which help users uphold their religious commitments<ref name="rinker">Rinker, C. H. (2016). Religious Apps for Smartphones and Tablets: Transforming Religious Authority and the Nature of Religion. Interdisciplinary Journal of Research on Religion, 12(4). Retrieved from https://www.religjournal.com/pdf/ijrr12004.pdf</ref>, reports a study performed by Hughes. The research found the majority of users of any faith-related mobile application use the software as a way to stay connected to their faith even without a direct community to be a part of. Users also described how they do not have time to attend religious services and instead remain involved on their own time whenever possible. Hughes describes how the usage of religious apps are transforming religion from an institutionalized practice to an everyday activity is keeping a younger generation engaged in practicing faith. The research concludes with a statement describing how apps disrupt the traditional importance of communal faith, while offering many benefits for any generation of faith.<br />
<br />
==Ethical Concerns==<br />
===Data Selling Controversy===<br />
[[File:Vicemedia.png|thumb|right|Logo of Vice Media <ref> “Vice Media.” Vice, www.vice.com/en. </ref>]]<br />
In 2020, an expose by Vice Motherboard identified Muslim Pro as one of the many applications that have had their location data accessed by the United States military<ref name="vice">“How the U.S. Military Buys Location Data from Ordinary Apps.” VICE, www.vice.com/en/article/jgqm5x/us-military-location-data-xmode-locate-x. </ref>. The article provides details concerning the U.S. Governments purchasing of Muslim Pro user location data through [https://en.wikipedia.org/wiki/X-Mode_social X-mode], a [https://en.wikipedia.org/wiki/Information_broker data broker] focused on location information, to assist counter terrorism activities. Another point of controversy lies in Muslim Pro not stating the true destination of user data in their privacy policy.<ref name="latimes">“Muslims Reel over a Prayer App That Sold User Data: 'A Betrayal from within Our Own Community'.” Los Angeles Times, Los Angeles Times, 23 Nov. 2020, www.latimes.com/business/technology/story/2020-11-23/muslim-pro-data-location-sales-military-contractors. </ref><br />
<br />
The expose also mentions other companies and applications with similar target audiences that are complicit in indirectly selling data to the United States government. Dating applications such as Muslim Mingle and Iran Social have been found to sell information to data brokers such as X-mode. <ref name="vice">“How the U.S. Military Buys Location Data from Ordinary Apps.” VICE, www.vice.com/en/article/jgqm5x/us-military-location-data-xmode-locate-x.</ref><br />
<br />
Due to the history of the United States military using location data to assist their controversial drone strikes and counter terrorism activities, Bitsmedia have received heavy criticism and condemnation from the Muslim community, the ACLU, and many data rights activists concerning their actions. <ref name="latimes">“Muslims Reel over a Prayer App That Sold User Data: 'A Betrayal from within Our Own Community'.” Los Angeles Times, Los Angeles Times, 23 Nov. 2020, www.latimes.com/business/technology/story/2020-11-23/muslim-pro-data-location-sales-military-contractors. </ref><br />
<br />
In response to the allegations, Muslim Pro have released a statement denying any involvement in selling user data to the U.S. government or any third party data brokers such as X-mode.<ref>Pro, Muslim. “Statement from Muslim Pro.” Muslim Pro - Help Center, 17 Dec. 2020, support.muslimpro.com/hc/en-us/articles/360052648551-Statement-from-Muslim-Pro. </ref><br />
<br />
===Islamic Apps===<br />
Muslim Pro exists among around 300 other Islam centered religious apps on the [https://en.wikipedia.org/wiki/Google_Play Google Play Store].<ref name="hameed">Hameed, A. (2019). Survey, Analysis and Issues of Islamic Android Apps. Elkawnie Journal of Islamic Science and Technology, 5(1). Retrieved from https://jurnal.ar-raniry.ac.id/index.php/elkawnie/article/view/4541/pdf</ref> Hameed states that a major issue regarding these apps is the lack of accessibility for users because proper categories do not exist in which to place these religious apps into. They say that these missing identifiers make it difficult for users to properly discover and utilize these programs compared to other apps on the Google platform.<br />
<br />
The research also describes another issue related to the lack of verification present when developing and publishing apps containing Islamic content to the Play Store. They discovered 90% of those surveyed reported they did not know how to inspect the content they were reading to be sure it was authentic as well as finding that many Islamic apps provided false information. Hameed states that this can mislead users to spread false information and will require work in the future to construct a content review framework to ensure accurate information is portrayed to those who rely on the apps for worship.<br />
<br />
==References==</div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=MuslimPro&diff=100028
MuslimPro
2021-04-01T23:34:45Z
<p>Nfigue: Added World Influence section</p>
<hr />
<div>Muslim Pro is a [https://en.wikipedia.org/wiki/Mobile_app mobile application] (app) designed for purpose of assisting Muslims with their religious activities such as prayer and fasting. It is the most popular Islamic religious application available on both Android and iOS, providing services to over 100 million users<ref>“Muslims Reel over a Prayer App That Sold User Data: 'A Betrayal from within Our Own Community'.” Los Angeles Times, Los Angeles Times, 23 Nov. 2020, www.latimes.com/business/technology/story/2020-11-23/muslim-pro-data-location-sales-military-contractors. </ref>. Muslim Pro was created in 2010 by the founder and former CEO of Bitsmedia, Erwan Mace<ref name="edge-markets">Raj, Adeline Paul. “Tech: Growing the World's Most Popular Lifestyle App for Muslims.” The Edge Markets, 20 June 2018, www.theedgemarkets.com/article/tech-growing-worlds-most-popular-lifestyle-app-muslims. </ref>, a French national and former employee of Southeast Asia Google.<br />
<br />
[[File:Muslimpro.jpeg|150px|thumb|right|Muslim Pro icon on iOS device. <ref> Ali Baker. Muslim Pro Icon. Mar 11 2020 </ref>]]<br />
<br />
<br />
==Background==<br />
<br />
[[File:Erwan mace.jpg|100px|thumb|right|Photo of Erwan Mace. <ref>“Erwan Mace.” App Masters, 25 Mar. 2015, appmasters.com/bitsmedia-erwan-mace/bitsmedia-erwan-mace-2/.</ref>]]<br />
<br />
Muslim Pro was founded in Singapore by Erwan Mace in 2010. It is developed and supported by a Singapore based company known as Bitsmedia. The initial release consisted of prayer time reminders and assistance with reading the Holy Quran. The app has since evolved to provide more features to the users. In 2019, Erwan Mace stepped down as Bitsmedia CEO and was replaced by Louis-Bernard Carcouet. <ref name="guardian">“ACLU Files Request over Data US Collected via Muslim App Used by Millions.” The Guardian, Guardian News and Media, 3 Dec. 2020, www.theguardian.com/us-news/2020/dec/03/aclu-seeks-release-records-data-us-collected-via-muslim-app-used-millions. </ref> As of 2019, the application has accumulated over 100 million unique users around the world.<br />
<br />
Muslim Pro and similar religious apps were born from the need to keep religious content accessible for people who don't have time to worship in a traditional sense <ref name="rinker"/>. Although the infusion with technology is looked down on by some members of Islam<ref name="rinker"/>, the power of the internet is being used to ensure that Muslims stay relevant given technological innovation <ref name="dailytime">Five most innovative islamic apps of 2018. (2018, Jun 26). Daily Times Retrieved from https://proxy.lib.umich.edu/login?url=https://www-proquest-com.proxy.lib.umich.edu/newspapers/five-most-innovative-islamic-apps-2018/docview/2058841830/se-2?accountid=14667</ref>. Smartphone apps have filled the place where personal interaction historically reigned, causing some controversy between young and old member of religious communities. <br />
<br />
Muslim pro was the subject of a privacy controversy due to allegations of sending information about their users to data brokers associated with the United States military.<ref name="guardian">“ACLU Files Request over Data US Collected via Muslim App Used by Millions.” The Guardian, Guardian News and Media, 3 Dec. 2020, www.theguardian.com/us-news/2020/dec/03/aclu-seeks-release-records-data-us-collected-via-muslim-app-used-millions. </ref><br />
<br />
==Functionality==<br />
Muslim Pro is only available on devices running the [https://en.wikipedia.org/wiki/IOS iOS] or [https://en.wikipedia.org/wiki/Android_(operating_system) Android] operating systems. The app is designed to help Muslims observe their religious practices and accommodate users based on their location. The app provides reminders to users about prayer times, a compass to help orient your position in prayers towards the Ka’ba, and a digital version of the Holy Quran available in 40 different languages.<ref> Pro, Muslim. Big 10 for Muslim Pro - 10 Million Global Downloads for Mobile App, 30 June 2018, www.prnewswire.com/news-releases/big-10-for-muslim-pro---10-million-global-downloads-for-mobile-app-255471731.html#:~:text=Muslim%20Pro%20is%20an%20Islamic,and%20BlackBerry%20World%20(BlackBerry). </ref> The app also utilizes location services to help Muslims find local halal food in their area. It also helps users find the nearest mosques in their local vicinity. Muslim Pro also provides assistance within the holy month of Ramadan helping users keep track of their fasting based on their location. The app can help Muslims plan out their pilgrimage (or Hajj) and allows for connecting with other Muslims on the app by sharing posts related to Islam on the application timeline. <ref name="features"> “Muslim Pro for IPhone and Android.” Www.muslimpro.com, www.muslimpro.com/features. </ref><br />
<br />
[[File:Location muslim.PNG|100px|thumb|left|Screenshot of Muslim Pro asking for user Location <ref> Ali Baker. Muslim Pro Location Screenshot. Mar 11 2020 </ref>]]<br />
Muslim Pro utilizes ads to help pay for their expenses and also provides the option of a subscription model which allows users to avoid ads by paying a monthly fee. Subscribers receive benefits such as additional voices for Qur’an and Adhan readings, as well as access to new color schemes. <ref name="features"> “Muslim Pro for IPhone and Android.” Www.muslimpro.com, www.muslimpro.com/features. </ref><br />
<br />
Muslim Pro relies on knowing the user’s [https://en.wikipedia.org/wiki/Geopositioning geolocation] data for most functions to work correctly. When a user opens the application, the service will ask for permission to access the devices GPS. According to the privacy policy of Muslim Pro, the application processes, stores, and analyzes data sent from the users using the application including personal details, location, and device specific information <ref> Pro, Muslim. “Privacy Policy.” Muslim Pro - Help Center, 25 Mar. 2021, support.muslimpro.com/hc/en-us/articles/203485970-Privacy-Policy.</ref>.<br />
<br />
===World Influence===<br />
Bunt describes the positive ways in which Muslim religion assisting apps secure the future of the Islamic religion through modern channels. They comment on the ability of having the Qur'an available easily at any moment and it's effect on keeping the religion relevant for the future. A strong point in their book is a specific example detailing how marginalized and minority Muslim persons have a stronger voice in any discussion interpreting their faith. They mention [https://en.wikipedia.org/wiki/Twitter Twitter] and the prevalence of use by religious leaders to reach a wider and more engaged audience through the platform<ref name="bunt">Bunt, G.R. (2018). Hashtag Islam: How Cyber-Islamic Environments Are Transforming Religious Authority. Chapel Hill: The University of North Carolina Press. Retrieved from https://muse-jhu-edu.proxy.lib.umich.edu/book/61421</ref>.<br />
<br />
==Religious Apps==<br />
Muslim Pro falls into a category along with similar apps for many other religions which help users uphold their religious commitments<ref name="rinker">Rinker, C. H. (2016). Religious Apps for Smartphones and Tablets: Transforming Religious Authority and the Nature of Religion. Interdisciplinary Journal of Research on Religion, 12(4). Retrieved from https://www.religjournal.com/pdf/ijrr12004.pdf</ref>, reports a study performed by Hughes. The research found the majority of users of any faith-related mobile application use the software as a way to stay connected to their faith even without a direct community to be a part of. Users also described how they do not have time to attend religious services and instead remain involved on their own time whenever possible. Hughes describes how the usage of religious apps are transforming religion from an institutionalized practice to an everyday activity is keeping a younger generation engaged in practicing faith. The research concludes with a statement describing how apps disrupt the traditional importance of communal faith, while offering many benefits for any generation of faith.<br />
<br />
==Ethical Concerns==<br />
===Data Selling Controversy===<br />
[[File:Vicemedia.png|thumb|right|Logo of Vice Media <ref> “Vice Media.” Vice, www.vice.com/en. </ref>]]<br />
In 2020, an expose by Vice Motherboard identified Muslim Pro as one of the many applications that have had their location data accessed by the United States military<ref name="vice">“How the U.S. Military Buys Location Data from Ordinary Apps.” VICE, www.vice.com/en/article/jgqm5x/us-military-location-data-xmode-locate-x. </ref>. The article provides details concerning the U.S. Governments purchasing of Muslim Pro user location data through [https://en.wikipedia.org/wiki/X-Mode_social X-mode], a [https://en.wikipedia.org/wiki/Information_broker data broker] focused on location information, to assist counter terrorism activities. Another point of controversy lies in Muslim Pro not stating the true destination of user data in their privacy policy.<ref name="latimes">“Muslims Reel over a Prayer App That Sold User Data: 'A Betrayal from within Our Own Community'.” Los Angeles Times, Los Angeles Times, 23 Nov. 2020, www.latimes.com/business/technology/story/2020-11-23/muslim-pro-data-location-sales-military-contractors. </ref><br />
<br />
The expose also mentions other companies and applications with similar target audiences that are complicit in indirectly selling data to the United States government. Dating applications such as Muslim Mingle and Iran Social have been found to sell information to data brokers such as X-mode. <ref name="vice">“How the U.S. Military Buys Location Data from Ordinary Apps.” VICE, www.vice.com/en/article/jgqm5x/us-military-location-data-xmode-locate-x.</ref><br />
<br />
Due to the history of the United States military using location data to assist their controversial drone strikes and counter terrorism activities, Bitsmedia have received heavy criticism and condemnation from the Muslim community, the ACLU, and many data rights activists concerning their actions. <ref name="latimes">“Muslims Reel over a Prayer App That Sold User Data: 'A Betrayal from within Our Own Community'.” Los Angeles Times, Los Angeles Times, 23 Nov. 2020, www.latimes.com/business/technology/story/2020-11-23/muslim-pro-data-location-sales-military-contractors. </ref><br />
<br />
In response to the allegations, Muslim Pro have released a statement denying any involvement in selling user data to the U.S. government or any third party data brokers such as X-mode.<ref>Pro, Muslim. “Statement from Muslim Pro.” Muslim Pro - Help Center, 17 Dec. 2020, support.muslimpro.com/hc/en-us/articles/360052648551-Statement-from-Muslim-Pro. </ref><br />
<br />
===Islamic Apps===<br />
Muslim Pro exists among around 300 other Islam centered religious apps on the [https://en.wikipedia.org/wiki/Google_Play Google Play Store].<ref name="hameed">Hameed, A. (2019). Survey, Analysis and Issues of Islamic Android Apps. Elkawnie Journal of Islamic Science and Technology, 5(1). Retrieved from https://jurnal.ar-raniry.ac.id/index.php/elkawnie/article/view/4541/pdf</ref> Hameed states that a major issue regarding these apps is the lack of accessibility for users because proper categories do not exist in which to place these religious apps into. They say that these missing identifiers make it difficult for users to properly discover and utilize these programs compared to other apps on the Google platform.<br />
<br />
The research also describes another issue related to the lack of verification present when developing and publishing apps containing Islamic content to the Play Store. They discovered 90% of those surveyed reported they did not know how to inspect the content they were reading to be sure it was authentic as well as finding that many Islamic apps provided false information. Hameed states that this can mislead users to spread false information and will require work in the future to construct a content review framework to ensure accurate information is portrayed to those who rely on the apps for worship.<br />
<br />
==References==</div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=MuslimPro&diff=100022
MuslimPro
2021-04-01T23:13:40Z
<p>Nfigue: Added to Background section</p>
<hr />
<div>Muslim Pro is a [https://en.wikipedia.org/wiki/Mobile_app mobile application] (app) designed for purpose of assisting Muslims with their religious activities such as prayer and fasting. It is the most popular Islamic religious application available on both Android and iOS, providing services to over 100 million users<ref>“Muslims Reel over a Prayer App That Sold User Data: 'A Betrayal from within Our Own Community'.” Los Angeles Times, Los Angeles Times, 23 Nov. 2020, www.latimes.com/business/technology/story/2020-11-23/muslim-pro-data-location-sales-military-contractors. </ref>. Muslim Pro was created in 2010 by the founder and former CEO of Bitsmedia, Erwan Mace<ref name="edge-markets">Raj, Adeline Paul. “Tech: Growing the World's Most Popular Lifestyle App for Muslims.” The Edge Markets, 20 June 2018, www.theedgemarkets.com/article/tech-growing-worlds-most-popular-lifestyle-app-muslims. </ref>, a French national and former employee of Southeast Asia Google.<br />
<br />
[[File:Muslimpro.jpeg|150px|thumb|right|Muslim Pro icon on iOS device. <ref> Ali Baker. Muslim Pro Icon. Mar 11 2020 </ref>]]<br />
<br />
<br />
==Background==<br />
<br />
[[File:Erwan mace.jpg|100px|thumb|right|Photo of Erwan Mace. <ref>“Erwan Mace.” App Masters, 25 Mar. 2015, appmasters.com/bitsmedia-erwan-mace/bitsmedia-erwan-mace-2/.</ref>]]<br />
<br />
Muslim Pro was founded in Singapore by Erwan Mace in 2010. It is developed and supported by a Singapore based company known as Bitsmedia. The initial release consisted of prayer time reminders and assistance with reading the Holy Quran. The app has since evolved to provide more features to the users. In 2019, Erwan Mace stepped down as Bitsmedia CEO and was replaced by Louis-Bernard Carcouet. <ref name="guardian">“ACLU Files Request over Data US Collected via Muslim App Used by Millions.” The Guardian, Guardian News and Media, 3 Dec. 2020, www.theguardian.com/us-news/2020/dec/03/aclu-seeks-release-records-data-us-collected-via-muslim-app-used-millions. </ref> As of 2019, the application has accumulated over 100 million unique users around the world.<br />
<br />
Muslim Pro and similar religious apps were born from the need to keep religious content accessible for people who don't have time to worship in a traditional sense <ref name="rinker"/>. Although the infusion with technology is looked down on by some members of Islam<ref name="rinker"/>, the power of the internet is being used to ensure that Muslims stay relevant given technological innovation <ref name="dailytime">Five most innovative islamic apps of 2018. (2018, Jun 26). Daily Times Retrieved from https://proxy.lib.umich.edu/login?url=https://www-proquest-com.proxy.lib.umich.edu/newspapers/five-most-innovative-islamic-apps-2018/docview/2058841830/se-2?accountid=14667</ref>. Smartphone apps have filled the place where personal interaction historically reigned, causing some controversy between young and old member of religious communities. <br />
<br />
Muslim pro was the subject of a privacy controversy due to allegations of sending information about their users to data brokers associated with the United States military.<ref name="guardian">“ACLU Files Request over Data US Collected via Muslim App Used by Millions.” The Guardian, Guardian News and Media, 3 Dec. 2020, www.theguardian.com/us-news/2020/dec/03/aclu-seeks-release-records-data-us-collected-via-muslim-app-used-millions. </ref><br />
<br />
==Functionality==<br />
Muslim Pro is only available on devices running the [https://en.wikipedia.org/wiki/IOS iOS] or [https://en.wikipedia.org/wiki/Android_(operating_system) Android] operating systems. The app is designed to help Muslims observe their religious practices and accommodate users based on their location. The app provides reminders to users about prayer times, a compass to help orient your position in prayers towards the Ka’ba, and a digital version of the Holy Quran available in 40 different languages.<ref> Pro, Muslim. Big 10 for Muslim Pro - 10 Million Global Downloads for Mobile App, 30 June 2018, www.prnewswire.com/news-releases/big-10-for-muslim-pro---10-million-global-downloads-for-mobile-app-255471731.html#:~:text=Muslim%20Pro%20is%20an%20Islamic,and%20BlackBerry%20World%20(BlackBerry). </ref> The app also utilizes location services to help Muslims find local halal food in their area. It also helps users find the nearest mosques in their local vicinity. Muslim Pro also provides assistance within the holy month of Ramadan helping users keep track of their fasting based on their location. The app can help Muslims plan out their pilgrimage (or Hajj) and allows for connecting with other Muslims on the app by sharing posts related to Islam on the application timeline. <ref name="features"> “Muslim Pro for IPhone and Android.” Www.muslimpro.com, www.muslimpro.com/features. </ref><br />
<br />
[[File:Location muslim.PNG|100px|thumb|left|Screenshot of Muslim Pro asking for user Location <ref> Ali Baker. Muslim Pro Location Screenshot. Mar 11 2020 </ref>]]<br />
Muslim Pro utilizes ads to help pay for their expenses and also provides the option of a subscription model which allows users to avoid ads by paying a monthly fee. Subscribers receive benefits such as additional voices for Qur’an and Adhan readings, as well as access to new color schemes. <ref name="features"> “Muslim Pro for IPhone and Android.” Www.muslimpro.com, www.muslimpro.com/features. </ref><br />
<br />
Muslim Pro relies on knowing the user’s [https://en.wikipedia.org/wiki/Geopositioning geolocation] data for most functions to work correctly. When a user opens the application, the service will ask for permission to access the devices GPS. According to the privacy policy of Muslim Pro, the application processes, stores, and analyzes data sent from the users using the application including personal details, location, and device specific information. <ref> Pro, Muslim. “Privacy Policy.” Muslim Pro - Help Center, 25 Mar. 2021, support.muslimpro.com/hc/en-us/articles/203485970-Privacy-Policy.</ref><br />
<br />
==Religious Apps==<br />
Muslim Pro falls into a category along with similar apps for many other religions which help users uphold their religious commitments<ref name="rinker">Rinker, C. H. (2016). Religious Apps for Smartphones and Tablets: Transforming Religious Authority and the Nature of Religion. Interdisciplinary Journal of Research on Religion, 12(4). Retrieved from https://www.religjournal.com/pdf/ijrr12004.pdf</ref>, reports a study performed by Hughes. The research found the majority of users of any faith-related mobile application use the software as a way to stay connected to their faith even without a direct community to be a part of. Users also described how they do not have time to attend religious services and instead remain involved on their own time whenever possible. Hughes describes how the usage of religious apps are transforming religion from an institutionalized practice to an everyday activity is keeping a younger generation engaged in practicing faith. The research concludes with a statement describing how apps disrupt the traditional importance of communal faith, while offering many benefits for any generation of faith.<br />
<br />
==Ethical Concerns==<br />
===Data Selling Controversy===<br />
[[File:Vicemedia.png|thumb|right|Logo of Vice Media <ref> “Vice Media.” Vice, www.vice.com/en. </ref>]]<br />
In 2020, an expose by Vice Motherboard identified Muslim Pro as one of the many applications that have had their location data accessed by the United States military<ref name="vice">“How the U.S. Military Buys Location Data from Ordinary Apps.” VICE, www.vice.com/en/article/jgqm5x/us-military-location-data-xmode-locate-x. </ref>. The article provides details concerning the U.S. Governments purchasing of Muslim Pro user location data through [https://en.wikipedia.org/wiki/X-Mode_social X-mode], a [https://en.wikipedia.org/wiki/Information_broker data broker] focused on location information, to assist counter terrorism activities. Another point of controversy lies in Muslim Pro not stating the true destination of user data in their privacy policy.<ref name="latimes">“Muslims Reel over a Prayer App That Sold User Data: 'A Betrayal from within Our Own Community'.” Los Angeles Times, Los Angeles Times, 23 Nov. 2020, www.latimes.com/business/technology/story/2020-11-23/muslim-pro-data-location-sales-military-contractors. </ref><br />
<br />
The expose also mentions other companies and applications with similar target audiences that are complicit in indirectly selling data to the United States government. Dating applications such as Muslim Mingle and Iran Social have been found to sell information to data brokers such as X-mode. <ref name="vice">“How the U.S. Military Buys Location Data from Ordinary Apps.” VICE, www.vice.com/en/article/jgqm5x/us-military-location-data-xmode-locate-x.</ref><br />
<br />
Due to the history of the United States military using location data to assist their controversial drone strikes and counter terrorism activities, Bitsmedia have received heavy criticism and condemnation from the Muslim community, the ACLU, and many data rights activists concerning their actions. <ref name="latimes">“Muslims Reel over a Prayer App That Sold User Data: 'A Betrayal from within Our Own Community'.” Los Angeles Times, Los Angeles Times, 23 Nov. 2020, www.latimes.com/business/technology/story/2020-11-23/muslim-pro-data-location-sales-military-contractors. </ref><br />
<br />
In response to the allegations, Muslim Pro have released a statement denying any involvement in selling user data to the U.S. government or any third party data brokers such as X-mode.<ref>Pro, Muslim. “Statement from Muslim Pro.” Muslim Pro - Help Center, 17 Dec. 2020, support.muslimpro.com/hc/en-us/articles/360052648551-Statement-from-Muslim-Pro. </ref><br />
<br />
===Islamic Apps===<br />
Muslim Pro exists among around 300 other Islam centered religious apps on the [https://en.wikipedia.org/wiki/Google_Play Google Play Store].<ref name="hameed">Hameed, A. (2019). Survey, Analysis and Issues of Islamic Android Apps. Elkawnie Journal of Islamic Science and Technology, 5(1). Retrieved from https://jurnal.ar-raniry.ac.id/index.php/elkawnie/article/view/4541/pdf</ref> Hameed states that a major issue regarding these apps is the lack of accessibility for users because proper categories do not exist in which to place these religious apps into. They say that these missing identifiers make it difficult for users to properly discover and utilize these programs compared to other apps on the Google platform.<br />
<br />
The research also describes another issue related to the lack of verification present when developing and publishing apps containing Islamic content to the Play Store. They discovered 90% of those surveyed reported they did not know how to inspect the content they were reading to be sure it was authentic as well as finding that many Islamic apps provided false information. Hameed states that this can mislead users to spread false information and will require work in the future to construct a content review framework to ensure accurate information is portrayed to those who rely on the apps for worship.<br />
<br />
==References==</div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=MuslimPro&diff=100015
MuslimPro
2021-04-01T22:39:57Z
<p>Nfigue: added link</p>
<hr />
<div>Muslim Pro is a [https://en.wikipedia.org/wiki/Mobile_app mobile application] (app) designed for purpose of assisting Muslims with their religious activities such as prayer and fasting. It is the most popular Islamic religious application available on both Android and iOS, providing services to over 100 million users<ref>“Muslims Reel over a Prayer App That Sold User Data: 'A Betrayal from within Our Own Community'.” Los Angeles Times, Los Angeles Times, 23 Nov. 2020, www.latimes.com/business/technology/story/2020-11-23/muslim-pro-data-location-sales-military-contractors. </ref>. Muslim Pro was created in 2010 by the founder and former CEO of Bitsmedia, Erwan Mace<ref name="edge-markets">Raj, Adeline Paul. “Tech: Growing the World's Most Popular Lifestyle App for Muslims.” The Edge Markets, 20 June 2018, www.theedgemarkets.com/article/tech-growing-worlds-most-popular-lifestyle-app-muslims. </ref>, a French national and former employee of Southeast Asia Google.<br />
<br />
[[File:Muslimpro.jpeg|150px|thumb|right|Muslim Pro icon on iOS device. <ref> Ali Baker. Muslim Pro Icon. Mar 11 2020 </ref>]]<br />
<br />
<br />
==Background==<br />
<br />
[[File:Erwan mace.jpg|100px|thumb|right|Photo of Erwan Mace. <ref>“Erwan Mace.” App Masters, 25 Mar. 2015, appmasters.com/bitsmedia-erwan-mace/bitsmedia-erwan-mace-2/.</ref>]]<br />
<br />
Muslim Pro was founded in Singapore by Erwan Mace in 2010. It is developed and supported by a Singapore based company known as Bitsmedia. The initial release consisted of prayer time reminders and assistance with reading the Holy Quran. The app has since evolved to provide more features to the users. In 2019, Erwan Mace stepped down as Bitsmedia CEO and was replaced by Louis-Bernard Carcouet. <ref name="guardian">“ACLU Files Request over Data US Collected via Muslim App Used by Millions.” The Guardian, Guardian News and Media, 3 Dec. 2020, www.theguardian.com/us-news/2020/dec/03/aclu-seeks-release-records-data-us-collected-via-muslim-app-used-millions. </ref> As of 2019, the application has accumulated over 100 million unique users around the world <br />
<br />
<br />
Muslim pro was the subject of a privacy controversy due to allegations of sending information about their users to data brokers associated with the United States military.<ref name="guardian">“ACLU Files Request over Data US Collected via Muslim App Used by Millions.” The Guardian, Guardian News and Media, 3 Dec. 2020, www.theguardian.com/us-news/2020/dec/03/aclu-seeks-release-records-data-us-collected-via-muslim-app-used-millions. </ref><br />
<br />
==Functionality==<br />
Muslim Pro is only available on devices running the [https://en.wikipedia.org/wiki/IOS iOS] or [https://en.wikipedia.org/wiki/Android_(operating_system) Android] operating systems. The app is designed to help Muslims observe their religious practices and accommodate users based on their location. The app provides reminders to users about prayer times, a compass to help orient your position in prayers towards the Ka’ba, and a digital version of the Holy Quran available in 40 different languages.<ref> Pro, Muslim. Big 10 for Muslim Pro - 10 Million Global Downloads for Mobile App, 30 June 2018, www.prnewswire.com/news-releases/big-10-for-muslim-pro---10-million-global-downloads-for-mobile-app-255471731.html#:~:text=Muslim%20Pro%20is%20an%20Islamic,and%20BlackBerry%20World%20(BlackBerry). </ref> The app also utilizes location services to help Muslims find local halal food in their area. It also helps users find the nearest mosques in their local vicinity. Muslim Pro also provides assistance within the holy month of Ramadan helping users keep track of their fasting based on their location. The app can help Muslims plan out their pilgrimage (or Hajj) and allows for connecting with other Muslims on the app by sharing posts related to Islam on the application timeline. <ref name="features"> “Muslim Pro for IPhone and Android.” Www.muslimpro.com, www.muslimpro.com/features. </ref><br />
<br />
[[File:Location muslim.PNG|100px|thumb|left|Screenshot of Muslim Pro asking for user Location <ref> Ali Baker. Muslim Pro Location Screenshot. Mar 11 2020 </ref>]]<br />
Muslim Pro utilizes ads to help pay for their expenses and also provides the option of a subscription model which allows users to avoid ads by paying a monthly fee. Subscribers receive benefits such as additional voices for Qur’an and Adhan readings, as well as access to new color schemes. <ref name="features"> “Muslim Pro for IPhone and Android.” Www.muslimpro.com, www.muslimpro.com/features. </ref><br />
<br />
Muslim Pro relies on knowing the user’s [https://en.wikipedia.org/wiki/Geopositioning geolocation] data for most functions to work correctly. When a user opens the application, the service will ask for permission to access the devices GPS. According to the privacy policy of Muslim Pro, the application processes, stores, and analyzes data sent from the users using the application including personal details, location, and device specific information. <ref> Pro, Muslim. “Privacy Policy.” Muslim Pro - Help Center, 25 Mar. 2021, support.muslimpro.com/hc/en-us/articles/203485970-Privacy-Policy.</ref><br />
<br />
==Religious Apps==<br />
Muslim Pro falls into a category along with similar apps for many other religions which help users uphold their religious commitments<ref name="hughes">Rinker, C. H. (2016). Religious Apps for Smartphones and Tablets: Transforming Religious Authority and the Nature of Religion. Interdisciplinary Journal of Research on Religion, 12(4)</ref>, reports a study performed by Hughes. The research found the majority of users of any faith-related mobile application use the software as a way to stay connected to their faith even without a direct community to be a part of. Users also described how they do not have time to attend religious services and instead remain involved on their own time whenever possible. Hughes describes how the usage of religious apps are transforming religion from an institutionalized practice to an everyday activity is keeping a younger generation engaged in practicing faith. The research concludes with a statement describing how apps disrupt the traditional importance of communal faith, while offering many benefits for any generation of faith.<br />
<br />
==Ethical Concerns==<br />
===Data Selling Controversy===<br />
[[File:Vicemedia.png|thumb|right|Logo of Vice Media <ref> “Vice Media.” Vice, www.vice.com/en. </ref>]]<br />
In 2020, an expose by Vice Motherboard identified Muslim Pro as one of the many applications that have had their location data accessed by the United States military<ref name="vice">“How the U.S. Military Buys Location Data from Ordinary Apps.” VICE, www.vice.com/en/article/jgqm5x/us-military-location-data-xmode-locate-x. </ref>. The article provides details concerning the U.S. Governments purchasing of Muslim Pro user location data through [https://en.wikipedia.org/wiki/X-Mode_social X-mode], a [https://en.wikipedia.org/wiki/Information_broker data broker] focused on location information, to assist counter terrorism activities. Another point of controversy lies in Muslim Pro not stating the true destination of user data in their privacy policy.<ref name="latimes">“Muslims Reel over a Prayer App That Sold User Data: 'A Betrayal from within Our Own Community'.” Los Angeles Times, Los Angeles Times, 23 Nov. 2020, www.latimes.com/business/technology/story/2020-11-23/muslim-pro-data-location-sales-military-contractors. </ref><br />
<br />
The expose also mentions other companies and applications with similar target audiences that are complicit in indirectly selling data to the United States government. Dating applications such as Muslim Mingle and Iran Social have been found to sell information to data brokers such as X-mode. <ref name="vice">“How the U.S. Military Buys Location Data from Ordinary Apps.” VICE, www.vice.com/en/article/jgqm5x/us-military-location-data-xmode-locate-x.</ref><br />
<br />
Due to the history of the United States military using location data to assist their controversial drone strikes and counter terrorism activities, Bitsmedia have received heavy criticism and condemnation from the Muslim community, the ACLU, and many data rights activists concerning their actions. <ref name="latimes">“Muslims Reel over a Prayer App That Sold User Data: 'A Betrayal from within Our Own Community'.” Los Angeles Times, Los Angeles Times, 23 Nov. 2020, www.latimes.com/business/technology/story/2020-11-23/muslim-pro-data-location-sales-military-contractors. </ref><br />
<br />
In response to the allegations, Muslim Pro have released a statement denying any involvement in selling user data to the U.S. government or any third party data brokers such as X-mode.<ref>Pro, Muslim. “Statement from Muslim Pro.” Muslim Pro - Help Center, 17 Dec. 2020, support.muslimpro.com/hc/en-us/articles/360052648551-Statement-from-Muslim-Pro. </ref><br />
<br />
===Islamic Apps===<br />
Muslim Pro exists among around 300 other Islam centered religious apps on the [https://en.wikipedia.org/wiki/Google_Play Google Play Store].<ref name="hameed">Hameed, A. (2019). Survey, Analysis and Issues of Islamic Android Apps. Elkawnie Journal of Islamic Science and Technology, 5(1). Retrieved from https://jurnal.ar-raniry.ac.id/index.php/elkawnie/article/view/4541/pdf</ref> Hameed states that a major issue regarding these apps is the lack of accessibility for users because proper categories do not exist in which to place these religious apps into. They say that these missing identifiers make it difficult for users to properly discover and utilize these programs compared to other apps on the Google platform.<br />
<br />
The research also describes another issue related to the lack of verification present when developing and publishing apps containing Islamic content to the Play Store. They discovered 90% of those surveyed reported they did not know how to inspect the content they were reading to be sure it was authentic as well as finding that many Islamic apps provided false information. Hameed states that this can mislead users to spread false information and will require work in the future to construct a content review framework to ensure accurate information is portrayed to those who rely on the apps for worship.<br />
<br />
==References==</div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=MuslimPro&diff=100012
MuslimPro
2021-04-01T22:22:00Z
<p>Nfigue: /* Islamic Apps */ reworded</p>
<hr />
<div>Muslim Pro is a [https://en.wikipedia.org/wiki/Mobile_app mobile application] (app) designed for purpose of assisting Muslims with their religious activities such as prayer and fasting. It is the most popular Islamic religious application available on both Android and iOS, providing services to over 100 million users<ref>“Muslims Reel over a Prayer App That Sold User Data: 'A Betrayal from within Our Own Community'.” Los Angeles Times, Los Angeles Times, 23 Nov. 2020, www.latimes.com/business/technology/story/2020-11-23/muslim-pro-data-location-sales-military-contractors. </ref>. Muslim Pro was created in 2010 by the founder and former CEO of Bitsmedia, Erwan Mace<ref name="edge-markets">Raj, Adeline Paul. “Tech: Growing the World's Most Popular Lifestyle App for Muslims.” The Edge Markets, 20 June 2018, www.theedgemarkets.com/article/tech-growing-worlds-most-popular-lifestyle-app-muslims. </ref>, a French national and former employee of Southeast Asia Google.<br />
<br />
[[File:Muslimpro.jpeg|150px|thumb|right|Muslim Pro icon on iOS device. <ref> Ali Baker. Muslim Pro Icon. Mar 11 2020 </ref>]]<br />
<br />
<br />
==Background==<br />
<br />
[[File:Erwan mace.jpg|100px|thumb|right|Photo of Erwan Mace. <ref>“Erwan Mace.” App Masters, 25 Mar. 2015, appmasters.com/bitsmedia-erwan-mace/bitsmedia-erwan-mace-2/.</ref>]]<br />
<br />
Muslim Pro was founded in Singapore by Erwan Mace in 2010. It is developed and supported by a Singapore based company known as Bitsmedia. The initial release consisted of prayer time reminders and assistance with reading the Holy Quran. The app has since evolved to provide more features to the users. In 2019, Erwan Mace stepped down as Bitsmedia CEO and was replaced by Louis-Bernard Carcouet. <ref name="guardian">“ACLU Files Request over Data US Collected via Muslim App Used by Millions.” The Guardian, Guardian News and Media, 3 Dec. 2020, www.theguardian.com/us-news/2020/dec/03/aclu-seeks-release-records-data-us-collected-via-muslim-app-used-millions. </ref> As of 2019, the application has accumulated over 100 million unique users around the world <br />
<br />
<br />
Muslim pro was the subject of a privacy controversy due to allegations of sending information about their users to data brokers associated with the United States military.<ref name="guardian">“ACLU Files Request over Data US Collected via Muslim App Used by Millions.” The Guardian, Guardian News and Media, 3 Dec. 2020, www.theguardian.com/us-news/2020/dec/03/aclu-seeks-release-records-data-us-collected-via-muslim-app-used-millions. </ref><br />
<br />
==Functionality==<br />
Muslim Pro is only available on devices running the [https://en.wikipedia.org/wiki/IOS iOS] or [https://en.wikipedia.org/wiki/Android_(operating_system) Android] operating systems. The app is designed to help Muslims observe their religious practices and accommodate users based on their location. The app provides reminders to users about prayer times, a compass to help orient your position in prayers towards the Ka’ba, and a digital version of the Holy Quran available in 40 different languages.<ref> Pro, Muslim. Big 10 for Muslim Pro - 10 Million Global Downloads for Mobile App, 30 June 2018, www.prnewswire.com/news-releases/big-10-for-muslim-pro---10-million-global-downloads-for-mobile-app-255471731.html#:~:text=Muslim%20Pro%20is%20an%20Islamic,and%20BlackBerry%20World%20(BlackBerry). </ref> The app also utilizes location services to help Muslims find local halal food in their area. It also helps users find the nearest mosques in their local vicinity. Muslim Pro also provides assistance within the holy month of Ramadan helping users keep track of their fasting based on their location. The app can help Muslims plan out their pilgrimage (or Hajj) and allows for connecting with other Muslims on the app by sharing posts related to Islam on the application timeline. <ref name="features"> “Muslim Pro for IPhone and Android.” Www.muslimpro.com, www.muslimpro.com/features. </ref><br />
<br />
[[File:Location muslim.PNG|100px|thumb|left|Screenshot of Muslim Pro asking for user Location <ref> Ali Baker. Muslim Pro Location Screenshot. Mar 11 2020 </ref>]]<br />
Muslim Pro utilizes ads to help pay for their expenses and also provides the option of a subscription model which allows users to avoid ads by paying a monthly fee. Subscribers receive benefits such as additional voices for Qur’an and Adhan readings, as well as access to new color schemes. <ref name="features"> “Muslim Pro for IPhone and Android.” Www.muslimpro.com, www.muslimpro.com/features. </ref><br />
<br />
Muslim Pro relies on knowing the user’s [https://en.wikipedia.org/wiki/Geopositioning geolocation] data for most functions to work correctly. When a user opens the application, the service will ask for permission to access the devices GPS. According to the privacy policy of Muslim Pro, the application processes, stores, and analyzes data sent from the users using the application including personal details, location, and device specific information. <ref> Pro, Muslim. “Privacy Policy.” Muslim Pro - Help Center, 25 Mar. 2021, support.muslimpro.com/hc/en-us/articles/203485970-Privacy-Policy.</ref><br />
<br />
==Religious Apps==<br />
Muslim Pro falls into a category along with similar apps for many other religions which help users uphold their religious commitments<ref name="hughes">Rinker, C. H. (2016). Religious Apps for Smartphones and Tablets: Transforming Religious Authority and the Nature of Religion. Interdisciplinary Journal of Research on Religion, 12(4)</ref>, reports a study performed by Hughes. The research found the majority of users of any faith-related mobile application use the software as a way to stay connected to their faith even without a direct community to be a part of. Users also described how they do not have time to attend religious services and instead remain involved on their own time whenever possible. Hughes describes how the usage of religious apps are transforming religion from an institutionalized practice to an everyday activity is keeping a younger generation engaged in practicing faith. The research concludes with a statement describing how apps disrupt the traditional importance of communal faith, while offering many benefits for any generation of faith.<br />
<br />
==Ethical Concerns==<br />
===Data Selling Controversy===<br />
[[File:Vicemedia.png|thumb|right|Logo of Vice Media <ref> “Vice Media.” Vice, www.vice.com/en. </ref>]]<br />
In 2020, an expose by Vice Motherboard identified Muslim Pro as one of the many applications that have had their location data accessed by the United States military<ref name="vice">“How the U.S. Military Buys Location Data from Ordinary Apps.” VICE, www.vice.com/en/article/jgqm5x/us-military-location-data-xmode-locate-x. </ref>. The article provides details concerning the U.S. Governments purchasing of Muslim Pro user location data through [https://en.wikipedia.org/wiki/X-Mode_social X-mode], a [https://en.wikipedia.org/wiki/Information_broker data broker] focused on location information, to assist counter terrorism activities. Another point of controversy lies in Muslim Pro not stating the true destination of user data in their privacy policy.<ref name="latimes">“Muslims Reel over a Prayer App That Sold User Data: 'A Betrayal from within Our Own Community'.” Los Angeles Times, Los Angeles Times, 23 Nov. 2020, www.latimes.com/business/technology/story/2020-11-23/muslim-pro-data-location-sales-military-contractors. </ref><br />
<br />
The expose also mentions other companies and applications with similar target audiences that are complicit in indirectly selling data to the United States government. Dating applications such as Muslim Mingle and Iran Social have been found to sell information to data brokers such as X-mode. <ref name="vice">“How the U.S. Military Buys Location Data from Ordinary Apps.” VICE, www.vice.com/en/article/jgqm5x/us-military-location-data-xmode-locate-x.</ref><br />
<br />
Due to the history of the United States military using location data to assist their controversial drone strikes and counter terrorism activities, Bitsmedia have received heavy criticism and condemnation from the Muslim community, the ACLU, and many data rights activists concerning their actions. <ref name="latimes">“Muslims Reel over a Prayer App That Sold User Data: 'A Betrayal from within Our Own Community'.” Los Angeles Times, Los Angeles Times, 23 Nov. 2020, www.latimes.com/business/technology/story/2020-11-23/muslim-pro-data-location-sales-military-contractors. </ref><br />
<br />
In response to the allegations, Muslim Pro have released a statement denying any involvement in selling user data to the U.S. government or any third party data brokers such as X-mode.<ref>Pro, Muslim. “Statement from Muslim Pro.” Muslim Pro - Help Center, 17 Dec. 2020, support.muslimpro.com/hc/en-us/articles/360052648551-Statement-from-Muslim-Pro. </ref><br />
<br />
===Islamic Apps===<br />
Muslim Pro exists among around 300 other Islam centered religious apps on the Google Play Store.<ref name="hameed">Hameed, A. (2019). Survey, Analysis and Issues of Islamic Android Apps. Elkawnie Journal of Islamic Science and Technology, 5(1). Retrieved from https://jurnal.ar-raniry.ac.id/index.php/elkawnie/article/view/4541/pdf</ref> Hameed states that a major issue regarding these apps is the lack of accessibility for users because proper categories do not exist in which to place these religious apps into. They say that these missing identifiers make it difficult for users to properly discover and utilize these programs compared to other apps on the Google platform.<br />
<br />
The research also describes another issue related to the lack of verification present when developing and publishing apps containing Islamic content to the Play Store. They discovered 90% of those surveyed reported they did not know how to inspect the content they were reading to be sure it was authentic as well as finding that many Islamic apps provided false information. Hameed states that this can mislead users to spread false information and will require work in the future to construct a content review framework to ensure accurate information is portrayed to those who rely on the apps for worship.<br />
<br />
==References==</div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=MuslimPro&diff=100011
MuslimPro
2021-04-01T22:20:52Z
<p>Nfigue: /* Islamic Apps */ reference formatting</p>
<hr />
<div>Muslim Pro is a [https://en.wikipedia.org/wiki/Mobile_app mobile application] (app) designed for purpose of assisting Muslims with their religious activities such as prayer and fasting. It is the most popular Islamic religious application available on both Android and iOS, providing services to over 100 million users<ref>“Muslims Reel over a Prayer App That Sold User Data: 'A Betrayal from within Our Own Community'.” Los Angeles Times, Los Angeles Times, 23 Nov. 2020, www.latimes.com/business/technology/story/2020-11-23/muslim-pro-data-location-sales-military-contractors. </ref>. Muslim Pro was created in 2010 by the founder and former CEO of Bitsmedia, Erwan Mace<ref name="edge-markets">Raj, Adeline Paul. “Tech: Growing the World's Most Popular Lifestyle App for Muslims.” The Edge Markets, 20 June 2018, www.theedgemarkets.com/article/tech-growing-worlds-most-popular-lifestyle-app-muslims. </ref>, a French national and former employee of Southeast Asia Google.<br />
<br />
[[File:Muslimpro.jpeg|150px|thumb|right|Muslim Pro icon on iOS device. <ref> Ali Baker. Muslim Pro Icon. Mar 11 2020 </ref>]]<br />
<br />
<br />
==Background==<br />
<br />
[[File:Erwan mace.jpg|100px|thumb|right|Photo of Erwan Mace. <ref>“Erwan Mace.” App Masters, 25 Mar. 2015, appmasters.com/bitsmedia-erwan-mace/bitsmedia-erwan-mace-2/.</ref>]]<br />
<br />
Muslim Pro was founded in Singapore by Erwan Mace in 2010. It is developed and supported by a Singapore based company known as Bitsmedia. The initial release consisted of prayer time reminders and assistance with reading the Holy Quran. The app has since evolved to provide more features to the users. In 2019, Erwan Mace stepped down as Bitsmedia CEO and was replaced by Louis-Bernard Carcouet. <ref name="guardian">“ACLU Files Request over Data US Collected via Muslim App Used by Millions.” The Guardian, Guardian News and Media, 3 Dec. 2020, www.theguardian.com/us-news/2020/dec/03/aclu-seeks-release-records-data-us-collected-via-muslim-app-used-millions. </ref> As of 2019, the application has accumulated over 100 million unique users around the world <br />
<br />
<br />
Muslim pro was the subject of a privacy controversy due to allegations of sending information about their users to data brokers associated with the United States military.<ref name="guardian">“ACLU Files Request over Data US Collected via Muslim App Used by Millions.” The Guardian, Guardian News and Media, 3 Dec. 2020, www.theguardian.com/us-news/2020/dec/03/aclu-seeks-release-records-data-us-collected-via-muslim-app-used-millions. </ref><br />
<br />
==Functionality==<br />
Muslim Pro is only available on devices running the [https://en.wikipedia.org/wiki/IOS iOS] or [https://en.wikipedia.org/wiki/Android_(operating_system) Android] operating systems. The app is designed to help Muslims observe their religious practices and accommodate users based on their location. The app provides reminders to users about prayer times, a compass to help orient your position in prayers towards the Ka’ba, and a digital version of the Holy Quran available in 40 different languages.<ref> Pro, Muslim. Big 10 for Muslim Pro - 10 Million Global Downloads for Mobile App, 30 June 2018, www.prnewswire.com/news-releases/big-10-for-muslim-pro---10-million-global-downloads-for-mobile-app-255471731.html#:~:text=Muslim%20Pro%20is%20an%20Islamic,and%20BlackBerry%20World%20(BlackBerry). </ref> The app also utilizes location services to help Muslims find local halal food in their area. It also helps users find the nearest mosques in their local vicinity. Muslim Pro also provides assistance within the holy month of Ramadan helping users keep track of their fasting based on their location. The app can help Muslims plan out their pilgrimage (or Hajj) and allows for connecting with other Muslims on the app by sharing posts related to Islam on the application timeline. <ref name="features"> “Muslim Pro for IPhone and Android.” Www.muslimpro.com, www.muslimpro.com/features. </ref><br />
<br />
[[File:Location muslim.PNG|100px|thumb|left|Screenshot of Muslim Pro asking for user Location <ref> Ali Baker. Muslim Pro Location Screenshot. Mar 11 2020 </ref>]]<br />
Muslim Pro utilizes ads to help pay for their expenses and also provides the option of a subscription model which allows users to avoid ads by paying a monthly fee. Subscribers receive benefits such as additional voices for Qur’an and Adhan readings, as well as access to new color schemes. <ref name="features"> “Muslim Pro for IPhone and Android.” Www.muslimpro.com, www.muslimpro.com/features. </ref><br />
<br />
Muslim Pro relies on knowing the user’s [https://en.wikipedia.org/wiki/Geopositioning geolocation] data for most functions to work correctly. When a user opens the application, the service will ask for permission to access the devices GPS. According to the privacy policy of Muslim Pro, the application processes, stores, and analyzes data sent from the users using the application including personal details, location, and device specific information. <ref> Pro, Muslim. “Privacy Policy.” Muslim Pro - Help Center, 25 Mar. 2021, support.muslimpro.com/hc/en-us/articles/203485970-Privacy-Policy.</ref><br />
<br />
==Religious Apps==<br />
Muslim Pro falls into a category along with similar apps for many other religions which help users uphold their religious commitments<ref name="hughes">Rinker, C. H. (2016). Religious Apps for Smartphones and Tablets: Transforming Religious Authority and the Nature of Religion. Interdisciplinary Journal of Research on Religion, 12(4)</ref>, reports a study performed by Hughes. The research found the majority of users of any faith-related mobile application use the software as a way to stay connected to their faith even without a direct community to be a part of. Users also described how they do not have time to attend religious services and instead remain involved on their own time whenever possible. Hughes describes how the usage of religious apps are transforming religion from an institutionalized practice to an everyday activity is keeping a younger generation engaged in practicing faith. The research concludes with a statement describing how apps disrupt the traditional importance of communal faith, while offering many benefits for any generation of faith.<br />
<br />
==Ethical Concerns==<br />
===Data Selling Controversy===<br />
[[File:Vicemedia.png|thumb|right|Logo of Vice Media <ref> “Vice Media.” Vice, www.vice.com/en. </ref>]]<br />
In 2020, an expose by Vice Motherboard identified Muslim Pro as one of the many applications that have had their location data accessed by the United States military<ref name="vice">“How the U.S. Military Buys Location Data from Ordinary Apps.” VICE, www.vice.com/en/article/jgqm5x/us-military-location-data-xmode-locate-x. </ref>. The article provides details concerning the U.S. Governments purchasing of Muslim Pro user location data through [https://en.wikipedia.org/wiki/X-Mode_social X-mode], a [https://en.wikipedia.org/wiki/Information_broker data broker] focused on location information, to assist counter terrorism activities. Another point of controversy lies in Muslim Pro not stating the true destination of user data in their privacy policy.<ref name="latimes">“Muslims Reel over a Prayer App That Sold User Data: 'A Betrayal from within Our Own Community'.” Los Angeles Times, Los Angeles Times, 23 Nov. 2020, www.latimes.com/business/technology/story/2020-11-23/muslim-pro-data-location-sales-military-contractors. </ref><br />
<br />
The expose also mentions other companies and applications with similar target audiences that are complicit in indirectly selling data to the United States government. Dating applications such as Muslim Mingle and Iran Social have been found to sell information to data brokers such as X-mode. <ref name="vice">“How the U.S. Military Buys Location Data from Ordinary Apps.” VICE, www.vice.com/en/article/jgqm5x/us-military-location-data-xmode-locate-x.</ref><br />
<br />
Due to the history of the United States military using location data to assist their controversial drone strikes and counter terrorism activities, Bitsmedia have received heavy criticism and condemnation from the Muslim community, the ACLU, and many data rights activists concerning their actions. <ref name="latimes">“Muslims Reel over a Prayer App That Sold User Data: 'A Betrayal from within Our Own Community'.” Los Angeles Times, Los Angeles Times, 23 Nov. 2020, www.latimes.com/business/technology/story/2020-11-23/muslim-pro-data-location-sales-military-contractors. </ref><br />
<br />
In response to the allegations, Muslim Pro have released a statement denying any involvement in selling user data to the U.S. government or any third party data brokers such as X-mode.<ref>Pro, Muslim. “Statement from Muslim Pro.” Muslim Pro - Help Center, 17 Dec. 2020, support.muslimpro.com/hc/en-us/articles/360052648551-Statement-from-Muslim-Pro. </ref><br />
<br />
===Islamic Apps===<br />
Muslim Pro exists among around 300 other Islam centered religious apps on the Google Play Store.<ref name="hameed">Hameed, A. (2019). Survey, Analysis and Issues of Islamic Android Apps. Elkawnie Journal of Islamic Science and Technology, 5(1). Retrieved from https://jurnal.ar-raniry.ac.id/index.php/elkawnie/article/view/4541/pdf</ref> Hameed states that a major issue regarding these apps is the lack of accessibility for users because of the lack of categories to place these religious apps into. They say that these missing identifiers make it difficult for users to properly discover and utilize these programs compared to other apps on the Google platform.<br />
<br />
The research also describes another issue related to the lack of verification present when developing and publishing apps containing Islamic content to the Play Store. They discovered 90% of those surveyed reported they did not know how to inspect the content they were reading to be sure it was authentic as well as finding that many Islamic apps provided false information. Hameed states that this can mislead users to spread false information and will require work in the future to construct a content review framework to ensure accurate information is portrayed to those who rely on the apps for worship. <br />
<br />
==References==</div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=MuslimPro&diff=100010
MuslimPro
2021-04-01T22:19:11Z
<p>Nfigue: Expanded Islamic Apps section</p>
<hr />
<div>Muslim Pro is a [https://en.wikipedia.org/wiki/Mobile_app mobile application] (app) designed for purpose of assisting Muslims with their religious activities such as prayer and fasting. It is the most popular Islamic religious application available on both Android and iOS, providing services to over 100 million users<ref>“Muslims Reel over a Prayer App That Sold User Data: 'A Betrayal from within Our Own Community'.” Los Angeles Times, Los Angeles Times, 23 Nov. 2020, www.latimes.com/business/technology/story/2020-11-23/muslim-pro-data-location-sales-military-contractors. </ref>. Muslim Pro was created in 2010 by the founder and former CEO of Bitsmedia, Erwan Mace<ref name="edge-markets">Raj, Adeline Paul. “Tech: Growing the World's Most Popular Lifestyle App for Muslims.” The Edge Markets, 20 June 2018, www.theedgemarkets.com/article/tech-growing-worlds-most-popular-lifestyle-app-muslims. </ref>, a French national and former employee of Southeast Asia Google.<br />
<br />
[[File:Muslimpro.jpeg|150px|thumb|right|Muslim Pro icon on iOS device. <ref> Ali Baker. Muslim Pro Icon. Mar 11 2020 </ref>]]<br />
<br />
<br />
==Background==<br />
<br />
[[File:Erwan mace.jpg|100px|thumb|right|Photo of Erwan Mace. <ref>“Erwan Mace.” App Masters, 25 Mar. 2015, appmasters.com/bitsmedia-erwan-mace/bitsmedia-erwan-mace-2/.</ref>]]<br />
<br />
Muslim Pro was founded in Singapore by Erwan Mace in 2010. It is developed and supported by a Singapore based company known as Bitsmedia. The initial release consisted of prayer time reminders and assistance with reading the Holy Quran. The app has since evolved to provide more features to the users. In 2019, Erwan Mace stepped down as Bitsmedia CEO and was replaced by Louis-Bernard Carcouet. <ref name="guardian">“ACLU Files Request over Data US Collected via Muslim App Used by Millions.” The Guardian, Guardian News and Media, 3 Dec. 2020, www.theguardian.com/us-news/2020/dec/03/aclu-seeks-release-records-data-us-collected-via-muslim-app-used-millions. </ref> As of 2019, the application has accumulated over 100 million unique users around the world <br />
<br />
<br />
Muslim pro was the subject of a privacy controversy due to allegations of sending information about their users to data brokers associated with the United States military.<ref name="guardian">“ACLU Files Request over Data US Collected via Muslim App Used by Millions.” The Guardian, Guardian News and Media, 3 Dec. 2020, www.theguardian.com/us-news/2020/dec/03/aclu-seeks-release-records-data-us-collected-via-muslim-app-used-millions. </ref><br />
<br />
==Functionality==<br />
Muslim Pro is only available on devices running the [https://en.wikipedia.org/wiki/IOS iOS] or [https://en.wikipedia.org/wiki/Android_(operating_system) Android] operating systems. The app is designed to help Muslims observe their religious practices and accommodate users based on their location. The app provides reminders to users about prayer times, a compass to help orient your position in prayers towards the Ka’ba, and a digital version of the Holy Quran available in 40 different languages.<ref> Pro, Muslim. Big 10 for Muslim Pro - 10 Million Global Downloads for Mobile App, 30 June 2018, www.prnewswire.com/news-releases/big-10-for-muslim-pro---10-million-global-downloads-for-mobile-app-255471731.html#:~:text=Muslim%20Pro%20is%20an%20Islamic,and%20BlackBerry%20World%20(BlackBerry). </ref> The app also utilizes location services to help Muslims find local halal food in their area. It also helps users find the nearest mosques in their local vicinity. Muslim Pro also provides assistance within the holy month of Ramadan helping users keep track of their fasting based on their location. The app can help Muslims plan out their pilgrimage (or Hajj) and allows for connecting with other Muslims on the app by sharing posts related to Islam on the application timeline. <ref name="features"> “Muslim Pro for IPhone and Android.” Www.muslimpro.com, www.muslimpro.com/features. </ref><br />
<br />
[[File:Location muslim.PNG|100px|thumb|left|Screenshot of Muslim Pro asking for user Location <ref> Ali Baker. Muslim Pro Location Screenshot. Mar 11 2020 </ref>]]<br />
Muslim Pro utilizes ads to help pay for their expenses and also provides the option of a subscription model which allows users to avoid ads by paying a monthly fee. Subscribers receive benefits such as additional voices for Qur’an and Adhan readings, as well as access to new color schemes. <ref name="features"> “Muslim Pro for IPhone and Android.” Www.muslimpro.com, www.muslimpro.com/features. </ref><br />
<br />
Muslim Pro relies on knowing the user’s [https://en.wikipedia.org/wiki/Geopositioning geolocation] data for most functions to work correctly. When a user opens the application, the service will ask for permission to access the devices GPS. According to the privacy policy of Muslim Pro, the application processes, stores, and analyzes data sent from the users using the application including personal details, location, and device specific information. <ref> Pro, Muslim. “Privacy Policy.” Muslim Pro - Help Center, 25 Mar. 2021, support.muslimpro.com/hc/en-us/articles/203485970-Privacy-Policy.</ref><br />
<br />
==Religious Apps==<br />
Muslim Pro falls into a category along with similar apps for many other religions which help users uphold their religious commitments<ref name="hughes">Rinker, C. H. (2016). Religious Apps for Smartphones and Tablets: Transforming Religious Authority and the Nature of Religion. Interdisciplinary Journal of Research on Religion, 12(4)</ref>, reports a study performed by Hughes. The research found the majority of users of any faith-related mobile application use the software as a way to stay connected to their faith even without a direct community to be a part of. Users also described how they do not have time to attend religious services and instead remain involved on their own time whenever possible. Hughes describes how the usage of religious apps are transforming religion from an institutionalized practice to an everyday activity is keeping a younger generation engaged in practicing faith. The research concludes with a statement describing how apps disrupt the traditional importance of communal faith, while offering many benefits for any generation of faith.<br />
<br />
==Ethical Concerns==<br />
===Data Selling Controversy===<br />
[[File:Vicemedia.png|thumb|right|Logo of Vice Media <ref> “Vice Media.” Vice, www.vice.com/en. </ref>]]<br />
In 2020, an expose by Vice Motherboard identified Muslim Pro as one of the many applications that have had their location data accessed by the United States military<ref name="vice">“How the U.S. Military Buys Location Data from Ordinary Apps.” VICE, www.vice.com/en/article/jgqm5x/us-military-location-data-xmode-locate-x. </ref>. The article provides details concerning the U.S. Governments purchasing of Muslim Pro user location data through [https://en.wikipedia.org/wiki/X-Mode_social X-mode], a [https://en.wikipedia.org/wiki/Information_broker data broker] focused on location information, to assist counter terrorism activities. Another point of controversy lies in Muslim Pro not stating the true destination of user data in their privacy policy.<ref name="latimes">“Muslims Reel over a Prayer App That Sold User Data: 'A Betrayal from within Our Own Community'.” Los Angeles Times, Los Angeles Times, 23 Nov. 2020, www.latimes.com/business/technology/story/2020-11-23/muslim-pro-data-location-sales-military-contractors. </ref><br />
<br />
The expose also mentions other companies and applications with similar target audiences that are complicit in indirectly selling data to the United States government. Dating applications such as Muslim Mingle and Iran Social have been found to sell information to data brokers such as X-mode. <ref name="vice">“How the U.S. Military Buys Location Data from Ordinary Apps.” VICE, www.vice.com/en/article/jgqm5x/us-military-location-data-xmode-locate-x.</ref><br />
<br />
Due to the history of the United States military using location data to assist their controversial drone strikes and counter terrorism activities, Bitsmedia have received heavy criticism and condemnation from the Muslim community, the ACLU, and many data rights activists concerning their actions. <ref name="latimes">“Muslims Reel over a Prayer App That Sold User Data: 'A Betrayal from within Our Own Community'.” Los Angeles Times, Los Angeles Times, 23 Nov. 2020, www.latimes.com/business/technology/story/2020-11-23/muslim-pro-data-location-sales-military-contractors. </ref><br />
<br />
In response to the allegations, Muslim Pro have released a statement denying any involvement in selling user data to the U.S. government or any third party data brokers such as X-mode.<ref>Pro, Muslim. “Statement from Muslim Pro.” Muslim Pro - Help Center, 17 Dec. 2020, support.muslimpro.com/hc/en-us/articles/360052648551-Statement-from-Muslim-Pro. </ref><br />
<br />
===Islamic Apps===<br />
Muslim Pro exists among around 300 other Islam centered religious apps on the Google Play Store.<ref name="hameed">Hameed, A. (2019). Survey, Analysis and Issues of Islamic Android Apps. Elkawnie Journal of Islamic Science and Technology, 5(1). Retrieved from https://jurnal.ar-raniry.ac.id/index.php/elkawnie/article/view/4541/pdf> Hameed states that a major issue regarding these apps is the lack of accessibility for users because of the lack of categories to place these religious apps into. They say that these missing identifiers make it difficult for users to properly discover and utilize these programs compared to other apps on the Google platform.<br />
<br />
The research also describes another issue related to the lack of verification present when developing and publishing apps containing Islamic content to the Play Store. They discovered 90% of those surveyed reported they did not know how to inspect the content they were reading to be sure it was authentic as well as finding that many Islamic apps provided false information. Hameed states that this can mislead users to spread false information and will require work in the future to construct a content review framework to ensure accurate information is portrayed to those who rely on the apps for worship. <br />
<br />
== References ==</div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=MuslimPro&diff=100004
MuslimPro
2021-04-01T20:23:00Z
<p>Nfigue: Added Islamic Apps section</p>
<hr />
<div>Muslim Pro is a [https://en.wikipedia.org/wiki/Mobile_app mobile application] (app) designed for purpose of assisting Muslims with their religious activities such as prayer and fasting. It is the most popular Islamic religious application available on both Android and iOS, providing services to over 100 million users<ref>“Muslims Reel over a Prayer App That Sold User Data: 'A Betrayal from within Our Own Community'.” Los Angeles Times, Los Angeles Times, 23 Nov. 2020, www.latimes.com/business/technology/story/2020-11-23/muslim-pro-data-location-sales-military-contractors. </ref>. Muslim Pro was created in 2010 by the founder and former CEO of Bitsmedia, Erwan Mace<ref name="edge-markets">Raj, Adeline Paul. “Tech: Growing the World's Most Popular Lifestyle App for Muslims.” The Edge Markets, 20 June 2018, www.theedgemarkets.com/article/tech-growing-worlds-most-popular-lifestyle-app-muslims. </ref>, a French national and former employee of Southeast Asia Google.<br />
<br />
[[File:Muslimpro.jpeg|150px|thumb|right|Muslim Pro icon on iOS device. <ref> Ali Baker. Muslim Pro Icon. Mar 11 2020 </ref>]]<br />
<br />
<br />
==Background==<br />
<br />
[[File:Erwan mace.jpg|100px|thumb|right|Photo of Erwan Mace. <ref>“Erwan Mace.” App Masters, 25 Mar. 2015, appmasters.com/bitsmedia-erwan-mace/bitsmedia-erwan-mace-2/.</ref>]]<br />
<br />
Muslim Pro was founded in Singapore by Erwan Mace in 2010. It is developed and supported by a Singapore based company known as Bitsmedia. The initial release consisted of prayer time reminders and assistance with reading the Holy Quran. The app has since evolved to provide more features to the users. In 2019, Erwan Mace stepped down as Bitsmedia CEO and was replaced by Louis-Bernard Carcouet. <ref name="guardian">“ACLU Files Request over Data US Collected via Muslim App Used by Millions.” The Guardian, Guardian News and Media, 3 Dec. 2020, www.theguardian.com/us-news/2020/dec/03/aclu-seeks-release-records-data-us-collected-via-muslim-app-used-millions. </ref> As of 2019, the application has accumulated over 100 million unique users around the world <br />
<br />
<br />
Muslim pro was the subject of a privacy controversy due to allegations of sending information about their users to data brokers associated with the United States military.<ref name="guardian">“ACLU Files Request over Data US Collected via Muslim App Used by Millions.” The Guardian, Guardian News and Media, 3 Dec. 2020, www.theguardian.com/us-news/2020/dec/03/aclu-seeks-release-records-data-us-collected-via-muslim-app-used-millions. </ref><br />
<br />
==Functionality==<br />
Muslim Pro is only available on devices running the [https://en.wikipedia.org/wiki/IOS iOS] or [https://en.wikipedia.org/wiki/Android_(operating_system) Android] operating systems. The app is designed to help Muslims observe their religious practices and accommodate users based on their location. The app provides reminders to users about prayer times, a compass to help orient your position in prayers towards the Ka’ba, and a digital version of the Holy Quran available in 40 different languages.<ref> Pro, Muslim. Big 10 for Muslim Pro - 10 Million Global Downloads for Mobile App, 30 June 2018, www.prnewswire.com/news-releases/big-10-for-muslim-pro---10-million-global-downloads-for-mobile-app-255471731.html#:~:text=Muslim%20Pro%20is%20an%20Islamic,and%20BlackBerry%20World%20(BlackBerry). </ref> The app also utilizes location services to help Muslims find local halal food in their area. It also helps users find the nearest mosques in their local vicinity. Muslim Pro also provides assistance within the holy month of Ramadan helping users keep track of their fasting based on their location. The app can help Muslims plan out their pilgrimage (or Hajj) and allows for connecting with other Muslims on the app by sharing posts related to Islam on the application timeline. <ref name="features"> “Muslim Pro for IPhone and Android.” Www.muslimpro.com, www.muslimpro.com/features. </ref><br />
<br />
[[File:Location muslim.PNG|100px|thumb|left|Screenshot of Muslim Pro asking for user Location <ref> Ali Baker. Muslim Pro Location Screenshot. Mar 11 2020 </ref>]]<br />
Muslim Pro utilizes ads to help pay for their expenses and also provides the option of a subscription model which allows users to avoid ads by paying a monthly fee. Subscribers receive benefits such as additional voices for Qur’an and Adhan readings, as well as access to new color schemes. <ref name="features"> “Muslim Pro for IPhone and Android.” Www.muslimpro.com, www.muslimpro.com/features. </ref><br />
<br />
Muslim Pro relies on knowing the user’s [https://en.wikipedia.org/wiki/Geopositioning geolocation] data for most functions to work correctly. When a user opens the application, the service will ask for permission to access the devices GPS. According to the privacy policy of Muslim Pro, the application processes, stores, and analyzes data sent from the users using the application including personal details, location, and device specific information. <ref> Pro, Muslim. “Privacy Policy.” Muslim Pro - Help Center, 25 Mar. 2021, support.muslimpro.com/hc/en-us/articles/203485970-Privacy-Policy.</ref><br />
<br />
==Religious Apps==<br />
Muslim Pro falls into a category along with similar apps for many other religions which help users uphold their religious commitments<ref name="hughes">Rinker, C. H. (2016). Religious Apps for Smartphones and Tablets: Transforming Religious Authority and the Nature of Religion. Interdisciplinary Journal of Research on Religion, 12(4)</ref>, reports a study performed by Hughes. The research found the majority of users of any faith-related mobile application use the software as a way to stay connected to their faith even without a direct community to be a part of. Users also described how they do not have time to attend religious services and instead remain involved on their own time whenever possible. Hughes describes how the usage of religious apps are transforming religion from an institutionalized practice to an everyday activity is keeping a younger generation engaged in practicing faith. The research concludes with a statement describing how apps disrupt the traditional importance of communal faith, while offering many benefits for any generation of faith.<br />
<br />
==Ethical Concerns==<br />
===Data Selling Controversy===<br />
[[File:Vicemedia.png|thumb|right|Logo of Vice Media <ref> “Vice Media.” Vice, www.vice.com/en. </ref>]]<br />
In 2020, an expose by Vice Motherboard identified Muslim Pro as one of the many applications that have had their location data accessed by the United States military<ref name="vice">“How the U.S. Military Buys Location Data from Ordinary Apps.” VICE, www.vice.com/en/article/jgqm5x/us-military-location-data-xmode-locate-x. </ref>. The article provides details concerning the U.S. Governments purchasing of Muslim Pro user location data through [https://en.wikipedia.org/wiki/X-Mode_social X-mode], a [https://en.wikipedia.org/wiki/Information_broker data broker] focused on location information, to assist counter terrorism activities. Another point of controversy lies in Muslim Pro not stating the true destination of user data in their privacy policy.<ref name="latimes">“Muslims Reel over a Prayer App That Sold User Data: 'A Betrayal from within Our Own Community'.” Los Angeles Times, Los Angeles Times, 23 Nov. 2020, www.latimes.com/business/technology/story/2020-11-23/muslim-pro-data-location-sales-military-contractors. </ref><br />
<br />
The expose also mentions other companies and applications with similar target audiences that are complicit in indirectly selling data to the United States government. Dating applications such as Muslim Mingle and Iran Social have been found to sell information to data brokers such as X-mode. <ref name="vice">“How the U.S. Military Buys Location Data from Ordinary Apps.” VICE, www.vice.com/en/article/jgqm5x/us-military-location-data-xmode-locate-x.</ref><br />
<br />
Due to the history of the United States military using location data to assist their controversial drone strikes and counter terrorism activities, Bitsmedia have received heavy criticism and condemnation from the Muslim community, the ACLU, and many data rights activists concerning their actions. <ref name="latimes">“Muslims Reel over a Prayer App That Sold User Data: 'A Betrayal from within Our Own Community'.” Los Angeles Times, Los Angeles Times, 23 Nov. 2020, www.latimes.com/business/technology/story/2020-11-23/muslim-pro-data-location-sales-military-contractors. </ref><br />
<br />
In response to the allegations, Muslim Pro have released a statement denying any involvement in selling user data to the U.S. government or any third party data brokers such as X-mode.<ref>Pro, Muslim. “Statement from Muslim Pro.” Muslim Pro - Help Center, 17 Dec. 2020, support.muslimpro.com/hc/en-us/articles/360052648551-Statement-from-Muslim-Pro. </ref><br />
<br />
===Islamic Apps===<br />
<ref name="hameed">Hameed, A. (2019). Survey, Analysis and Issues of Islamic Android Apps. Elkawnie Journal of Islamic Science and Technology, 5(1). Retrieved from https://jurnal.ar-raniry.ac.id/index.php/elkawnie/article/view/4541/pdf><br />
<br />
== References ==</div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=MuslimPro&diff=100003
MuslimPro
2021-04-01T20:14:17Z
<p>Nfigue: Added more information to Religious Apps section</p>
<hr />
<div>Muslim Pro is a [https://en.wikipedia.org/wiki/Mobile_app mobile application] (app) designed for purpose of assisting Muslims with their religious activities such as prayer and fasting. It is the most popular Islamic religious application available on both Android and iOS, providing services to over 100 million users<ref>“Muslims Reel over a Prayer App That Sold User Data: 'A Betrayal from within Our Own Community'.” Los Angeles Times, Los Angeles Times, 23 Nov. 2020, www.latimes.com/business/technology/story/2020-11-23/muslim-pro-data-location-sales-military-contractors. </ref>. Muslim Pro was created in 2010 by the founder and former CEO of Bitsmedia, Erwan Mace<ref name="edge-markets">Raj, Adeline Paul. “Tech: Growing the World's Most Popular Lifestyle App for Muslims.” The Edge Markets, 20 June 2018, www.theedgemarkets.com/article/tech-growing-worlds-most-popular-lifestyle-app-muslims. </ref>, a French national and former employee of Southeast Asia Google.<br />
<br />
[[File:Muslimpro.jpeg|150px|thumb|right|Muslim Pro icon on iOS device. <ref> Ali Baker. Muslim Pro Icon. Mar 11 2020 </ref>]]<br />
<br />
<br />
==Background==<br />
<br />
[[File:Erwan mace.jpg|100px|thumb|right|Photo of Erwan Mace. <ref>“Erwan Mace.” App Masters, 25 Mar. 2015, appmasters.com/bitsmedia-erwan-mace/bitsmedia-erwan-mace-2/.</ref>]]<br />
<br />
Muslim Pro was founded in Singapore by Erwan Mace in 2010. It is developed and supported by a Singapore based company known as Bitsmedia. The initial release consisted of prayer time reminders and assistance with reading the Holy Quran. The app has since evolved to provide more features to the users. In 2019, Erwan Mace stepped down as Bitsmedia CEO and was replaced by Louis-Bernard Carcouet. <ref name="guardian">“ACLU Files Request over Data US Collected via Muslim App Used by Millions.” The Guardian, Guardian News and Media, 3 Dec. 2020, www.theguardian.com/us-news/2020/dec/03/aclu-seeks-release-records-data-us-collected-via-muslim-app-used-millions. </ref> As of 2019, the application has accumulated over 100 million unique users around the world <br />
<br />
<br />
Muslim pro was the subject of a privacy controversy due to allegations of sending information about their users to data brokers associated with the United States military.<ref name="guardian">“ACLU Files Request over Data US Collected via Muslim App Used by Millions.” The Guardian, Guardian News and Media, 3 Dec. 2020, www.theguardian.com/us-news/2020/dec/03/aclu-seeks-release-records-data-us-collected-via-muslim-app-used-millions. </ref><br />
<br />
==Functionality==<br />
Muslim Pro is only available on devices running the [https://en.wikipedia.org/wiki/IOS iOS] or [https://en.wikipedia.org/wiki/Android_(operating_system) Android] operating systems. The app is designed to help Muslims observe their religious practices and accommodate users based on their location. The app provides reminders to users about prayer times, a compass to help orient your position in prayers towards the Ka’ba, and a digital version of the Holy Quran available in 40 different languages.<ref> Pro, Muslim. Big 10 for Muslim Pro - 10 Million Global Downloads for Mobile App, 30 June 2018, www.prnewswire.com/news-releases/big-10-for-muslim-pro---10-million-global-downloads-for-mobile-app-255471731.html#:~:text=Muslim%20Pro%20is%20an%20Islamic,and%20BlackBerry%20World%20(BlackBerry). </ref> The app also utilizes location services to help Muslims find local halal food in their area. It also helps users find the nearest mosques in their local vicinity. Muslim Pro also provides assistance within the holy month of Ramadan helping users keep track of their fasting based on their location. The app can help Muslims plan out their pilgrimage (or Hajj) and allows for connecting with other Muslims on the app by sharing posts related to Islam on the application timeline. <ref name="features"> “Muslim Pro for IPhone and Android.” Www.muslimpro.com, www.muslimpro.com/features. </ref><br />
<br />
[[File:Location muslim.PNG|100px|thumb|left|Screenshot of Muslim Pro asking for user Location <ref> Ali Baker. Muslim Pro Location Screenshot. Mar 11 2020 </ref>]]<br />
Muslim Pro utilizes ads to help pay for their expenses and also provides the option of a subscription model which allows users to avoid ads by paying a monthly fee. Subscribers receive benefits such as additional voices for Qur’an and Adhan readings, as well as access to new color schemes. <ref name="features"> “Muslim Pro for IPhone and Android.” Www.muslimpro.com, www.muslimpro.com/features. </ref><br />
<br />
Muslim Pro relies on knowing the user’s [https://en.wikipedia.org/wiki/Geopositioning geolocation] data for most functions to work correctly. When a user opens the application, the service will ask for permission to access the devices GPS. According to the privacy policy of Muslim Pro, the application processes, stores, and analyzes data sent from the users using the application including personal details, location, and device specific information. <ref> Pro, Muslim. “Privacy Policy.” Muslim Pro - Help Center, 25 Mar. 2021, support.muslimpro.com/hc/en-us/articles/203485970-Privacy-Policy.</ref><br />
<br />
==Religious Apps==<br />
Muslim Pro falls into a category along with similar apps for many other religions which help users uphold their religious commitments<ref name="hughes">Rinker, C. H. (2016). Religious Apps for Smartphones and Tablets: Transforming Religious Authority and the Nature of Religion. Interdisciplinary Journal of Research on Religion, 12(4)</ref>, reports a study performed by Hughes. The research found the majority of users of any faith-related mobile application use the software as a way to stay connected to their faith even without a direct community to be a part of. Users also described how they do not have time to attend religious services and instead remain involved on their own time whenever possible. Hughes describes how the usage of religious apps are transforming religion from an institutionalized practice to an everyday activity is keeping a younger generation engaged in practicing faith. The research concludes with a statement describing how apps disrupt the traditional importance of communal faith, while offering many benefits for any generation of faith.<br />
<br />
==Ethical Concerns==<br />
===Data Selling Controversy===<br />
[[File:Vicemedia.png|thumb|right|Logo of Vice Media <ref> “Vice Media.” Vice, www.vice.com/en. </ref>]]<br />
In 2020, an expose by Vice Motherboard identified Muslim Pro as one of the many applications that have had their location data accessed by the United States military<ref name="vice">“How the U.S. Military Buys Location Data from Ordinary Apps.” VICE, www.vice.com/en/article/jgqm5x/us-military-location-data-xmode-locate-x. </ref>. The article provides details concerning the U.S. Governments purchasing of Muslim Pro user location data through [https://en.wikipedia.org/wiki/X-Mode_social X-mode], a [https://en.wikipedia.org/wiki/Information_broker data broker] focused on location information, to assist counter terrorism activities. Another point of controversy lies in Muslim Pro not stating the true destination of user data in their privacy policy.<ref name="latimes">“Muslims Reel over a Prayer App That Sold User Data: 'A Betrayal from within Our Own Community'.” Los Angeles Times, Los Angeles Times, 23 Nov. 2020, www.latimes.com/business/technology/story/2020-11-23/muslim-pro-data-location-sales-military-contractors. </ref><br />
<br />
The expose also mentions other companies and applications with similar target audiences that are complicit in indirectly selling data to the United States government. Dating applications such as Muslim Mingle and Iran Social have been found to sell information to data brokers such as X-mode. <ref name="vice">“How the U.S. Military Buys Location Data from Ordinary Apps.” VICE, www.vice.com/en/article/jgqm5x/us-military-location-data-xmode-locate-x.</ref><br />
<br />
Due to the history of the United States military using location data to assist their controversial drone strikes and counter terrorism activities, Bitsmedia have received heavy criticism and condemnation from the Muslim community, the ACLU, and many data rights activists concerning their actions. <ref name="latimes">“Muslims Reel over a Prayer App That Sold User Data: 'A Betrayal from within Our Own Community'.” Los Angeles Times, Los Angeles Times, 23 Nov. 2020, www.latimes.com/business/technology/story/2020-11-23/muslim-pro-data-location-sales-military-contractors. </ref><br />
<br />
In response to the allegations, Muslim Pro have released a statement denying any involvement in selling user data to the U.S. government or any third party data brokers such as X-mode.<ref>Pro, Muslim. “Statement from Muslim Pro.” Muslim Pro - Help Center, 17 Dec. 2020, support.muslimpro.com/hc/en-us/articles/360052648551-Statement-from-Muslim-Pro. </ref><br />
<br />
== References ==</div>
Nfigue
http://si410wiki.sites.uofmhosting.net/index.php?title=MuslimPro&diff=100002
MuslimPro
2021-04-01T20:02:34Z
<p>Nfigue: Added Religious Apps section</p>
<hr />
<div>Muslim Pro is a [https://en.wikipedia.org/wiki/Mobile_app mobile application] (app) designed for purpose of assisting Muslims with their religious activities such as prayer and fasting. It is the most popular Islamic religious application available on both Android and iOS, providing services to over 100 million users<ref>“Muslims Reel over a Prayer App That Sold User Data: 'A Betrayal from within Our Own Community'.” Los Angeles Times, Los Angeles Times, 23 Nov. 2020, www.latimes.com/business/technology/story/2020-11-23/muslim-pro-data-location-sales-military-contractors. </ref>. Muslim Pro was created in 2010 by the founder and former CEO of Bitsmedia, Erwan Mace<ref name="edge-markets">Raj, Adeline Paul. “Tech: Growing the World's Most Popular Lifestyle App for Muslims.” The Edge Markets, 20 June 2018, www.theedgemarkets.com/article/tech-growing-worlds-most-popular-lifestyle-app-muslims. </ref>, a French national and former employee of Southeast Asia Google.<br />
<br />
[[File:Muslimpro.jpeg|150px|thumb|right|Muslim Pro icon on iOS device. <ref> Ali Baker. Muslim Pro Icon. Mar 11 2020 </ref>]]<br />
<br />
<br />
==Background==<br />
<br />
[[File:Erwan mace.jpg|100px|thumb|right|Photo of Erwan Mace. <ref>“Erwan Mace.” App Masters, 25 Mar. 2015, appmasters.com/bitsmedia-erwan-mace/bitsmedia-erwan-mace-2/.</ref>]]<br />
<br />
Muslim Pro was founded in Singapore by Erwan Mace in 2010. It is developed and supported by a Singapore based company known as Bitsmedia. The initial release consisted of prayer time reminders and assistance with reading the Holy Quran. The app has since evolved to provide more features to the users. In 2019, Erwan Mace stepped down as Bitsmedia CEO and was replaced by Louis-Bernard Carcouet. <ref name="guardian">“ACLU Files Request over Data US Collected via Muslim App Used by Millions.” The Guardian, Guardian News and Media, 3 Dec. 2020, www.theguardian.com/us-news/2020/dec/03/aclu-seeks-release-records-data-us-collected-via-muslim-app-used-millions. </ref> As of 2019, the application has accumulated over 100 million unique users around the world <br />
<br />
<br />
Muslim pro was the subject of a privacy controversy due to allegations of sending information about their users to data brokers associated with the United States military.<ref name="guardian">“ACLU Files Request over Data US Collected via Muslim App Used by Millions.” The Guardian, Guardian News and Media, 3 Dec. 2020, www.theguardian.com/us-news/2020/dec/03/aclu-seeks-release-records-data-us-collected-via-muslim-app-used-millions. </ref><br />
<br />
==Functionality==<br />
Muslim Pro is only available on devices running the [https://en.wikipedia.org/wiki/IOS iOS] or [https://en.wikipedia.org/wiki/Android_(operating_system) Android] operating systems. The app is designed to help Muslims observe their religious practices and accommodate users based on their location. The app provides reminders to users about prayer times, a compass to help orient your position in prayers towards the Ka’ba, and a digital version of the Holy Quran available in 40 different languages.<ref> Pro, Muslim. Big 10 for Muslim Pro - 10 Million Global Downloads for Mobile App, 30 June 2018, www.prnewswire.com/news-releases/big-10-for-muslim-pro---10-million-global-downloads-for-mobile-app-255471731.html#:~:text=Muslim%20Pro%20is%20an%20Islamic,and%20BlackBerry%20World%20(BlackBerry). </ref> The app also utilizes location services to help Muslims find local halal food in their area. It also helps users find the nearest mosques in their local vicinity. Muslim Pro also provides assistance within the holy month of Ramadan helping users keep track of their fasting based on their location. The app can help Muslims plan out their pilgrimage (or Hajj) and allows for connecting with other Muslims on the app by sharing posts related to Islam on the application timeline. <ref name="features"> “Muslim Pro for IPhone and Android.” Www.muslimpro.com, www.muslimpro.com/features. </ref><br />
<br />
[[File:Location muslim.PNG|100px|thumb|left|Screenshot of Muslim Pro asking for user Location <ref> Ali Baker. Muslim Pro Location Screenshot. Mar 11 2020 </ref>]]<br />
Muslim Pro utilizes ads to help pay for their expenses and also provides the option of a subscription model which allows users to avoid ads by paying a monthly fee. Subscribers receive benefits such as additional voices for Qur’an and Adhan readings, as well as access to new color schemes. <ref name="features"> “Muslim Pro for IPhone and Android.” Www.muslimpro.com, www.muslimpro.com/features. </ref><br />
<br />
Muslim Pro relies on knowing the user’s [https://en.wikipedia.org/wiki/Geopositioning geolocation] data for most functions to work correctly. When a user opens the application, the service will ask for permission to access the devices GPS. According to the privacy policy of Muslim Pro, the application processes, stores, and analyzes data sent from the users using the application including personal details, location, and device specific information. <ref> Pro, Muslim. “Privacy Policy.” Muslim Pro - Help Center, 25 Mar. 2021, support.muslimpro.com/hc/en-us/articles/203485970-Privacy-Policy.</ref><br />
<br />
==Religious Apps==<br />
Muslim Pro fulfills the needs many require in order to uphold their religious commitments, regardless of the faith they follow.<ref name="hughes">Rinker, C. H. (2016). Religious Apps for Smartphones and Tablets: Transforming Religious Authority and the Nature of Religion. Interdisciplinary Journal of Research on Religion, 12(4)</ref>, shows a study. The app falls into this category along with similar apps for many other religions. The study found the majority of users of any faith-related mobile application use the software as a way to stay connected to their faith even without a direct community to be a part of. Users also described how they do not have time to attend religious services and instead remain involved on their own time whenever possible. <br />
<br />
==Ethical Concerns==<br />
===Data Selling Controversy===<br />
[[File:Vicemedia.png|thumb|right|Logo of Vice Media <ref> “Vice Media.” Vice, www.vice.com/en. </ref>]]<br />
In 2020, an expose by Vice Motherboard identified Muslim Pro as one of the many applications that have had their location data accessed by the United States military<ref name="vice">“How the U.S. Military Buys Location Data from Ordinary Apps.” VICE, www.vice.com/en/article/jgqm5x/us-military-location-data-xmode-locate-x. </ref>. The article provides details concerning the U.S. Governments purchasing of Muslim Pro user location data through [https://en.wikipedia.org/wiki/X-Mode_social X-mode], a [https://en.wikipedia.org/wiki/Information_broker data broker] focused on location information, to assist counter terrorism activities. Another point of controversy lies in Muslim Pro not stating the true destination of user data in their privacy policy.<ref name="latimes">“Muslims Reel over a Prayer App That Sold User Data: 'A Betrayal from within Our Own Community'.” Los Angeles Times, Los Angeles Times, 23 Nov. 2020, www.latimes.com/business/technology/story/2020-11-23/muslim-pro-data-location-sales-military-contractors. </ref><br />
<br />
The expose also mentions other companies and applications with similar target audiences that are complicit in indirectly selling data to the United States government. Dating applications such as Muslim Mingle and Iran Social have been found to sell information to data brokers such as X-mode. <ref name="vice">“How the U.S. Military Buys Location Data from Ordinary Apps.” VICE, www.vice.com/en/article/jgqm5x/us-military-location-data-xmode-locate-x.</ref><br />
<br />
Due to the history of the United States military using location data to assist their controversial drone strikes and counter terrorism activities, Bitsmedia have received heavy criticism and condemnation from the Muslim community, the ACLU, and many data rights activists concerning their actions. <ref name="latimes">“Muslims Reel over a Prayer App That Sold User Data: 'A Betrayal from within Our Own Community'.” Los Angeles Times, Los Angeles Times, 23 Nov. 2020, www.latimes.com/business/technology/story/2020-11-23/muslim-pro-data-location-sales-military-contractors. </ref><br />
<br />
In response to the allegations, Muslim Pro have released a statement denying any involvement in selling user data to the U.S. government or any third party data brokers such as X-mode.<ref>Pro, Muslim. “Statement from Muslim Pro.” Muslim Pro - Help Center, 17 Dec. 2020, support.muslimpro.com/hc/en-us/articles/360052648551-Statement-from-Muslim-Pro. </ref><br />
<br />
== References ==</div>
Nfigue