Algorithmic Audits

From SI410
Revision as of 14:54, 12 March 2021 by Dakdlew (Talk | contribs) (Created page with "== Background == All kinds of systems have transformed into "smart" objects. At the core of all these real-time digital services lie algorithms that provide essential function...")

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

Background

All kinds of systems have transformed into "smart" objects. At the core of all these real-time digital services lie algorithms that provide essential functions like sorting, segmentation, personalization, recommendations, and information management. Because technology and algorithms are heavily integrated into humanity, ethical issues arise because algorithms are opaque to public scrutiny and understanding. Since the 2000s, the regulation of algorithms has become a concern to governments worldwide due to its data and privacy implications, and legislation has made progress in essential areas.

Additionally, computer science and similar degrees have seen an increase in undergraduates, proportional to the tech industry's growth. Awareness about the infrastructure of algorithms and digital platforms enables the public to identify ethical breaches in these systems' functionality. This awareness has sparked activism for ethics in technology. For example, Joy Buolamwini identifying the bias in facial recognition technologies.

Legislation

Since the internet became a public commodity, the value of information has increased significantly. The US justice system has seen numerous cases involving digital privacy, ownership, plagiarism, hacking, fraud, and more. In response, laws have developed to categorize and respond to common disputes.

Display of Information

In 1984 USCAB declared that the airline sorting algorithm created by SABRE must be known to participating airlines under the Display of Information in the Code of Federal Regulations. This regulatory provision resulted from a bias identified by airlines using SABRE's software, which involved "screen science." and recommending airlines based on privatized criteria and capital incentives. The airlines noticed that American Airlines received prioritization in SABRE's algorithms search results, which led to an investigation by the USCAB and DOJ.

The Communications Decency Act

Section 230 of the Communications Decency Act states that "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." This section is one of the most potent active legislative acts concerning the internet and digital communications and one of its defining limitations. This provision protects service providers from the legal implications of removing content deemed obscene or offensive, even of constitutionally protected speech, as long as it is done in good faith.

US Computer Fraud and Abuse Act

The United states CFAA is an act that attaches sentences to digital behaviors deemed illegal or unlawful and has clearly stated the repercussions for committing such an offense.

The Act states: “The CFAA prohibits intentionally accessing a computer without authorization or in excess of authorization but fails to define what ‘without authorization’ means. With harsh penalty schemes and malleable provisions, it has become a tool ripe for abuse and uses against nearly every aspect of computer activity.”

Crime/ Offense Years In Prison
Obtaining National Security Information 10
Accessing a Computer and Obtaining Information 1-5
Trespassing in a Government Computer 1
Accessing a Computer to Defraud and Obtain Value 5
Intentionally Damaging by Knowing Transmission 1-10
Recklessly Damaging by Intentional Access 1-5
Negligently Causing Damage and Loss by Intentional Access 1
Trafficking in Passwords 1
Extortion Involving Computers 5
Attempt and Conspiracy to Commit such an Offense 10

Photo Sharing Law

Types of Audits

Code Audit

Use the idea of Algorithm Transparency where researchers acquire a copy of the algorithm and vet it for unethical platform behavior. Unlikely to happen because the program is considered valuable intellectual property.

Noninvasive User Audit

Researches approached users and obtained content to share their search queries and results with the researchers. Too unreliable and have small samples that are not random and make it difficult to infer causality.

Scraping Audit

Researchers issue repeated queries to a platform and observe the results, employing programs that utilize API data mining. Effective because it can gather a large quantity of data that can be tested. Not ideal because you could do 1-10 years in prison for CFAA violations resulting from this technique.

Sock Puppet Audit

Using computer programs to impersonate users by creating false accounts. Effective investigation of sensitive topics in difficult-to-access domains, sock puppets can investigate features of systems that are not public, and penetrate groups that are difficult to acknowledge. Still susceptible to CFAA violations Hard to detect without a large number of sock puppets testing.

Collaborative or Crowdsourced Audit

Similar to sock puppet but uses real people instead of computer programs to circumvent CFAA and terms of service violations. Amazon's Mechanical Turk allows a large enough group of testers to produce significant results, semi-automated crowdsourcing. requires a large budget to pay the participants.

Ethical Implications

Court Rulings

SANDVIG et al V. SESSIONS (2018)

See Also