Value Sensitive Design

From SI410
Jump to: navigation, search

Value Sensitive Design, or VSD, is a theoretically grounded approach to design that accounts for human values in a principled and comprehensive manner[1]. The term stems from the interest of designing technology and information systems that support enduring human values. Value Sensitive Design is applicable to many different types of design, including the design of online and offline information architectures.


This concept was developed by Batya Friedman and Peter Kahn at the University of Washington Information School in the late 1980s.[2] As a result, the Value Sensitive Design (VSD) Research Group was also founded in 1999 at the University of Washington by Batya Friedman in an effort to bring together Washington State faculty, students and stakeholders that wanted to design and develop new technologies that are created with human-centered values in mind[3].

Mechanics of Value Sensitive Design

Conceptual Investigations

A conceptual investigation involves conducting research on how stakeholders are affected by the design of a product, both directly and indirectly. A conceptual analysis of the value itself is completed as well, and is used to determine if a value is moral or non-moral. This means looking at what kind of values should be implemented in the design, as well as how certain values themselves should be used within the information system[1].

Empirical Investigations

Empirical Investigations are used to evaluate the success of a particular design. Many researchers who conduct empirical investigations use the same methods used in social science research, including surveys, raw observations, interviews, literature reviews, and experimental manipulations. User behavior and human physiology are also studied heavily[1].

For example, with UrbanSim, an urban development simulation (described below), the empirical investigation includes seeing whether the design has captured the range of possible values held by stakeholders, or if the designers should seek to include additional values. This might mean assessing the importance of neighborhood stability. If it is a major factor, then the simulation should make sure to account for it[4]

Technical Investigations

Technical Investigations involve researching how already existing technologies either support or hinder human values[1]. After this research is conducted, the design of a new system can begin to support the chosen value identified in the initial conceptual investigation. Researchers keep in mind the improvements that could be made to the already existing technology and implement this into the design of the new system.

With the same UrbanSim simulation, the technical investigation will involve devising ways to consider neighborhood stability in the simulation. Stakeholders should be allowed to evaluate different scenarios based on this value. Therefore, the technical investigation should tell the designers whether they have achieved this goal and successfully implemented the value correctly within the system.[5]



Pluralism is a condition or system in which two or more states, groups, principles, sources of authority, etc., coexist[6]. According to Carey Fiesler, pluralism is referred to as the act of designing in order to resist an overall specific point of view which embodies the issues of accessibility, diversity, and inclusivity[7]. A key challenge in design is to think about "whose values" are applied especially in the mindset of pluralism when different communities can have competing values.[7] However, it is important to note that pluralism allows the designer to avoid thinking about the user as one-dimensional.

Value Ethics

The roots of value ethics can be traced back to Aristotle and Plato however was championed by many thinkers throughout history including Friedrich Nietzsche and Immanuel Kant. It is an ethical framework that emphasizes virtuous behavior, as opposed to deontological or consequentialist ethics.[8] It is often considered a more practical ethical system that has far reaching applications in the real world. Moreover many philosophers today argue that it is compatible with a pluralist society. Aristotle argues that an individual can become virtuous by performing virtuous behavior. [9] Philosopher Shannon Valor argues that though this is true, individuals require an environment that is conducive to such behavior. Because of this, the responsibility to create a virtuous productive environment lies with the system and the designers of said system.[10] Value sensitive design aims to do just this.

Examples of Value Sensitive Design

Manor House Motel

Gerald Foos is the former owner of the Manor House Motel. His observational focus was to engage in voyeurism and observe the sexual acts of guests staying at the motel. For 30 years, he surveilled guests of the motel and recorded everything he saw and heard. He specifically manipulated the design of the hotel in such a way that allowed him to continue this voyeurism without getting caught. He cut 4 x 6 holes in the ceiling of each hotel room and covered them with aluminum so it appeared to be a ventilation vent. In reality, Foos was sitting in his attic and was able to record his findings from each motel room[11]. Foos situation relates to value sensitive design because Foos designed this hotel "surveillance" system in order to support his prized value: voyeurism. He carefully manipulated the affordances of the given space in such a way that allowed him to continue to practice his voyeurism without getting caught. This example of value sensitive design is controversial because of the debate over whether the value of voyeurism is truly a moral value.


The theoretical framework that is used in the design of the popular social media site Reddit is an example of a design choice that directly influences the culture and climate of the activity on the site. The design of the Reddit platform is linked Karma. Karma is a point system that represents how much Redditors value a particular account's contribution. When a user posts or comments on another post, it is accompanied with a total point score that consists of a certain number of upvotes and down votes. What one considers worthy of an upvote or downvote, is a matter of opinion.


UrbanSim is a simulation package that's purpose is to assist stakeholders in making decisions about city planning by providing a way to analyze development patterns in urban areas. It has the ability to model various aspects of development such as transportation usage, resource consumption, environmental impact, and overall growth. Built into this simulation is a distinction between "explicitly supported values" and "stakeholder values." A value that is explicitly supported by the simulation is the ability for each stakeholder to asses scenarios in ways that are important to that specific stakeholder. In addition, other values that were incorporated into the design include transparency, accountability, support for the democratic process, and freedom from bias[12]. These values are important so that the system does not privilege one mode of transportation or environmental measure over another. Since different stakeholders might have different, even conflicting values, it is important that the infrastructure of the simulation does not choose or weigh those values for the user[13].

Ethical Implications

The VSD Process

In the Value Sensitive Design process, there is much involvement in the selection of which moral values should or should not be used in the creation of a particular design. Designers are responsible for the values they implement into the technology they are designing. There is a certain amount of moral agency that is required in order for values to be chosen. One must decide which values are worth including in the design and which are not. There is uncertainty that a given value that is non-moral can be distinguished from a value that is moral. This requires both judgment and the room for discussion and disagreement[14]. If we want our information systems and new technologies to be fair, safe, and just, then we must ensure that our good intentions are heard and implemented by the designers of these technologies. Van Der Hoven argues in her article that specifically, in order to determine if a technology is practicing value sensitive design, the properties of these technologies must demonstrate these morally desirable affordances as well as be able to motivate political choices and justify investments from this value perspective [15]. It must be "verified" that the designed system does indeed embody the chosen value. However, some argue that technology itself cannot fully embody a human value. The same technology may embody different values according to different cultural contexts. On the other hand, different technologies with the same functions may embody totally different values[14]. Deciding how to use a chosen value to shape a particular design can be taken in many different directions. This requires a blend of social sciences, engineering, and design. It is a challenge to translate this chosen value into tangible design requirements. Design requirements are the steps needed to be taken in order to achieve the desired features of the designed system[14]. When making choices between conflicting values, there is a chance that two values being compared are incommensurable. Deliberation and reasoning about the disputed value(s) that is required before a decision is made[14].

It is also important for design to accommodate for accessibility. Accessibility within design[16] refers to user's ability to actually use the products being designed, and this can be a difficult thing to account for in regards to disabled individuals. Things like blindness, deafness, and other disabilities must be considered within design so as to not alienate a subsection of the user population. Some examples of accessible design include wheelchair ramps, screen-readers, and subtitles.

Due to the nature of value sensitive design, it is easy to see how things could go ethically wrong. As a rapidly developing concept, VSD does not have a fully constructed a formula to ensure that the implemented values are justified. It is difficult to tell where to draw the line in value decision making. Because of this, "VSD practitioners risk attending to an unprincipled or unbounded set of values."[17] In order to alleviate some of the conflict that can occur, it is critical that the designers share with the VSD users any values that may be embedded in the systems. This can direct the users in a clearer direction in choosing if they would like to utilize the design or not.[17] Naomi Jacobs et. al. proposes that there are three methods that have potential to better regulate value sensitive design. First, top-down; basic rules are applied in specific instances. Secondly, bottom-up; applying "moral certitude" to specific cases. Lastly, a mid-level approach which aims to create more general guidelines for decisions on whether values breach ethical bounds or not.[17]


Digital technology has marked a shift in the protection of one's privacy from physical barriers to online architectural barriers. There is a distinct difference in designing to protect one's privacy in private and protecting one's privacy in public. In terms of one's privacy within the public infosphere, there are certain regulations that are put in place to protect someone's personal information from being used or exploited[18].[19]. The concern with privacy relates to preserving one's reputation, ensuring opportunities are still available, and a general conception of a good life. Designing for privacy includes creating information systems and technology that embody the notion of privacy. That is, designs that make an effort to protect people's private information and allow others to access information about a person at their own discretion. When one steps into the public infosphere, although he may not be protected legally, he is protected by the friction created within the informational architecture that places a high cost on gathering or using facts about the individual[18]. The architectural code in digital technologies has the ability to make more online and offline behavior monitorable, and searchable.[18]. It can be argued by Larry Lessig that we live in an increasing technological revolution where information is being processed by architecture in ways that make people both more monitorable, and searchable. Examples of this exist in everyday life, such as the internet, E-mail, voice, and cookies. An example of designing for privacy would be Facebook's privacy features. When an individual posts a photo, Facebook allows that person to choose who is allowed to see the photo of that particular person.


Dorian Warner, who interviewed with ProPublica as a part of their TSA discrimination piece, says that she is often searched at the airport due to her thick hair.

A biased computer system is defined as a design that unfairly discriminates against certain groups or individuals [20]. If a system denies an opportunity or service to a certain individual or group as well as assign an unwanted outcome on unreasonable grounds to an individual or group, then the system can be considered to have been designed with bias. Bias can arise from three different types of origins. Preexisting biases, whether they are individual or societal, arise from values and attitudes that are harbored prior to the design of the system[20]. An example of this would be in systems such as computer games that are intended to look more attractive to young boys rather than young girls. In this case, the bias in the system design stems from an overall societal attitude towards gender norms.

The second type of bias is technical bias, which results from technical constraints. The code and wireframing systems that are used to create a computer system are not inherently linked to bias themselves but have the potential to create bias when they are used to design systems [21]. One case of technical bias occurred in a 2015 incident in which soap dispensers at an Atlanta Marriott hotel worked on white people's hands, but not black people's hands due to issues with the infrared ray reflection technology not working on darker skin colors [22]. This technical bias is not limited to to physical designs, but also extends to algorithms. Similar to the racist soap dispenser, algorithms can show a bias against minorities depending on what's been searched. In his article Corrupt Personalization [23], Christian Sandvig notes that "algorithms are dangerous" and that because of the bias in algorithms, "your attention is drawn to interests that are not your own." Minorities might not be able to relate to search results (say, on a given health-related illness), and they're pigeon-holed into looking at content that does not represent them or their people who have had these health problems. Despite all the data collected on a particular user, search engines can still fail to personalize results.

The third type of bias that is apparent in value sensitive design is emergent bias. This can be found when a system is designed and used, however, the system is used within a social context within a different social context that the designer did not intend. The values that are held by stakeholders are not supported by this system in the way they intended. IR reflection. A case of emergent bias was found in an April 2019 report that found the Transportation Security Administration's full-body airport scanners to be discriminatory against women of color as well as minorities who wear head garments. The TSA's 'millimeter scanners' can detect non-metallic objects, but aren't capable of determining what such objects are. As a result, thick hair of women of color as well as headscarves and turbans frequently register on the machines and leave their bearers subject to disproportionately high frequencies of full-body patdowns in airport security [24].


  1. 1.0 1.1 1.2 1.3 Himma, Kenneth Einar; Tavani, Herman T., eds. (2008). The Handbook of Information and Computer Ethics (PDF). John Wiley & Sons Inc. ISBN 978-0-471-79959-7. Retrieved 8 July 2016.
  2. Friedman, Batya, Peter H. Kahn, and Alan Borning. "Value sensitive design and information systems." The handbook of information and computer ethics (2008): 69-101.
  3. “Value Sensitive Design.” Wikipedia, Wikimedia Foundation, 10 Feb. 2019,
  4. "Waddell, P. (in press). UrbanSim: Modeling urban development for land use, transportation, and environmental planning. To appear in Journal of the American Planning Association. (Preprint available from "
  5. "Waddell, P. (in press). UrbanSim: Modeling urban development for land use, transportation, and environmental planning. To appear in Journal of the American Planning Association. (Preprint available from "
  6. “Pluralism | Definition of Pluralism in English by Oxford Dictionaries.” Oxford Dictionaries | English, Oxford Dictionaries,
  7. 7.0 7.1 "“An Archive of Their Own: A Case Study of Feminist HCI and Values in Design (CHI 2016).” Casey Fiesler, 16 June 2016,
  8. Hursthouse, Rosalind and Pettigrove, Glen,"Virtue Ethics", The Stanford Encyclopedia of Philosophy, 2018
  9. Virtue ethics, Internet Encyclopedia of Philosophy,
  10. Shannon Vallor, Social networking technology and the virtues, 1 August 2009, Springer Science+Business Media B.V. 2009
  11. Talese, Gay, and Gay Talese. “The Voyeur's Motel.” The New Yorker, The New Yorker, 26 Feb. 2019,
  12. "Friedman, Batya, and Alan Borning . “Value Sensitive Design as a Pattern: Examples from Informed Consent in Web Browsers and from Urban Simulation.” Information School and Department of Computer Science & Engineering, University of Washington, 2009, p. 1–4."
  13. "Waddell, P. (in press). UrbanSim: Modeling urban development for land use, transportation, and environmental planning. To appear in Journal of the American Planning Association. (Preprint available from"
  14. 14.0 14.1 14.2 14.3 Poel, Ibo. (2010). Value-sensitive design: four challenges.
  15. Van Der Hoven, Jeroen. "The Handbook of Information and Computer Ethics", Ch 3., "Moral Methodology and Information Technology"
  16. “What Is Accessibility?” The Interaction Design Foundation,
  17. 17.0 17.1 17.2 Jacobs, Naomi. "Why value sensitive design needs ethical commitments". Ethics and Information Technology. 2018.
  18. 18.0 18.1 18.2 Lessig, Larry. "Code is Law" (1999)
  19. Doyle, Tony. “Privacy and Perfect Voyeurism.” 27 May 2009.
  20. 20.0 20.1 Friedman, Batya and Nissenbaum, Helen. "Bias in Computer Systems". July 1996.
  21. Brey, Phillip. "Values in technology and disclosive computer ethics"
  22. Plenke, Max. "The Reason This "Racist Soap Dispenser" Doesn't Work On Black Skin.
  23. "Corrupt Personalization." Social Media Collective. N.p., 26 June 2014. Web. 20 Feb. 2016.
  24. Medina, Brenda. "TSA Agents Say They’re Not Discriminating Against Black Women, But Their Body Scanners Might Be". Pro Publica.