Sensory Substitution Technology

From SI410
Jump to: navigation, search

Introduction

Sensory substitution technology is a growing field of research that aims to compensate for the loss or impairment of one sense, with heightened signals being sent to another.[1] This could take the form of devices converting visual signals to auditory cues or auditory signals to tactile patterns. There are a variety of ethical concerns that have arisen due to widespread use of these technologies, including those relating to accessibility, data privacy, effectiveness, and regulation.

History

The field of sensory substitution can be traced back to American neuroscientist and University of Wisconsin professor Paul Bach-y Rita, who viewed brain plasticity as a basis for sensory substitution. Rita cites the mechanisms of brain plasticity, including neurochemical, endplate, receptor, and neuronal structural changes, are what give way for sensory substitution to be possible.[2] It is through a “human-machine interface”[3] that the brain is able to use information from an artificial receptor to substitute the information lost from a non-functional sensory system.

Despite Bach-y Rita’s establishment of the field of sensory substitution, its history goes back a little further. In 1897, Kazimierz Noiszewski invented the first sensory substitution device, called Elektroftalm, which used a light sensitive selenium cell to express brightness as auditory information to provide a distinction between light and dark spaces for blind people.[4] Early examples of sensory substitution devices (SSDs) include Braille and Canes, both of which provide tactile information to compensate for vision impairment. The advancement of technology has increased the capabilities of sensory substitution devices, allowing for improved quality and suitability for daily use.[5]

Substitution Technologies

Visual Substitution

Visual substitution technology is often created to aid individuals with visual impairments sufficiently absorb and comprehend visual signals. This compensation helps individuals with a variety of vision-demanding tasks like reading text, comprehending visual media like photos and videos, and navigating public spaces. Modern technology has allowed visual substitution devices to take on a variety of forms, ranging from digital to wearable.

Visual to Auditory
While many early forms of visual substitution technology, like Canes, were created to aid in spatial awareness via tactile feedback, the advent of digital technologies like digital cameras and machine learning softwares have afforded the conversion of visual cues in auditory signals.[6]

Many visual SSDs take software formats. Many websites that display visual media are encouraged to have Alternative (Alt) Text to describe the contents of an image, which can then be read audibly to users by screen reader softwares.[7] Alt text is encouraged to be as descriptive as possible, and to include not only content of the image, but significance of it within greater arguments. Web browser extensions are common ways to help visually impaired people navigate web pages, but are often found to be very inconsistent and incompatible with more complex website layouts.[8] Companies like Accessibe, UserWay, and Audio Eye utilize artificial intelligence to improve digital accessibility for company websites. Despite immense success, many of these companies have received immense backlash from users, claiming inefficiency and increased difficulty navigating the web pages. [link to controversy section] Some navigation aids take the form of mobile apps, like the software NavCog, which helps visually impaired individuals navigate large indoor public spaces such as universities, airports, and hospitals.[9]

Other visual SSDs that convert visual signals into auditory cues take physical, wearable forms. Commonly, these devices materialize in the form of smart glasses. One of the most widely investigated vision-to-audio SSD is the vOICe, scans visual environments and converts the 2D grayscale image into a “soundscape” that can be audibly recited to the user.[10] Another line of wireless smart glasses currently available on the market is called Pivothead, which utilizes a built in camera to convey auditory information to the user.[11] In 2017, Microsoft released an artificial intelligence driven app called “Seeing AI” which was designed to be compatible with Pivothead SMART glasses. The combination of these two technologies allows visually impaired individuals to read text from menus and can recognize types of dollar bills.[12] Another line of smart glasses, Aira, connects the user to a human agent who assists the user in the specified task like navigation, reading, or even more complex tasks like musical lessons[13]. While human connection and conversation is beneficial for the visually-impaired user, Aira Tech Corporation has created a positive feedback loop in which hours of human input when assisting users is fed into machine learning algorithms being developed to streamline the assistance process. Another, more unique approach to smart glasses for the visually impaired is EyeMusic, which uses musical notes on a pentatonic scale to provide both shape and color information. These musical notes are generated by natural instruments with the hope of presenting visual information to the impaired user in a digestible, pleasant format. [14]



Visual to Tactile
Progressing from more primitive devices like Canes and Braille, modern technology has paved the way for a new generation of SSDs that convert visual information into tactile patterns. One of the first examples was the Tactile Vision Substitution System (TVSS) created by Paul Bach-Y Rita, which transformed visual information into physical stimuli delivered to tactoile recpetors on the back of his blind subject.[15] The Brainport, developed in 1998, is a Brain Computer Interface (BCI) that can convert a camera’s video feed into dynamic patterns of electrical stimulation on the surface of the tongue. They seek to deliver important sensory information critical for human behavior while decreasing the risk of sensory overload by providing a supplemental channel for information flow to the brain.[16]


Auditory Substitution

Auditory substitution technologies, another growing field of sensory substitution, aims to substitute missing auditory information in deaf and hard of hearing (DHH) individuals for increased signals to other sensory channels like sight and touch. While difficult to integrate for everyday conversation, auditory SSDs can be vital in helping audibly impaired individuals detect frequency noises like alarms and sirens.

Auditory to Visual
There are several visual substitution softwares to aid individuals in business or academic settings. LegionScribe is a crowd-powered, real-time captioning service that hopes to circumvent the need for a trained stenographers while maintaining the reliability that comes with a human transcriber. With less than 5 seconds of latency, LegionScribe merges the input of multiple untrained participants and volunteers into an accurate, collective output – producing results that are often more accurate and complete than what one worker could have generated alone.[17]

Auditory to Tactile
Taking wearable forms, numerous SSD’s have arisen in the last decade to provide DHH individuals with tactile signals. The Tongue Display Unit (TDU) is a 144-channel programmable pulse generator that delivers electrotactile stimulation to the tongue through a matrix of surface electrodes to represent auditory frequencies.[18] Neuroscientist David Eaglemen and his colleagues at Neurosensory have developed a wearable device called “The Buzz” which translates ambient sounds like alarms and sirens into distinct patterns of vibrations that are administered by the wearable wrist device.[19]


Ethical Concerns

As with any new field of technology, researchers and philosophers have considered SSDs’ implications on society and the lives of those with disabilities. SSD’s play an intimate, but delicate role in the lives of those who are disabled, often serving as an intermediary between the individual and the physical world they interact with. The rapid growth of the field of Sensory Substitution, paired with exponential advancements of technology like artificial intelligence demand new ethical considerations at the intersection of accessibility, ableism, privacy, and regulation.

Accessibility

The complex technical nature of many important Sensory Substitution Devices often result in high prices and difficult access for those in need of such technology, especially individuals from low-income or rural areas. This issue is aggravated by the fact that the cost of SSDs are not reimbursed through health insurance in the U.S.[20] Devices like the BrainPort, which retails for $10,000, are difficult purchases for individuals with disabilites, especially since those with disabilities have much lower rates of employment in the United States.[21] It is found that inadequate access to assistive technologies negatively influences existing health inequalities already present in individuals with intellectual or physcial disabilities, who are more likely to face health inequality due to economic, social, or environmental factors. [22]

Ableism Reinforcement

Naturally, there are some that argue that the widespread development and encouragement of use of SSDs reinforce ableist beliefs. It is argued that the presence of technologies designed to aid disabled individuals reinforces societal prejudices against people with disabilities, asserting that they are “incomplete” and require compensation from SSDs to become “fixed” or “normal”.[23] This can further marginalize individuals and exacerbate the divide between those who require the assistance of SSDs and those who do not. Further, the cultural acceptance of sensory enhancement may form new societal subcultures, potentially creating a divide between those who are enhanced and those who remain ‘subnormal’[24], which, due to the high price point of many technologies, can exacerbate existing socioeconomic inequalities. There is a move for the design of SSDs to be more inclusive, marking a desired towards the invention and engineering of products that fulfill the needs of all users, regardless of any impairments. This is known as an “inclusive design”.

Data Privacy

There are a variety of data privacy concerns that skeptics of sensory substitution technology have. Much of these concerns originate from skepticism regarding the broader Internet of Things (IoT), which connects devices, places, and humans via smart sensors and widespread connectivity. The proliferated interconnection afforded by IoT aids the mass collection of personal data, many of which pertain to levers of health and activity [25]. Sensory substitution technologies, in particular, track, store, and analyze sensitive data about the user’s sensory activity, physical location, and health. Much of this data is often collected without the user’s explicit consent, and carries risks of unauthorized access [26]. There is a growing level of unease regarding implications of data collection and cloud storage with wearable technology in relation to the scope of IoT, skepticizing that keeping health data centralized is becoming more difficult. [27] Further, trends of surveillance capitalism commodify personal data, paving the way for predictive analytics to help businesses with personal marketing and consumer exploitation. [28] Additionally, input and output data collected by SSDs in use constitute rich data for machine learning algorithms. Many companies have developed feedback loops to improve their models and develop more effective AI solutions.[29] While undoubtedly useful for companies developing these technologies, much of this data is collected without consent and positions users, in the eyes of many, as less of a beneficiary and more of a test subject.

Limited Effectiveness

Many consider the effective limitations of sensory substitution devices to constitute ethical concerns. Naturally, it is difficult to compensate for the loss of a sense, and increased information pushed into other sensory channels runs the risk of sensory overload in the user.[30] Further, limits in attentional capacity make it imperative that these devices be focused on necessary information to accomplish the desired outcome. Critics argue that there is often a discrepancy between the functional promises of sensory substitution companies and the actual capabilities of the devices in the market.[31] Researchers have also criticized the high learning curve that is associated with many of these devices, making them impractical financial decisions for many individuals. [32] For visual-to-tactile devices like the BrainPort, researchers have found that minimal training of 300 trials does not give uers sufficient preparation for successful object recognition.[33] It has also been found that some visual-to-auditory devices, like the vOICe, can produce unpleasant noises in more complex contextual settings, creating a frustrating learning period for the user.[34]

Further, there is a disillusionment with the asymmetrical focus on more ambitious forms of sensory substitution to allow visually impaired individuals perform more complex tasks in physical space as opposed to fully functional and integrated technology that aid blind users in even the simplest tasks like navigating webpages. According to 2021 State of Accessibility Report (SOAR), more than 90% of the world’s website do not meet the minimum requirements for accessibility established by the standards of the World Wide Web Consortium (W3C). [35]

Legislation

There is a variety of legislation that has been passed that address many concerns regarding accessibility, rehabilitation, and assistive technology. On July 26, 1990, President George H.W. Bush signed the American Disability Act (ADA) into law, marking the codified arrival of a comprehensive civil rights law that prohibited discrimination on the basis of disability.[36] Title III of this act specifically prohibits this discrimination from occurring in places of public accommodation like restaurants, schools, movie theaters, day care facilities, and businesses generally open to the public. This law set the foundation for the accessibility mandates and regulations that are common today. On February 8, 1996, Section 255 of the Telecommunications Act was signed into law to require telecommunications products and services to be accessible to people with disabilities. This includes all wired and wireless communication devices such as telephones and computers. It is specified that this access be “readily achievable”, meaning easily attainable without added difficulty or expense. [37] In 1998, Congress amended the Rehabilitation Act of 1973 to mandate that Federal agencies make their electronic and information technology accessible to people with disabilities. [38] This rule was enacted in response to growing technological innovations, and has been amended and updated since then. The Assistive Technology Act of 1998, signed on October 9th, established a system of funding for increased accessibility that included continuity grants for assistive technology for individuals with disabilities and for states that have received less than ten years of funding under the Disabilities Act of 1988. It also supported grants for small businesses to work with entities funded by the State to support the development and implementation of assistive technologies.[39] On October 8, 2010, former President Barack Obama signed the Twenty-First Century Communications and Video Accessibility Act (CVAA) into law which updated the federal communications law to increase access to modern communications, bringing older laws up to date with 21st century technologies including digital, broadband, and mobile innovations.[40] This included mandating close captioning options for programmed media and accessibility for electronic messaging.

Sources

  1. https://www.sciencedirect.com/science/article/pii/S1364661303002900?casa_token=0bErxwzBRNAAAAAA:Hztl0AVdfvAxAxeQI_JKb-Vr_dkpEGS4IIBjOdPwdbgjnrmOgZjdMWBLpDv6C_vuRYFZe-s
  2. https://journals.sagepub.com/doi/abs/10.1177/136140968700100202
  3. https://www.sciencedirect.com/science/article/pii/S1364661303002900?casa_token=0bErxwzBRNAAAAAA:Hztl0AVdfvAxAxeQI_JKb-Vr_dkpEGS4IIBjOdPwdbgjnrmOgZjdMWBLpDv6C_vuRYFZe-s
  4. http://www.medien.ifi.lmu.de/pubdb/publications/pub/schmidmaier2011mias/schmidmaier2011mias.pdf
  5. http://www.medien.ifi.lmu.de/pubdb/publications/pub/schmidmaier2011mias/schmidmaier2011mias.pdf
  6. https://ieeexplore.ieee.org/abstract/document/6961368
  7. https://accessibility.huit.harvard.edu/describe-content-images
  8. https://link.springer.com/chapter/10.1007/978-3-319-58703-5_24
  9. https://www.cs.cmu.edu/~NavCog/navcog.html
  10. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6033394/
  11. https://www.tandfonline.com/doi/full/10.1080/08820538.2022.2152703?src=
  12. https://interestingengineering.com/innovation/microsofts-new-app-seeing-ai-describes-world-blind
  13. https://tvst.arvojournals.org/article.aspx?articleid=2763503
  14. https://content.iospress.com/articles/restorative-neurology-and-neuroscience/rnn130338
  15. https://www.cs.utexas.edu/users/kuipers/readings/Bach-y-Rita-ijn-83.pdf
  16. https://www.worldscientific.com/doi/abs/10.1142/S0219635205000914
  17. https://dl.acm.org/doi/pdf/10.1145/2661334.2661352
  18. https://pubmed.ncbi.nlm.nih.gov/28748231/
  19. https://mags.acm.org/communications/january_2018/?folio=15&&pg=19#pg19
  20. https://cacm.acm.org/magazines/2018/1/223884-feeling-sounds-hearing-sights/abstract
  21. https://www.bls.gov/news.release/pdf/disabl.pdf
  22. https://onlinelibrary.wiley.com/doi/full/10.1111/jir.12532
  23. https://cognitiveresearchjournal.springeropen.com/articles/10.1186/s41235-020-00240-7
  24. https://link.springer.com/referenceworkentry/10.1007/978-94-007-4707-4_46
  25. https://www.sciencedirect.com/science/article/pii/S2405896316325174
  26. https://link.springer.com/referenceworkentry/10.1007/978-94-007-4707-4_46
  27. https://deliverypdf.ssrn.com/delivery.php?ID=644113004017114121118014083105092097029078004022011049090083082098124116027003100126119126029119053124101021115088117105001070122047012042084064076125071094113007092007079067090091086095095096012018114090112082073026000113086021073003101102111096083098&EXT=pdf&INDEX=TRUE
  28. https://brooklynworks.brooklaw.edu/bjcfcl/vol15/iss2/8
  29. https://ieeexplore.ieee.org/document/8448514
  30. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5044782/
  31. https://academic.oup.com/british-academy-scholarship-online/book/31069/chapter-abstract/264066864?redirectedFrom=fulltext
  32. https://brill.com/view/journals/msr/29/8/article-p773_5.xml
  33. https://brill.com/view/journals/msr/29/8/article-p773_5.xml
  34. https://link.springer.com/article/10.1007/s12193-021-00376-w
  35. https://pages.diamond.la/hubfs/SOAR%20/SOAR%202021/FINAL%20DOCUMENTS/2021%20Diamond%20State%20of%20Accessibility%20Report.pdf?utm_campaign=SOAR%202021&utm_medium=email&_hsmi=192554077&_hsenc=p2ANqtz-9eUfm2x_IqUaoFCq9hd-yqpJzFeU_KtrJXcAiS-Oe3HIvLRhCCzMwK3S7UMoqhGdLyaS-ZQLpswJ3Rdnho4Y6kxOJUz4UtJMIZ5E-uZl_XNj-MrCs&utm_content=192554077&utm_source=hs_automation
  36. https://www.ada.gov/law-and-regs/title-iii-regulations/#:~:text=Title%20III%2C%20which%20this%20rule,%2C%20schools%2C%20day%20care%20facilities%2C
  37. https://www.access-board.gov/ict/guide/2555_guide.md.html#:~:text=Section%20255%20of%20the%20Communications,without%20much%20difficulty%20or%20expense.
  38. https://www.section508.gov/manage/laws-and-policies/#:~:text=Section%20255%20of%20the%20Communications%20Act%20%2D%20Requires%20telecommunications%20products%20and,field%20for%20individuals%20with%20disabilities
  39. https://www.congress.gov/bill/105th-congress/senate-bill/2432
  40. https://www.fcc.gov/consumers/guides/21st-century-communications-and-video-accessibility-act-cvaa