Sensory Substitution Technology

From SI410
Revision as of 22:03, 27 January 2023 by Abdewan (Talk | contribs)

Jump to: navigation, search

Introduction

Sensory substitution technology is a growing field of research that aims to compensate for the loss or impairment of one sense, with heightened signals being sent to another. This could take the form of devices converting visual signals to auditory cues or auditory signals to tactile patterns. There are a variety of ethical concerns that have arisen due to widespread use of these technologies, including those relating to accessibility, data privacy, effectiveness, and regulation.

History

The field of sensory substitution can be traced back to American neuroscientist and University of Wisconsin professor Paul Bach-y Rita, who viewed brain plasticity as a basis for sensory substitution. Rita cites the mechanisms of brain plasticity, including neurochemical, endplate, receptor, and neuronal structural changes, are what give way for sensory substitution to be possible.[1] Early examples of sensory substitution devices (SSDs) include Braille and Canes, both of which provide tactile information to compensate for vision impairment. The advancement of technology has increased the capabilities of sensory substitution devices, allowing for improved quality and suitability for daily use.[2]

Substitution Technologies

Visual Substitution

Visual substitution technology is often created to aid individuals with visual impairments sufficiently absorb and comprehend visual signals. This compensation helps individuals with a variety of vision-demanding tasks like reading text, comprehending visual media like photos and videos, and navigating public spaces. Modern technology has allowed visual substitution devices to take on a variety of forms, ranging from digital to wearable.

Visual to Auditory
While many early forms of visual substitution technology, like Canes, were created to aid in spatial awareness via tactile feedback, the advent of digital technologies like digital cameras and machine learning softwares have afforded the conversion of visual cues in auditory signals.[3]
Many visual SSDs take software formats. Many websites that display visual media are encouraged to have Alternative (Alt) Text to describe the contents of an image, which can then be read audibly to users by screen reader softwares.[4] Alt text is encouraged to be as descriptive as possible, and to include not only content of the image, but significance of it within greater arguments. Web browser extensions are common ways to help visually impaired people navigate web pages, but are often found to be very inconsistent and incompatible with more complex website layouts.[5] Companies like Accessibe, UserWay, and Audio Eye utilize artificial intelligence to improve digital accessibility for company websites. Despite immense success, many of these companies have received immense backlash from users, claiming inefficiency and increased difficulty navigating the web pages. [link to controversy section] Some navigation aids take the form of mobile apps, like the software NavCog, which helps visually impaired individuals navigate large indoor public spaces such as universities, airports, and hospitals.[6]
Other visual SSDs that convert visual signals into auditory cues take physical, wearable forms. Commonly, these devices materialize in the form of smart glasses. One of the most widely investigated vision-to-audio SSD is the vOICe, scans visual environments and converts the 2D grayscale image into a “soundscape” that can be audibly recited to the user.[7] Another line of wireless smart glasses currently available on the market is called Pivothead, which utilizes a built in camera to convey auditory information to the user.[8] In 2017, Microsoft released an artificial intelligence driven app called “Seeing AI” which was designed to be compatible with Pivothead SMART glasses. The combination of these two technologies allows visually impaired individuals to read text from menus and can recognize types of dollar bills.[9] (TO CONTINUE)

Visual to Tactile
Progressing from more primitive devices like Canes and Braille, modern technology has paved the way for a new generation of SSDs that convert visual information into tactile patterns. One of the first examples was the Tactile Vision Substitution System (TVSS) created by Paul Bach-Y Rita, which transformed visual information into physical stimuli delivered to tactoile recpetors on the back of his blind subject.[10] The Brainport, developed in 1998, is a Brain Computer Interface (BCI) that can convert a camera’s video feed into dynamic patterns of electrical stimulation on the surface of the tongue. They seek to deliver important sensory information critical for human behavior while decreasing the risk of sensory overload by providing a supplemental channel for information flow to the brain.[11]

Auditory Substitution

Auditory substitution technologies, another growing field of sensory substitution, aims to substitute missing auditory information in deaf and hard of hearing (DHH) individuals for increased signals to other sensory channels like sight and touch. While difficult to integrate for everyday conversation, auditory SSDs can be vital in helping audibly impaired individuals detect frequency noises like alarms and sirens.

Auditory to Visual
There are several visual substitution softwares to aid individuals in business or academic settings. LegionScribe is a crowd-powered, real-time captioning service that hopes to circumvent the need for a trained stenographers while maintaining the reliability that comes with a human transcriber. With less than 5 seconds of latency, LegionScribe merges the input of multiple untrained participants and volunteers into an accurate, collective output – producing results that are often more accurate and complete than what one worker could have generated alone.[12]

Auditory to Tactile
Taking wearable forms, numerous SSD’s have arisen in the last decade to provide DHH individuals with tactile signals. The Tongue Display Unit (TDU) is a 144-channel programmable pulse generator that delivers electrotactile stimulation to the tongue through a matrix of surface electrodes to represent auditory frequencies.[13] Neuroscientist David Eaglemen and his colleagues at Neurosensory have developed a wearable device called “The Buzz” which translates ambient sounds like alarms and sirens into distinct patterns of vibrations that are administered by the wearable wrist device.[14]

Ethical Concerns

As with any new field of technology, researchers and philosophers have considered SSDs’ implications on society and the lives of those with disabilities. The rapid growth of the field of Sensory Substitution, paired with exponential advancements of technology like artificial intelligence demand new ethical considerations at the intersection of accessibility, ableism, privacy, and regulation.

Accessibility

The complex technical nature of many important Sensory Substitution Devices often result in high prices and difficult access for those in need of such technology, especially individuals from low-income or rural areas. This issue is aggravated by the fact that the cost of SSDs are not reimbursed through health insurance in the U.S.[15] Devices like the BrainPort, which retails for $10,000, are difficult purchases for individuals with disabilites, especially since those with disabilities have much lower rates of employment in the United States.[16]

Ableism Reinforcement

Naturally, there are some that argue that the widespread development and encouragement of use of SSDs reinforce ableist beliefs. It is argued that the presence of technologies designed to aid disabled individuals reinforces societal prejudices against people with disabilities, asserting that they are “incomplete” and require compensation from SSDs to become “fixed” or “normal”.[17] This can further marginalize individuals and exacerbate the divide between those who require the assistance of SSDs and those who do not. There is a move for the design of SSDs to be more inclusive, marking a desired towards the invention and engineering of products that fulfill the needs of all users, regardless of any impairments. This is known as an “inclusive design”.

Data Privacy

Limited Effectiveness

Lack of Regulation

Controversies

Legislation

Title III of the ADA

Section 508 of the Rehabilitation Act of 1973

21st Century Communications and Video Accessibility Act (CCVA) of 2010

Section 255 of the Communications Act

Assistive Technology Act of 1998

Sources

  1. https://journals.sagepub.com/doi/abs/10.1177/136140968700100202
  2. http://www.medien.ifi.lmu.de/pubdb/publications/pub/schmidmaier2011mias/schmidmaier2011mias.pdf
  3. https://ieeexplore.ieee.org/abstract/document/6961368
  4. https://accessibility.huit.harvard.edu/describe-content-images
  5. https://link.springer.com/chapter/10.1007/978-3-319-58703-5_24
  6. https://www.cs.cmu.edu/~NavCog/navcog.html
  7. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6033394/
  8. https://www.tandfonline.com/doi/full/10.1080/08820538.2022.2152703?src=
  9. https://interestingengineering.com/innovation/microsofts-new-app-seeing-ai-describes-world-blind
  10. https://www.cs.utexas.edu/users/kuipers/readings/Bach-y-Rita-ijn-83.pdf
  11. https://www.worldscientific.com/doi/abs/10.1142/S0219635205000914
  12. https://dl.acm.org/doi/pdf/10.1145/2661334.2661352
  13. https://pubmed.ncbi.nlm.nih.gov/28748231/
  14. https://mags.acm.org/communications/january_2018/?folio=15&&pg=19#pg19
  15. https://cacm.acm.org/magazines/2018/1/223884-feeling-sounds-hearing-sights/abstract
  16. https://www.bls.gov/news.release/pdf/disabl.pdf
  17. https://cognitiveresearchjournal.springeropen.com/articles/10.1186/s41235-020-00240-7