Difference between revisions of "Neurotechnology"

From SI410
Jump to: navigation, search
(identity -)
Line 69: Line 69:
 
The ability to make enhancements and changes to an individual's brain raises ethical questions regarding society’s view of what is “normal” and “good”. <ref>Shook, John (December 5, 2014). [https://www.frontiersin.org/articles/10.3389/fnsys.2014.00228/full#h4 "Cognitive enhancement kept within contexts: neuroethics and informed public policy"]. Retrieved January 25, 2022.</ref>  
 
The ability to make enhancements and changes to an individual's brain raises ethical questions regarding society’s view of what is “normal” and “good”. <ref>Shook, John (December 5, 2014). [https://www.frontiersin.org/articles/10.3389/fnsys.2014.00228/full#h4 "Cognitive enhancement kept within contexts: neuroethics and informed public policy"]. Retrieved January 25, 2022.</ref>  
 
“How much can humans be enhanced without deforming or destroying aspects of the social or natural world on which life relies?” <ref>Shook, John (December 5, 2014). [https://www.frontiersin.org/articles/10.3389/fnsys.2014.00228/full#h4 "Cognitive enhancement kept within contexts: neuroethics and informed public policy"]. Retrieved January 25, 2022.</ref> Striving for all people to desire and display “normal” functioning requires a societal definition of what “normal” is and should be.  
 
“How much can humans be enhanced without deforming or destroying aspects of the social or natural world on which life relies?” <ref>Shook, John (December 5, 2014). [https://www.frontiersin.org/articles/10.3389/fnsys.2014.00228/full#h4 "Cognitive enhancement kept within contexts: neuroethics and informed public policy"]. Retrieved January 25, 2022.</ref> Striving for all people to desire and display “normal” functioning requires a societal definition of what “normal” is and should be.  
 +
People are discriminated against if they have differently functioning brains or bodies, and this is expected to extend into neurotechnology. Allowing people to drastically change their mental, sensory and physical abilities will “likely change societal norms”. There may be pressure to use neurotechnology for self enhancement as progess in the field creates new types of discrimination. '''(ADD SOURCE)''
 +
  
 
<h4> Identity </h4>
 
<h4> Identity </h4>

Revision as of 02:32, 10 February 2022

Neurotechnology is defined as any technology that “provides greater insight into brain or nervous system activity, or affects brain or nervous system function.” (ADD SOURCE). Neurotechnology is used in a variety of contexts including research, therapeutic processes, rehabilitation, and more recently - gaming and self monitoring. Its potential to improve the treatment of mental and neurological disorders has made exploration into neurotechnology widespread… Researchers value its possible capabilities - from restoring motor function to facilitating brain controlled augmented reality in recreational devices. Neurotechnology is a relatively new field, and as it grows and changes (The market for neurotechnology products is projected to reach $13.3 billion in 2022)[1] allowing the alteration of consciousness and thoughts, ethical concerns surrounding issues of privacy, autonomy, liability, and security arise.

Background

Brain-Computer Interfaces (BCIs) connect a user's brain directly to a computer, enabling them to act with their mind, without using their muscles and nerves. They do so by detecting and interpreting neural activity [2]. The most common measures used in neural devices to interpret brain patterns are electroencephalogram (EEG) and functional magnetic resonance imaging (fMRI) because they are already widely used in the medical field and low risk. EEG tracks the electrical potential between two electrodes, and fMRI monitors changes in blood flow. [3] First defined by Jonathan Wolpaw in the 1990s, noninvasive BCIs began gaining traction amongst the scientific community, looking to explore its potential capabilities. [4] After many scientists had contributed to discoveries of EEGs and electrical signals in the brain Jacques Vidal published his paper in 1973 “Toward Direct Brain-Computer Communications” which detailed the brain's ability to communicate with external devices. [5]


Types

BCIs are classified into 3 main types, which refer to how they interact with and interpret a users’ brain waves.

Active

To utilize these devices, users “actively generate” brain signals to administer a specific command. (ADD SOURCE)

Reactive

These devices are used by interpreting how a user's brain responds to “specific probe stimuli”(ADD SOURCE) . They control an application by interpreting the users brain activity in response to the stimuli. (ADD SOURCE)

Passive

These types of BCIs read the cognitive activity of the user's brain activity, allowing them to recognize a user's emotions and mental state. (ADD SOURCE) They are not used for controlling devices or applications, but to improve an interaction between a user and a device by tracking the user's current state. (ADD SOURCE)


Current Applications

Medical Devices

Assistive Applications

These types of devices are employed to assist users who have lost function, by providing them with technological alternatives. Current forms of assistive applications include BCIs that enable users to communicate via “mental typing” and auditory assistance for those with neurodegenerative diseases that affect communication. Newer possibilities for assistive BCIs include the ability to control wheelchairs or similar devices. [6]

Rehabilitative Applications

These types of devices trigger neural plasticity which over time allow the users own neural pathways recover control over their limb. They function by detecting and interpreting a user's attempt or thought of movement, then artificially moving the limb with a robotic aid or by stimulation of the muscles. [7]

Civilian Devices

Direct to Consumer Products

Direct to Consumer Products: Direct to Consumer Products (DCPs) describe the neural devices that can be purchased by a consumer without the involvement of a researcher or doctor. [8] There are a variety of neurotechnology devices that are becoming available to the greater public, along with accompanying apps, and accessories. DCPs applications vary from gaming to self tracking medical data, and improving “brain fitness and performance”. These devices usually rely on bluetooth connections between the neural device and a home computer or machine. [9]

Emerging Applications

As further research and progress is made regarding neurotechnology, new applications are being explored : “biofeedback, sleep control, treatment of learning disorders, functional and stroke rehabilitation, and the use of brain signals as biomarkers for diagnosis of diseases or their progression” are some of the trending applications.[10] More recent commercial applications include neuromarketing and defense applications.[11] BCIs ability to “induce cortical plasticity” has also led to development of potential therapeutic applications: reducing seizures, treating ADD and ADHD, improving cognitive function in elderly patients, managing pain, and more.[12] These newer applications are being introduced to broader markets than traditional, medical BCIs.[13]


Ethical Implications

As neurotechnology is a rapidly growing field gaining in popularity and expanding beyond clinical usage, many ethical implications have arisen. Because “brain hacking” can occur at various phases along the BCI cycle, there are distinct concerns for differing ethical dilemmas. [14].

Privacy

Historically, at the early stages of technological innovation security risks are common because of a lack of stringent security measures integrated into the technologies and unprepared legal frameworks. Notoriously, "technology innovates faster than the regulatory system can adapt," and disruptive technological advancements can make current privacy and security norms obsolete”” . [15]. Hackers of BCIs could gain access to sensitive or private information of the BCI user. As privacy and data protection are often valued by individuals, this type of hacking poses a threat to users of neurotechnology. [16] Private information could include financial, medical, and locational data. This information could potentially be abused by not only criminals, but employers who use the information to make hiring decisions, leading to potential hiring discrimination. Normal/customary practices of setting privacy standards and choosing which data you are comfortable sharing will become obsolete when users aren’t fully cognizant of what data is being extracted from them. [17] There are particular concerns regarding devices marketed towards the general public. Consumer devices do not have to adhere to the stringent regulations that many clinical devices do. This enables Direct to Consumer companies to share large amounts of their users data to third parties without the consent of their users. [18]


Autonomy


When neurotechnology is hacked, an individual would have the ability to alter a user’s decisions and or behavior. [19] The threat of lost agency is potentially increased in clinical settings in which patients with severe neurological disorders are vulnerable. [20] Similarly, a user may be coerced into performing involuntary actions. These threats to individual autonomy are worth noting as they create doubt regarding the willingness of the user to perform the action. This in turn raises liability issues. When an individual doesn’t have total control over their actions, who will be liable for their actions and consequences? Some users felt “unsure about the authenticity or authorship of their feelings and behaviors” . [21]


Hacking

As neurotechnology becomes more commonplace, the possibility of viewing and or manipulating the cognitive information of a neurotech user emerges. [22] Neurocrime takes advantage of the neural device to access and alter cognitive information - similar to the hacking of a computer. [23] This “brain hacking” can occur at different stages of the BCI cycle. “Misusing neural devices for cybercriminal purposes may not only threaten the physical security of the users but also influence their behavior and alter their self-identification as persons.” [24] This type of neurocrime manifests in different forms: from gaining control over a users’ prostheses, uncovering sensitive information by reading a users’ brain signals, and more.

INPUT MANIPULATION

A study done by Ivan Martinovic at the University of Oxford showed that it was possible to extract banking information, addresses, and other sensitive information. This type of hacking could make fraud, password hacking, identity theft and more complicated.[25]


Matrix of Domination

“​​The matrix of domination works to uphold the undue privilege of dominant groups while unfairly oppressing minoritized groups. “ [26] When creating new neurotechnology, there is potential for creators to influence the data and functionalities of their devices. Thus, ethical issues arise concerning bias and equitable distribution.

“The unintended disclosure of information revealing cognitive deficits and neural signatures predictive of disorders (for example, depression or bipolar disorder), substance addiction or personality traits that the person wants to keep private can lead to discrimination and social isolation”. [27] Recently there has been apprehension regarding the use of functional brain imaging to predict future behaviors, such as whether or not an individual is acceptable to employ or prone to aggressive behavior. Margaret Eaton says technology being utilized to guess the future is likely and dangerous.[28]

Distributive Justice

In a future in which consumers would have this technology available to them, not everyone would have equal access to the neurotechnology and BCIs that might enable self enhancements. [29] The allocation of improved health is always a social concern. [30] Further, in his book “The Ethics of Biomedical Enhancement”, Allen Buchanan says that the public should not assume that the developers of neurotechnology will be inclined to distribute their products in a way that promotes justice - which will result in “unjust exclusion or domination”. [31] Thus, there are concerns that “those who already possess certain traits, attributes, and/or resources will likely and quickly get even more.”[32]


Normality

The ability to make enhancements and changes to an individual's brain raises ethical questions regarding society’s view of what is “normal” and “good”. [33] “How much can humans be enhanced without deforming or destroying aspects of the social or natural world on which life relies?” [34] Striving for all people to desire and display “normal” functioning requires a societal definition of what “normal” is and should be. People are discriminated against if they have differently functioning brains or bodies, and this is expected to extend into neurotechnology. Allowing people to drastically change their mental, sensory and physical abilities will “likely change societal norms”. There may be pressure to use neurotechnology for self enhancement as progess in the field creates new types of discrimination. '(ADD SOURCE)


Identity

“Identity encompasses the memories, experiences, relationships, and values that create one’s sense of self.” (ADD SOURCE). The ethical issue around maintaining a consistent identity when neurotechnology can alter a users’ brain activity is under debate. Users of neural devices have reported feeling as if their movements and accomplishments are not their own and that they feel as if they are artificial or robotic. Some users have felt unsure what is a result of themselves and what is a result of their neural device. [35]. Research posted in the Social Science Research Network conducted by Emily Postan says that the information gleaned in monitored neural data that could be considered “self descriptors” such as an indication that a user is a light sleeper can modify an individual's identity and sense of self - which may in turn alter an individual's perceptions and actions. [36] Dr. Michael Decker found that altering brain function via brain stimulation could cause unintended changes in a users’ mental states integral to their personality , which therefore can affect an individual's personal identity (ADD SOURCE). A study conducted by Michael Frank found that brain stimulation through neural devices resulted in behavioral changes such as increased aggressiveness, quicker decision making in high-conflict conditions, and altered sexual behaviors. (ADD SOURCE). Marcello Ienca suggests the implementation of the protection of the right to “psychological continuity” in the future to ensure the protection of continuity in an individual's thoughts, preferences, and choices.


Public Perception

There are concerns that the capabilities of BCIs and other forms of neurotechnology are being overestimated by the media and general public. A study done by Eric Racine et al found that popular media conveys to the public an incorrect emphasis on the positives and capabilities of neurotechnology to reveal a person's true character, and 67% of the articles they studied didn’t mention the capabilities, and 79% had a mostly optimistic tone. Racine et al posits that if the limitations and risks are not understood by the broader public, as it gains commercial popularity, problems arise when society believes that neurotechnology is able to accurately reveal the “true self” because the “brain can’t lie”.[37] Additionally, a case study by Allyson Purcell-Davis found that “the initial results of scientific experiments conducted within laboratories were often projected as potential cures for those with neurodegenerative disease, without any further discussion of the cost of such procedures or how long it might be before they become available.”[38]


References

  1. Marcella, Ienca (September, 2018). "Brain leaks and consumer neurotechnology". Retrieved February 6, 2022.
  2. Friedrich, Orsolya (2021). "Clinical Neurotechnology meets Artificial Intelligence". Retrieved January 25, 2022.
  3. White, Susan (October 17, 2014). "The Promise of Neurotechnology in Clinical Translational Science". Retrieved February 5, 2022.
  4. Friedrich, Orsolya (2021). "Clinical Neurotechnology meets Artificial Intelligence". Retrieved January 25, 2022.
  5. Friedrich, Orsolya (2021). "Clinical Neurotechnology meets Artificial Intelligence". Retrieved January 25, 2022.
  6. Vlek, Rutger (June,2012). https://journals.lww.com/jnpt/FullText/2012/06000/Ethical_Issues_in_Brain_Computer_Interface.8.aspx?casa_token=C18A3mf2Y5EAAAAA:wwz9lj2qZSJCWAQFV4lR8-fmWUeidUMQ9CcznLLjfeV4UJQhRBKmbqGiR2Nw3jNeCJEPdXCo2bQs9X2WGWGgmQ2sK1wM "Ethical Issues in Brain–Computer Interface Research, Development, and Dissemination"]. Retrieved January 26, 2022.
  7. Vlek, Rutger (June,2012). https://journals.lww.com/jnpt/FullText/2012/06000/Ethical_Issues_in_Brain_Computer_Interface.8.aspx?casa_token=C18A3mf2Y5EAAAAA:wwz9lj2qZSJCWAQFV4lR8-fmWUeidUMQ9CcznLLjfeV4UJQhRBKmbqGiR2Nw3jNeCJEPdXCo2bQs9X2WGWGgmQ2sK1wM "Ethical Issues in Brain–Computer Interface Research, Development, and Dissemination"]. Retrieved January 26, 2022.
  8. Ienca, Marcello (October 13, 2019). "Direct-to-Consumer Neurotechnology: What Is It and What Is It for?". Retrieved February 6, 2022.
  9. Marcella, Ienca (September, 2018). "Brain leaks and consumer neurotechnology". Retrieved February 6, 2022.
  10. P, Brunner (March 24, 2011). "Current trends in hardware and software for brain–computer interfaces (BCIs)". Retrieved January 26, 2022.
  11. P, Brunner (March 24, 2011). "Current trends in hardware and software for brain–computer interfaces (BCIs)". Retrieved January 26, 2022.
  12. P, Brunner (March 24, 2011). "Current trends in hardware and software for brain–computer interfaces (BCIs)". Retrieved January 26, 2022.
  13. P, Brunner (March 24, 2011). "Current trends in hardware and software for brain–computer interfaces (BCIs)". Retrieved January 26, 2022.
  14. Marcella, Ienca (April 16, 2016). "Hacking the brain: brain–computer interfacing technology and the ethics of neurosecurity". Retrieved January 24, 2022.
  15. Marcella, Ienca (September, 2018). "Brain leaks and consumer neurotechnology". Retrieved January 24, 2022.
  16. Marcella, Ienca (April 16, 2016). "Hacking the brain: brain–computer interfacing technology and the ethics of neurosecurity". Retrieved January 24, 2022.
  17. Marcella, Ienca (September, 2018). "Brain leaks and consumer neurotechnology". Retrieved January 24, 2022.
  18. Marcella, Ienca (September, 2018). "Brain leaks and consumer neurotechnology". Retrieved January 24, 2022.
  19. Marcella, Ienca (April 16, 2016). "Hacking the brain: brain–computer interfacing technology and the ethics of neurosecurity". Retrieved January 24, 2022.
  20. Marcella, Ienca (April 16, 2016). "Hacking the brain: brain–computer interfacing technology and the ethics of neurosecurity". Retrieved January 24, 2022.
  21. Goering, Sara (March 10, 2021). "Neurotechnology ethics and relational agency". Retrieved January 24, 2022.
  22. Marcella, Ienca (April 16, 2016). "Hacking the brain: brain–computer interfacing technology and the ethics of neurosecurity". Retrieved January 24, 2022.
  23. Marcella, Ienca (April 16, 2016). "Hacking the brain: brain–computer interfacing technology and the ethics of neurosecurity". Retrieved January 24, 2022.
  24. Marcella, Ienca (April 16, 2016). "Hacking the brain: brain–computer interfacing technology and the ethics of neurosecurity". Retrieved January 24, 2022.
  25. Marcella, Ienca (April 16, 2016). "Hacking the brain: brain–computer interfacing technology and the ethics of neurosecurity". Retrieved January 24, 2022.
  26. D'Ignazio, Catherine (February 21, 2020). "Data Feminism". Retrieved February 6, 2022.
  27. Marcella, Ienca (September, 2018). "Brain leaks and consumer neurotechnology". Retrieved January 24, 2022.
  28. Eaton, Margaret (April 2007). "Commercializing cognitive neurotechnology—the ethical terrain". Retrieved February 6, 2022.
  29. Shook, John (December 5, 2014). "Cognitive enhancement kept within contexts: neuroethics and informed public policy". Retrieved January 25, 2022.
  30. Shook, John (December 5, 2014). "Cognitive enhancement kept within contexts: neuroethics and informed public policy". Retrieved January 25, 2022.
  31. Buchanan, Allen (2011). "The Ethics of Biomedical Enhancement". Retrieved January 25, 2022.
  32. Shook, John (December 5, 2014). "Cognitive enhancement kept within contexts: neuroethics and informed public policy". Retrieved January 25, 2022.
  33. Shook, John (December 5, 2014). "Cognitive enhancement kept within contexts: neuroethics and informed public policy". Retrieved January 25, 2022.
  34. Shook, John (December 5, 2014). "Cognitive enhancement kept within contexts: neuroethics and informed public policy". Retrieved January 25, 2022.
  35. Goering, Sara (March 10, 2021). "Neurotechnology ethics and relational agency". Retrieved January 24, 2022.
  36. Postan, Emily (September 28, 2020). "Narrative Devices: Neurotechnologies, Information, and Self-Constitution". Retrieved February 1, 2022.
  37. Racine, Eric (February 1, 2005). "fMRI in the public eye". Retrieved February 5, 2022.
  38. Purcell-Davis, Allyson (April 21, 2015). "The Representations of Novel Neurotechnologies in Social Media". Retrieved February 5, 2022.