Neurotechnology

From SI410
Revision as of 18:20, 11 February 2022 by Ivalice (Talk | contribs) (ethical intro)

Jump to: navigation, search
From Axios: An artist's visualization of Neurotechnology]
Neurotechnology is defined as any technology that “provides greater insight into brain or nervous system activity, or affects brain or nervous system function.” [1] Neurotechnology is used in a variety of contexts including research, therapeutic processes, rehabilitation, and more recently - gaming and self monitoring. Neural devices' potential to improve the treatment of mental and neurological disorders has made exploration into neurotechnology widespread as researchers analyze their possible capabilities - from restoring motor function to facilitating brain controlled augmented reality in recreational devices. Neurotechnology is a relatively new field, and as it grows and changes enabling the alteration of consciousness and thoughts, ethical concerns arise surrounding issues of privacy, autonomy, liability, security and more.

Background

First defined by Jonathan Wolpaw in the 1990s, noninvasive neurotechnologies began gaining traction amongst the scientific community, looking to explore its potential capabilities. [2] Following the work of many scientists who had contributed to discoveries of EEGs and electrical signals in the brain, Jacques Vidal published his paper in 1973 “Toward Direct Brain-Computer Communications” which detailed the brain's ability to communicate with external devices.[3] These devices do so by detecting and interpreting neural activity. [4] The most common measures used in neural devices to interpret brain patterns are electroencephalogram (EEG) and functional magnetic resonance imaging (fMRI) because they are already widely used in the medical field and low risk. EEG tracks the electrical potential between two electrodes, and fMRI monitors changes in blood flow. [5] Since the ability to connect a user's brain directly to a computer was discovered, the field has progressed rapidly to create new ways of enabling users to act with their mind without using muscles and nerves. Neural devices now come in a variety of forms and functions - among them are implants, monitoring devices that use electrodes attached to the scalp, and physical devices such as brain powered limbs. Researchers in the medical and entertainment fields alike are interested in harnessing the capabilities of neural devices to their benefit, especially after studies have been released such as one conducted by Davimar Borducchi that found neurotechnologies have potential to help improve sports performance, academic success, and even possible military advantages.[6] The market for neurotechnology products is projected to reach $13.3 billion in 2022[7] and as it progresses, so do its accompanying ethical discussions.


Types

Despite there being a great variety of neurotechnology, these devices are classified into 3 main types, which refer to how they interact with and interpret a users’ brain waves.

Active

To utilize these devices, users “actively generate” brain signals to administer a specific command. [8]

Reactive

These devices are used by interpreting how a user's brain responds to “specific probe stimuli”[9] . They control an application by interpreting the users brain activity in response to the stimuli. [10]

Passive

These types of devices read the cognitive activity of the user's brain activity, allowing them to recognize a user's emotions and mental state. [11] They are not used for controlling devices or applications, but to improve an interaction between a user and a device by tracking the user's current state. [12]


Applications

Medical Devices

Assistive Applications

These types of devices are employed to assist users who have lost function, by providing them with technological alternatives. Current forms of assistive applications include neural devices that enable users to communicate via “mental typing” and auditory assistance for those with neurodegenerative diseases that affect communication. Newer possibilities for assistive neurotechnology include the ability to control wheelchairs or similar devices. [13]

Rehabilitative Applications

These types of devices trigger neural plasticity which over time allow the users own neural pathways recover control over their limb. They function by detecting and interpreting a user's attempt or thought of movement, then artificially moving the limb with a robotic aid or by stimulation of the muscles. [14]

Civilian Devices

Direct to Consumer Products

Direct to Consumer Products (DCPs) describe the neural devices that can be purchased by a consumer without the involvement of a researcher or doctor. [15] There are a variety of neurotechnology devices that are becoming available to the greater public, along with accompanying apps, and accessories. DCPs applications vary from gaming to self tracking medical data, and those that improve “brain fitness and performance”. These devices usually rely on bluetooth connections between the neural device and a home computer or machine. [16]

Emerging Applications

As further research and progress is made in the neurotechnology field, new applications are being explored : “biofeedback, sleep control, treatment of learning disorders, functional and stroke rehabilitation, and the use of brain signals as biomarkers for diagnosis of diseases or their progression” are some of the trending applications.[17] More recent commercial applications include neuromarketing and defense applications.[18] The ability of neural devices to “induce cortical plasticity” has also led to development of potential therapeutic applications: reducing seizures, treating ADD and ADHD, improving cognitive function in elderly patients, managing pain, and more.[19] These newer applications are being introduced to broader markets than traditional, medical BCIs.[20]


Ethical Implications

As neurotechnology is a rapidly growing field gaining in popularity and expanding beyond clinical usage, many ethical implications have arisen. Because “brain hacking” can occur at various phases along the neurotechnology feedback cycle, there are distinct concerns for differing ethical dilemmas. [21].

Privacy

Throughout history, in the beginning phases of technological innovation, the risks of security issues are higher due to a lack of well developed security protocols built into the technologies, along with legal structures that are not fully prepared for the new issues brought upon by the technology. (ADD SOURCE??)These concerns are based in the past breaches in technological privacy, as ​​“large corporations already provide third-party access, usually through customer’s unknowing approval, that infringes on patients’ right to privacy” [22] Notoriously, "technology innovates faster than the regulatory system can adapt," and disruptive technological advancements can make current privacy and security norms obsolete”” . [23]. Hackers of BCIs could gain access to sensitive or private information of the BCI user. As privacy and data protection are often valued by individuals, this type of hacking poses a threat to users of neurotechnology. [24] Private information could include financial, medical, and locational data. This information could potentially be abused by not only criminals, but employers who use the information to make hiring decisions, leading to potential hiring discrimination. In this context, if neural information reveals that a patient is predisposed to certain disorders or diseases, should they or could they be obligated to share that information with their employer, insurance company or elsewhere? Normal/customary practices of setting privacy standards and choosing which data you are comfortable sharing will become obsolete when users aren’t fully cognizant of what data is being extracted from them. [25] The specifics surrounding how neural data is used after it is entered into a system can often be ambiguous to users in the general population, as there is inadequate transparency in policies dictating how companies share their data [26]There are particular concerns regarding devices marketed towards the general public. Consumer devices do not have to adhere to the stringent regulations that many clinical devices do. This enables Direct to Consumer companies to share large amounts of their users data to third parties without the consent of their users. [27] There is a concern that those in charge of neurotechnology development will not prioritize the privacy of users over scientific research or profit. In a study conducted by Katherine MacDuffie, she found that general public respondents were significantly more likely to prioritize privacy than members of the neural device industry. This finding may have implications for further advances in neurotechnology in regards to whether the beliefs of stakeholders, users or researchers should be prioritized. [28]


Autonomy

When neurotechnology is hacked, an individual would have the ability to alter a user’s decisions and or behavior. [29] The threat of lost agency is potentially increased in clinical settings in which patients with severe neurological disorders are vulnerable. [30] Similarly, a user may be coerced into performing involuntary actions. These threats to individual autonomy are worth noting as they create doubt regarding the willingness of the user to perform the action. This in turn raises liability issues. When an individual doesn’t have total control over their actions, who will be liable for their actions and consequences? Some users felt “unsure about the authenticity or authorship of their feelings and behaviors” . [31]

Liability

Neurotechnologies have the potential to necessitate changes in legal systems, as their capabilities and issues regarding agency may complicate questions of liability and responsibility. While laws have changed to respond to technological advancements in the past, neural devices are unlike many other forms of autonomous technology. While other forms operate independently from user input, some neural devices are ingrained into a users’ cognitive system. Thus, there is ambiguity surrounding who is responsible if harm is caused or damage is done by a user of neurotechnology. As of now, there are no clear guidelines of where the manufacturer, user or device is at fault.[32] There is a possibility that users themselves may not feel certain if they are responsible for their actions. A study conducted by Luke Bashford found that BCI users gained a false sense of control over a virtual hand, despite that there was no existing causal connection between their BCI and the hand. Conversely, other users reported not feeling in control of their device when they were.[33]Because Brain Computer Interfaces are capable of moving a user's body via brain activity instead of physical movements, these new actions are uncharted territory and don’t fall under current legal definitions of “willed bodily movements and thus, as actions”[34]


Hacking

As neurotechnology becomes more commonplace, the possibility of viewing and or manipulating the cognitive information of a neurotech user emerges. [35] Neurocrime takes advantage of the neural device to access and alter cognitive information - similar to the hacking of a computer. [36] This “brain hacking” can occur at different stages of the BCI cycle. “Misusing neural devices for cybercriminal purposes may not only threaten the physical security of the users but also influence their behavior and alter their self-identification as persons.” [37] This type of neurocrime manifests in different forms: from gaining control over a users’ prostheses, uncovering sensitive information by reading a users’ brain signals, and more.

INPUT MANIPULATION

A study done by Ivan Martinovic at the University of Oxford showed that it was possible to extract banking information, addresses, and other sensitive information. This type of hacking could make fraud, password hacking, identity theft and more complicated.[38]


Matrix of Domination

“​​The matrix of domination works to uphold the undue privilege of dominant groups while unfairly oppressing minoritized groups. “ [39] As new neurotechnologies continue to be developed, there is potential for their creators to influence the data and functionalities of the devices. Thus, ethical issues have arisen concerning the risk of biased devices. Similar to other products in the technological sphere, there is a lack of diversity among the developers - which may likely introduce biases into the architecture of the devices. Because the technology will be principally developed by a select few, devices may (intentionally or inadvertently) attempt to correct “certain behaviors” that could unfairly target and oppress minority groups. [39]. An opposing argument exists that neurotechnology could potentially reduce problems of injustice by providing more affordable access to products that will benefit lower socioeconomic groups.[40]There is also a concern that neurotechnologies may result in increased discrimination or social isolation if information from the neural devices that contain signs of a users’ cognitive impairments, disorders, or personality traits are unintentionally shared. [41] There has been notable apprehension surrounding the potential to use function brain imaging as a means to predict an individual's future behavior - such as an employer deciding whether or not they will hire a person whose brain activity suggests they are prone to combative behavior. The prospect of using brian activity to make assumptions about what people will do is highly controversial as it may unfairly target those with depression, or disorders that they would not otherwise disclose. [42]

Distributive Justice

The allocation of improved health is always a social concern, and in a society where neural technologies that can enhance, rehabilitate, and provide therapy and entertainment would be available for consumers, it is likely everyone would not have equal access to them.[43] The wealthier classes will likely have better access to these technologies, as they do to the majority of innovations in the healthcare field.[44] This unequal distribution may increase the already existent divides and inequalities amongst social classes, thus furthering the disparate access to information that often hurts lower socioeconomic groups. This issue is especially relevant to countries with privatized healthcare systems such as the United States or when neural devices are distributed via private corporations. In those cases, poorer patients may be unable to purchase these devices - furthering a “monetary divide within the society”. [45]Similarly, in his book “The Ethics of Biomedical Enhancement”, Allen Buchanan says that the public should not assume that the developers of neurotechnology will be inclined to distribute their products in a way that promotes justice - but rather in ways that may result in “unjust exclusion or domination”.[46]Thus, there are concerns that “those who already possess certain traits, attributes, and or resources will likely and quickly get even more”.[47]


Normality

The ability to make enhancements and changes to an individual's brain raises ethical questions regarding society’s view of what is “normal” and “good”. [48] “How much can humans be enhanced without deforming or destroying aspects of the social or natural world on which life relies?” [49] Striving for all people to desire and display “normal” functioning requires a societal definition of what “normal” is and should be. People are discriminated against if they have differently functioning brains or bodies, and this is expected to extend into neurotechnology. Allowing people to drastically change their mental, sensory and physical abilities will “likely change societal norms”. There may be pressure to use neurotechnology for self enhancement as progress in the field creates new types of discrimination. [50] In Khan Shujhat’s book “Brain Communications”, he says that this new ability to self “enhance” with neurotechnology may alter human nature - or what separates “healthy humans” from “animals” - because these devices can change cognitive reasoning, our ability to make moral judgments, and emotional perception. [51] In an Oxford University symposium, Dr FitzGerald highlighted the importance of knowing the truth about the capacities and limits to changing one's self. The process of a patients' informed consent and the “ethical dilemmas that could occur as we respond to escalating socio-economic pressures to ‘use what we've got’ in practical applications” [52]


Identity

“Identity encompasses the memories, experiences, relationships, and values that create one’s sense of self.” [53]. The ethical issue around maintaining a consistent identity when neurotechnology can alter a users’ brain activity is under debate. Users of neural devices have reported feeling as if their movements and accomplishments are not their own and that they feel as if they are artificial or robotic. Some users have felt unsure what is a result of themselves and what is a result of their neural device. [54]. Research posted in the Social Science Research Network conducted by Emily Postan says that the information gleaned in monitored neural data that could be considered “self descriptors” such as an indication that a user is a light sleeper can modify an individual's identity and sense of self - which may in turn alter an individual's perceptions and actions. [55] Dr. Michael Decker found that altering brain function via brain stimulation could cause unintended changes in a users’ mental states integral to their personality , which therefore can affect an individual's personal identity. [56] A study conducted by Michael Frank found that brain stimulation through neural devices resulted in behavioral changes such as increased aggressiveness, quicker decision making in high-conflict conditions, and altered sexual behaviors. [57] Marcello Ienca suggests the implementation of the protection of the right to “psychological continuity” in the future to ensure the protection of continuity in an individual's thoughts, preferences, and choices.[58]

Informed Consent

“Informed consent is regarded as a continuous agreement to go forward with a particular treatment modality”. [59] As one of the largest anticipated applications of neurotechnology is treating damaged cognitive functioning, assuring that a patient is completely aware of any and all ramifications of their treatment to give their informed consent becomes complicated. Dr. James Giordano says that “ongoing research is needed to enable clinicians to communicate the relative values, benefits, and risks of particular treatments and to enable patients to make well-informed decisions.” Even if a healthcare provider can assure that a patient is aware of potential consequences, there are doubts of whether individuals should have the option to opt into unnecessary treatments or applications of neurotechnology. Khan Shujhat says that this question could become relevant in contexts such as sports where a user may feel pressured to give their consent to improve their motor performance and therefore be influenced to undergo treatment [60]


Public Perception

There are concerns that the capabilities of BCIs and other forms of neurotechnology are being overestimated by the media and general public. A study done by Eric Racine et al found that popular media conveys to the public an incorrect emphasis on the positives and capabilities of neurotechnology to reveal a person's true character, and 67% of the articles they studied didn’t mention the capabilities, and 79% had a mostly optimistic tone. Racine et al posits that if the limitations and risks are not understood by the broader public, as it gains commercial popularity, problems arise when society believes that neurotechnology is able to accurately reveal the “true self” because the “brain can’t lie”.[61] Additionally, a case study by Allyson Purcell-Davis found that “the initial results of scientific experiments conducted within laboratories were often projected as potential cures for those with neurodegenerative disease, without any further discussion of the cost of such procedures or how long it might be before they become available.”[62] This tendency for neurotechnologies to be viewed by the public as powerful and astonishing, could threaten a person's ability to decide logically whether or not the use of these technologies is beneficial for them, taking into account the potential risks. [63] While intending to highlight the wide breadth of capabilities that neurotechnology offers, some representations may have negative consequences.In a review published to the Journal of Cognitive Enhancement by Anna Wexler and Robert Thibault, if the companies that allocate neural devices do not maintain transparency regarding the outcomes of a device, consumers are more likely to be misled. [64]



References

  1. Institute of Electrical and Electronics Engineers(May 26, 2021). "Neurotechnologies: The Next Technology Frontier". Retrieved February 2, 2022.
  2. Friedrich, Orsolya (2021). "Clinical Neurotechnology meets Artificial Intelligence". Retrieved January 25, 2022.
  3. Friedrich, Orsolya (2021). "Clinical Neurotechnology meets Artificial Intelligence". Retrieved January 25, 2022.
  4. Friedrich, Orsolya (2021). "Clinical Neurotechnology meets Artificial Intelligence". Retrieved January 25, 2022.
  5. White, Susan (October 17, 2014). "The Promise of Neurotechnology in Clinical Translational Science". Retrieved February 5, 2022.
  6. Borducchi, Davimar (November 30, 2016). "Transcranial Direct Current Stimulation Effects on Athletes’ Cognitive Performance: An Exploratory Proof of Concept Trial". Retrieved February 9, 2022.
  7. Marcella, Ienca (September, 2018). "Brain leaks and consumer neurotechnology". Retrieved February 6, 2022.
  8. VanErp, J.B.F (July 7, 2012). "Framework for BCIs in Multimodal Interaction and Multitask Environments". Retrieved February 2, 2022.
  9. VanErp, J.B.F (July 7, 2012). "Framework for BCIs in Multimodal Interaction and Multitask Environments". Retrieved February 2, 2022.
  10. O Zander, Thorston (March 24, 2011). "Towards passive brain–computer interfaces: applying brain–computer interface technology to human–machine systems in general". Retrieved February 2, 2022.
  11. Kawala-Sterniuk, Thorston (January 3, 2021). "Summary of over Fifty Years with Brain-Computer Interfaces—A Review". Retrieved February 2, 2022.
  12. O Zander, Thorston (March 24, 2011). "Towards passive brain–computer interfaces: applying brain–computer interface technology to human–machine systems in general". Retrieved February 2, 2022.
  13. Vlek, Rutger (June,2012). https://journals.lww.com/jnpt/FullText/2012/06000/Ethical_Issues_in_Brain_Computer_Interface.8.aspx?casa_token=C18A3mf2Y5EAAAAA:wwz9lj2qZSJCWAQFV4lR8-fmWUeidUMQ9CcznLLjfeV4UJQhRBKmbqGiR2Nw3jNeCJEPdXCo2bQs9X2WGWGgmQ2sK1wM "Ethical Issues in Brain–Computer Interface Research, Development, and Dissemination"]. Retrieved January 26, 2022.
  14. Vlek, Rutger (June,2012). https://journals.lww.com/jnpt/FullText/2012/06000/Ethical_Issues_in_Brain_Computer_Interface.8.aspx?casa_token=C18A3mf2Y5EAAAAA:wwz9lj2qZSJCWAQFV4lR8-fmWUeidUMQ9CcznLLjfeV4UJQhRBKmbqGiR2Nw3jNeCJEPdXCo2bQs9X2WGWGgmQ2sK1wM "Ethical Issues in Brain–Computer Interface Research, Development, and Dissemination"]. Retrieved January 26, 2022.
  15. Ienca, Marcello (October 13, 2019). "Direct-to-Consumer Neurotechnology: What Is It and What Is It for?". Retrieved February 6, 2022.
  16. Marcella, Ienca (September, 2018). "Brain leaks and consumer neurotechnology". Retrieved February 6, 2022.
  17. P, Brunner (March 24, 2011). "Current trends in hardware and software for brain–computer interfaces (BCIs)". Retrieved January 26, 2022.
  18. P, Brunner (March 24, 2011). "Current trends in hardware and software for brain–computer interfaces (BCIs)". Retrieved January 26, 2022.
  19. P, Brunner (March 24, 2011). "Current trends in hardware and software for brain–computer interfaces (BCIs)". Retrieved January 26, 2022.
  20. P, Brunner (March 24, 2011). "Current trends in hardware and software for brain–computer interfaces (BCIs)". Retrieved January 26, 2022.
  21. Marcella, Ienca (April 16, 2016). "Hacking the brain: brain–computer interfacing technology and the ethics of neurosecurity". Retrieved January 24, 2022.
  22. Shujhat, Khan (September 16, 2019). "Transcending the brain: is there a cost to hacking the nervous system?". Retrieved February 9, 2022.
  23. Marcella, Ienca (September, 2018). "Brain leaks and consumer neurotechnology". Retrieved January 24, 2022.
  24. Marcella, Ienca (April 16, 2016). "Hacking the brain: brain–computer interfacing technology and the ethics of neurosecurity". Retrieved January 24, 2022.
  25. Marcella, Ienca (September, 2018). "Brain leaks and consumer neurotechnology". Retrieved January 24, 2022.
  26. Ishan, Dasgupta (2020). "Developments in Neuroethics and Bioethics". Retrieved February 9, 2022.
  27. Marcella, Ienca (September, 2018). "Brain leaks and consumer neurotechnology". Retrieved January 24, 2022.
  28. MacDuffie, Katherine (March 31, 2021). "Neuroethics Inside and Out: A Comparative Survey of Neural Device Industry Representatives and the General Public on Ethical Issues and Principles in Neurotechnology". Retrieved February 9, 2022.
  29. Marcella, Ienca (April 16, 2016). "Hacking the brain: brain–computer interfacing technology and the ethics of neurosecurity". Retrieved January 24, 2022.
  30. Marcella, Ienca (April 16, 2016). "Hacking the brain: brain–computer interfacing technology and the ethics of neurosecurity". Retrieved January 24, 2022.
  31. Goering, Sara (March 10, 2021). "Neurotechnology ethics and relational agency". Retrieved January 24, 2022.
  32. Bublitz, Christoph (November 16, 2018). "Legal liabilities of BCI-users: Responsibility gaps at the intersection of mind and machine?". Retrieved February 9, 2022.
  33. Bashford, Luke (June 15, 2016). "Ownership and Agency of an Independent Supernumerary Hand Induced by an Imitation Brain-Computer Interface". Retrieved February 9, 2022.
  34. Bublitz, Christoph (November 16, 2018)"Legal liabilities of BCI-users: Responsibility gaps at the intersection of mind and machine?". Retrieved February 9, 2022.
  35. Marcella, Ienca (April 16, 2016). "Hacking the brain: brain–computer interfacing technology and the ethics of neurosecurity". Retrieved January 24, 2022.
  36. Marcella, Ienca (April 16, 2016). "Hacking the brain: brain–computer interfacing technology and the ethics of neurosecurity". Retrieved January 24, 2022.
  37. Marcella, Ienca (April 16, 2016). "Hacking the brain: brain–computer interfacing technology and the ethics of neurosecurity". Retrieved January 24, 2022.
  38. Marcella, Ienca (April 16, 2016). "Hacking the brain: brain–computer interfacing technology and the ethics of neurosecurity". Retrieved January 24, 2022.
  39. Gianfrancesco, Milena (November 2018). "Potential Biases in Machine Learning Algorithms Using Electronic Health Record Data". Retrieved February 9, 2022.
  40. Fitz, Nicholas (2015). "The challenge of crafting policy for do-it-yourself brain stimulation". Retrieved February 9, 2022.
  41. Marcella, Ienca (September, 2018). "Brain leaks and consumer neurotechnology". Retrieved January 24, 2022.
  42. Eaton, Margaret (April 2007). "Commercializing cognitive neurotechnology—the ethical terrain". Retrieved February 6, 2022.
  43. Shook, John (December 5, 2014). "Cognitive enhancement kept within contexts: neuroethics and informed public policy". Retrieved January 25, 2022.
  44. Wexler, Anna (January 18, 2019). "Oversight of direct-to-consumer neurotechnologies". Retrieved January 25, 2022.
  45. Shujhat, Khan (September 16,2019)."Transcending the brain: is there a cost to hacking the nervous system?". Retrieved February 9, 2022.
  46. Buchanan, Allen (2011). "The Ethics of Biomedical Enhancement". Retrieved January 25, 2022.
  47. Shook, John (December 5, 2014). "Cognitive enhancement kept within contexts: neuroethics and informed public policy". Retrieved January 25, 2022.
  48. Shook, John (December 5, 2014). "Cognitive enhancement kept within contexts: neuroethics and informed public policy". Retrieved January 25, 2022.
  49. Shook, John (December 5, 2014). "Cognitive enhancement kept within contexts: neuroethics and informed public policy". Retrieved January 25, 2022.
  50. Yuste, Rafael (November 9, 2017). "Four ethical priorities for neurotechnologies and AI". Retrieved February 9, 2022.
  51. Shujhat, Khan (September 16, 2019). "Transcending the brain: is there a cost to hacking the nervous system?". Retrieved February 9, 2022.
  52. Palchik, Guillermo (May 8, 2019). "Technology, Neuroscience & the Nature of Being: Considerations of Meaning, Morality and Transcendence Part I: The Paradox of Neurotechnology ". Retrieved February 9, 2022.
  53. Psychology Today (2018). "Identity". Retrieved January 24, 2022.
  54. Goering, Sara (March 10, 2021). "Neurotechnology ethics and relational agency". Retrieved January 24, 2022.
  55. Postan, Emily (September 28, 2020). "Narrative Devices: Neurotechnologies, Information, and Self-Constitution". Retrieved February 1, 2022.
  56. Decker, Michael (December 15, 2008). "Contacting the brain – aspects of a technology assessment of neural implants". Retrieved February 1, 2022.
  57. Frank, Michael (November 23, 2007). "Hold Your Horses: Impulsivity, Deep Brain Stimulation, and Medication in Parkinsonism". Retrieved February 1, 2022.
  58. Ienca, Marcello (April 26, 2017). "Towards new human rights in the age of neuroscience and neurotechnology". Retrieved February 1, 2022.
  59. Shujhat, Khan (September 16,2019)."Transcending the brain: is there a cost to hacking the nervous system?". Retrieved February 9, 2022.
  60. Shujhat, Khan (September 16,2019)."Transcending the brain: is there a cost to hacking the nervous system?". Retrieved February 9, 2022.
  61. Racine, Eric (February 1, 2005). "fMRI in the public eye". Retrieved February 5, 2022.
  62. Purcell-Davis, Allyson (April 21, 2015). "The Representations of Novel Neurotechnologies in Social Media". Retrieved February 5, 2022.
  63. Giattino, Charles (January 2019)."The Seductive Allure of Artificial Intelligence-Powered Neurotechnology". Retrieved February 9, 2022.
  64. Wexler, Anna (September 18,2018)."Mind-Reading or Misleading? Assessing Direct-to-Consumer Electroencephalography (EEG) Devices Marketed for Wellness and Their Ethical and Regulatory Implications". Retrieved February 9, 2022.