Neuroprivacy

Neuroprivacy, or "brain privacy," is a concept which refers to the rights people have regarding the imaging, extraction and analysis of neural data from their brains.[1] This concept is highly related to fields like neuroethics, neurosecurity, and neurolaw, and has become increasingly relevant with the development and advancement of various neuroimaging technologies. Neuroprivacy is an aspect of neuroethics specifically regarding the use of neural information in legal cases, neuromarketing, surveillance and other external purposes, as well as corresponding social and ethical implications.

History

Neuroethical concepts such as neuroprivacy developed initially in the 2000s, after the initial invention and development of neuroimaging techniques such as positron emission tomography (PET), electroencephalography (EEG), and functional magnetic resonance imaging (fMRI).[2] As neuroimaging became highly studied and popularized in the 1990s, it also started entering the commercial market as entrepreneurs sought to market the practical applications of neuroscience, such as neuromarketing, neuroenhancement and lie detection. Neuroprivacy consists of the privacy issues raised by both neuroscience research and applied uses of neuroimaging techniques. The relevance of neuroprivacy debate increased significantly after the 9/11 terrorist attacks, which lead to a push for increased neuroimaging in the context of information/threat detection and surveillance.[3][4]

Neuroanalysis techniques

Brain fingerprinting

Brain fingerprinting is a controversial and unproven EEG technique that relies on identifying the P300 event-related potential,[5] which is correlated with recognition of some stimulus.[6] The purpose of this technique is to determine if a person has incriminating information or memory. In its current state, brain fingerprinting is only able to determine the existence of information, and is unable to provide any specific details about that information.[7] Its creator, Dr. Lawrence Farwell, claims brain fingerprinting is highly reliable and nearly impossible to fool,[6] but some studies dispute its reliability and lack of countermeasures.[8][9] Some possible countermeasures include thinking of something else instead of processing the real stimuli, mental suppression of recognition, or simply not cooperating with the test.[8] There have been concerns over the potential use of memory dampening drugs such as propranolol to beat brain fingerprinting.[10] However, some studies have shown that propranolol actually dampens the emotional arousal associated with a memory instead of the memory itself, which could even improve the recollection of the memory.[11]

A comparable EEG technique is brain electrical oscillation signature profiling (BEOS), which is very similar to brain fingerprinting in that it detects the presence of specific information or memories. Despite a significant lack of scientific studies confirming the validity of BEOS profiling, this technique has been used in India to provide evidence for criminal investigations.[9][12]

Evaluation and prediction of mental and moral faculties

Current neuroimaging technology has been able to detect neural correlates of human attributes such as memory and morality.[13][14] Neurodata can be used to diagnose and predict behavioral disorders and patterns such as psychopathy and antisocial behavior, both of which are factors in calculating likelihood of future criminal behavior.[15][16] This ability to evaluate mental proficiencies, biases and faculties could be relevant to government or corporate entities for the purposes of surveillance or neuromarketing, especially if neurodata can be collected without the subjects' knowledge or consent.[17] Using neurodata to predict future behaviors and actions could help create or inform preventive measures to treat people before problems happen; however, this raises ethical issues as to how society defines "moral" or "acceptable" behavior.[16]

Lie detection

It is possible to use neuroimaging as a form of lie detection. By assuming deception requires an increase of cognitive processes to develop an alternate story, the difference in mental states between telling the truth or lying should be noticeable.[7] However, this relies on assumptions that have yet to be conclusively determined, and as such neurological lie detection is not yet reliable or fully understood. This is in contrast to the standard polygraph, which relies on analyzing biological mechanisms that are well understood but still not necessarily reliable.[18]

Applications of personal neurodata

The legal systems of most countries generally do not accept neuroimaging data as permissible evidence, with some exceptions. India has allowed BEOS tests as legal evidence, and an Italian court of appeals used neuroimaging evidence in a 2009 case, being the first European court to do so.[7] Canadian and US courts have been more cautious in permitting neuroimaging data as legal evidence.[18] One of the reasons legal systems have been slow to adopt neuroimaging data as an accepted form of evidence is the possible error and misinterpretations that could result from such a new technology; courts in the US typically follow the Daubert standard set for evidence evaluation by the Daubert v. Merrell Dow Pharmaceuticals, Inc. Supreme Court case, which established that the validity of scientific evidence must be determined by the trial judge.[9] The Daubert standard serves as a safeguard for the reliability of scientific evidence, and requires a significant amount of testing for any neuroimaging technique to be considered for it to be considered as evidence. While brain fingerprinting was technically accepted in the Harrington v. Iowa case, the judge specifically stated that the EEG evidence was not to be presented to a jury and so the evidence did not set a significant precedent.[7]

Surveillance and security

Neurological surveillance is relevant to governmental, corporate, academic and technological entities, as the improvement of technology increases the amount of information that can be extrapolated from neuroimaging.[19] Surveillance with current neuroimaging technology is considered difficult, given how fMRI data is difficult to collect and interpret even in laboratory settings; fMRI studies generally require subjects to be motionless and cooperative.[17] However, as technology improves it may be possible to overcome these requirements.

In theory, there are benefits in using neuroscience in the context of surveillance and security.[4] However, there is debate over whether doing so would violate neuroprivacy to an unacceptable extent.[3][20]

Neuromarketing

Neurodata is valuable to advertising and marketing entities by its potential to identify how and why people react to different stimuli in order to better influence consumers.[21] This ability to examine reactions and perceptions from the brain directly creates new ethical debates, such as how to define the acceptable limits of mental manipulation and how to avoid targeting vulnerable/receptive demographics. In a sense, these could be seen as not necessarily brand new debates but rather added dimensions to previously existing discussions.

Controversy and debate

Scientific arguments

The main scientific arguments regarding neuroprivacy mainly revolve around the limits to the current understanding of neurodata. Many of the arguments against using neuroimaging in legal, surveillance and other contexts are based on the lack of a solid scientific basis, meaning the potential for error and misinterpretation is too high.[9] Brain fingerprinting, one of the most popularized forms of neuroanalysis, has been promoted by its creator, Dr. Lawrence Farwell, despite a lack of scientific agreement on its reliability.[22][23][8] Currently, there is even a lack of scientific understanding as to what can be interpreted from neurodata, which makes limiting and categorizing different types of neurodata difficult and thus complicating neuroprivacy.[24] Another complication is that neurodata is highly personal and is essentially inseparable from the subject, making it extremely sensitive and difficult to anonymize. One possible way to regulate and protect neuroprivacy is to focus on the different uses and cases of neurodata.

Another issue is the conflation with scientific knowledge with beliefs regarding the relations between philosophical, neural and societal constructs.[3] Popularization and overconfidence in scientific techniques may lead to assumptions or misinterpretations of what neurodata actually describe, when in reality there are limits to what can be interpreted from correlations between neural activity and semantic meaning.[25]

There are various legal arguments as to how neuroprivacy is covered under current protections and rights and how future laws should be implemented to define and protect neuroprivacy, as neuroscience has the potential to significantly change the legal status quo.[7] The legal definition of neuroprivacy has yet to be properly established, but there appears to be a general consensus that a legal and ethical foundation for neuroprivacy rights should be established before neuroimaging becomes widely accepted across legal, corporate and security contexts.[19][3][18][9][24][1][13][17][4] As neuroprivacy constitutes an international issue, an international consensus may be required to establish the necessary legal and ethical foundation.[7]

Bringing neuroscience into legal contexts has been argued to have certain benefits. Current types of legal testimony, such as eyewitness testimony and polygraph testing, have significant flaws that may be possibly currently overlooked due to historical and traditional precedents.[26][27] Neuroscience could potentially solve some of these issues by directly examining the brain, given scientific confidence in the neuroimaging techniques.[4] However, this raises questions concerning balancing legal usages of neuroscience with neuroprivacy protections.[17]

In the US, there are certain existing rights that could be interpreted to protect neuroprivacy. The Fifth Amendment, which protects citizens from self-incrimination, could be interpreted to protecting citizens from being incriminated by their own brain.[17] However, the current interpretation is that the Fifth Amendment protects citizens from self-incriminating testimony; if neuroimaging constitutes physical evidence instead of testimony, the Fifth Amendment may not protect against neuroimaging evidence.[20] The Ninth and Fourteenth Amendment help protect unspecified rights and fair procedures, which may or may not include neuroprivacy to some extent.[17]

One interpretation of neuroimaging evidence is categorizing it as forensic evidence rather than scientific expert testimony; detecting memories and information of a crime could be compared to collecting forensic residue from a crime scene. This distinction would make it categorically different than a polygraph test, and increase its legal permissibility in Canadian and US legal systems.[18]

Ethical arguments

Some general ethical concerns regarding neuroprivacy revolve around personal rights and control over personal information. As technology improves, it is possible that collecting neurodata without consent or knowledge will be easier or more common in the future. One argument is that the collection of neurodata is a violation of both personal property and intellectual property, as the collection of neurodata involves scanning the both the body and the analysis of thought.[20]

One of the main ethical controversies regarding neuroprivacy is related to the issue of free will, and the mind-body problem. A possible concern is the unknown extent to which neurodata can predict actions and thoughts - it is not currently known if the physical activity of the brain is conclusively or solely responsible for thoughts and actions.[28] Examining the brain as a way to prevent crimes or disorders before they manifest raises the question of if it is possible for people to exercise their agency despite their neurological condition. Even using neurodata in a way to treat certain disorders and diseases preemptively raises questions about identity, agency and how society defines morality.[15]

  • In the television show Westworld, hats are used as neuroimaging devices that record experiences and data without the consent or knowledge of the users.[29] This data is mainly used for research for neuromarketing and commercial pursuits, namely the pursuit of immortality.
  • In the Dark Forest novel by Liu Cixin, one of the projects developed to ensure the survival of humanity involved extensive human brain mapping to develop ways to improve cognition.[30] This project was eventually used to imprint human brains with "mental seals", artificially implanted unshakeable beliefs in a person's psyche.
  • In the Harry Potter series by J. K. Rowling, brain privacy can be invaded by the use of Legilimens, which involves the extraction of the contents of the mind such as thoughts and emotions.[31] One way to increase neuroprivacy in the Harry Potter world is by practicing Occlumency, which involves defending the mind against Legilimens and other forms of mental invasion.[32]


gollark: Why do you have finite memory anyway?
gollark: Delete random objects until someone complains.
gollark: All goods now explode if resold.
gollark: Also:- number go up- swim in pools of money (if very rich)
gollark: People love that stuff.

See also

References

  1. The Committee on Science and Law (2005). "Are Your Thoughts Your Own?: 'Neuroprivacy' and the Legal Implications of Brain Imaging" (PDF). The Record of the Association of the Bar of the City of New York. 60 (2): 407–37.
  2. Vidal, Fernando (2015). "Historical and Ethical Perspectives of Modern Neuroimaging". Handbook of Neuroethics. pp. 535–550. doi:10.1007/978-94-007-4707-4_27. ISBN 978-94-007-4706-7.
  3. Littlefield, Melissa (29 April 2008). "Constructing the Organ of Deceit". Science, Technology, & Human Values. 34 (3): 365–392. doi:10.1177/0162243908328756.
  4. McCormick, Brian (2006). "Your Thoughts May Deceive You: The Constitutional Implications of Brain Fingerprinting Technology and How it May Be Used to Secure Our Skies". Law & Psychology Review. 30: 171–184.
  5. Brandom, Russell (2015-02-02). "Is 'brain fingerprinting' a breakthrough or a sham?". The Verge.
  6. Farwell, Lawrence A.; Richardson, Drew C.; Richardson, Graham M. (5 December 2012). "Brain fingerprinting field studies comparing P300-MERMER and P300 brainwave responses in the detection of concealed information". Cognitive Neurodynamics. 7 (4): 263–299. doi:10.1007/s11571-012-9230-0. PMC 3713201. PMID 23869200.
  7. Church, Dominique J. (2012). "Neuroscience in the Courtroom: An International Concern". William & Mary Law Review. 53 (5): 1825–54.
  8. Rosenfeld, J. Peter (11 March 2019). "P300 in detecting concealed information and deception: A review". Psychophysiology: e13362. doi:10.1111/psyp.13362. PMID 30859600.
  9. Gaudet, Lyn M. (2011). "Brain fingerprinting, scientific evidence, and Daubert: a cautionary lesson from India". Jurimetrics. 51 (3): 293–318. JSTOR 41307131.
  10. McGorrery, Paul (19 September 2017). "A further critique of brain fingerprinting: The possibility of propranolol usage by offenders". Alternative Law Journal. 42 (3): 216–220. doi:10.1177/1037969X17730204.
  11. Elsey, James; Kindt, Merel (30 July 2018). "Can criminals use propranolol to erase crime-related memories? A response to McGorrery (2017)". Alternative Law Journal. 43 (2): 136–138. doi:10.1177/1037969X18765204.
  12. Pulice, Erin B. Pulice (2010). "The Right to Silence at Risk: Neuroscience-Based Lie Detection in the United Kingdom, India, and the United States" (PDF). George Washington International Law Review. 42 (4): 865–96.
  13. Bzdok, Danilo; Groß, Dominik; Eickhoff, Simon B. (2015). "The Neurobiology of Moral Cognition: Relation to Theory of Mind, Empathy, and Mind-Wandering". Handbook of Neuroethics. pp. 127–148. doi:10.1007/978-94-007-4707-4_161. ISBN 978-94-007-4706-7.
  14. Murty, Vishnu P.; Ritchey, Maureen; Adcock, R. Alison; LaBar, Kevin S. (March 2011). "Reprint of: fMRI studies of successful emotional memory encoding: A quantitative meta-analysis". Neuropsychologia. 49 (4): 695–705. doi:10.1016/j.neuropsychologia.2011.02.031. PMID 21414466.
  15. Jotterand, Fabrice; Giordano, James (2015). "Real-Time Functional Magnetic Resonance Imaging–Brain-Computer Interfacing in the Assessment and Treatment of Psychopathy: Potential and Challenges". Handbook of Neuroethics. pp. 763–781. doi:10.1007/978-94-007-4707-4_43. ISBN 978-94-007-4706-7.
  16. Glenn, Andrea L.; Focquaert, Farah; Raine, Adrian (2015). "Prediction of Antisocial Behavior". Handbook of Neuroethics. pp. 1689–1701. doi:10.1007/978-94-007-4707-4_149. ISBN 978-94-007-4706-7.
  17. Roskies, Adina L. (2015). "Mind Reading, Lie Detection, and Privacy". Handbook of Neuroethics. pp. 679–695. doi:10.1007/978-94-007-4707-4_123. ISBN 978-94-007-4706-7.
  18. Frederiksen, Soren (June 2011). "Brain fingerprint or lie detector: does Canada's polygraph jurisprudence apply to emerging forensic neuroscience technologies?". Information & Communications Technology Law. 20 (2): 115–132. doi:10.1080/13600834.2011.578930.
  19. Pearlman, Ellen (12 November 2015). "The brain as site-specific surveillant performative space". International Journal of Performance Arts and Digital Media. 11 (2): 219–234. doi:10.1080/14794713.2015.1084810.
  20. Moore, Adam D. (4 November 2016). "Privacy, Neuroscience, and Neuro-Surveillance". Res Publica. 23 (2): 159–177. doi:10.1007/s11158-016-9341-2.
  21. Matthews, Steve (2015). "Neuromarketing: What is It and is It a Threat to Privacy?". Handbook of Neuroethics. pp. 1627–1645. doi:10.1007/978-94-007-4707-4_154. ISBN 978-94-007-4706-7.
  22. Meijer, Ewout H.; Ben-Shakhar, Gershon; Verschuere, Bruno; Donchin, Emanuel (14 August 2012). "A comment on Farwell (2012): brain fingerprinting: a comprehensive tutorial review of detection of concealed information with event-related brain potentials". Cognitive Neurodynamics. 7 (2): 155–158. doi:10.1007/s11571-012-9217-x. PMC 3595430. PMID 23493984.
  23. Farwell, Lawrence A.; Richardson, Drew C. (9 January 2013). "Brain fingerprinting: let's focus on the science—a reply to Meijer, Ben-Shakhar, Verschuere, and Donchin". Cognitive Neurodynamics. 7 (2): 159–166. doi:10.1007/s11571-012-9238-5. PMC 3595431. PMID 23494087.
  24. Hallinan, Dara; Schütz, Philip; Friedewald, Michael; De Hert, Paul (20 November 2013). "Neurodata and Neuroprivacy: Data Protection Outdated?". Surveillance & Society. 12 (1): 55–72. doi:10.24908/ss.v12i1.4500.
  25. Roskies, Adina L. (2015). "Neuroimaging Neuroethics: Introduction". Handbook of Neuroethics. pp. 659–663. doi:10.1007/978-94-007-4707-4_34. ISBN 978-94-007-4706-7.
  26. Christianson, SA (September 1992). "Emotional stress and eyewitness memory: a critical review". Psychological Bulletin. 112 (2): 284–309. doi:10.1037/0033-2909.112.2.284. PMID 1454896.
  27. Proverbio, Alice Mado; La Mastra, Francesca; Zani, Alberto; Hills, Peter James (21 September 2016). "How Negative Social Bias Affects Memory for Faces: An Electrical Neuroimaging Study". PLOS ONE. 11 (9): e0162671. Bibcode:2016PLoSO..1162671P. doi:10.1371/journal.pone.0162671. PMC 5031436. PMID 27655327.
  28. Reyna, Stephen (2015). "Free Will, Agency, and the Cultural, Reflexive Brain". Handbook of Neuroethics. pp. 323–342. doi:10.1007/978-94-007-4707-4_138. ISBN 978-94-007-4706-7.
  29. Barrett, Brian. 2018. “The Real 'Westworld' Villain Has Always Been Its Privacy Policy.” Wired. Retrieved April 4, 2019 (https://www.wired.com/story/westworld-privacy-policy/).
  30. Liu, Cixin. 2008. The Dark Forest. Tor Books.
  31. Anon. n.d. “Legilimens.” Pottermore. Retrieved April 4, 2019 (https://www.pottermore.com/explore-the-story/legilimens).
  32. Anon. n.d. “A Helpful Guide to Occlumency.” Pottermore. Retrieved April 4, 2019 (https://www.pottermore.com/features/a-guide-to-occlumency).
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.