Danger of discrimination from neurotech monitoring the brain
Technology to monitor neurodata, the information coming directly from the brain and nervous system, will become widespread over the next decade.
Today, the ICO published a report explaining the ways in which the collection and use of such data poses major risks of being biased leading to discrimination, including in the workplace. These risks apply particularly to neurodiverse people, such as those who are autistic, who have atypical ways of learning, dyslexia, Attention Deficit Disorder, or ways of communicating with others.
Stephen Almond, Executive Director of Regulatory Risk at the Information Commissioner’s Office said:
“Neurotechnology collects intimate personal information that people are often not aware of, including emotions and complex behaviour. The consequences could be dire if these technologies are developed or deployed inappropriately.”
“We want to see everyone in society benefit from this technology. It’s important for organisations to act now to avoid the real danger of discrimination.”
It is expected that workplaces, home entertainment and wellbeing services may well use neurotechnology to provide more personalised services in the years to come. There is a risk of inherent bias and inaccurate data being embedded in neurotechnology.
The report examines the impact of neurotechnologies and neurodata and analyse their impact on privacy by exploring plausible scenarios and use cases for emerging neurotechnologies.