Brain Hack Devices

Brain Hack Devices Must Be Scrutinised, Say Top Scientists

Devices that merge machines with the human brain need to be investigated, a study has said.

In future, “people could become telepathic to some degree” and being able to read someone else’s thoughts raises ethical issues, experts said.

This could become especially worrying if those thoughts were shared with corporations.

Commercial products should not come from “a handful of companies”, they added

In the study – iHuman: Blurring Lines between Mind and Machine – leading scientists at the UK’s Royal Society lay out the opportunities and risks of brain-to-computer devices.

Such interfaces refer to gadgets, either implanted in the body or worn externally, that stimulate activity in either the brain or nervous system.

‘Benefit of humanity’

Among the risks highlighted by the report was the idea of thoughts or moods being accessed by big corporations as well as the bigger question about whether such devices fundamentally change what it means to be human.

“While advances like seamless brain-to-computer communication seem a much more distant possibility, we should act now to ensure our ethical and regulatory safeguards are flexible enough for any future development.

“In this way we can guarantee these emerging technologies are implemented safely and for the benefit of humanity.”

In July, Elon Musk announced that his firm Neuralink was applying to start human trials in the US, with electrodes inserted into the brains of patients with paralysis.

And Facebook is supporting research that aims to create a headset with the ability to transcribe words at a rate of 100 per minute, just by thinking.

In the US it is estimated that 60,000 spinal-cord stimulators are implanted annually and around the world some 400,000 people have benefited from cochlear implants.

Thousands of people with Parkinson’s disease and similar conditions have been treated with deep brain stimulation, and artificial pancreases and wireless heart monitors are also common.