Neurotechnology: The Goal Of Human Enhancement.
At the World Government Summit in Dubai in February 2017, Tesla and SpaceX chief executive Elon Musk said that people would need to become cyborgs to be relevant in an artificial intelligence age. He said that a “merger of biological intelligence and machine intelligence” would be necessary to ensure we stay economically valuable. Soon afterwards, the serial entrepreneur created Neuralink, with the intention of connecting computers directly to human brains. He wants to do this using “neural lace” technology – implanting tiny electrodes into the brain for direct computing capabilities. Brain-computer interfaces (BCI) aren’t a new idea. Various forms of BCI are already available, from ones that sit on top of your head and measure brain signals to devices that are implanted into your brain tissue. They are mainly one-directional, with the most common uses enabling motor control and communication tools for people with brain injuries. In March, a man who was paralysed from below the neck moved his hand using the power of concentration.
But Musk’s plans go beyond this: he wants to use BCIs in a bi-directional capacity so that plugging in could make us smarter, improve our memory, help with decision-making and eventually provide an extension of the human mind. “Musk’s goals of cognitive enhancement relate to healthy or able-bodied subjects, because he is afraid of AI and that computers will ultimately become more intelligent than the humans who made the computers,” explains BCI expert Professor Pedram Mohseni of Case Western Reserve University, Ohio, who sold the rights to the name Neuralink to Musk. “He wants to directly tap into the brain to read out thoughts, effectively bypassing low-bandwidth mechanisms such as speaking or texting to convey the thoughts. This is pie-in-the-sky stuff, but Musk has the credibility to talk about these things,” he adds.
Musk is not alone in believing that “neurotechnology” could be the next big thing. Silicon Valley is abuzz with similar projects. Bryan Johnson, for example, has also been testing “neural lace.” He founded Kernel, a startup to enhance human intelligence by developing brain implants linking people’s thoughts to computers. In 2015, Facebook CEO Mark Zuckerberg said that people would one day be able to share “full sensory and emotional experiences” online – not just photos and videos. Facebook has been hiring neuroscientists for an undisclosed project at its secretive hardware division, Building 8. However, it is unlikely this technology will be available anytime soon, and some of the more ambitious projects may be unrealistic, according to Mohseni.
“In my opinion, we are at least 10 to 15 years away from the cognitive enhancement goals in healthy, able-bodied subjects. It certainly appears to be, from the more immediate goals of Neuralink, that the neurotechnology focus will continue to be on patients with various neurological injuries or diseases,” he says. Mohseni says one of the best current examples of cognitive enhancement is the work of Professor Ted Berger, of the University of Southern California, who has been working on a memory prosthesis to replace the damaged parts of the hippocampus in patients who have lost their memory due to, for example, Alzheimer’s disease. “In this case, a computer is to be implanted in the brain that acts similarly to the biological hippocampus from an input and output perspective,” he says. “Berger has resulted from both rodents and non-human primate models, as well as preliminary results in several human subjects.” Mohseni adds: “The [US government’s] Defense Advanced Research Projects Agency (DARPA) currently has a programme that aims to do cognitive enhancement in their soldiers – i.e. enhance the learning of a wide range of cognitive skills, through various mechanisms of peripheral nerve stimulation that facilitate and encourage neural plasticity in the brain. This would be another example of cognitive enhancement in able-bodied subjects, but it is quite pie-in-the-sky, which is exactly how DARPA operates.”
Understanding the brain
In the UK, research is ongoing. Davide Valeriani, a senior research officer at University of Essex’s BCI-NE Lab, is using an electroencephalogram (EEG)-based BCI to tap into the unconscious minds of people as they make decisions. “Everyone who makes decisions wears the EEG cap, which is part of a BCI, a tool to help measure EEG activity. It measures electrical activity to gather patterns associated with confident or non-confident decisions,” says Valeriani. “We train the BCI – the computer basically – by asking people to make decisions without knowing the answer and then tell the machine, ‘Look, in this case, we know the decision made by the user is correct, so associate those patterns to confident decisions’ – as we know that confidence is related to probability of being correct. So during training, the machine knows which answers were correct and which ones were not. The user doesn’t know all the time.”
Valeriani adds: “I hope more resources will be put into supporting this very promising area of research. BCIs are not only an invaluable tool for people with disabilities, but they could be a fundamental tool for going beyond human limits, hence improving everyone’s life.” He notes, however, that one of the biggest challenges with this technology is that first, we need to understand better how the human brain works before deciding where and how to apply BCI. “This is why many agencies have been investing in basic neuroscience research – for example, the Brain initiative in the US and the Human Brain Project in the EU.” Whenever there is the talk of enhancing humans, moral questions remain – particularly around where the human ends and the machine begins. “In my opinion, one way to overcome these ethical concerns is to let humans decide whether they want to use a BCI to augment their capabilities,” Valeriani says. “Neuroethicists are working to advise policymakers about what should be regulated. I am quite confident that, in the future, we will be more open to the possibility of using BCIs if such systems provide a clear and tangible advantage to our lives.”
Credit: Sarah Marsh for The Guardian, 1 January 2018.