The educational revolution with Elon Musk’s brain chip
Imagine a future where people no longer learn languages. Instead, just as many of us today stream music or movies on demand to our computers, in the future we’ll stream it directly from the Internet to a neural implant in our heads. This would require brain-computer interface technology, a language chip implanted in the cerebral cortex, as well as a modem chip, possibly behind the ear, capable of receiving signals from the language stream. It is based on current technology and ongoing research that connects the human brain to the computer.
But such technology raises many ethical questions, such as the potential for brain chips to be used and abused by big governments or big tech, the potential for hacking individual brains, and ultimately brain implants that affect language, a very vital element of being human, with many implications for experience. A human being includes the way we think, feel, learn and interact with each other and the world around us. In fact, more than half of the world’s population speaks at least two languages and often more; For example, in South Africa, many people speak five languages.
From a psychological perspective, it is well established that speaking and learning another language has significant benefits for our brain capacity; Beyond the obvious practical benefits of being able to communicate more broadly. Speaking two or more languages makes you smarter, from being better able to grasp new skills to improving your ability to concentrate. It also improves memory and provides better protection against dementia in old age. Bilingual people are less likely to suffer from cognitive impairment in late life than their monolingual peers.
Can language learning be replaced by technology?
In 2016, Elon Musk founded Neuralink; A neural technology company (Nuralink) that is working on developing implantable brain-computer interface chips. Initially, Neuralink focused on technology that would allow paraplegics to communicate with computers with the click of a mouse or using a smartphone, without the need to move. Today, this technology involves implanting computer chips in the brain that at least allow for wireless communication with external computers.
Neural implants are increasingly common to bypass areas of the brain that have become dysfunctional. Such implants serve as a biomedical prosthesis in cases of stroke or head injury. Other applications include implants that provide deep brain stimulation functions in cases of Parkinson’s disease or even to treat depression. Elon Musk’s vision is that brain implants not only go beyond medical treatment, but also aim to enhance the human experience.
One of them is language learning, where Elon Musk claimed that brain implants could soon be the end of old-fashioned language learning. He has gone so far as to claim that perhaps within five to 10 years, human languages can be replaced by a single universal language, using brain implants, making communication more effective and convenient. While Musk’s claims and predictions are unlikely to come true, at least in the predicted time frame, what will be the psychological consequence of this? That is, if in the future there is technology to solve the problem of language learning so that we can use it to transmit language directly to our brains, what does this mean for the way we communicate?
And what does it mean for the way we think, feel and experience the world around us? Since the basic function of language is to facilitate communication, language does this well; By encoding and externalizing concepts, thoughts, and ideas, by turning the formless thoughts locked inside our heads into something we can verbalize or sign (using written or typed or signed text, etc.) to make our ideas into reality. Let’s meet someone in the real world. Psychologists express this main function of language as a sign of a communicative purpose. But by default, we can refer to this process as meaning transfer.
But recently, research in psychological sciences has definitely shown that the meanings conveyed by language are rooted in the world of our lived and everyday experiences; For example, when I say, "hammer a nail into a stick," you understand these words in part by activating the motor areas of your brain that know what it feels like to pick up a hammer and hit a nail. In other words, understanding language involves activating the embodied experience of the very situation that the words are intended to convey. Psychologically, given that the meanings conveyed by language depend on actual experiences, that is, the very brain states that encode lived experience, the question now remains:How will the new world of brain implants facilitate language if words are not tied to the same brain states in the living brain that forms them? If language is broadcast to us through a computer, perhaps the Internet, then where do the meanings of words come from? This problem is known as the problem of symbolic contextualization of meaning. Words are not abstract entities.
They can be used to do things, make us fall in love or make someone laugh because of the meanings associated with them. Now, if we load the language onto a computer, the problem is how do we connect it to a meaning that will definitely stick in our minds? And finally, if language learning can be transferred to brain implants, it may change the nature of language and communication and irreversibly what it means to be human.