Scientists from Universitya of Texas at Austin developedha a method of reconstructing the stream of thought in a magnetic resonance imaging scan (MRI)or nuclear magnetic resonance. For the purpose they use the linguistic model GPT-1.
With the help of artificial intelligence, MRI scans can be used to read brain activity. The technology is capable of converting thoughts into text and thus literally reading people’s minds, say the scientists, who recently published a scientific paper about the study in the Nature Journal with the eloquent title “Semantic reconstruction of streaming speech based on data from non-invasive monitoring of brain activity.” .
The researchers were able to reconstruct not only individual sets of words and phrases, but also their long sequences. The process of training the artificial intelligence is that the people who are in the MRI machine listen to different podcasts for a long time – 16 hours in total.
The decoder—a program based on the GPT-1 model (ChatGPT’s predecessor)—is trained to relate the test participant’s brain activity to the information it receives from the podcasts.
In the past, some research teams have had some success with implantable human-machine interfaces. In this new research, however, no implants were used at all, and it is the non-invasiveness of the method that makes it particularly promising.
Such technologies would allow people who have lost the ability to speak or the mobility of their limbs to write with the power of thought.
At this stage, the results of the research of the Austin scientists are not ideal – the decoder sometimes transforms the phrases it hears into completely different ones. For example, the phrase “I still don’t have a license” was converted by the decoder based on indicators of brain activity into the following sentence: “She hasn’t even started learning to drive.”
A decoder may unnecessarily simplify a complex phrase. For example, the original line “I didn’t know what to do, scream, cry or run. I said, “Leave me alone!” is translated as follows: “I started screaming and crying, and then I just said, ‘I asked you to leave me alone.’
In theory, these are not just AI fantasies: the resulting phrases were generated precisely based on data from an MRI brain scan. In some experiments, participants are asked to watch a video without sound. The decoder manages to correctly describe what people see.
It will take some time before the method becomes truly practical, but experts in this field of knowledge are already talking about an impressive breakthrough. At the same time, there have been concerns that the technology may not be used to help people with disabilities.
“We take concerns that the described method could be used for evil very seriously and are working to avoid this,” says one of the study’s authors. “We want to make sure that this kind of technology will only be used to help and in good faith.”