With the help of artificial intelligence: how to read minds

New study
With a brain scanner and AI: How we could read minds in the future

US researcher Jerry Tang collecting data on brain activity at the University of Texas Biomedical Imaging Center in Austin

© Nolan Zuk / University of Texas at Austin / DPA

Can we read future minds? Researchers have come a step closer to this with the help of artificial intelligence. But they warn of the consequences.

According to their own statements, US scientists have succeeded in using brain scanners and artificial intelligence to roughly capture what people think – and thus come a whole lot closer to reading minds. The researchers at the University of Texas at Austin use a speech decoder that works “on a completely different level” than current devices, said Alexander Huth, a neuroscientist at the US university and co-author of a study published in the journal Nature Neuroscience on the subject on Monday at a press conference.

In the future, speech decoders could enable people who have lost their ability to speak with a brain implant to spell words and even whole sentences. The “Brain-Computer Interfaces” focus on the part of the brain that controls the mouth when trying to form words.

“Our system works (…) on the level of ideas, semantics, meaning,” said Huth. It is the first system that can reconstruct speech without a brain implant.

Researchers used predecessors of ChatGPT

For the study, three subjects spent a total of 16 hours in a functional magnetic resonance imaging (fMRI) scanner – a device that maps physiological functions in the body using magnetic resonance imaging – and listened to spoken podcasts. In this way, the researchers were able to record the reactions that words, sentences and meanings triggered in the brain regions responsible for processing language.

They fed this data into a neural network language model using GPT-1, the predecessor of the AI ​​technology that also uses ChatGPT. The AI ​​model was trained to predict how the brain would respond to the speech it heard, and then narrow down the options and match the most likely sentences. For example, if the study participant heard the sentence “I don’t have a driver’s license yet”, the model replied: “She hasn’t even started learning to drive”.

Huth explained that even when the subjects made up their own stories, the decoder was able to capture the essentials. “This shows that we are decoding something deeper than speech and then turning it into speech,” the researcher said.

Machines make it possible to read minds

Bioethics professor at the University of Grenada in Spain, David Rodriguez-Arias Vailhen, who was not involved in the experiment, confirmed that the system goes beyond what had been achieved with previous brain-computer interfaces.

“This brings us closer to a future where machines are able to read and transcribe minds,” he said, warning that this could potentially happen against a human’s will, such as when they are asleep.

cl
AFP

source site-1