Meta is working on ways to read minds using AI

6 Sep 2022

Image: © pressmaster/Stock.adobe.com

In a pre-print study, Meta scientists said their AI model was able to decode speech segmengs from three seconds of brain activity.

New exploratory research at Meta is looking at ways to understand what’s happening in people’s minds using AI.

In an ambitious study, which has not been peer reviewed, Meta scientists sought to decode speech from non-invasive brain recordings using an AI model.

The parent company of Facebook said this is a “long-awaited goal” in both healthcare and neuroscience, as it could improve the lives of those with brain injuries that make them unable to communicate through speech, typing or gestures.

In the pre-print study, researchers tested the AI model on four public datasets, which consisted of 169 volunteers listening to audiobooks and isolated sentences in English and Dutch.

The brain activity of these volunteers was captured using electro and magnetoencephalography, which are non-invasive ways to record brain activity.

The Meta researchers said that from three seconds of brain activity, the AI model was able to decode corresponding speech segments with up to 73pc “top-10 accuracy” from a vocabulary of 793 words.

The research is still in its early stages, however. Jean-Rémi King, a researcher in Meta’s Brain and AI group, told Time that a number of challenges were found in the study, including the fact brain signals are “noisy”.

“The sensors are pretty far away from the brain,” King said. “There is a skull, there is skin, which can corrupt the signal that we can pick up. So picking them up with a sensor requires super advanced technology.

“The other big problem is more conceptual in that we actually don’t know how the brain represents language to a large extent.”

Meta said the work is a part of a “broader effort by the scientific community” to use AI to better understand the human brain. The company described the work as a “first step” as enabling patient communication will require extending the work to speech production.

“I take this more as a proof of principle that there may be pretty rich representations in these signals – more than perhaps we would have thought,” King told Time.

Meta has been looking at AI as a means to improve communication in various ways. In July, the company said it had developed an AI model that can translate 200 different languages, and that one of its AI models can automatically verify hundreds of thousands of Wikipedia citations at once.

10 things you need to know direct to your inbox every weekday. Sign up for the Daily Brief, Silicon Republic’s digest of essential sci-tech news. 

Leigh Mc Gowran is a journalist with Silicon Republic

editorial@siliconrepublic.com