By analysing specific signals sent in your brain, researchers can tell whether you’ve actually understood what you’ve just heard.
At some point in time, you might have had a conversation with someone where they said they knew what you were talking about, but you knew deep down that wasn’t the case.
Well now, a team of neuroscientists at Trinity College Dublin and the University of Rochester in the US has delved deep into the mind to identify a specific brain signal associated with the conversion of speech into understanding.
When the signal is present, it confirms that the person has understood what they’ve just been told; if it’s absent, it went straight over their head or they weren’t paying attention.
During our everyday interactions, we routinely speak at rates of between 120 and 200 words per minute, meaning that our brains have to be clued in to comprehend the meaning of each of these words quite quickly.
Despite taking this incredible skill for granted, how our brains compute the meaning of words in context has, until now, remained unclear.
Computer v human
In a paper published to Current Biology, the international team used voice-recognition technology in computers and smartphones to ‘understand’ speech by creating patterns from huge volumes of text and learning from them.
This is quite different from how we comprehend spoken language as human babies are typically hardwired to learn how to speak a small number of speech examples.
It allows us to tell the difference between two identical words that have very different meanings, such as ‘bat’.
So, using a series of sensors to monitor electrical brainwave signals, the team asked participants in the study to listen to a number of audiobooks.
By analysing their brain activity, the neuroscientists identified a specific brain response that reflected how similar or different a given word was from the words that preceded it in the story.
This signal then disappeared when the subject listened to audio where the speech was incomprehensible, either because there was too much background noise or they stopped paying attention.
Testing for stressful roles
There are a number of different potential applications for such a breakthrough, including tracking language development in infants, assessing brain function in unresponsive patients or determining early onset dementia in older people.
Prof Ed Lalor, who led the research, even thinks it could be expanded to include candidate testing for jobs that require quick reaction times.
“The presence or absence of the signal may also confirm if a person in a job that demands precision and speedy reactions – such as an air traffic controller, or soldier – has understood the instructions they have received.”
The team said that there is more work to be done before the science is fully understood, particularly the full range of computations that our brains perform when we understand speech.