In a room crowded with people, we’re still able to tune into the voice of just one person — even in a crowded room full of masked people, like at a pandemic-era holiday get-together. Despite the cacophony of chatter, somehow, we can pick up the sounds of a single speaker.
Researchers from the University of Rochester Medical Center have discovered fresh insight into how the brain might deliberately hear one speaker while shutting out or ignoring all others.
“Our findings suggest that the acoustics of both the attended story and the unattended or ignored story are processed similarly. But we found there was a clear distinction between what happened next in the brain.”
Edmund Lalor
According to their recent study, the brain takes an extra step to grasp the words coming from the speaker being listened to but doesn’t take that step with the other words whirling around the dialogue. To put it simply, the brain hears everyone, but listening is another step entirely.
“Our findings suggest that the acoustics of both the attended story and the unattended or ignored story are processed similarly,” Edmund Lalor, who led the research, said in a statement. “But we found there was a clear distinction between what happened next in the brain.”
What they did: Participants in the study, which was published in The Journal of Neuroscience, were asked to listen to two stories at the same time but focus their attention on only one.
Using EEG brainwave recordings, the researchers found that the participant’s brains converted the story they were concentrating on into linguistic units known as phonemes — distinct units of sound that can distinguish between words.
“That conversion [into phonemes] is the first step towards understanding the attended story,” Lalor says. “Sounds need to be recognized as corresponding to specific linguistic categories like phonemes and syllables so that we can ultimately determine what words are being spoken.”
The study demonstrated that in a multi-speaker scenario, EEG brainwave waves could be used to detect who a person was paying attention to.
Sound in the brain: This isn’t the first time the team used an EEG to understand how the brain processes sound. Last year they used it to identify and track an individual’s brain waves that indicate if they understand what others are saying.
People speak on average about 120 to 200 words per minute. That is a lot of words, when you think about it — the fact that we can follow along and understand their meaning in fractions of a second is impressive.
Now, the ability to use an EEG to recognize that something has been heard, combined with the ability to realize that it was understood, could have many research applications — like testing for dementia or studying infant language development. According to the researchers, it could even be used to ensure that someone in a high-risk position (such as a pilot or soldier) understands the orders they’ve been given.
We’d love to hear from you! If you have a comment about this article or if you have a tip for a future Freethink story, please email us at tips@freethink.com.