A specially trained “decoder” slashes the time it takes a brain-computer interface (BCI) to start working, which could potentially make the tech more accessible to more people.
The challenge: BCIs translate brain signals into commands for computers, robotic limbs, and other machines. Some use implants to detect these signals, while others rely on noninvasive tech, like EEG caps.
One thing they all have in common, though, is the need for calibration — no two brains are identical, and before a person can use a BCI, it needs to be trained on their specific brain data.
“You do [it] 600, 700 times until you want to cry.”
Perrikaryal
This usually means thinking about the same thing over and over again until the BCI learns to recognize the pattern of brain waves generated by the thought — a time-consuming, tedious process.
“You do [it] 600, 700 times until you want to cry,” Perrikaryal, a Twitch streamer who trained a BCI to allow her to play video games with her mind, told Freethink, laughing. “It’s fueled by sadness.”
Only after the BCI is trained to recognize the pattern can you then program it to do something in response to it, like move a cursor on a screen, control a prosthetic arm, or attack a baddie in a video game.
What’s new? Engineers at UT Austin have now developed a “one-size-fits-all” BCI that works without extensive calibration, potentially eliminating a major hurdle to using the systems. They’ve published their work in PNAS Nexus.
“When we think about this in a clinical setting, this technology will make it so we won’t need a specialized team to do this calibration process, which is long and tedious,” said researcher Satyam Kumar. “It will be much faster to move from patient to patient.”
How it works: The UT Austin team recorded brain data from one person as they attempted to use a non-invasive BCI to perform a simple task on a computer screen with their mind. The researchers then used that data to create a machine learning program they call a “decoder.”
Over the course of five online sessions, 18 healthy volunteers attempted to use the BCI to perform the same task. Their brain data was sent through the decoder, which served as a sort of base for the system, allowing it to determine their intention without any extensive calibration.
The decoder was so effective, the volunteers were even able to use the BCI to play a more complex racing game with their minds during the online sessions, even though the original trainee had never played that specific game.
Looking ahead: Neither task was particularly complex — even in the racing game, volunteers only had the option of sending their car left or right — but the UT Austin team plans to continue developing their decoder and then trial it for people with motor disabilities.
“The point of this technology is to help people, help them in their everyday lives,” said researcher José del R. Millán. “We’ll continue down this path wherever it takes us in the pursuit of helping people.”
We’d love to hear from you! If you have a comment about this article or if you have a tip for a future Freethink story, please email us at [email protected].