This is your brain on a computer
- By Kevin McCaney
- May 11, 2015
An EEG-based based brain-computer interface enables direct communication.
The human interface with a computer has gone from the keyboard to the mouse to a touchpad to a still-uncertain flirtation with gesture control. Could thought control be next?
The Army Research Laboratory is taking the possibility seriously, exploring ways to meld the mind with software in a brain-computer interface, or BCI. Although it’s a long way off—requiring both human training and serious algorithm upgrades—researchers think it has the potential to transform applications ranging from medical treatments to how soldiers communicate on the battlefield.
BCI is not entirely new, although to date it’s mostly been used with people who are paralyzed and can communicate only with their eyes. But because of improvements in both computing technologies and neuroscience, they can use their thoughts to perform tasks such as writing, making a phone call or controlling a robotic arm, according to an ARL release.
ARL, along with teams of university researchers, now is researching how to expand on those capabilities for wider use as part of a multi-million dollar effort.
“ARL recognizes that BCI is an emerging area with a high potential for revolutionizing the way we communicate with machines and that the potential exists for larger scale real-world applications such as brain-based communication through everyday devices,” said Dr. Liyi Dai, program manager in the Computer Sciences Division at ARL’s Army Research Office in Research Triangle Park, N.C.
The challenges facing the research involve both the methods of recording brain activity and the software algorithms necessary to interpret that activity.
The two primary methods for recording brain activity—the non-invasive electroencephalography, or EEG, which involves electrodes placed on the scalp, and the invasive electrocorticography, or ECoG, in which electrodes are placed on the exposed surface of the brain—have worked only in laboratory settings and aren’t at the moment suitable for everyday use, ARL said.
As well, current algorithms can’t interpret a very wide range of activities and lack a feedback mechanism that could help train the user in working with the software. ARL researchers are working on “creating advanced computation algorithms so that, with the new algorithms, BCI capabilities are moving a step closer toward real applications,” Dai said. “The new algorithms put greater emphasis on the dynamics of brain signals and the interaction of different parts of the brain.”
The initiative is in its nascent stages and will likely take many years to come to fruition but, ultimately, researchers hope that it could lead to military systems one day being controlled by thought alone.
The two specific projects within the initiative are pointed that way. The first, called “A Brain-Based Communication and Orientation System,” aims to develop a prototype system to detect imagined speech and monitor a user’s attention and orientation through real-time recordings of brain activities. The second, “Silent Spatialized Communication among Dispersed Forces,” is studying the physiological biomarkers of brain signals to detect imagined speech, which would help design the algorithms necessary to allow such silent communication.
“Progress in BCI based communication is potentially of great importance to the warfighter because BCIs would eliminate the intermediate steps required in traditional human-machine interfaces,” Dai said. “Having a soldier gain the ability to communicate without any overt movement would be invaluable both in the battlefield as well as in combat casualty care.”
Kevin McCaney is a former editor of Defense Systems and GCN.