(August 18, 2015) Scientists working at Korea University, Korea, and TU Berlin, Germany have developed a brain-computer control interface for a lower limb exoskeleton by decoding specific signals from within the user’s brain.
Using an electroencephalogram (EEG) cap, the system allows users to move forwards, turn left and right, sit and stand simply by staring at one of five flickering light emitting diodes (LEDs).
The results are published today (Tuesday 18th August) in the Journal of Neural Engineering.
Each of the five LEDs flickers at a different frequency, and when the user focusses their attention on a specific LED this frequency is reflected within the EEG readout. This signal is identified and used to control the exoskeleton.
Figure 1. (a) Components of an SSVEP-based exoskeleton system. (b) State machine diagram.
(c) Visual stimulation unit. (source: iopscience journal / link below)
A key problem has been separating these precise brain signals from those associated with other brain activity, and the highly artificial signals generated by the exoskeleton.
“Exoskeletons create lots of electrical ‘noise’” explains Klaus Muller, an author on the paper. “The EEG signal gets buried under all this noise – but our system is able to separate not only the EEG signal, but the frequency of the flickering LED within this signal.”
Although the paper reports tests on healthy individuals, the system has the potential to aid sick or disabled people.