Restoration of Whole Body Movement: Toward a Noninvasive Brain-Machine Interface System
This article highlights recent advances in the design of noninvasive neural interfaces based on the scalp electroencephalogram (EEG). The simplest of physical tasks, such as turning the page to read this article, requires an intense burst of brain activity. It happens in milliseconds and requires little conscious thought. But for amputees and stroke victims with diminished motor-sensory skills, this process can be difficult or impossible.
Our team at the University of Maryland, in conjunction with the Johns Hopkins Applied Physics Laboratory (APL) and the University of Maryland School of Medicine, hopes to offer these people newfound mobility and dexterity. In separate research thrusts, we’re using data gleaned from scalp EEG to develop reliable brain–machine interface (BMI) systems that could soon control modern devices such as prosthetic limbs or powered robotic exoskeletons.
EEG externally measures electrical activity generated by large neural networks in the brain, and research in our laboratory was the first to demonstrate the feasibility of inferring voluntary natural movement from EEG signals, essentially decoding human brain activity used for physical movement (Figure 1). While similar but invasive neural interface technology under development allows users to think commands that are sent to sophisticated upper- or lower-limb prosthetics or used to control computer cursors, we recently reported the first EEG-based neural interface (needing only a single training session before subjects can operate it) that employs continuous decoding of imagined, continuous hand movements.

A noninvasive EEG-based neural interface is easier to repair or replace, if needed, and the technology is very user friendly requiring only a fabric cap and the slight inconvenience of some goo on a person’s head where the sensors are attached. (Photograph by John T. Consoli, University of Maryland.)
Noninvasive Neural Decoding of Movement
Decoding of Upper-Limb Movements
There are more than 1.8 million amputees in the United States according to the National Limb Loss Information Center. At least 1,600 of them are U.S. veterans wounded in Iraq and Afghanistan, and their rehabilitation is a priority of the Defense Advanced Research Projects Agency (DARPA), the military’s research and development entity.
Our team is in the preliminary stages of pairing EEG-based findings with the ongoing research at APL in Laurel, Maryland. Here, a team of engineers and medical experts are working on the DARPA-funded modular prosthetics limb (MPL), the next-generation robotic arm that functions like a normal limb. The APL team is building the arm, while our group is working on one of the options for the control system using our EEG-based decoders. Our team members come from backgrounds in bioengineering, electrical engineering, neuroscience, kinesiology, as well as the university’s program in neuroscience and cognitive science.
To design the neural interfaces, we use EEG to acquire brain signals that will be fed to neural decoders (filters) that translate them into control signals for driving the dexterous finger movements of the robotic arm. The filters are designed based on the data collected during a short calibration period that involves imagined and observed movement.
Though EEG monitoring is safer than other approaches, many in the scientific community had deemed it unreliable for a brain-computer interface, mainly because they believed that the human skull blocks much of the detailed brain activity needed for precision-controlled prosthetics. An article we published in Journal of Neuroscience [1] questioned that premise, showing that we could capture and decode three-dimensional hand motions from the amplitude modulations of the smoothed EEG signals in the lower frequency delta band (<4 Hz) emanating from the scalp. Our method produces data comparable to almost any invasive method—data that can be used to drive complex robotic devices like the MPL.
To get these data, we instructed subjects to randomly select and reach eight different targets on a computer screen. Horizontal, vertical, and depth velocities were then recorded, and EEGs were used to predict their trajectories. Also, standardized low-resolution brain electromagnetic tomography (sLORETA) showed that distributed current density sources related to hand velocity are concentrated in the contralateral precentral gyrus, postcentral gyrus, and inferior parietal lobule regions of the brain. These brain areas are typically associated with sensorimotor performance.
In subsequent articles and presentations, we provided new validation of the effectiveness of EEG signals. An article in Journal of Neural Engineering [2] demonstrated that people using a noninvasive EEG-based brain-computer interface, with minimal training, were able to control a computer cursor with performance comparable to that of invasive brain-computer interfaces using implanted electrodes.
Decoding of Human Bipedal Locomotion
In the work presented at the 2010 Neural Engineering, Science, and Technology Forum sponsored by DARPA, we showed that the linear and angular movements of the ankle, knee, and hip joints during human treadmill walking can be inferred from high-density scalp EEG with decoding accuracies comparable to that from similar studies of nonhuman primates with electrodes implanted in their brains [3], [4] (Figure 2).

A representative scalp map of the spatial distribution (r2) of decoding accuracies across scalp electrodes for the joint angle of the right ankle joint in a healthy individual. Note the sparse network, in the EEG sensor space, that contains information about the ankle position.
We like to think that our interface will come in two flavors. One is for the restoration of function and includes robotics work with the Johns Hopkins laboratory. The other is for rehabilitation, specifically for stroke victims with brain injuries that affect motor-sensory control or patients with spinal cord injury (SCI). There is a big push in brain science to understand what exercise does in terms of motor learning or motor retraining of the human brain.
In the past year, we have partnered with the colleagues in the Department of Physical Therapy and Rehabilitation at the University of Maryland School of Medicine in Baltimore. The research, funded in part by the National Institutes of Health and a seed grant program between the University of Maryland and the University of Maryland, Baltimore, has tracked the neural activity of people in motion.
Subjects were instructed to walk for five minutes on a treadmill, while EEG and kinematic parameters, such as Cartesian positions, joint angle, and angular velocity, were recorded. The EEG and kinematics signals were filtered offline (in the delta lower frequency band), and a linear decoder (a Wiener filter with memory—signals up to 100 ms in the past were used as inputs to the decoder) trained offline to find the optimal weights to decode the kinematic parameters [4] (Figure 3).
Decoded ankle joint kinematics of a healthy able-bodied participant using a linear decoder. Blue is the joint angle recorded during treadmill walking, and red is the joint angle of the ankle that we predicted using our EEG-based neural interface.
Once again, we are intent on matching specific brain activity recorded in real time with exact intended lower-limb movements. We believe that these data could help stroke victims in several ways. One is a prosthetic device called an anklebot or ankle robot that currently stores data from a normal human gait and assists partially paralyzed people. We are working to augment the anklebot with our BMI to actively include the patient in the control loop, thereby enhancing cortical plasticity. Providing movement feedback to the patient is also important for rehabilitation, and we are investigating ways to reflect motion and contact signals from the anklebot back to the patient. People who are less mobile commonly suffer from other health issues such as obesity, diabetes, or cardiovascular problems, so rehabilitation experts normally want to get stroke survivors up and moving by whatever means possible.
The second use of the EEG data in stroke victims is more complex but still offers exciting possibilities. By decoding the motion of a normal gait, rehabilitation experts can try and teach stroke victims to think certain ways and match their own EEG signals with the normal signals. This could retrain healthy areas of the brain through neuroplasticity.
In addition to improving the rehabilitation of stroke patients, our BMIs could be used to control lower-limb powered exoskeletons to restore walking after SCI. In this scenario, EEG decoders are calibrated using visual perception of unsteady locomotion such as an observation of an avatar turning during walking, stepping up or down stairs, and so on (Figure 4). These decoders could then be used to control smart powered lower-limb exoskeletons fitted to the SCI patients to regain walking capabilities. It is likely that a shared control approach that allows human-machine confluence will be critical to minimize cognitive effort and achieve multitasking capabilities (Figure 5).

Noninvasive BMI systems for the control of walking avatars that could soon control sophisticated prosthetic devices. Shown are bioengineering student Steve Graff (foreground, with cap), kinesiology doctoral student Alessandro Presacco (center), and lead researcher and electrical engineer José L. Contreras-Vidal, Ph.D (background). (Photograph by John T. Consoli, University of Maryland.)

Demonstration of the Rex-powered robotic exoskeleton (Rex Bionics) by a wheelchair user wearing a wireless 64-channel active EEG cap used in our neural interface development. Shown are the time series of the user’s brain waves as recorded by the EEG cap. From left: Faisal Almesfer, Steve Holbert, and Jedy Shishbaradaran. (Photograph by Joy Wilson, Department of Health and Human Performance, University of Houston.)
Conclusions
We strongly believe that our laboratory is on track to develop, test, and make available to the public (within the next few years) safe, reliable, and noninvasive BMI to robotic systems that can bring life-changing technology to millions of people who have difficulty generating uninhibited movement.
To accomplish this vision, our team is now pursuing new meaningful partnerships with The Methodist Hospital Research Institute (Houston, Texas) and the Department of Electrical and Computer Engineering at the University of Houston (UH)—the new home for our Laboratory for Noninvasive BMI Systems. We are collaborating with Rex Bionics, innovators in advanced technology robotics, to provide a noninvasive neurological interface to their independent robotic exoskeleton [Rex (www.rexbionics.com), see Figure 5]. This partnership will demonstrate the capability to achieve thought-controlled (hands free) exoskeleton use and prove medical benefits and quality of life enhancements realized by exoskeleton users. We will be teaming with our clinical partners at Methodist and engineers at UH to join efforts to conduct clinical trials of BMI to robotic systems to assess safety and efficacy, and use our noninvasive neural interfaces as tools for reverse-translational studies of brain plasticity and human-machine interaction/ confluence.
Acknowledgments
The University of Maryland Communications Specialists Lee Tune and Tom Ventsias contributed to this article. This work was supported in part by National Institute of Neurological Disorders and Stroke under award number R01NS075889, National Science Foundation under award no IIS-1064703, the University of Maryland at College Park-University of Maryland at Baltimore Seed Grant Program, and the National Academies Keck Future Initiative on Smart Prosthetics.
Contributor Information
José L. Contreras-Vidal, Department of Electrical and Computer Engineering, University of Houston, Texas.
Alessandro Presacco, Department of Kinesiology, University of Maryland, College Park, Maryland.
Harshavardhan Agashe, Department of Electrical and Computer Engineering, University of Houston, Texas.
Andrew Paek, Department of Electrical and Computer Engineering, University of Houston, Texas.

