One main focus is to bring research out of the lab, with a positive impact to society. In a recent project, I am working with several students to design a brain-machine interface (BMI) and robotic technology where users are assisted by robots able to navigate and grasp objects with the use of off-the-shelf EEG and robotic components, with potential applications for assistive devices for elderly and disabled patients.
Specifically, the agent is an iRobot Create enhanced with a rotatable camera and robotic arm. Using EEG signals, subjects can quickly learn to direct the robot to a desired location in a room. At the current stage, the robot uses bio-inspired visual attention mechanisms to select target objects to be picked up by the robotic arm, and the user uses EEG signal to select the target object. Planned expansions of the visual system will allow the user to simply name a target object among a large class of objects that the robot has learned to classify. To accomplish this complex task we employ a two-way co-adaptation paradigm where both the subject and robot adapt to each other.
The subject learns to use EEG signals to improve control by practicing movements; this type of learning has proven crucial to BMI performance in other domains. A computer onboard the robot uses adaptive algorithms that employ synaptic plasticity to a) learn visually-guided reaching with a robotic arm via a phase of motor babbling, and later b) learn to classify objects of interest independently from their view. The project has significant potential for clinical applications. Elderly individuals and patients suffering from severe motor impairment, such as disabled veterans or patients that suffered from a stroke, could regain some agency through control of these mobile devices, increasing their quality of life. Novel commercial applications for healthy subjects are also possible.
The following NSF Science Nation features the CELEST work on BCI, among which is the work of the Neuromorphics Lab on adaptive robotics and BCI.
Shown below is a preliminary video of the iRobot Create with a robotic arm controlled by an EEG-based BMI. The video below depicts the first successful attempts by Byron Galbraith and Sean Lorenz to control a robot with BCI.
Abstracts & Conference papers
- Galbraith B., Versace M., and Chandler B. (2011) Asimov: Middleware for Modeling the Brain on the iRobot Create. Submitted to PyCon, Atlanta, March 11-13th 2011.