Neuromorphic hardware for robots

Mobile land and aerial robots suffer from the large quantities of data generated by their sensors and the inability of these devices to process and evaluate this data locally. To solve this problem we suggest the combination of two ideas. First, to evaluate only the sensor information that seems most relevant to solve the current task and, second, to implement biologically inspired algorithms in customized hardware to meet computation time, power, and weight constraints not possible to fulfill with general purpose hardware.

input to robot visual system optic flow

Optic flow is a powerful cue to help a robot navigate, avoid obstacles, and estimate its distance from salient objects in its environment.

Data from a video stream of a passive sensor provides rich information about the environment and the movement of the sensor. Optic flow and models thereof are key concepts to access this information. We develop biologically inspired algorithms for the computation of optic flow from video data, the extraction of information about sensor movement and environment from computed optic flow, and the integration of this extracted information into a reinforcement learning strategy to train for obstacle avoidance and, thus, save navigation in small cluttered environments. In collaboration with the Integrated Circuits and System Design Group at the Boston University Electrical and Computer Engineering Department we deploy our algorithms to Field Programmable Gate Array (FPGA) preserving the flexibility to make changes on algorithms. Refined algorithms are used to design Application Specific Integrated Circuits (ASIC).

The long term goal is to develop and test neuromorphic hardware that adapts its behavior to different environments using visual navigation of land and aerial vehicles as an example.

Experimental settings and results

Abstracts & Conference papers


I am the leader of the Neuromorphics Lab, a highly collaborative lab with connections across both academia and industry.