About the Event
Over the years, we have shown that detailed predictive information of the arm's trajectory can be extracted from populations composed of single unit recordings from motor cortex. By developing techniques to record these populations and process the signal in real-time, we have been successful in demonstrating the efficacy of these recordings as a control signal for intended movements in 3D space. Having shown that closed-loop control of a cortical prosthesis can produce very good brain-controlled movements in virtual reality, we have been extending this work to robot control. By introducing an anthropomorphic robot arm into our closed-loop system, we have shown that a monkey can easily control the robot's movement with direct brain-control while watching the movement in virtual-reality. The animal learned this rapidly and produced good movements in 3D space. The next step was to have the animal visualize and move the arm directly without the VR display. This was much more difficult for the animal to learn, as it seemed to have difficulty understanding that the robot was to act as a tool. After the animal was trained, it was able to use the robot to reach for hand-held (by the investigator) targets. We are now training monkeys and developing hardware and software to demonstrate a prosthetic device that can be used to reach out for food targets at different locations in space, and to retrieve them so they can be eaten.