Odest Chadwicke Jenkins is an Associate Professor at CSE, where he leads the Laboratory for Progress (Perception, Robotics, and Grounded Reasoning Systems). His research aims to discover methods for computational reasoning and perception that will enable robots to effectively assist people in common human environments.
Prof. Jenkins' active projects include perceptual reasoning for goal-directed robotic manipulation, interactive systems for assisted robot teleoperation, and independent living technologies for aging populations.
His past projects include robotic person following and gesture recognition, robot learning from demonstration, physics-based tracking of human motion from video, protocols and libraries for web/cloud robotics, markerless model and motion capture, inertial motion capture, balance control of simulated humanoids, and humanoid imitation learning.
In 2013, Prof. Jenkins presented at the National Geographic Explorers Symposium about his work in robotics, where he stated that the impact of robotics is in an early stage and predicted that its growth and pervasiveness would mirror the impact of the computer. He envisioned a World Wide Web of Robotics in which users would be able to leverage web-based apps that would allow them to control robots through customized interfaces to accomplish remote or otherwise challenging tasks.
In one example, he spotlighted his work with Henry Evans, an individual who had become a quadriplegic in 2002. Prof. Jenkins' team developed an interface based upon Mr. Evan's physical constraints that enabled him to use a custom web interface to pilot a quadrotor drone in order to navigate and view spaces beyond his reach.
In a paper presented at the 2015 IEEE Conference on Intelligent Robots and Systems, Prof. Jenkins and his collaborators made contributions toward the long-standing challenge in robotics of executing manipulation tasks involving sequential pick-and-place actions in human environments. Central to this problem is the inability for robots to perceive in cluttered environments where objects are physically touching, stacked, or occluded from view, which prevents robots from distinguishing individual objects and impairs their ability to execute desired pick-and place-actions. In the paper, the researchers introduce the Axiomatic Particle Filter as a method for simultaneously perceiving objects in clutter and performing sequential reasoning for manipulation.Prof. Jenkins has a strong belief in the importance of shared knowledge to facilitate progress. He has stated that, "Our job is to build things, to build knowledge, to create ideas – even crazy ideas – and put them out there so that others can develop the next generation technology, the next generation set of ideas."
Prof. Jenkins' interest in robots and computer science has roots in an early love of video games. When he was a child, his parents bought him an Atari 2600 game system; he loved it and soon began dreaming about how he might create his own games. Learning to program video games and to think about the creation of virtual worlds led to an interest in coding for the real world – and robots, as systems in the real world.
Prof. Jenkins' research has been published in numerous scholarly and professional journals, including Human-Robot Interaction and Computer Vision and Pattern Recognition. In 2010, he co-authored the book Creating Games: Mechanics, Content, and Technology. He has served on the editorial board for International Journal of Robotics Research, and the International Journal of Humanoid Robotics.
Prof. Jenkins received his PhD in Computer Science at the University of Southern California in 2003. He served on the faculty of Brown University in Computer Science from 2004–2015 and joined the faculty at Michigan in 2015.
Prof. Jenkins has been recognized as a Sloan Research Fellow in 2009 and as one of Popular Science's "Brilliant 10" in 2011. He is a recipient of the Presidential Early Career Award for Scientists and Engineers (PECASE) for his work in physics-based human tracking from video. His work has also been supported by Young Investigator awards from the Office of Naval Research (ONR) for his research in learning dynamical primitives from human motion, the Air Force Office of Scientific Research (AFOSR) for his work in manifold learning and multi-robot coordination, and the National Science Foundation (NSF) for robot learning from multivalued human demonstrations.
Posted: February 3, 2017