Prof. Chad Jenkins has been awarded an NSF National Robotics Initiative grant of $400,000 for his project, “Sketching Geometry and Physics Informed Inference for Mobile Robot Manipulation in Cluttered Scenes.”
Under the grant, Prof. Jenkins will improve the ability of robots to manipulate and interact with objects, such as when assisting people to support their daily activities. The key idea is that people can provide robots with important information about their environment and the objects within their environment. Specifically, the project will develop a natural user interface that enables people to provide such information by drawing and sketching on top of the robot’s view of the world. Physical simulation will then be used to fill in the missing gaps needed for a robot to complete autonomous manipulation tasks.
This project aims to improve goal-directed dexterous robotic manipulation in cluttered and unstructured environments through sketching and physical simulation. Robots operating in human environments face considerable uncertainty in perception due to physical contact and occlusions between objects. This project will address such perceptual uncertainty by combining methods for probabilistic inference with natural sketch-based interfaces to extract, label, and automatically infer the geometry, pose, and behavior of objects in complicated scenes.
More information about the project is available in the NSF award posting.
Chad is an experimental roboticist focusing on human-robot interaction. His research explores methods that enable robots to learn human skills and in recent years, his work has focused on the exploration of robot systems for assisting disabled people. His work has been supported through prestigious awards, including an NSF PECASE award, an ONR Young Investigator Award, and a Sloan Research Fellowship.