The Robot Perception and Learning (RPL) Lab researches robots with limbs (e.g., legged) to function in challenging environments.
Our research focuses on perception and learning for robotics, and in particular, on new estimation and planning algorithms for mobile and articulated robots that locomote and manipulate in uncertain natural environments. We focus on developing new theoretical results on sensing, real-time map building, and self/environment modeling of surface contact areas which include statistical models of uncertainty, for the purpose of mobile robot navigation, articulated manipulation, and legged locomotion.