The Robot Perception and Learning (RPL) Lab researches robots with limbs (e.g., legged) to function in challenging environments.
Our research focuses on perception and learning for robotics, and in particular, on new estimation and planning algorithms for mobile and articulated robots that locomote and manipulate in uncertain natural environments. We focus on developing new theoretical results on sensing, real-time map building, and self/environment modeling of surface contact areas which include statistical models of uncertainty, for the purpose of mobile robot navigation, articulated manipulation, and legged locomotion.
Twitter | YouTube | LinkedIn | facebook
-
"ASFM - Augmented Social Force Model for Legged Robot Social Navigation" [Webpage, Repo]
-
Learning-based Grasping of Grocery Objects with Novel Gripper (Papers: RL grasping method, RL design optimisation, Gripper design)
Training repository | Deployment repositories |
---|---|
Train grasping agents in simulation using RL | Deploy with ROS, Gripper arduino code |