This package presents a complete software pipeline for picking thin,rigid objects. This perception-to-manipulation system for picking is fully autonomous system, incorporating visual perception module for obejct detection and localization with force/torque sensing module for environment perception. The manipulation module is the core component of the system, which is an implementation of Tilt-and-Pivot manipulaiton technique: a novel robotic object handling technique for picking thin objects lying on a flat surface through robotic dexterous, in-handmanipulation. Picking thin objects is an important manipulation capability with a wide range of pratical applications such as bin picking tasks, product packaging tasks.
This ROS package is directly applicable to an ordinary robotic setting featuring the conventional two- or three-fingered grippers installed on UR10 robot arm. This system is demonstrated to be used for picking a range of objects: acrylic borad, plastic container lid, paper carton and can be applied on bin picking problem, in which individual objects are picked one by one out of clutter.
Authors: Zhekai Tong (ztong@connect.ust.hk), Tierui He (theae@connect.ust.hk), Chung Hee Kim (chkimaa@connect.ust.hk), Yu Hin Ng (yhngad@connect.ust.hk), Qianyi Xu (qxuaj@connect.ust.hk), and Jungwon Seo
- Universal Robot UR10
- Robotiq 140mm Adaptive parallel-jaw gripper
- Robotiq FT300 Force/Torque Sensor
- Realsense Camera SR300
Our software is developed in Ubuntu 16.04. ROS Kinetic
- ROS Kinetic
- Driver for UR10 robot arms from universal robots
- Universal Robot package for ROS Kinetic
- MoveIt!
- Robotiq ROS package
- Mask R-CNN
- AprilTag ROS package
.ipynb
files can be run in jupyter notebook.
In your catkin workspace:
cd ~/catkin_ws/src
git clone https://github.com/HKUST-RML/pickpack.git
cd ..
catkin_make
-
Picking acrylic board script:
Note: In multiple boards bin picking scenario, run
~/catkin_ws/src/pickpack/Mask_RCNN/scripts/board_detection.ipynb
first for instance detection and segmentation. Refer to Mask_RCNN/samples folder to train the vision perception module with your dataset.For single object picking scenario, run the following code:
cd ~/catkin_ws/src/pickpack/scripts
jupyter notebook
Open
picking_acrylic_board.ipynb
-
Picking carton board script:
cd ~/catkin_ws/src/pickpack/scripts
jupyter notebook
Open
picking_carton.ipynb
Note: In multiple carton boards bin picking scenario, run
~/catkin_ws/src/pickpack/Mask_RCNN/scripts/carton_detection.ipynb
first for instance detection and segmentation. -
Picking book script:
cd ~/catkin_ws/src/pickpack/scripts
jupyter notebook
Open
picking_book.ipynb
-
Opening container lid script:
cd ~/catkin_ws/src/pickpack/scripts
jupyter notebook
Open
opening_boxlid.ipynb
-
Picking acrylic board script with 3-finger gripper:
cd ~/catkin_ws/src/pickpack/scripts
jupyter notebook
Open
picking_acrylic_board_with_3finger_gripper.ipynb
- Start UR10 Robot-Gripper Scene
roslaunch tilt_pivot_collision_check demo.launch
- Run collision check script
rosrun tilt_pivot_collision_check tilt_pivot_collision_check.py
If no collision is involved, the collision checker will visualize the path without any warning.
If a collision occurs, for example, there is collision between the joint of the robot and ground surface as shown in the figure below, the collision area will be denoted as red color.
The parameters of the Tilt-and-Pivot process can be specified in every .ipynb
file in scripts folder.
The parameters are as follows:
- Robot Parameter
- global_speed: Robot tool center point (TCP) speed
- Object Dimension
- object_length: object length in meters
- Tilt-and-Pivot configuration:
- psi: Angle between ground and effector before tilt phase
- phi: Angle between object and ground in tilt phase
- alpha, beta, gamma: Rotation angles about the contact point in pivot phase
For any technical issues, please contact: Zhekai Tong (ztong@connect.ust.hk), Tierui He (theae@connect.ust.hk), Qianyi Xu (qxuaj@connect.ust.hk) and Yu Hin Ng (yhngad@connect.ust.hk).