Skip to content

Automated Pick and Placing of Colored cubes for UFactory Lite 6 using YOLO Object Recognition

Notifications You must be signed in to change notification settings

mNandhu/BlockPicking

Repository files navigation

BlockPicking

Mapping Camera to World pos and picking up a block

This Project is an implementation for control algorithms for UFactory Lite 6 robot. Picking up a block requires,

  1. Identifying a block using YOLO model
  2. Finding the center of the box (Px,Py)
  3. Converting Pixel Coordinates to Robot Coordinates (Rx,Ry)
  4. Instructing UArm to move to the position and pick it up and place it on a fixed position

Methodology

YOLO model has been trained the model on a public dataset available on Roboflow. Find it here

Conversion of Pixel Coordinates to Robot Coordinates is done by using Regression

Using Linear Regression - PseudoInverse

A set of points are taken, by using the robot to place the block randomly within a range of camera. Then, the pixel coordinates of that point and the robot coordinates are taken. Using those points, we find the transformation matrix from Pixel to Robot coordinates.

Plot showing Px vs Rx and Py vs Ry

Assuming only rotation of the coordinate frame. $$ \begin{bmatrix} R_x \ R_y \end{bmatrix} = \begin{bmatrix} m_1 & c_1 \ m_2 & c_2 \end{bmatrix} \begin{bmatrix} P_x \ P_y \end{bmatrix} $$

We find the transformation by, $$ R*P^{\dagger} = w $$

where $ {\dagger} $ is the PseudoInverse

Using Multivariate Regression to account for rotation

$$ \begin{bmatrix} R_{1_x} & R_{2_x} & \cdots & R_{n_x} \\ R_{1_y} & R_{2_y} & \cdots & R_{n_y} \\ 1 & 1 & \cdots & 1 \end{bmatrix} = \begin{bmatrix} m_{1_x} & m_{1_y} & c_1 \\ m_{2_x} & m_{2_y} & c_2 \\ 0 & 0 & 1 \end{bmatrix} \begin{bmatrix} P_{1_x} & P_{2_x} & \cdots & P_{n_x} \\ P_{1_y} & P_{2_y} & \cdots & P_{n_y} \\ 1 & 1 & \cdots & 1 \end{bmatrix} $$ $$ R*P^{\dagger} = w $$

where $P$ is the homogeneous transformation matrix for 2D

Installation

pip install -r requirements.txt

Usage

  1. Run test_movement.ipynb to test if the robot is connected and can move.
  2. Run coord_calibration.ipynb to calibrate the camera and robot coordinate system.
  3. Run live_detection.ipynb to start the picking up process.

About

Automated Pick and Placing of Colored cubes for UFactory Lite 6 using YOLO Object Recognition

Resources

Stars

Watchers

Forks