Skip to content

hello-robot/stretch_ai

Repository files navigation

Stretch AI

Python 3.9 pre-commit Code style: black Imports: isort

This repository is currently under active development and is subject to change.

It is a pre-release codebase designed to enable developers to build intelligent behaviors on mobile robots in real homes. It contains code for:

  • grasping
  • manipulation
  • mapping
  • navigation
  • LLM agents
  • text to speech and speech to text
  • visualization and debugging

This code is licensed under the Apache 2.0 license. See the LICENSE file for more information. Parts of it are derived from the Meta HomeRobot project and are licensed under the MIT license.

Quickstart

After following the installation instructions, start the server on your robot:

ros2 launch stretch_ros2_bridge server.launch.py

Make sure the core test app runs:

python -m stretch.app.view_images --robot_ip $ROBOT_IP

You should see windows popping up with camera viers from the robot, and the arm should move into a default position. The head should also turn to face the robot hand. If this all happens, you are good to go! Press q to quit the app.

Then, on your PC, you can easily send commands and stream data:

from stretch.agent import RobotClient
robot = RobotClient(robot_ip="192.168.1.15")  # Replace with your robot's IP
# On future connection attempts, the IP address can be left blank

# Turn head towards robot's hand
robot.move_to_manip_posture()

# Move forward 0.1 along robot x axis in maniplation mode, and move arm to 0.5 meter height
robot.arm_to([0.1, 0.5, 0, 0, 0, 0])

# Turn head towards robot's base and switch base to navigation mode
# In navigation mode, we can stream velocity commands to the base for smooth motions, and base
# rotations are enabled
robot.move_to_nav_posture()

# Move the robot back to origin
# navigate_to() is only allowed in navigation mode
robot.navigate_to([0, 0, 0])

# Move the robot 0.5m forward
robot.navigate_to([0.5, 0, 0], relative=True)

# Rotate the robot 90 degrees to the left
robot.navigate_to([0, 0, 3.14159/2], relative=True)

# And to the right
robot.navigate_to([0, 0, -3.14159/2], relative=True)

Apps

After installation, on the robot, run the server:

# If you did a manual install
ros2 launch stretch_ros2_bridge server.launch.py

# Alternately, via Docker -- will be slow the first time when image is downloaded!
./scripts/run_stretch_ai_ros2_bridge_server.sh

Then, try the view_images app to make sure connections are working properly:

Next you can run the AI demo:

Finally:

There are also some apps for debugging.

Installation

Stretch AI supports Python 3.10. We recommend using mamba to manage dependencies, or starting with Docker.

If you do not start with Docker, follow the install guide.

In short, on the PC you will:

# Install Git LFS
sudo apt-get install git-lfs
git lfs install

# Clone the repository
git clone git@github.com:hello-robot/stretch_ai.git --recursive

# Install system dependencies
sudo apt-get install libasound-dev portaudio19-dev libportaudio2 libportaudiocpp0 espeak ffmpeg

# Run install script to create a conda environment and install dependencies
./install.sh

Stretch AI Apps

Stretch AI is a collection of tools and applications for the Stretch robot. These tools are designed to be run on the robot itself, or on a remote computer connected to the robot. The tools are designed to be run from the command line, and are organized as Python modules. You can run them with python -m stretch.app.<app_name>.

Some, like print_joint_states, are simple tools that print out information about the robot. Others, like mapping, are more complex and involve the robot moving around and interacting with its environment.

All of these take the --robot_ip flag to specify the robot's IP address. You should only need to do this the first time you run an app for a particular IP address; the app will save the IP address in a configuration file at ~/.stretch/robot_ip.txt. For example:

export ROBOT_IP=192.168.1.15
python -m stretch.app.print_joint_states --robot_ip $ROBOT_IP

Visualization and Streaming Video

Visualize output from the caneras and other sensors on the robot. This will open multiple windows with wrist camera and both low and high resolution head camera feeds.

python -m stretch.app.view_images --robot_ip $ROBOT_IP

You can also visualize it with semantic segmentation (defaults to Detic:

python -m stretch.app.view_images --robot_ip $ROBOT_IP ----run_semantic_segmentation

You can visualize gripper Aruco markers as well; the aruco markers can be used to determine the finger locations in the image.

python -m stretch.app.view_images --robot_ip $ROBOT_IP --aruco

Dex Teleop for Data Collection

Dex teleop is a low-cost system for providing user demonstrations of dexterous skills right on your Stretch. This app requires the use of the dex teleop kit.

You need to install mediapipe for hand tracking:

python -m pip install mediapipe
python -m stretch.app.dex_teleop.ros2_leader -i $ROBOT_IP --teleop-mode base_x --save-images --record-success --task-name default_task

Read the data collection documentation for more details.

After this, read the learning from demonstration instructions to train a policy.

Automatic 3d Mapping

python -m stretch.app.mapping

You can show visualizations with:

python -m stretch.app.mapping --show-intermediate-maps --show-final-map

The flag --show-intermediate-maps shows the 3d map after each large motion (waypoint reached), and --show-final-map shows the final map after exploration is done.

It will record a PCD/PKL file which can be interpreted with the read_map script; see below.

Another useful flag when testing is the --reset flag, which will reset the robot to the starting position of (0, 0, 0). This is done blindly before any execution or mapping, so be careful!

Voxel Map Visualization

You can test the voxel code on a captured pickle file. We recommend trying with the included hq_small.pkl or hq_large files, which contain a short and a long captured trajectory from Hello Robot.

python -m stretch.app.read_map -i hq_small.pkl

Optional open3d visualization of the scene:

python -m stretch.app.read_map -i hq_small.pkl  --show-svm

You can visualize instances in the voxel map with the --show-instances flag:

python -m stretch.app.read_map -i hq_small.pkl  --show-instances

You can also re-run perception with the --run-segmentation flag and provide a new export file with the --export flag:

 python -m stretch.app.read_map -i hq_small.pkl --export hq_small_v2.pkl --run-segmentation

You can test motion planning, frontier exploration, etc., as well. Use the --start flag to set the robot's starting position:

# Test motion planning
python -m stretch.app.read_map -i hq_small.pkl --test-planning --start 4.5,1.3,2.1
# Test planning to frontiers with current parameters file
python -m stretch.app.read_map -i hq_small.pkl --test-plan-to-frontier --start 4.0,1.4,0.0
# Test sampling movement to objects
python -m stretch.app.read_map -i hq_small.pkl --test-sampling --start 4.5,1.4,0.0
# Test removing an object from the map
python -m stretch.app.read_map -i hq_small.pkl --test-remove --show-instances --query "cardboard box"

Pickup Objects

This will have the robot move around the room, explore, and pickup toys in order to put them in a box.

python -m stretch.app.pickup --target_object toy

You can add the --reset flag to make it go back to the start position. The default object is "toy", but you can specify other objects as well, like "bottle", "cup", or "shoe".

python -m stretch.app.pickup --reset

Development

Clone this repo on your Stretch and PC, and install it locally using pip with the "editable" flag:

cd stretch_ai/src
pip install -e .[dev]
pre-commit install

Then follow the quickstart section. See CONTRIBUTING.md for more information.

Updating Code on the Robot

See the update guide for more information. There is an update script which should handle some aspects of this. Code installed from git must be updated manually, including code from this repository.

Docker

Docker build and other instructions are located in the docker guide. Generally speaking, from the root of the project, you can run the docker build process with:

docker build -t stretch-ai_cuda-11.8:latest .

See the docker guide for more information and troubleshooting advice.

Acknowledgements

Parts of this codebase were derived from the Meta HomeRobot project, and is licensed under the MIT license. We thank the Meta team for their contributions.

The stretch_ros2_bridge package is based on the OK robot project's Robot Controller, and is licensed under the Apache 2.0 license.

We use LeRobot from HuggingFace for imitation learning, though we use our own fork.

License

This code is licensed under the Apache 2.0 license. See the LICENSE file for more information.

About

No description, website, or topics provided.

Resources

License

Apache-2.0, MIT licenses found

Licenses found

Apache-2.0
LICENSE
MIT
META_LICENSE

Stars

Watchers

Forks

Packages

No packages published

Languages