Accompanying code for training VisuoSkin policies as described in the paper:
Learning Precise, Contact-Rich Manipulation through Uncalibrated Tactile Skins
ViSk is a framework for learning visuotactile policies for fine-grained, contact-rich manipulation tasks. ViSk uses a transformer-based architecture in conjunction with AnySkin and presents a significant improvement over vision-only policies as well as visuotactile policies that use high-dimensional tactile sensors like DIGIT.
- Clone this repository
git clone https://github.com/raunaqbhirangi/visuoskin.git
- Create a conda environment and install dependencies
conda create -f env.yml
pip install -r requirements.txt
-
Move raw data to your desired location and set
DATA_DIR
inutils.py
to point to this location. Similarly, setroot_dir
incfgs/local_config.yaml
. -
Process data for the
current-task
(name of the directory containing demonstration data for the current task) and convert to pkl.
python process_data.py -t current-task
python convert_to_pkl.py -t current-task
-
Install
xarm-env
usingpip install -e envs/xarm-env
-
Run BC training
python train_bc.py 'suite.task.tasks=[current-task]'