Zihao Yan · Fubao Su · Mingyang Wang · Hao Zhang · Ruizhen Hu · Hui Huang*
A fully automatic, active 3D reconstruction method.
We introduce a fully automatic, active 3D reconstruction method which integrates interaction perception from depth sensors, real robot-object interaction(e.g., opening drawers), and on-the-fly scanning and reconstruction to obtain a complete geometry acquisition of both the object exteriors and interiors.
# <1> clone project
git clone https://github.com/Salingo/Interaction-Driven-Reconstruction.git
cd Interaction-Driven-Reconstruction
# <2> [OPTIONAL] create conda enviroment
conda create -n IDR python=3.9
conda activate IDR
# <3> install pytorch according to instructions
# https://pytorch.org/get-started/
# <4> install requirements
pip install -r requirements.txt
# <5> install openpoints according to instructions
# https://github.com/guochengqian/openpoints
Our dataset was obtained by processing the PartNet-Mobility. The main processing script is based on the peojrct virtual-3d-scanner. You can download the dataset from the link: Google Drive .
Unzip the file, the file structure should look like this:
Interaction-Driven-Reconstruction
> data_raw
> motion
> pc_vscan_iter_front
Rather than downloading our data, it is more convenient to generate it manually.
# In the root directory.
python gen_data/action/gen_score_batch.py
After executing the above code, we should have obtained a dataset in ./data/action
.
Before starting training, you need to modify the training configuration in the ./config/experiment
. Our code is based on lightning-hydra-template, if you don't know how to modify the configuration, please refer to the repository.
# In the root directory
python src/train.py experiment=action_critic
After the training, you cloud see the result in ./logs
.
Training the action network is based on the critic network. You should find the best ckpt file of the critic network in the ./logs
. And then setting it in the ./config/experiment
.
# In the root directory
python src/train.py experiment=action_critic
In order to train the segmentation neural network more conveniently, we preprocessed the original data and obtained the dataset of the segmentation network.
# In the root directory.
python gen_data/seg/gen_seg_data_batch.py
After modifying the configuration file in ./configs/experiment
, you can use the following command to train
# In the root directory
python src/train.py experiment=seg
# Or
python src/train.py experiment=seg_wom
# Or
python src/train.py experiment=seg_baseline
Generate the dataset of completion network is based on the dataset of segmentation network. After generating the dataset of segmentation network in 4.1, you could run the following code:
# In the root directory
python gen_data/com/gen_com_data.py
# Normalized
python gen_data/com/norm_com_data.py
# In the root directory
python src/train experiment=com