Skip to content

[SIGGRAPH Asia 2023] Interaction-Driven Active 3D Reconstruction with Object Interiors

Notifications You must be signed in to change notification settings

Salingo/Interaction-Driven-Reconstruction

Repository files navigation

Interaction-Driven Active 3D Reconstruction with Object Interiors

Zihao Yan · Fubao Su · Mingyang Wang · Hao Zhang · Ruizhen Hu · Hui Huang*

SIGGRAPH Aisa 2023

Paper arXiv Project Page

A fully automatic, active 3D reconstruction method.


📌 Introduction

We introduce a fully automatic, active 3D reconstruction method which integrates interaction perception from depth sensors, real robot-object interaction(e.g., opening drawers), and on-the-fly scanning and reconstruction to obtain a complete geometry acquisition of both the object exteriors and interiors.


🚀 Quickstart

1 Set up Environment.

# <1> clone project
git clone https://github.com/Salingo/Interaction-Driven-Reconstruction.git
cd Interaction-Driven-Reconstruction

# <2> [OPTIONAL] create conda enviroment
conda create -n IDR python=3.9
conda activate IDR

# <3> install pytorch according to instructions
# https://pytorch.org/get-started/

# <4> install requirements
pip install -r requirements.txt

# <5> install openpoints according to instructions
# https://github.com/guochengqian/openpoints

2 Download dataset

Our dataset was obtained by processing the PartNet-Mobility. The main processing script is based on the peojrct virtual-3d-scanner. You can download the dataset from the link: Google Drive .

Unzip the file, the file structure should look like this:

Interaction-Driven-Reconstruction
  > data_raw
      > motion
      > pc_vscan_iter_front

3 Action Network

3.1 Generate the dataset

Rather than downloading our data, it is more convenient to generate it manually.

# In the root directory.
python gen_data/action/gen_score_batch.py

After executing the above code, we should have obtained a dataset in ./data/action.

3.2 Train the Critic Network

Before starting training, you need to modify the training configuration in the ./config/experiment. Our code is based on lightning-hydra-template, if you don't know how to modify the configuration, please refer to the repository.

# In the root directory
python src/train.py experiment=action_critic

After the training, you cloud see the result in ./logs.

3.2 Train the Action Network

Training the action network is based on the critic network. You should find the best ckpt file of the critic network in the ./logs. And then setting it in the ./config/experiment .

# In the root directory
python src/train.py experiment=action_critic

4 Segmentation Network

4.1 Generate the dataset

In order to train the segmentation neural network more conveniently, we preprocessed the original data and obtained the dataset of the segmentation network.

# In the root directory.
python gen_data/seg/gen_seg_data_batch.py

4.2 Train the Segmentation Network

After modifying the configuration file in ./configs/experiment, you can use the following command to train

# In the root directory
python src/train.py experiment=seg

# Or 
python src/train.py experiment=seg_wom

# Or 
python src/train.py experiment=seg_baseline

5 Completion Network

5.1 Generate the dataset

Generate the dataset of completion network is based on the dataset of segmentation network. After generating the dataset of segmentation network in 4.1, you could run the following code:

# In the root directory
python gen_data/com/gen_com_data.py

# Normalized  
python gen_data/com/norm_com_data.py

5.2 Train the completion network

# In the root directory
python src/train experiment=com

About

[SIGGRAPH Asia 2023] Interaction-Driven Active 3D Reconstruction with Object Interiors

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages