Skip to content

jackyzengl/THUD-plus-plus

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 

Repository files navigation

THUD++: Large-Scale Dynamic Indoor Scene Dataset and Benchmark for Mobile Robots

Zeshun Li1* · Fuhao Li1* · Wanting Zhang1*
Zijie Zheng · Xueping Liu1 · Tao Zhang2 · Yongjin Liu1 · Long Zeng1†

1THU   2PUDU Tech
*equal contribution †corresponding author

Paper PDF Paper PDF Project Page

🧭 Overview

teaser THUD++ comprises three primary components: an RGB-D dataset, a pedestrian trajectory dataset, and a robot navigation emulator. By sharing this dataset, we aim to accelerate the development and testing of mobile robot algorithms,contributing to real-world robotic applications.

📃 Usage

1. RGB-D Dataset

Please refer to THUD_Dataset_Overview

2. Pedestrian Trajactory Prediction

Prepraration

git clone https://github.com/jackyzengl/THUD-plus-plus.git
conda create -n thud++ python=3.8
conda activate thud++
pip install -r requirements.txt

Dataset Structure

The construction of the dataset is the same as that of ETH/UCY. Each row in the data set is recorded according to frameID, pedID, x, y. The data of each scene is named after the scene and world coordinate system range.

Dataset
├── eth
│   ├── train
│   ├── val
│   ├── test_eth
│   ├── test_gym
│   ├── test_hotel
│   ├── test_office
│   ├── test_supermarket
├── gym
│   ├── gym_x[-7,7]_y[-14.4,14.8].txt
├── office
│   ├── office_x[-43.6,-35.5]_y[0.25,17].txt
├── supermarket
│   ├── supermarket_x[-26,-3]_y[-8,8].txt

Evaluation

cd traj_pred/tools/FLA/sgan/scripts && python evaluate_model.py
cd traj_pred/tools/FLA/pecnet/scripts && python test_pretrained_model.py
cd traj_pred/tools/FLA/stgcnn && python test.py

3. Navigation Emulator

teaser The code of our Emulator is coming soon.

🖊 Citation

If you find this project useful, please consider citing:

@article{li2024thud++,
  title={THUD++: Large-Scale Dynamic Indoor Scene Dataset and Benchmark for Mobile Robots},
  author={Li, Zeshun and Li, Fuhao and Zhang, Wanting and Zheng, Zijie and Liu, Xueping and Liu, Yongjin and Zeng, Long},
  journal={arXiv preprint arXiv:2412.08096},
  year={2024}
}

@inproceedings{tang2024mobile,
  title={Mobile robot oriented large-scale indoor dataset for dynamic scene understanding},
  author={Tang, Yi-Fan and Tai, Cong and Chen, Fang-Xing and Zhang, Wan-Ting and Zhang, Tao and Liu, Xue-Ping and Liu, Yong-Jin and Zeng, Long},
  booktitle={2024 IEEE International Conference on Robotics and Automation (ICRA)},
  pages={613--620},
  year={2024},
  organization={IEEE}
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published