Zeshun Li1* · Fuhao Li1* · Wanting Zhang1*
Zijie Zheng · Xueping Liu1 · Tao Zhang2 · Yongjin Liu1 · Long Zeng1†
1THU 2PUDU Tech
*equal contribution †corresponding author
THUD++ comprises three primary components: an RGB-D dataset, a pedestrian trajectory dataset, and a robot navigation emulator. By sharing this dataset, we aim to accelerate the development and testing of mobile robot algorithms,contributing to real-world robotic applications.
Please refer to THUD_Dataset_Overview
git clone https://github.com/jackyzengl/THUD-plus-plus.git
conda create -n thud++ python=3.8
conda activate thud++
pip install -r requirements.txtThe construction of the dataset is the same as that of ETH/UCY. Each row in the data set is recorded according to frameID, pedID, x, y. The data of each scene is named after the scene and world coordinate system range.
Dataset
├── eth
│ ├── train
│ ├── val
│ ├── test_eth
│ ├── test_gym
│ ├── test_hotel
│ ├── test_office
│ ├── test_supermarket
├── gym
│ ├── gym_x[-7,7]_y[-14.4,14.8].txt
├── office
│ ├── office_x[-43.6,-35.5]_y[0.25,17].txt
├── supermarket
│ ├── supermarket_x[-26,-3]_y[-8,8].txt
cd traj_pred/tools/FLA/sgan/scripts && python evaluate_model.py
cd traj_pred/tools/FLA/pecnet/scripts && python test_pretrained_model.py
cd traj_pred/tools/FLA/stgcnn && python test.py
The code of our Emulator is coming soon.
If you find this project useful, please consider citing:
@article{li2024thud++,
title={THUD++: Large-Scale Dynamic Indoor Scene Dataset and Benchmark for Mobile Robots},
author={Li, Zeshun and Li, Fuhao and Zhang, Wanting and Zheng, Zijie and Liu, Xueping and Liu, Yongjin and Zeng, Long},
journal={arXiv preprint arXiv:2412.08096},
year={2024}
}
@inproceedings{tang2024mobile,
title={Mobile robot oriented large-scale indoor dataset for dynamic scene understanding},
author={Tang, Yi-Fan and Tai, Cong and Chen, Fang-Xing and Zhang, Wan-Ting and Zhang, Tao and Liu, Xue-Ping and Liu, Yong-Jin and Zeng, Long},
booktitle={2024 IEEE International Conference on Robotics and Automation (ICRA)},
pages={613--620},
year={2024},
organization={IEEE}
}