Project page available here.
Evaluation code and dataset from "RGB-D-E: Event Camera Calibration for Fast 6-DOF Object Tracking " [arxiv paper]
Download the evaluation dataset here (12 GB).
The dataset contains multiple sequences each in different folder. Each sequence contains the following files:
camera.json
: RGB-D sensor (Microsoft Kinect Azure) intrinsic calibrationdvs.json
: Event based sensor (DAVIS346) intrinsic calibrationtransfo_mat.npy
: Extrinsic calibrationfevents.npz
: Events data of shape 4xN (Timestamps, x, y, Polarity)davis_frame.npz
: Grayscale frame record from the event sensordavis_frame_ts.npz
: Timestamps associated to each grayscale frameframes.npz
: RGB-D frames from the Microsoft Kinect Azurets_frames.npz
: Timestamps associated to each RGB-D frameposes.npy
: Ground truth 6DoF poses
Download and extract:
This repository used submodule and it should be initiated:
git submodule update --init --recursive
Update your PYTHONPATH
:
export PYTHONPATH=$PYTHONPATH:./6DOF_tracking_evaluation
To run the tracker on the whole dataset and compute the tracking failures for both networks:
python tracking_event_6dof/inference/tracker_failure.py \
-e ./model/event
-f ./model/frame
-d ./dataset
-m ./dragon
To generate video result of each sequence:
python tracking_event_6dof/inference/tracker_failure.py \
-e ./model/event
-f ./model/frame
-d ./dataset
-m ./dragon
-a /path/to/folder/to/save/videos
Note: Those examples suppose that the assets are extracted in the root folder
@misc{dubeau2020rgbde,
title={RGB-D-E: Event Camera Calibration for Fast 6-DOF Object Tracking},
author={Etienne Dubeau and Mathieu Garon and Benoit Debaque and Raoul de Charette and Jean-François Lalonde},
year={2020},
eprint={2006.05011},
archivePrefix={arXiv},
primaryClass={cs.CV}
}