Skip to content

Latest commit

 

History

History
52 lines (38 loc) · 1.62 KB

README.md

File metadata and controls

52 lines (38 loc) · 1.62 KB

Depth Estimation with VPD

Getting Started

  1. Install the mmcv-full library and some required packages.
pip install openmim
mim install mmcv-full
pip install -r requirements.txt
  1. Prepare NYUDepthV2 datasets following GLPDepth and BTS.
mkdir nyu_depth_v2
wget http://horatio.cs.nyu.edu/mit/silberman/nyu_depth_v2/nyu_depth_v2_labeled.mat
python extract_official_train_test_set_from_mat.py nyu_depth_v2_labeled.mat splits.mat ./nyu_depth_v2/official_splits/

Download sync.zip provided by the authors of BTS from this url and unzip in ./nyu_depth_v2 folder.

Your dataset directory should be:

│nyu_depth_v2/
├──official_splits/
│  ├── test
│  ├── train
├──sync/

Results and Fine-tuned Models

RMSE d1 d2 d3 REL log_10 Fine-tuned Model
VPD 0.254 0.964 0.995 0.999 0.069 0.030 Tsinghua Cloud

We offer the predicted depths in 16-bit format for NYU-Depth-v2 official test set here.

Training

Run the following instuction to train the VPD-Depth model. We recommend using 8 NVIDIA V100 GPUs to train the model with a total batch size of 24.

bash train.sh <LOG_DIR>

Evaluation

Command format:

bash test.sh <CHECKPOINT_PATH>