Skip to content

LASP-UCL/GT-MilliNoise

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 

Repository files navigation

GT-MilliNoise: Graph Transformer for Point-wise Denoising of Indoor Millimeter-Wave Point Clouds

(Paper under submission) [Project] [Paper]

This GitHub contains the code to implement GT-MilliNoise architecture for millimetre-wave point cloud denoising using the MilliNoise dataset.
This Github also provides benchmarking on state-of-art architecture (PointNet, PointNet++, DGCNN, Transformer) on MilliNoise point cloud denoising.

MillNoise Dataset: [Dataset] [Paper]

Overview

Citation

Please cite this paper if you want to use it in your work,

@article{coming soon,
  }

Installation

Install TensorFlow. The code has been tested with Python 3.6, TensorFlow 1.10.0, CUDA 9.0 and cuDNN 7.3

Compile the code. You must select the correct CUDA version and install Tensorflow on your computer. For that edit the Makefiles to the paths of your Cuda and Tensorflow directories. The Makefiles to compile the code are in modules/tf_ops

Directory paths

You must specific the following directory (parse as input to train.py and test.py)

  1. --data-dir: Path were the dataset is stored
  2. --log-dir: Path were the model outputs should be saved.

Usage

To train a model for millimeter-wave point cloud denoising:

python train.py --version <name_of_model> --data-split <#split> --model <architecture_name>  --seq-length <input_frames>

For example:

python train.py --version v0 --data-split 4 --model GT  --seq-length 12 

Trains a GT-Millinoise model using dataset split #4 (Fold-4)

To evaluate the model

python test.py --version v0 --data-split 4 --model GT --seq-length 12 --manual-restore 2 

--manual-restore 2: loads best model in validation, --manual-restore 1 allows to choose a specific checkpoint

Splits:

#Split: K-Fold: Training Scenarios Test-Scenarios
--data-split 11 Fold 1-6 All [1-6] All [1-6]
--data-split 17 Fold 1,2,3 [4,5,6] [1,2,3]
--data-split 4 Fold 4 [1,2,3,5,6] [4]
--data-split 16 Fold 5 [1,2,3,4,6] [5]

Available Models:

Model Name Description
--model GT GT-Millinoise Standart GT-Millinoise architecture
--model PointNet PoinNet
--model PointNet_2 PointNet++
--model DGCNN Dynamci Graph CNN (DGCNN)
--model Transformer Vannila-Transformer
--model GT_intensity GT-Millinoise GT-Millinoise with intensity as input
--model GT_velocity GT-Millinoise GT-Millinoise with velocity as input
--model GT_noTC GT-Millinoise GT-Millinoise with without temporal block

Datasets

The models were evaluated with the following datasets:

  1. Complete Raw Data (JSON)   2. Pre-processed Data (Numpy)

We provide the dataset in two formats.

  1. The original JSON data with complete information (intensity, velocity, roto-translation coordinates).
  2. Pre-processed data converted to numpy, used in our experimentation

Acknowledgement

The parts of this codebase are borrowed from Related Repos:

Related Repos

  1. PointRNN TensorFlow implementation: https://github.com/hehefan/PointRNN
  2. PointNet++ TensorFlow implementation: https://github.com/charlesq34/pointnet2
  3. Dynamic Graph CNN for Learning on Point Clouds https://github.com/WangYueFt/dgcnn
  4. Millinoise https://github.com/c3lab/MilliNoise

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published