Skip to content

Multi-Head ReLU Implicit Neural Representation Networks

Notifications You must be signed in to change notification settings

AlirezaMorsali/MH-RELU-INR

Repository files navigation

MH-RELU-INR

This is the Tensorflow 2.x implementation of our paper "Multi-Head ReLU Implicit Neural Representation Networks", accepted in ICASSP 2022.

In this paper, a novel multi-head multi-layer perceptron (MLP) structure is presented for implicit neural representation (INR). Since conventional rectified linear unit (ReLU) networks are shown to exhibit spectral bias towards learning low-frequency features of the signal, we aim at mitigating this defect by taking advantage of local structure of the signals. To be more specific, an MLP is used to capture the global features of the underlying generator function of the desired signal. Then, several heads are utilized to reconstruct disjoint local features of the signal, and to reduce the computational complexity, sparse layers are deployed for attaching heads to the body. Through various experiments, we show that the proposed model does not suffer from the special bias of conventional ReLU networks and has superior generalization capabilities. Finally, simulation results confirm that the proposed multi-head structure outperforms existing INR methods with considerably less computational cost.

Convergence

Training_Convergence_MultiHead.mp4

Run

Demo

Run experiments on Google Colab:Open In Colab

1. Clone Repository

$ git clone https://github.com/AlirezaMorsali/MH-RELU-INR.git
$ cd MH-RELU-INR/

2. Requirements

  • Tensorflow >= 2.3.0
  • Numpy >= 1.19.2
  • Scikit-image >= 4.50.2
  • Matplotlib> = 3.3.1
  • Opencv-python >= 4.5.1
$ pip install -r requirements.txt

3. Set hyperparameters and training config :

You only need to change the constants in the hyperparameters.py to set the hyperparameters and the training config.

4. Run experiments:

Use the following codes to run the experiments.

  • Note : The results for the experiments are saved in the result folder.

Comparison experiments:

python run_comparison_experiments.py -i [path of input image] \
                                     -nh [number of heads for multi-head network] \
                                     -bh [root number of heads for base multi-head network(for fair comparison)] \
                                     -ba [alpha parameter for base multi-head network(for fair comparison)] \
                                     -ub [use bias for the head part of the multi-head network]

Example:

python run_comparison_experiments.py -i pics/sample1.jpg \
                                     -nh 64 \
                                     -bh 64 \
                                     -ba 256 \
                                     -ub true

Generalization experiments:

python run_generalization_experiments.py -i [path of input image] \
                                         -bh [root number of heads for base multi-head network(for fair comparison)] \
                                         -ba [alpha parameter for base multi-head network(for fair comparison)] \
                                         -ub [use bias for the head part of the multi-head network]

Example:

python run_generalization_experiments.py -i pics/sample1.jpg \
                                         -bh 256 \
                                         -ba 32 \
                                         -ub false

Spectral bias experiments:

python run_spectral_bias_experiments.py

Citation

If you find our code useful for your research, please consider citing:

@inproceedings{aftab2022multi,
  title={Multi-Head ReLU Implicit Neural Representation Networks},
  author={Aftab, Arya and Morsali, Alireza and Ghaemmaghami, Shahrokh},
  booktitle={ICASSP 2022-2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)},
  pages={2510--2514},
  year={2022},
  organization={IEEE}
}

About

Multi-Head ReLU Implicit Neural Representation Networks

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published