Skip to content

[ICLR 2024 Spotlight ✨] ResFields: Residual Neural Fields for Spatiotemporal Signals

License

Notifications You must be signed in to change notification settings

markomih/ResFields

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ResFields: Residual Neural Fields for Spatiotemporal Signals

Marko Mihajlovic · Sergey Prokudin · Marc Pollefeys · Siyu Tang

ICLR 2024 (spotlight ✨)

PyTorch Lightning
Paper PDF Project Page

Google Colab

ResField layers incorporate time-dependent weights into MLPs to effectively represent complex temporal signals.

Applications

Video TSDF
2D Video Approximation Temporal SDF Capture
TNeRF TNeRF_RGBD
Dynamic NeRFs from 4 RGB views Dynamic NeRFs from 3 RGB-D

News 🚩

  • [2023/10/01] Code released.

Key idea of ResFields

Our key idea is to substitute one or several MLP layers with time-dependent layers whose weights are modeled as trainable residual parameters added to the existing layer weights.

We propose to implement the residual parameters as a global low-rank spanning set and a set of time-dependent coefficients. this modeling enhances the generalization properties and further reduces the memory footprint caused by maintaining additional network parameters.

These residual weights are modeled as a learnable low-rank composition.

Increasing the model capacity in this way offers three key advantages:

  1. Runtime: the underlying MLP does not increase in size and hence maintains the inference and training speed.
  2. Generalizability: retains the implicit regularization and generalization properties of MLPs.
  3. Universality: ResFields are versatile, easily extendable, and compatible with most MLP-based methods for spatiotemporal signals.
Please consider citing our work if you find it useful
@inproceedings{mihajlovic2024ResFields,
  title={{ResFields}: Residual Neural Fields for Spatiotemporal Signals},
  author={Mihajlovic, Marko and Prokudin, Sergey and Pollefeys, Marc and Tang, Siyu},
  booktitle={International Conference on Learning Representations (ICLR)},
  year={2024}
}

Instructions

Remaining tasks:

  • Release RGB-D data
  • Release data preprocessing code

Citation

@inproceedings{mihajlovic2024ResFields,
  title={{ResFields}: Residual Neural Fields for Spatiotemporal Signals},
  author={Mihajlovic, Marko and Prokudin, Sergey and Pollefeys, Marc and Tang, Siyu},
  booktitle={International Conference on Learning Representations (ICLR)},
  year={2024}
}

Acknowledgments

We thank Hongrui Cai and Ruizhi Shao for providing additional details about the baseline methods and Anpei Chen, Shaofei Wang, and Songyou Peng for proofreading the manuscript and proving useful suggestions.

Some great prior work we benefit from:

This project has been supported by the Innosuisse Flagship project PROFICIENCY.

License

The code and models are available for use without any restrictions. See the LICENSE file for details.

Contact

Please open a PR or contact Marko Mihajlovic for any questions. We greatly appreciate everyone's feedback and insights. Please do not hesitate to get in touch.

About

[ICLR 2024 Spotlight ✨] ResFields: Residual Neural Fields for Spatiotemporal Signals

Resources

License

Stars

Watchers

Forks