Source code for: J. Walchessen, A. Zammit-Mangion, R. Huser, M. Kuusela (2024). Neural Conditional Simulation for Complex Spatial Processes. arXiv preprint arXiv:2505.24556.
This repository contains the code and training details for each case study in "Neural Conditional Simulation for Complex Spatial Processes." The package environment for this project is contained in requirements.txt
We recommend creating a python virtual environment via the following commands:
- Create the python environment.
python3 -m venv ~/ncs
- Activate the environment.
source ~/ncs/bin/activate
- Install main dependencies.
python -m pip install ninja torch torchvision torchaudio
- Install remaining dependencies according to the requirements.txt
From experimentation, there are no hyperparameters that are sensitive to how well the U-Net approximates the true conditional score function. As such, the reason the hyperparameters vary between the U-Net and spatial process types is due to computational constraints, not training issues. The only crucial training choice we encountered pertains to amortization---the amortized variable ought to be sampled from a continuous, positive distribution.
There are
There are
As in the Gaussian process case study, we did not discover any training sensitivities to any hyperparameters except with respect to amortization---the amortized variable ought to be sampled from a continuous, positive distribution. As such, we used the same hyperparameters as in the Gaussian process case study with some noted exceptions due to computational efficiency.
The batch size, number of data draws, and epochs per data draw are the same as for the parameter U-Net in the Gaussian process case study.
The batch size, number of data draws, and epochs per data draw are also the same as for the proportion U-Net in the Gaussian process case study.
There are