Skip to content

A NumPy implementation of Lee et al., Deep Neural Networks as Gaussian Processes, 2018

Notifications You must be signed in to change notification settings

MB-29/neural-Gaussian-process

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

29 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Neural Networks as Gaussian Processes

A NumPy implementation of the bayesian inference approach of Deep Neural Networks as Gaussian Processes.

We focus on infinitely wide neural network endowed with ReLU nonlinearity function, allowing for an analytic computation of the layer kernels.

Usage

Requirements

  • Python 3
  • numpy

Installation

Clone the repository

git clone https://github.com/MB-29/NN-gaussian-process.git

and move to the root directory

cd NN-gaussian-process

Use our module

from nngp import NNGP

# ... 

regression = NNGP(
    training_data,              # Data
    training_targets,
    test_data,
    L,                          # Neural network depth
    sigma_eps_2=sigma_eps**2,   # Observation noise variance
    sigma_w_2=sigma_w_2,        # Weight hyperparameter
    sigma_b_2=sigma_b_2         # Bias hyperparameter
    )

regression.train()
predictions, covariance = regression.predict()

Examples

  • A classification script for MNIST is provided in the file classify_MNIST.py. It relies on the additional requirement python-mnist available on pip.
  • A 1D regression script is provided in the file 1D_regression.py. We obtained the following results.

Network expressivity

Network expressivity

Fixed point analysis

Fixed point analysis

Test uncertainty and test error

Test uncertainty and test error

Releases

No releases published

Packages

No packages published

Languages