Skip to content

AlbiCarle/Deep-Probabilistic-Scaling

Repository files navigation

Deep-Probabilistic-Scaling

Example Image

Deep Probabilistic Scaling (DPS) is an uncertainty quantification tool for the control of misclassification error in (binary) neural classification. The algorithm relies on probabilistic scaling, a branch of order statistics for non-parametric inference with confidence bounds on the prediction.

DPS is a direct application of Scalable Classification to convolutional neural networks for (binary) classification:

that is the predictor function of the network $\hat \varphi$ such that

This framework allows to define a special region $\mathcal{S}_\varepsilon$ such that the probability of observing a false negative is bounded by $\varepsilon$:

Content of the Repository

This repository contains the code for the experiments to validate the DPS algorithm.

We considered 6 benchmarks datasets, on which we defined a binary classification problem, as shown below: Schermata 2024-06-13 alle 10 31 43

As an example, MNIST data are also available in the data folder in the .npy format.

The following notebooks are available:

  • get_pneumoniaMNIST_data.ipynb shows how to download and save data for pnemoniaMNIST by using medmnist python library

  • DeepSC_NNtraining.ipynb: contains the training of the convolutional models (3-layer CNNs) used for DPS. The models are also shared in the models folder

  • DeepSC_ProbScaling.ipynb: main code to implement DPS

  • EvaluationMetricsPlot.ipynb: computes the evaluation metrics and plots the results for all the considered datasets.

References

DPS was implemented for a research paper presented to COPA2024 on the basis of the concept paper

Carlevaro, Alberto, et al. "Probabilistic Safety Regions Via Finite Families of Scalable Classifiers." arXiv preprint arXiv:2309.04627 (2023).

The full paper will be available in the proceedings of the conference.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published