The official implementation of my computer vision semester project. Using ResNet50 model on Oxford Flowers102 dataset, I quantify uncertainty on out-of-distribution and noise-induced data to verify robustness of this method.
- Added support of Noise Detection
- Added support of Confusion Matrix Visualization for effectiveness of uncertainty quantification to avoid wrong predictions
- Out of Distribution Data Detection
- Training and Testing scripts
- Replace Training Loss of ResNet with EDL loss adopted and inspired from pytorch-classification-uncertainty
- During Inference, the logits generated by our model will no longer be fed to SoftMax layer, but it will be interpreted as alpha parameters of Dirichlet distribution.
- Inspired from NVIDIA's tutorial on Exploring Uncertainty Quantification in Deep Learning for Medical Imaging
Assuming Linux Operating system,
Download the Oxford Flowers102 dataset zip folder. Place the downloaded folder in directory data/images/ cd to data/images/ and place the following command to extract these images: tar -xvzf 102flowers.tgz. Images will be saved in directory data/images/jpg
Make a new python env
conda create -n unc python=3.8
conda activate uncRun these commands after activating unc:
pip install torch==1.13.0+cu116 torchvision==0.14.0+cu116 torchaudio==0.13.0 --extra-index-url https://download.pytorch.org/whl/cu116
pip install matplotlib
pip install pandas
pip install opencv-python
pip install scikit-image
pip install scipyWe show the effectiveness of this approach to:
- Recognize OOD
- Perform Noise Detection
- Avoid Wrong Predictions
Use this command to infer custom OOD images stored in data/images/custom/ on model unc900.pth and store results in data/images/results/
python inference_custom.py For example, feeding in OOD (from data/images/custom) will result in:
To compare your noising and denoising images uncertainty, run
python inference.py For example, uncertainty score on noisy vs clear image of one test set Flowers102 Image:
To run model's inference on test images of flower 102 and perform uncertainty filtering to give out confusion matrices, use
python confusion_matrix.pyFor Example, more wrong predictions on confusion matrix (left) as compared to uncertainty filter (right)
| No Uncertainty Filter | Uncertainty Filter (0.65) |
|---|---|
|
|
Pretraining on CrossEntropy is required to fine tune EDL loss. Upon my findings and experiments, this method does not work on scratch training with EDL loss. To start pretraining, run
python pretraining_50.pyIt will save model named ce.pth which will be trained for 50 epochs with Cross Entropy Loss.
To start training on ce.pth with custom loss function i.e. EDLLoss() for 500 epochs and lr = 2e-2
python train_EDL500.pyThis will save model named unc500.pth
To start training on unc500.pth with EDLLoss() for another 200 epochs with lr = 2e-5
Run python resume500_700.py This will save model named unc700.pth
To start training on unc700.pth with EDLLoss() for another 200 epochs with lr = 1e-2
Run python resume700_900.py This will save model named unc900.pth








