HazeSpace2M: A Dataset for Haze Aware Single Image Dehazing [Paper]
Md Tanvir Islam 1, Nasir Rahim 1, Saeed Anwar 2, Muhammad Saqib 3, Sambit Bakshi 4, Khan Muhammad 1, *
| 1. Sungkyunkwan University, South Korea | 2. KFUPM, KSA | 3. UTS, Australia | 4. NIT Rourkela, India || *Corresponding Author |
pip install -r requirements.txt
We are preparing the complete dataset formatting with a structural naming convention. We will upload the full dataset as soon as we complete preparing the dataset with correct naming format as we displayed in the last images in this page.
All the pre-trained weights of the classifiers and the dehazers are available to download:
Google Drive: | Classifier | Specialized Dehazers |
python inference.py --gt_folder <path_to_gt> --hazy_folder <path_to_hazy> --output_dir <output_dir> --classifier <path_to_classifier> --cloudSD <path_to_cloudSD> --ehSD <path_to_ehSD> --fogSD <path_to_fogSD>
Note: Each variable is explained in the inference.py file.
To use your custom classifier, please follow the following steps:
- Write the code for your classifier architecture in the classifier.py file in the models folder.
- Now define the object of your classifier in the classification_inference method inside the conditionalDehazing.py file under the models folder.
- Finally, define the weights of your classifier inside the inference.py file
To use your custom specialized dehazers, please follow the following steps:
- Write the code for your classifier architecture in the dehazer.py file in the models folder.
- Now define the object of your dehazer in the load_model method inside the helper.py file under the utils folder.
- Finally, define the weights of your classifier inside the inference.py file
If you find our work useful in your research, please consider citing our paper:
@inproceedings{hazespace2m,
title={HazeSpace2M: A Dataset for Haze Aware Single Image Dehazing},
author={Islam, Md Tanvir and Rahim, Nasir and Anwar, Saeed and Saqib Muhammad and Bakshi, Sambit and Muhammad, Khan},
booktitle={Proceedings of the 32nd ACM International Conference on Multimedia},
year={2024},
doi = {10.1145/3664647.3681382}
}