Skip to content

Latest commit

 

History

History
64 lines (42 loc) · 5.45 KB

README.md

File metadata and controls

64 lines (42 loc) · 5.45 KB

TIC-TAC

TIC-TAC: A Framework For Improved Covariance Estimation In Deep Heteroscedastic Regression

arXiv Project OpenReview Docker

Code repository for "TIC-TAC: A Framework For Improved Covariance Estimation In Deep Heteroscedastic Regression". We address the problem of sub-optimal covariance estimation in deep heteroscedastic regression by proposing a new parameterisation (TIC) and metric (TAC). We derive a new expression, the Taylor Induced Covariance (TIC), which expresses the randomness of the prediction through its gradient and curvature. The Task Agnostic Correlations (TAC) metric leverages the conditioning property of the normal distribution to evaluate the covariance quantitatively.

Table of contents

  1. Installation: Docker (recommended) or PIP
  2. Organization
  3. Code Execution
  4. Acknowledgement
  5. Citation

Installation: Docker (recommended) or PIP

Docker: We provide a Docker image which is pre-installed with all required packages. We recommend using this image to ensure reproducibility of our results. Using this image requires setting up Docker on Ubuntu: Docker. Once installed, we can use the provided docker-compose.yaml file to start our environment with the following command: docker-compose run --rm tictac

PIP: In case using Docker is not possible, we provide a requirements.txt file containing a list of all the packages which can be installed with pip. We recommend setting up a new virtual environment (link) and install the packages using: pip install -r requirements.txt

Organization

The repository contains four main folders corresponding to the four experiments: Univariate, Multivariate, UCIand HumanPose. While the first three are self contained, HumanPose requires us to download images corresponding to the MPII Dataset, LSP Dataset and LSPET Dataset. For each of these datasets copy-paste all the images *.jpg into HumanPose/data/{mpii OR lsp OR lspet}/images/. Within HumanPose, we have a separate folder cached, which holds the generated file mpii_cache_imgs_{True/False}.npy. This file stores the post-processed MPII dataset to avoid redundancy every time the code is run. Running python main.py in the code folder executes the code, with configurations specified in configuration.yml

Code Execution

We first need to activate the environment. This requires us to start the container: docker-compose run --rm tictac, which loads our image containing all the pre-installed packages. Alternatively, we can activate the virtual environment which contains packages installed via pip.

The main files to run the experiments are: univariate.py, multivariate.py, uci.py and main.py in the directories Univariate, Multivariate, UCI and HumanPose respectively. These experiments can be run using python <filename>.py. The configuration to run the Univariate, Multivariate and UCI are available in the main files respectively, whereas the configuration for human pose experiments are stored in configuration.yml in the HumanPose directory. The results for the univariate sinusoidal experiments are saved as PDF files for each epoch. The Multivariate results are saved as TAC.pdf. For UCI, the results are saved as output.txt for each dataset. Similarly, human pose results are saved in output_*.txt.

Stopping a container once the code execution is complete can be done using:

  1. docker ps: List running containers
  2. docker stop <container id>

Acknowledgement

We thank https://github.com/jaehyunnn/ViTPose_pytorch for their implementation of ViTPose which was easily customizable. We also borrow code from the Active Learning for Human Pose library.

Citation

If you find this work useful, please consider starring this repository and citing this work!

@InProceedings{shukla2024tictac,
  title = {TIC-TAC: A Framework for Improved Covariance Estimation in Deep Heteroscedastic Regression},
  author = {Shukla, Megh and Salzmann, Mathieu and Alahi, Alexandre},
  booktitle = {Proceedings of the 41th International Conference on Machine Learning (ICML)},
  year = {2024},
  series = {Proceedings of Machine Learning Research},
  month = {21--27 Jul},
  publisher = {PMLR}
}