Skip to content

Latest commit

 

History

History
79 lines (47 loc) · 2.25 KB

README.md

File metadata and controls

79 lines (47 loc) · 2.25 KB

Reproducibility Study on Adversarial Attacks against Robust Transformer Trackers

Project webpage

OpenReview

This repository contains the codes of the TMLR 2024 "Reproducibility Study on Adversarial Attacks against Robust Transformer Trackers". Three experiments on the adversarial robustness of transformer trackers are performed and their codes are included. The dataset, trackers, and attack method links are listed below:

Transformer trackers:

Adversarial attacks:

Datasets:

Docker Image:

We provide a Docker image that includes all the necessary packages to run the experiments. To build the docker image run:

docker build . -f mixformer24.base -t mixformer24

To mount the local directory to the docker container, run:

nvidia-docker run -it --rm --user root --mount type=bind,source="$(pwd)",target=/mnt mixformer24:latest

To run the codes, first export the essential paths and then, use the following sample:

conda run -n mixformer24 /bin/bash -c "vot evaluate TransT"

BibTex Record:

Please cite our paper as follows:

@article{
nokabadi2024reproducibility,
title={Reproducibility Study on Adversarial Attacks Against Robust Transformer Trackers},
author={Fatemeh Nourilenjan Nokabadi and Jean-Francois Lalonde and Christian Gagn{\'e}},
journal={Transactions on Machine Learning Research},
issn={2835-8856},
year={2024},
url={https://openreview.net/forum?id=FEEKR0Vl9s},
note={Reproducibility Certification}
}

Contact:

Fatemeh Nokabadi