Skip to content

[TMLR 2024] Reproducibility study on adversarial attacks against robust transformer trackers

License

Notifications You must be signed in to change notification settings

fatemehN/ReproducibilityStudy

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

69 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Reproducibility Study on Adversarial Attacks against Robust Transformer Trackers

Project webpage

OpenReview

This repository contains the codes of the TMLR 2024 "Reproducibility Study on Adversarial Attacks against Robust Transformer Trackers". Three experiments on the adversarial robustness of transformer trackers are performed and their codes are included. The dataset, trackers, and attack method links are listed below:

Transformer trackers:

Adversarial attacks:

Datasets:

Docker Image:

We provide a Docker image that includes all the necessary packages to run the experiments. To build the docker image run:

docker build . -f mixformer24.base -t mixformer24

To mount the local directory to the docker container, run:

nvidia-docker run -it --rm --user root --mount type=bind,source="$(pwd)",target=/mnt mixformer24:latest

To run the codes, first export the essential paths and then, use the following sample:

conda run -n mixformer24 /bin/bash -c "vot evaluate TransT"

BibTex Record:

Please cite our paper as follows:

@article{
nokabadi2024reproducibility,
title={Reproducibility Study on Adversarial Attacks Against Robust Transformer Trackers},
author={Fatemeh Nourilenjan Nokabadi and Jean-Francois Lalonde and Christian Gagn{\'e}},
journal={Transactions on Machine Learning Research},
issn={2835-8856},
year={2024},
url={https://openreview.net/forum?id=FEEKR0Vl9s},
note={Reproducibility Certification}
}

Contact:

Fatemeh Nokabadi

About

[TMLR 2024] Reproducibility study on adversarial attacks against robust transformer trackers

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages