Inference with TorchInferencer - Downloading weights of Backbone #947
Unanswered
tominator95
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi,
I wrote a REST API for running inferences with different models. When loading the models to the device (I tried it with PaDiM and PatchCore), the weights for the Feature Extractor are downloaded (via torchvision or the timm package). I am just wondering: aren't the weights saved within the model in the
model.ckpt
after training? If so why are they downloaded again when I initialize a newTorchInferencer
? How can I prevent this? Background: my API should also run offline.Thats how I create the inferencer:
Beta Was this translation helpful? Give feedback.
All reactions