You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am trying to run the model on Docker (Docker Desktop, Windows via WSL2) and my card is a RTX 4070 12GB, but I always see the error "torch.cuda.OutOfMemoryError: Allocation on device" and the predictions although they say "suceeeded", there are no output files.
I am guessing that the minimum for this model is 16 GB of VRAM?
The text was updated successfully, but these errors were encountered:
I am trying to run the model on Docker (Docker Desktop, Windows via WSL2) and my card is a RTX 4070 12GB, but I always see the error "torch.cuda.OutOfMemoryError: Allocation on device" and the predictions although they say "suceeeded", there are no output files.
I am guessing that the minimum for this model is 16 GB of VRAM?
The text was updated successfully, but these errors were encountered: