-
Notifications
You must be signed in to change notification settings - Fork 21
Description
After installing the environment following instructions in README.md, I get the error
WARNING[XFORMERS]: xFormers can't load C++/CUDA extensions. xFormers was built for: PyTorch 2.6.0+cu124 with CUDA 1204 (you have 2.6.0+cu126) Python 3.10.16 (you have 3.10.16) Please reinstall xformers (see [https://github.com/facebookresearch/xformers#installing-xformers](vscode-file://vscode-app/snap/code/208/usr/share/code/resources/app/out/vs/code/electron-browser/workbench/workbench.html)) Memory-efficient attention, SwiGLU, sparse and more won't be available. Set XFORMERS_MORE_DETAILS=1 for more details
I installed cuda-toolkit using
conda install nvidia/label/cuda-12.6.2::cuda-toolkit,
assuming CUDA 12.6 was used based on the line
pip install torch==2.6.0 torchvision==0.21.0 --index-url https://download.pytorch.org/whl/cu126
in README.md.
Did you ignore this error, and not use xformers in your experiments?
If so, how much VRAM is required per GPU for your experiments? After the xformers warning, training terminates with an OOM error (I am using 24GB VRAM GPUs).
If you did use xformers, I would appreciate further help on package installation.