make install
insufficient for running llama3-8B-Instruct
#484
Labels
documentation
Improvements or additions to documentation
System Info
lorax-launcher-env
output:cargo version
output:Model being used:
meta-llama/Meta-Llama-3-70B-Instruct
GPUs: 8 A100s on Coreweave (can't get more details since I broke nvidia accidentally).
Cuda is 12.2 I believe.
Information
Tasks
Reproduction
make install
lorax-launcher --model-id meta-llama/Meta-Llama-3-70B-Instruct --port 8080
Initial failure reports that module
dropout_layer_norm
can't be found.From reading the docker instructions, I believe the full installation is something like:
However when doing this the
install-vllm
step ran into an issue whereby it expectedtorch==2.2.1
howevermake install
actually runspip install torch==2.2.0
which breaks the vllm step.Expected behavior
The following steps work successfully:
make install
lorax-launcher --model-id meta-llama/Meta-Llama-3-70B-Instruct --port 8080
Alternatively, 2 could be something like
make install-comprehensive
to include the full vllm and flash attention set of dependencies.The text was updated successfully, but these errors were encountered: