Skip to content

[Feature Request] Use llama.cpp directly on the lxc without docker #10

@eikaramba

Description

@eikaramba

Describe the feature

i adapted the great script from https://github.com/kyuz0/amd-strix-halo-toolboxes/blob/main/toolboxes/Dockerfile.rocm-7.2 (I used the 7.1.1 script which is now removed) to compile and use llama.cpp direclty on the lxc without docker. if anyone is interested here it is:
https://github.com/eikaramba/proxmox-setup-scripts/blob/main/lxc/install-rocm-llamacpp-in-lxc.sh

Maybe this is something that is of interest and can be adopted :) feel free to close this "issue" if the repo should stick to the bare minimum and not include application specific scripts.

will see if i can update it soon. but i wanted to say it works, at least for super simple "hello world" messages. still waiting for newer proxmox kernels above 6.17 before doing more testing with rocm 7.2 see compatibility chart here

Additional information

  • Would you be willing to help implement this feature?

Final checks

  • I understand that this is a feature request function. I also understand that if i use this to report a bug report, that my issue might be un-answered closed.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or requesthelp wantedExtra attention is needed

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions