Skip to content

Issues: eth-sri/lmql

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Assignee
Filter by who’s assigned
Sort

Issues list

Unload model from GPU
#362 opened Sep 16, 2024 by miqaP
[Feature Request] Group chat in lmql
#361 opened Aug 29, 2024 by whpy
Starting out support for lmql with conda on wsl
#355 opened Jun 8, 2024 by Gayanukaa
2 tasks
Support for GPT-4o
#354 opened May 17, 2024 by mainpyp
Support for Ollama
#353 opened May 17, 2024 by fjfricke
Difficulty getting started!
#352 opened May 4, 2024 by mcchung52
Docs for cache behaviour
#342 opened Mar 16, 2024 by benwhalley
Use external files and avoid quoting? question Questions about using LMQL.
#341 opened Mar 14, 2024 by benwhalley
Add a normal llama.cpp server endpoint option. enhancement New feature or request
#338 opened Mar 7, 2024 by lastrosade
Possible error with nests of nested queries in Python bug Something isn't working
#329 opened Feb 23, 2024 by mlinegar
[BUG] sub-queries fail with temperature=0 bug Something isn't working
#328 opened Feb 23, 2024 by gamendez98
Quotes aren't always parsed properly bug Something isn't working good first issue Good for newcomers
#324 opened Feb 16, 2024 by nopepper
ProTip! Adding no:label will show everything without a label.