Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Expose prompt and parser separately #1403

Open
aaronvg opened this issue Feb 1, 2025 · 2 comments
Open

Expose prompt and parser separately #1403

aaronvg opened this issue Feb 1, 2025 · 2 comments
Labels
good first issue Good for newcomers

Comments

@aaronvg
Copy link
Contributor

aaronvg commented Feb 1, 2025

Some folks want a more modular approach to using BAML, where they may modify the api request themselves.

This issue tracks exposing the api request baml makes from a function, as well as the parser

@hellovai
Copy link
Contributor

hellovai commented Feb 1, 2025

Specifically we could expose this:

# get the raw request BAML would have fired.
prompt = b.prompt.SomeFunction(..)

str = call_llm_yourself(prompt)

# parse into the type using SAP
T = b.parse.SomeFunction(llm_response: str)

# or for streaming
Partial_T = b.stream.parse.SomeFunction(llm_response: str)

This would also enable support for RealTimeAPI and Batch APIs meanwhile while we wait (or any other providers as well).

@hellovai hellovai added the good first issue Good for newcomers label Feb 1, 2025
@PGCodehub
Copy link

Yes this would be amazing, make the library more modular

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
good first issue Good for newcomers
Projects
None yet
Development

No branches or pull requests

3 participants