Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Conversion for OpenAI-compliant outputs in Spring AI #1379

Open
bruno-oliveira opened this issue Sep 19, 2024 · 1 comment
Open

Conversion for OpenAI-compliant outputs in Spring AI #1379

bruno-oliveira opened this issue Sep 19, 2024 · 1 comment
Labels
enhancement New feature or request openai

Comments

@bruno-oliveira
Copy link
Contributor

Expected Behavior

Essentially, title.

Some of the modern UI frameworks for LLMs tend to standardize on using what's called an "OpenAI compliant API" response format, where, essentially, you can use any LLM+server-side framework combination you want as long as you adhere to the OpenAI response format.

Current Behavior

The ChatResponse class is not directly compatible with OpenAI which is, of course, desirable, as Spring AI aims to be a generic framework that abstracts over many different providers with a unified API.

The cost of writing a small "wrapper record" is negligible and it works okay, but, is there some function or method that we can call somewhere in the framework to make the responses be OpenAI-compliant?

Context

Essentially, many clients standardize on OpenAI compliant formats which can make some of the output of SpringAI needing to be adapted. It could make sense to offer this out of the box.

Example of a wrapper record for streaming "OpenAI chunks":

public class OpenAIStreamingResponse {
    public record ChatCompletionChunk(
            String id,
            String object,
            long created,
            String model,
            @JsonProperty("system_fingerprint") String systemFingerprint,
            List<ChunkChoice> choices) {}

    public record ChunkChoice(
            int index, Delta delta, Object logprobs, @JsonProperty("finish_reason") String finishReason) {}

    public record Delta(String role, String content) {}
}
@bruno-oliveira
Copy link
Contributor Author

@tzolov @markpollack Do you feel like this is something that would be valuable to have in the framework?
Essentially, having parameter when returning from a stream() or call() method that would return an OpenAICompliantChatResponse.

For instance:

Flux<OpenAICompliantChatResponse> response = chatModel.prompt("Example").withOpenAICompliantFormat();

Or a sort of similar construct.

As described above, the main reason for this request is that a lot of the "out of the box" front-end solutions to use LLMs have standardized on an OpenAI format specification which adds a slight amount of extra code to circumvent.

@asaikali asaikali added openai enhancement New feature or request labels Oct 30, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request openai
Projects
None yet
Development

No branches or pull requests

2 participants