You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Some of the modern UI frameworks for LLMs tend to standardize on using what's called an "OpenAI compliant API" response format, where, essentially, you can use any LLM+server-side framework combination you want as long as you adhere to the OpenAI response format.
Current Behavior
The ChatResponse class is not directly compatible with OpenAI which is, of course, desirable, as Spring AI aims to be a generic framework that abstracts over many different providers with a unified API.
The cost of writing a small "wrapper record" is negligible and it works okay, but, is there some function or method that we can call somewhere in the framework to make the responses be OpenAI-compliant?
Context
Essentially, many clients standardize on OpenAI compliant formats which can make some of the output of SpringAI needing to be adapted. It could make sense to offer this out of the box.
Example of a wrapper record for streaming "OpenAI chunks":
publicclassOpenAIStreamingResponse {
public record ChatCompletionChunk(
Stringid,
Stringobject,
longcreated,
Stringmodel,
@JsonProperty("system_fingerprint") StringsystemFingerprint,
List<ChunkChoice> choices) {}
public record ChunkChoice(
intindex, Deltadelta, Objectlogprobs, @JsonProperty("finish_reason") StringfinishReason) {}
public record Delta(Stringrole, Stringcontent) {}
}
The text was updated successfully, but these errors were encountered:
@tzolov@markpollack Do you feel like this is something that would be valuable to have in the framework?
Essentially, having parameter when returning from a stream() or call() method that would return an OpenAICompliantChatResponse.
As described above, the main reason for this request is that a lot of the "out of the box" front-end solutions to use LLMs have standardized on an OpenAI format specification which adds a slight amount of extra code to circumvent.
Expected Behavior
Essentially, title.
Some of the modern UI frameworks for LLMs tend to standardize on using what's called an "OpenAI compliant API" response format, where, essentially, you can use any LLM+server-side framework combination you want as long as you adhere to the OpenAI response format.
Current Behavior
The ChatResponse class is not directly compatible with OpenAI which is, of course, desirable, as Spring AI aims to be a generic framework that abstracts over many different providers with a unified API.
The cost of writing a small "wrapper record" is negligible and it works okay, but, is there some function or method that we can call somewhere in the framework to make the responses be OpenAI-compliant?
Context
Essentially, many clients standardize on OpenAI compliant formats which can make some of the output of SpringAI needing to be adapted. It could make sense to offer this out of the box.
Example of a wrapper record for streaming "OpenAI chunks":
The text was updated successfully, but these errors were encountered: