Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add OpenAiChatModel stream observability #1191

Closed

Conversation

chemicL
Copy link
Member

@chemicL chemicL commented Aug 8, 2024

Integrated Micrometer's Observation into the OpenAiChatModel#stream reactive chain.

Included changes:

  • Added ability to aggregate streaming responses for use in Observation metadata.
  • Improved error handling and logging for chat response processing.
  • Updated unit tests to include new observation logic and subscribe to Flux responses.
  • Refined validation of observations in both normal and streaming chat operations.
  • Disabled retry for streaming which used RetryTemplate - should use .retryWhen operator as the next step.
  • Added an integration test.

This PR is a joint effort with @tzolov.

Resolves #1190

chemicL and others added 5 commits August 8, 2024 11:35
- Integrated Micrometer's  for tracing in  method.
- Enhance  to aggregate streaming responses and handle metadata.
- Improved error handling and logging for chat response processing.
- Updated unit tests to include new observation logic and subscribe to Flux responses.
- Refined  to validate observations in both normal and streaming chat operations.
- Updated test configurations and removed unnecessary imports.
@tzolov
Copy link
Contributor

tzolov commented Aug 8, 2024

Thanks you @chemicL
Rebased, squashed and merged at 478f180

@tzolov tzolov closed this Aug 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

OpenAI Streaming Observability support
2 participants