Skip to content

Conversation

@Sameerlite
Copy link
Collaborator

Title

Implemented native XAI Responses API support in LiteLLM

Relevant issues

Fixes LIT-1442

Pre-Submission checklist

Please complete all items before asking a LiteLLM maintainer to review your PR

  • I have Added testing in the tests/litellm/ directory, Adding at least 1 test is a hard requirement - see details
  • I have added a screenshot of my new test passing locally
  • My PR passes all unit tests on make test-unit
  • My PR's scope is as isolated as possible, it only solves 1 specific problem

Type

🆕 New Feature

Changes

  1. Created XAI Responses API Configuration Class
  2. Registered XAI Provider Config
  3. Updated Exports
  4. Added tests

Reference : https://docs.x.ai/docs/guides/responses-api

Non-Streaming
Screenshot 2025-11-06 at 3 17 47 PM

Streaming
Screenshot 2025-11-06 at 3 19 33 PM

@vercel
Copy link

vercel bot commented Nov 8, 2025

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Preview Comments Updated (UTC)
litellm Ready Ready Preview Comment Nov 8, 2025 7:53pm

@Sameerlite Sameerlite marked this pull request as ready for review November 8, 2025 07:34
@ishaan-jaff ishaan-jaff merged commit 6fb0a8f into main Nov 8, 2025
48 of 53 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants