Skip to content
#

llm-client

Here are 19 public repositories matching this topic...

FlexiProxy is a service proxy that provides OpenAI-Compatible API compatibility for different target supplier platforms. It allows users to use different backend services under existing LLM clients, solving the problem of expensive or unavailable large language model backend services in certain regions for clients that are easy to use.

  • Updated Sep 29, 2025
  • TypeScript

Improve this page

Add a description, image, and links to the llm-client topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the llm-client topic, visit your repo's landing page and select "manage topics."

Learn more