Skip to content

Best way to use an OpenAI-compatible embedding API #11809

Discussion options

You must be logged in to vote

@BeautyyuYanli @WinPooh32 @david1542 @juanluisrosaramos @vikrantdeshpande09876

Here's the non-langchain method for using openai-like API's

pip install llama-index-llms-openai-like llama-index-embeddings-openai
from llama_index.embeddings.openai import OpenAIEmbedding
from llama_index.llms.openai_like import OpenAILike

embed_model = OpenAIEmbedding(
  model="some model", # use `model` instead of `model_name` -- janky I know
  api_base="...",
  api_key="fake",
  embed_batch_size=10,
)

llm = OpenAILike(
  model_name="my model",
  api_key="fake",
  api_base="...",
  # context window should match whatever llm you are using
  context_window=32000,  
  # specifies whether or not to use chat c…

Replies: 6 comments 12 replies

Comment options

You must be logged in to vote
1 reply
@BeautyyuYanli
Comment options

Comment options

You must be logged in to vote
1 reply
@juanluisrosaramos
Comment options

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
2 replies
@Searcherr
Comment options

@vikrantdeshpande09876
Comment options

Comment options

You must be logged in to vote
1 reply
@logan-markewich
Comment options

Comment options

You must be logged in to vote
7 replies
@WinPooh32
Comment options

@logan-markewich
Comment options

@GTimothee
Comment options

@logan-markewich
Comment options

@GTimothee
Comment options

Answer selected by logan-markewich
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
8 participants