Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Temperature #2

Open
7robots opened this issue Jun 28, 2024 · 1 comment
Open

Temperature #2

7robots opened this issue Jun 28, 2024 · 1 comment

Comments

@7robots
Copy link

7robots commented Jun 28, 2024

What is the maximum temperature value that easy-llms supports?

Although OpenAI documentation suggests that gpt-4o will support 0 to 2, I found that easy-llms errors out at 1.7 and above.

I was able to successfully query with a temperature up to 2 with a direct API request to OpenAI using chat completion.

@ventz
Copy link
Owner

ventz commented Jun 28, 2024

@7robots I seem to be able to call it with a temperature of 1.9 and 2:

% cat test.py 
from llms.openai import *

answer = gpt_4o(temperature=1.9).run("what llm are you?")
print(answer)


answer = gpt_4o(temperature=2).run("what llm are you?")
print(answer)

This is using the latest version:

% pip list | grep easy-llms
easy-llms                     0.1.7

Can you try a new/clean folder + creating a new venv with the latest version (if not using already)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants