Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Max token length exceeded error #1

Open
salvatoto opened this issue Apr 29, 2023 · 2 comments
Open

Max token length exceeded error #1

salvatoto opened this issue Apr 29, 2023 · 2 comments

Comments

@salvatoto
Copy link

Running pycodeagi.py, got a max token length exceeded error at the "APP CODE" step:

Traceback (most recent call last):
  File "pycodeagi.py", line 165, in <module>
    pycode_agi({"objective": objective})

openai.error.InvalidRequestError: This model's maximum context length is 4097 tokens, however you requested 4170 tokens (1170 in your prompt; 3000 for the completion). Please reduce your prompt; or completion length.

@chakkaradeep
Copy link
Owner

The GPT3 version can only use 4097 tokens, and unfortunately, there is no way to optimize or increase the token usage. The GPT4 version has higher token limits and can give you better results.

@chakkaradeep
Copy link
Owner

Curios to know what did you ask it to build? I can try with GPT4 version if you do not have access to GPT4.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants