Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to increase the timeout waiting time #517

Closed
Crestina2001 opened this issue Nov 7, 2024 · 2 comments
Closed

How to increase the timeout waiting time #517

Crestina2001 opened this issue Nov 7, 2024 · 2 comments

Comments

@Crestina2001
Copy link

I am using o1, and it will take a lot of time to respond at times. How to increase the timeout such that it won't give an Error: Failed to fetch

@Crestina2001
Copy link
Author

I am unfamiliar with JS. The chatGPT told me 'Error: Failed to fetch' could be thrown for various reasons, and the code does not explicitly set a timeout.

The issue may be related to the Max tokens, when I set it to 131072(suggested by the UI), it will at times throw the error, but when I set it to 131072000, it works fine. But I am still curious, as it is unlikely for the answer to exceed 131072 tokens.

@Niek
Copy link
Owner

Niek commented Nov 7, 2024

Unfortunately streaming is not available for o1 yet. It can take really long for the model to return a reply. There is no hard timeout in the code, but we need to add some logging to show when the limit was surpassed (see: #513 (comment)).

@Niek Niek closed this as completed Nov 7, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants