-
Notifications
You must be signed in to change notification settings - Fork 4.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
AI-Proxy plugin: "An unexpected error occurred" when upstream URL is missing port/path #12869
Comments
It is tracked by KAG-4312 |
Hey @Water-Melon, I don't have access to Kong JIRA. How can I get updates on KAG-4312? I am also facing same issue with self hosted llama2 model |
#12903 may fix this issue, could you check it? thanks. |
@chronolaw I have tested against this branch but it throws another 500
When no path is specified parsed_url.path is nil
|
Hi everyone; many apologies for this bug - I had originally fixed it and I knew it was coming in 3.7.0, so I wasn't watching this issue, and now during refactoring the URL parser for fixing OLLAMA token streaming I had broken it again. This is my fault here. I have opened the suggested fix and tested it again: #12998 but we may be frozen for 3.7.0 features now, so it might roll into fixes for 3.7.1 instead. See the linked pull request for updates on this one. |
Hi all @chronolaw @dascole @rohitrsh @Water-Melon This is fixed in Kong 3.7.0 and onwards, have verified it's working with/without port, path, etc. |
Is there an existing issue for this?
Kong version (
$ kong version
)3.6.1
Current Behavior
When the AI-Proxy plugin is enabled and the configured to use Model.Options.Upstream Url
that lacks a port/path, the below behavior is observed.
2024/04/16 13:58:38 [error] 1279#0: *20351 [kong] init.lua:405 [ai-proxy] /usr/local/share/lua/5.1/kong/llm/drivers/llama2.lua:266: path must be a string, client: 172.25.0.1, server: kong, request: "POST /echo HTTP/1.1", host: "localhost:8000", request_id: "3c5c232285a29358a4d7567573a04219"
2024/04/16 13:59:23 [error] 1279#0: *20271 [kong] init.lua:405 [ai-proxy] /usr/local/share/lua/5.1/kong/llm/drivers/llama2.lua:268: port must be an integer, client: 172.25.0.1, server: kong, request: "POST /echo HTTP/1.1", host: "localhost:8000", request_id: "9a948902ff31d42d49899041b1079f5f"
These are the result of passing nil values to
Expected Behavior
The AI-Proxy plugin should handle cases where the components of the upstream URL are missing more gracefully:
Missing Path: When the upstream URL lacks a path, the plugin should default to using “/” instead of throwing a server error. This would provide a fallback behavior to maintaining functionality.
Missing Port: If the upstream URL does not specify a port, the plugin should infer the port based on the protocol used (i.e HTTP/80, HTTPS/443).
Steps To Reproduce
Response:
Anything else?
I'm happy to submit the PR for this if we agree on the approach. My thought is to simply perform the checks mentioned above, something along these lines.
This has been working well in testing, but open to other ideas and would like to help drive this forward.
The text was updated successfully, but these errors were encountered: