Closed dascole closed 4 months ago
It is tracked by KAG-4312
Hey @Water-Melon, I don't have access to Kong JIRA. How can I get updates on KAG-4312? I am also facing same issue with self hosted llama2 model
https://github.com/Kong/kong/pull/12903 may fix this issue, could you check it? thanks.
@chronolaw I have tested against this branch but it throws another 500
{
"message": "An unexpected error occurred",
"request_id": "b1dc385ad051903a0a30441d9b7dc97d"
}
2024/05/06 10:24:52 [error] 2660914#0: *12887 [kong] init.lua:405 [ai-proxy] ./kong/llm/drivers/llama2.lua:267: bad argument #1 to 'string_gsub' (string expected, got nil), client: 127.0.0.1, server: kong, request: "POST /echo HTTP/1.1", host: "localhost:8000", request_id: "b1dc385ad051903a0a30441d9b7dc97d"
When no path is specified parsed_url.path is nil
2024/05/06 10:24:52 [notice] 2660914#0: *12887 [lua] llama2.lua:265: configure_request(): parsed_url = {
authority = "myserver:11434",
host = "myserver",
port = "11434",
scheme = "http"
}
$ kong version
3.7.0
$ git branch
* feat/streamline_ai_proxy
Thank you @dascole , we have recorded it in the ticket KAG-4312, we are still trying to fix it.
Hi everyone; many apologies for this bug - I had originally fixed it and I knew it was coming in 3.7.0, so I wasn't watching this issue, and now during refactoring the URL parser for fixing OLLAMA token streaming I had broken it again. This is my fault here.
I have opened the suggested fix and tested it again: https://github.com/Kong/kong/pull/12998 but we may be frozen for 3.7.0 features now, so it might roll into fixes for 3.7.1 instead.
See the linked pull request for updates on this one.
Hi all @chronolaw @dascole @rohitrsh @Water-Melon
This is fixed in Kong 3.7.0 and onwards, have verified it's working with/without port, path, etc.
Is there an existing issue for this?
Kong version (
$ kong version
)3.6.1
Current Behavior
When the AI-Proxy plugin is enabled and the configured to use Model.Options.Upstream Url that lacks a port/path, the below behavior is observed.
2024/04/16 13:58:38 [error] 1279#0: *20351 [kong] init.lua:405 [ai-proxy] /usr/local/share/lua/5.1/kong/llm/drivers/llama2.lua:266: path must be a string, client: 172.25.0.1, server: kong, request: "POST /echo HTTP/1.1", host: "localhost:8000", request_id: "3c5c232285a29358a4d7567573a04219"
2024/04/16 13:59:23 [error] 1279#0: *20271 [kong] init.lua:405 [ai-proxy] /usr/local/share/lua/5.1/kong/llm/drivers/llama2.lua:268: port must be an integer, client: 172.25.0.1, server: kong, request: "POST /echo HTTP/1.1", host: "localhost:8000", request_id: "9a948902ff31d42d49899041b1079f5f"
These are the result of passing nil values to
Expected Behavior
The AI-Proxy plugin should handle cases where the components of the upstream URL are missing more gracefully:
Missing Path: When the upstream URL lacks a path, the plugin should default to using “/” instead of throwing a server error. This would provide a fallback behavior to maintaining functionality.
Missing Port: If the upstream URL does not specify a port, the plugin should infer the port based on the protocol used (i.e HTTP/80, HTTPS/443).
Steps To Reproduce
Response:
Anything else?
I'm happy to submit the PR for this if we agree on the approach. My thought is to simply perform the checks mentioned above, something along these lines.
This has been working well in testing, but open to other ideas and would like to help drive this forward.