Open jami3f opened 2 months ago
Hi @jami3f,
Thanks for reaching out, we on it and will update you as soon as it's fixed.
I have the same problem.
Hi,
The issue is fixed in yesterdays package release: 0.2.2, let me know if issue persists.
Nope.
If I do:
llm = ChatNVIDIA(
base_url="https://myserver.com/llama/v1",
model="meta/llama-3.1-8b-instruct",
)
I get some error that indicates it's using https://myserver.com/v1/chat/completions
...
'raw': <urllib3.response.HTTPResponse object at 0x1727f9900>, 'url': 'https://myserver.com/v1/chat/completions',
'encoding': 'ISO-8859-1', 'history': [], 'reason': 'Not Found', 'cookies': <RequestsCookieJar[]>, 'elapsed':
datetime.timedelta(microseconds=17269), 'request': <PreparedRequest [POST]>, 'connection':
<requests.adapters.HTTPAdapter object at 0x172746300>}
...
I believe the additional necessary portions of the path get removed. In @renambot's case /llama
would need to be reinserted before /v1
in the URL that is used to query the NIM.
Yes, I would recommend either:
path
in the URL (it is removed right now) and tell people not to add 'v1' in the base_urlLINE 142 in _common.py
# Keep the path and add it to '/v1'
v = urlunparse(
(parsed.scheme, parsed.netloc, parsed.path + "/v1", None, None, None)
)
Still in v0.3.1
Hi, I am using a self hosted NIM behind a reverse proxy, meaning to access the LLM, I use the following base URL:
http://{host}/llm/v1
.When using ChatNVIDIA, using this base URL worked until version 0.2.1, where the URL is now expected to have no trailing path. The error is as follows:
Base URL path is not recognized. Expected format is: http://host:port (type=value_error)
. In my case, it is not possible to use this format.For clarity's sake, my code looks like the following:
The error originates from _common.py, line 147, after the validation fails.