Closed xnaron closed 9 months ago
@xnaron - Grab the latest version (0.4.1--just published) and give it a try. Let me know how things go.
Thanks I think that worked but I'm not seeing what I expected. When I do a list models I see only gpt-3.5-turbo when I have the list of models you can see attached. Is this because all models will just appear as gpt-3.5-turbo for the id?
Looks like TGWUI doesn't yet have response parity with the OpenAI List models API endpoint. Here's what I found:
response = OAImodels.list_models() # remove the dummy
reference from the function name.
You'll get a list of strings containing the file names in or TGWUI _/models_ directory as a result, though still not true parity with OpenAI's [response structure](https://platform.openai.com/docs/api-reference/models/list) which is an object in this form:
```python
{
"object": "list",
"data": [
{
"id": "model-id-0",
"object": "model",
"created": 1686935002,
"owned_by": "organization-owner"
},
{
"id": "model-id-1",
"object": "model",
"created": 1686935002,
"owned_by": "organization-owner",
},
{
"id": "model-id-2",
"object": "model",
"created": 1686935002,
"owned_by": "openai"
},
],
"object": "list"
}
If the TGWUI community sorts this out, the updated response structure should be returned to you without an update to this Node-RED node. Do let me know if you discover otherwise.
Closing this one out. Cheers.
I have a working text-generation-webui api url that I am using with many other things here locally. I just installed the node-red-openai-api and configured as per the docs for local llm as below and I get the error specified in title.