danny-avila / LibreChat

Enhanced ChatGPT Clone: Features Anthropic, OpenAI, Assistants API, Azure, Groq, GPT-4o, Mistral, OpenRouter, Vertex AI, Gemini, Artifacts, AI model switching, message search, langchain, DALL-E-3, ChatGPT Plugins, OpenAI Functions, Secure Multi-User System, Presets, completely open-source for self-hosting. Actively in public development.
https://librechat.ai/
MIT License
17.2k stars 2.87k forks source link

OpenRouter model fetching and assignment #971

Closed jordantgh closed 11 months ago

jordantgh commented 11 months ago

Contact Details

No response

What is your question?

I am trying to access the gpt-4-32k model via openrouter. I have set my openrouter api key and from a couple of test messages confirm I am able to communicate with gpt4 via the LibreChat interface:

However, I'd like to access the 32k model. Is it as simple as uncommenting the OPENAI_MODELS key and adding gpt-4-32k? I will try that now and report back, but I thought it's worth logging an issue either way since it's non transparent.

More Details

N/A

What is the main subject of your question?

Documentation, Endpoints

Screenshots

image

Code of Conduct

jordantgh commented 11 months ago

OK, it seems that my hypothesis was right - when you set an openrouter API key, it still draws from the list of models specified by OPENAI_MODELS, and changing this is all you need.

danny-avila commented 11 months ago

OK, it seems that my hypothesis was right - when you set an openrouter API key, it still draws from the list of models specified by OPENAI_MODELS, and changing this is all you need.

it should be fetching the openrouter api for a list of models to show all of its supported models, was it not doing that for you? they are formatted differently, openai/gpt-xxx

danny-avila commented 11 months ago

i found that when you make the change and set openrouter, you restart your server as always, but the client/web app page needs to be refreshed once or twice per server restart for the list to update from the default.

danny-avila commented 11 months ago

In this case, i didn't even have to refresh the page after first server restart, and I don't have OPENAI_MODELS set

image

jordantgh commented 11 months ago

Hm, it's not what I see.

Here is how things look with OPENAI_MODELS commented out and OPENROUTER_API_KEY set:

image

And here is an example where I set both:

.env image

UI image

Note that I fully restarted the container and clear browser cache after any changes.

I am using Docker Desktop and Windows.

I'd note that it does appear to be working, and the chats are showing up in my openrouter activity.

jordantgh commented 11 months ago

IDK if this is helpful, but it seems that the model is automatically appended to the openai/ route, hence:

image

danny-avila commented 11 months ago

Hm, it's not what I see.

Here is how things look with OPENAI_MODELS commented out and OPENROUTER_API_KEY set:

Try refreshing the page to see if the list updates. Also I guess it appends on their end, that's good to know.

jordantgh commented 11 months ago

Try refreshing the page to see if the list updates.

As I said, I was fully restarting the container and clearing browser cache after any changes (and reloading, of course). I can confirm that, at least on my system, setting the OPENROUTER_API_KEY variable alone isn't sufficient to populate the web frontend with the list of openrouter models; I require to do this manually.

jordantgh commented 11 months ago

I haven't set any openai keys in the first place as I just went straight to openrouter. Hence the page still prompts me to Set API Key First

image

Is it possible there is some path dependency where you are assuming that first an openai key is present and being overwritten?

jordantgh commented 11 months ago

From quickly scanning around the code base it isn't clear to me how modelsConfig gets updated if OPENROUTER_API_KEY is set; if you can point me in the right direction I can take a closer look and see if I can't figure something out.

danny-avila commented 11 months ago

Is it possible there is some path dependency where you are assuming that first an openai key is present and being overwritten?

Oh I see, yes you need an OPENAI_API_KEY variable set, and then models will fetch. The reason is that an api call shouldn't be made if there is no openai key is set, but openrouter should be treated it separately in this regard and not just overwriting this condition

jordantgh commented 11 months ago

I see - does this need looking at further? I can open another issue if so. Otherwise it's fine. For myself the issue is solved :)

danny-avila commented 11 months ago

I see - does this need looking at further? I can open another issue if so. Otherwise it's fine. For myself the issue is solved :)

it was a quick fix so im pushing it with my new PR!