Open bachp opened 6 months ago
Came here to say the same thing, I don't use ollama but something else like many users I reckon who like to use models offline.
OpenRouter can utilize the OpenAI API for model access. Besides the already available API Key setting, I suggest adding an option for the equivalent of OPENAI_BASE_URL in the Advanced Settings. For OpenRouter usage, this is typically set to https://openrouter.ai/api/v1
. This addition might help to partially address the issue.
Allowing to specify an alternative URL for the OpenAI API gives the user the possibility to run the plugin against another alternative implementation like FastChat or llama.cpp.