Open CGH20171006 opened 6 days ago
I am having a similar problem, where following the docs for an ollama local model, paper-qa still tries to use my OpenAI key which responds with an over quota reply. It seems like even with local model settings, it still tries to call a remote API.
When I want to use the litellm model, add in the Docs method doesn't seem to work. An error is reported when using this method, prompting me to do an OPENAI_API_KEY definition. But at this time I am trying to use the liteLLM API.
error message: