Closed caufieldjh closed 5 months ago
This is the case for other operations requiring GPT access too, like Curate.
Setting the key with runoak set-apikey -e openai (key)
instead does not solve the issue.
Ah, here's the issue: llm
handles storage and retrieval of API keys in all places except the chromadb_adapter
, so getting embeddings works as expected but the key this uses is pulled directly from the local environment.
All other instances, like in curategpt's Extractor class, call llm.get_key()
- but the key is stored with the model object returned from llm.get_model()
. In practice they should all be using the same environment and have the same keys, but that may not always be true.
llm
stores its own configs, including credentials, in a user directory. These get used preferentially over environment variables.
On my system they're in ~/.config/io.datasette.llm
Closing this issue as it will only happen if setting up llm
independent of curategpt
.
Setting the OpenAI API key as stated in the README may not consistently set it in a way the app can access. If I do the following:
and then use the Chat interface, I get an authentication error:
That's not my API key...but it is one I have used in the past! It's not active anymore so it won't work here, and I'm not certain where CurateGPT is finding or why it isn't using the one I just set to
OPENAI_API_KEY
.