Open templehasfallen opened 3 months ago
maybe this was a typo:
api-key-env: API_KEY_IS_HERE_REDACTED
but this should have the environment variable used to find the openai api key if api-key
is not set. default is this:
api-key:
api-key-env: OPENAI_API_KEY
maybe this was a typo:
api-key-env: API_KEY_IS_HERE_REDACTED
but this should have the environment variable used to find the openai api key if
api-key
is not set. default is this:api-key: api-key-env: OPENAI_API_KEY
You are right. I've now set it to OPENAI_API_KEY
and set the env variable OPENAI_API_KEY
to my openai API key (which I do not wish to use), but the result is exactly the same:
apis:
openai:
base-url: https://api.openai.com/v1
api-key:
api-key-env: OPENAI_API_KEY
models: # https://platform.openai.com/docs/models
I can somewhat confirm this behaviour. Even when removing all API configurations other than the one for LocalAI I get asked for an OpenAI API key (which I do not have). I guess that's because OpenAI is the default API configuration (mods.go
, in the switch mod.API
block, it calls ensureKey
to make sure the environment variable is set).
I think the environment variable does not need to hold your actual OpenAI API key, any value should do to pass this check. I got past this point by setting OPENAI_API_KEY to a random value, and it looks like mods is making a request to the configured API endpoint now. There's now answer displayed after the "Generating" message finishes, but I am not sure if that is a problem with mods or with my setup of LocalAI. The link they give for setup (https://github.com/mudler/LocalAI#example-use-gpt4all-j-model) is dead, so I'm not sure if my LocalAI is set up correctly.
I'm having the same issue - I tried setting the env var to something random and still got this error. Even having localai be the only option in the config file gives the same error.
Same error whether I pass the model & api in via command line or select localai via mods -M.
Describe the bug Using the latest version, for some reason, I cannot use my LocalAI endpoints at all. Having first carried over a configuration from an older version and then completely reset settings and only added my localai endpoint (either keeping or deleting the configuration for other apis), whatever I do, I keep getting:
I have tried:
OPENAI_API_KEY
via config/settings (it should not be required if using localai).OPENAI_API_KEY
The behavior is the same regardless of command. If I go with
mods -M
, I am able to select my model and type a prompt and am later present with that error again (see attached GIF)Setup
To Reproduce Steps to reproduce the behavior:
Source Code Config file:
Alternatively:
Expected behavior OPENAI_KEY should be ignored if set or not if using LocalAI as API.
Screenshots Behavior as per the second config file:
Additional context I'm not sure if I am missing something, but even having generated a fresh config and having looked a the code I see two issues: