Open thoraxe opened 2 days ago
Hi @thoraxe. Thanks for raising the issue.
The OPENAI_API_KEY
is definitely required. When you say that Otto8 tries to use OpenAI anyway, what do you mean?
The default configuration appears to be that OpenAI models are all present/configured and set as the defaults. If you set a "fake" OpenAI key to get past the startup error, and then ask a question in the Otto8 interface, you get an error because the OpenAI API key is bad.
It's not until you go in and manually reconfigure Otto8 to use Azure OpenAI models as the default that things start to work.
If I'm using Azure OpenAI for embedding and inference, why would I need to configure an OpenAI key at all? It makes it seem like using OpenAI is a hard requirement even if you want to use Azure OpenAI?
Thank you very much for explaining.
For now, this is how things are intended to work. However, we will keep this issue open as we discuss how the product should evolve.
If you want me to convert this to an RFE -- if I have only configured Azure OpenAI environment variables, then the "startup" sequence should potentially configure no models (empty list of models), or pre-populate a sensible set of Azure-specific defaults.
At a minimum I think that the service/server should start without an OpenAI API key even if it means there will be other failures post-start.
Thanks for that input! I will bring it to the team as we determine how to move forward.
I am using Azure OpenAI and want to use this with Otto8. https://docs.otto8.ai/configuration/model-providers documents how to use Azure OpenAI with Otto8. However, when setting the Azure-specific environment variables, I get the following error when running the container:
OPENAI_API_KEY env is required to be set
Setting an OpenAI API key allows the startup to complete, but then Otto8 tries to use OpenAI anyway under the covers.
Here is my container run: