langchain-ai / langchainjs

πŸ¦œπŸ”— Build context-aware reasoning applications πŸ¦œπŸ”—
https://js.langchain.com/docs/
MIT License
12.24k stars 2.07k forks source link

Potential confusion on using OPENAI_API_KEY and AZURE_OPENAI_API_KEY together #4951

Open castroalves opened 5 months ago

castroalves commented 5 months ago

Checked other resources

Example Code

How to reproduce it:

Try to run and you should see the message "Azure OpenAI API instance name not found".

Error Message and Stack Trace (if applicable)

Azure OpenAI API instance name not found

Description

Hey guys, I'm using langchain on my NextJS project. I'm using 0.0.202 version, but I've also face the same issue when bumping to the latest version (0.1.30 as of now).

After investigating the issue, I realized that using AZURE_OPENAI_API_KEY, you automatically assign that value to the this.azureOpenAIApiKey property (see here).

On production, we are using OPENAI_API_KEY and it's working as expected.

We have a preview branch where we are testing something else. When doing requests to OpenAI, it's not working and returning the error message Azure OpenAI API instance name not found.

However, we have a another preview branch where we are implementing Azure, but it's still being tested and reviewed by our team. Therefore, we have AZURE_OPENAI_API_KEY defined.

Because of that, looks like there is a conflict on both envs and the Azure one takes priority once it's defined. And so I'm getting the error Azure OpenAI API instance name not found.

Important: I use langchain for almost one year and wasn't aware of this behavior. πŸ˜…

System Info

System info:

langchain version: 0.0.202 (also tested on 0.1.30) Platform: macOS Ventura 13.0 Node version: v21.4.0 pnpm version: 8.10.2

dosubot[bot] commented 5 months ago

Hey @castroalves, great to see you back! I hope all is well on your end. Let's dive into this issue together.

Thank you for providing detailed information about the issue you're facing. It seems like you're encountering a conflict when both OPENAI_API_KEY and AZURE_OPENAI_API_KEY are defined in your .env file.

Based on the context provided, it appears that LangChain.js is designed to prioritize the Azure configuration over the OpenAI configuration when both keys are present. This is likely causing the error message "Azure OpenAI API instance name not found" because the AZURE_OPENAI_API_INSTANCE_NAME is not set in your .env file.

To resolve this issue, you have a couple of options:

  1. If you're using Azure OpenAI, ensure that all necessary Azure OpenAI API variables are correctly set in your .env file, especially the AZURE_OPENAI_API_INSTANCE_NAME. This error typically indicates that the AZURE_OPENAI_API_INSTANCE_NAME is either missing or not correctly set. You can refer to the example .env file in the LangChain.js repository for guidance on setting these variables.

  2. If you're not using Azure OpenAI and prefer to use OpenAI directly, ensure that only OPENAI_API_KEY is set and the Azure related keys are either commented out or removed from your .env file. This will help avoid any conflicts in the handling of these keys by LangChain.js.

I hope this helps! If you have any further questions or if the issue persists, please provide more details so I can assist you better.

Sources

#### About Dosu This response is meant to be useful and save you time. It is **not meant to be a precise solution**, but rather a starting point for your own research. Help me be more useful! Please leave a πŸ‘ if this is helpful and πŸ‘Ž if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot.
oliviermills commented 4 months ago

The only work around when having both env vars, to solve this, is to use different env names for the keys and only set and unset the AZURE_OPENAI_API_KEY when needed.

Please prioritize this.

Solution idea

use an additional explicit service property in the constructor for OpenAIEmbeddings and OpenAI classes with options 'openai' | 'azure' defaulting to azure to reflect current behaviour

And change the code in llm.ts here to not throw an error if there is an azure key.. because maybe we aren't using it!, hence theservice idea for explicit choice.

Why

Both classes are already opinionated with Azure and Open keys, so might as well provide a non

Effects on code base

Limited as default behaviour is to prioritize Azure.

johnnyoshika commented 3 months ago

I created this repo to easily reproduce this: https://github.com/johnnyoshika/langchain-azure-openai-collision

Indeed the only way around this right now is to use a different environment variable from the default recommended one, AZURE_OPENAI_API_KEY, if you want to use both OpenAI and Azure OpenAI in the same project.