langflow-ai / langflow

Langflow is a low-code app builder for RAG and multi-agent AI applications. It’s Python-based and agnostic to any model, API, or database.
http://www.langflow.org
MIT License
35.54k stars 4.23k forks source link

Dynamic Agent template: OpenAI key error #4395

Open L-Huber opened 2 weeks ago

L-Huber commented 2 weeks ago

Bug Description

I currently want to expand the Dymanic Agent of langflow. However, I get the following error: AuthenticationError: litellm.AuthenticationError: AuthenticationError: OpenAIException - Error code: 401 - {'error': {'message': 'Incorrect API key provided: dummy. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}

Reproduction

  1. Choose the Dynamic Agent template as a new project in langflow
  2. Add your openAI KI
  3. Run the flow

Expected behavior

There should not be an error and the flow should be working

Who can help?

@ogabrielluiz @nicoloboschi @zzzming @jo

Operating System

MacOS

Langflow Version

1.3.5

Python Version

3.11

Screenshot

Bildschirmfoto 2024-11-05 um 10 06 21

Flow File

No response

dosubot[bot] commented 2 weeks ago

Hey @L-Huber! πŸ‘‹ I'm here to help you with any bugs, questions, or contributions for langflow. Let's work together to solve the issue you're facing.

To resolve the OpenAI key error (AuthenticationError: OpenAIException - Error code: 401) when using the Dynamic Agent template in Langflow version 1.3.5, ensure that you have correctly added your OpenAI API key to the Global Variables in Langflow. Follow these steps:

  1. Open Langflow and navigate to Settings.
  2. Go to Global Variables.
  3. Click Add New.
  4. Name your variable and paste your OpenAI API key in the Value field.
  5. In the Apply To Fields field, select the field to apply this variable to.
  6. Click Save Variable.

Make sure that the API key is valid and correctly entered. This should resolve the authentication error [1].

To continue talking to Dosu, mention @dosu.

L-Huber commented 2 weeks ago

Hey @L-Huber! πŸ‘‹ I'm here to help you with any bugs, questions, or contributions for langflow. Let's work together to solve the issue you're facing.

To resolve the OpenAI key error (AuthenticationError: OpenAIException - Error code: 401) when using the Dynamic Agent template in Langflow version 1.3.5, ensure that you have correctly added your OpenAI API key to the Global Variables in Langflow. Follow these steps:

  1. Open Langflow and navigate to Settings.
  2. Go to Global Variables.
  3. Click Add New.
  4. Name your variable and paste your OpenAI API key in the Value field.
  5. In the Apply To Fields field, select the field to apply this variable to.
  6. Click Save Variable.

Make sure that the API key is valid and correctly entered. This should resolve the authentication error [1].

To continue talking to Dosu, mention @dosu.

I tried this, but it has not worked.

carlosrcoelho commented 2 weeks ago

Hey @L-Huber

What version are you using?

I just tried on the latest version 1.0.19.post2 and worked as expected.

image
L-Huber commented 2 weeks ago

@carlosrcoelho just updated it to Langflow-1.0.19.post2 langflow-base-0.0.99. I even created a new openai key and tested it. But it is still not working.

carlosrcoelho commented 2 weeks ago

@L-Huber

Can you share your flow?

L-Huber commented 2 weeks ago

Dynamic Agent (3) - not working.json sure! @carlosrcoelho

carlosrcoelho commented 2 weeks ago

@L-Huber Make sure all the OpenAI components have the API Key.

I tried to replicate it, but I didn't get the error.

image

However, if I forget to add an API Key to a component, the error appears.

image
L-Huber commented 2 weeks ago

@carlosrcoelho that is the funny thing: all previous steps until "Hierarchical Crew" work:

Bildschirmfoto 2024-11-06 um 23 17 58

so the API key should be correct.

SavvySupport commented 2 weeks ago

I have the same issue. I think it's an error with "Hierarchical Crew" component. I tried using Groq as the LLM, and there's an issue with that too.

Error building Component Hierarchical Crew:

litellm.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=llama-3.2-1b-preview Pass model as E.g. For 'Huggingface' inference endpoints pass in completion(model='huggingface/starcoder',..) Learn more: https://docs.litellm.ai/docs/providers

L-Huber commented 2 weeks ago

@SavvySupport Do you have any solution to this?

yannisTr commented 2 weeks ago

Hello,

Same error here with sequential crew and custom base url for the llm (to use with openrouter). All previous steps work fine, llm works with other types of components (like chains).

austin-duff-prft commented 1 week ago

Same issue.

guilegarcia commented 1 week ago

I'm experiencing the same issue. Has anyone found a solution? I believe it might be a bug.