langflow-ai / langflow

Langflow is a low-code app builder for RAG and multi-agent AI applications. It’s Python-based and agnostic to any model, API, or database.
http://www.langflow.org
MIT License
36.67k stars 4.3k forks source link

Environment variables aren't picked up for flow components #3906

Closed devinbost closed 2 months ago

devinbost commented 2 months ago

Bug Description

When I start LangFlow in headless mode, it's not detecting the environment variables that I'm providing. So, I'm getting errors like in this log:

error_logs.log

Reproduction

  1. Clone this branch: airline-langflow-history-hakeem of this repo: https://github.com/devinbost/wikichat

  2. Create a flows directory and storing this file in the flows directory: SIA_flow_v4.15.json

  3. Add this docker compose file to the repo's root. (Remove the txt extension from the file. That was just a workaround to upload the file to Github.)

compose_temp.yaml.txt

  1. Add this Dockerfile.langflow file similarly: Dockerfile.langflow.txt

  2. And this Dockerfile.app file: Dockerfile.app.txt

  3. Run docker-compose up --build

  4. Open the app at http://localhost:3000/

  5. Observe the errors

Expected behavior

The environment variables should propagate through to components in the flow.

Who can help?

No response

Operating System

Mac M1 (Apple Silicon)

Langflow Version

1.0.18

Python Version

3.10

Screenshot

No response

Flow File

Flow file is attached above.

dosubot[bot] commented 2 months ago

There are known issues with running LangFlow on Mac M1 (Apple Silicon) that could affect environment variable loading. Specifically, the environment variable OBJC_DISABLE_INITIALIZE_FORK_SAFETY needs to be set to YES to avoid errors when running gunicorn. This is handled in the set_var_for_macos_issue function:

def set_var_for_macos_issue():
    # OBJC_DISABLE_INITIALIZE_FORK_SAFETY=YES
    # we need to set this var if we are running on MacOS
    # otherwise we get an error when running gunicorn

    if platform.system() in ["Darwin"]:
        import os

        os.environ["OBJC_DISABLE_INITIALIZE_FORK_SAFETY"] = "YES"
        # https://stackoverflow.com/questions/75747888/uwsgi-segmentation-fault-with-flask-python-app-behind-nginx-after-running-for-2 # noqa
        os.environ["no_proxy"] = "*"  # to avoid error with gunicorn
        logger.debug("Set OBJC_DISABLE_INITIALIZE_FORK_SAFETY to YES to avoid error")

Additionally, ensure that you have copied the .env.example file to .env and filled in the necessary values. Here is an example of how your .env file should look:

LANGFLOW_DATABASE_URL=sqlite:///./langflow.db
LANGFLOW_LANGCHAIN_CACHE=SQLiteCache
LANGFLOW_HOST=127.0.0.1
LANGFLOW_WORKERS=1
LANGFLOW_PORT=7860
LANGFLOW_LOG_LEVEL=critical
LANGFLOW_LOG_FILE=logs/langflow.log
LANGFLOW_FRONTEND_PATH=/path/to/frontend/build/files
LANGFLOW_OPEN_BROWSER=false
LANGFLOW_REMOVE_API_KEYS=false
LANGFLOW_CACHE_TYPE=memory
LANGFLOW_AUTO_LOGIN=true
LANGFLOW_SUPERUSER=admin
LANGFLOW_SUPERUSER_PASSWORD=123456
LANGFLOW_STORE_ENVIRONMENT_VARIABLES=false
LANGFLOW_STORE_URL=https://api.langflow.store
LANGFLOW_DOWNLOAD_WEBHOOK_URL=
LANGFLOW_LIKE_WEBHOOK_URL=
LANGFLOW_SECRET_KEY=randomly_generated_secure_key
LANGFLOW_NEW_USER_IS_ACTIVE=false

Make sure to replace the placeholder values with your actual configuration. When running LangFlow in headless mode, you can specify the path to the .env file using the --env-file option in the CLI:

langflow run --env-file /path/to/your/.env

This ensures that the environment variables are correctly loaded from the specified file [1][2][3][4][5].

To continue talking to Dosu, mention @dosu.

devinbost commented 2 months ago

I get the same behavior when I update my Dockerfile.langflow file to include the .env file (with all of my variables):

FROM langflowai/langflow:1.0.18

WORKDIR /app

COPY flows /app/flows

COPY .env /app/.env

RUN pip install cassio mysql-connector-python oracledb

ENV LANGFLOW_LOAD_FLOWS_PATH=/app/flows

EXPOSE 7860

CMD ["langflow", "run", "--env-file", "/app/.env", "--backend-only", "--host", "0.0.0.0", "--port", "7860"]
italojohnny commented 2 months ago

First, you need to set the following environment variable:

LANGFLOW_STORE_ENVIRONMENT_VARIABLES=true

Next, set the environment variable that specifies which environment variables you want Langflow to convert into global variables:

LANGFLOW_VARIABLES_TO_GET_FROM_ENVIRONMENT=OPENAI_KEY,OPENAI_API_KEY,ASTRA_DB_TOKEN,ASTRA_DB_APPLICATION_TOKEN

After that, you can start your containers and access your flow via the API using something like:

curl -X POST \
    "http://127.0.0.1:7860/api/v1/run/your_flow?stream=false" \
    -H 'Content-Type: application/json'\
    -d '{"input_value": "your input",
    "output_type": "chat",
    "input_type": "chat",
    "tweaks": {
  "ChatInput-JyTpr": {},
  "CustomComponent-NvRS3": {
    "example_token": "ASTRA_ENDPOINT",
    "token": "ASTRA_DB_TOKEN",
    "database_id": "ASTRA_DB_DATABASE_ID"
},
  "ParseData-QhRZ0": {},
  "ChatOutput-vSK1c": {}
}}'

In this command, make sure to link the field names in the component to the global variables you defined in the environment variables from the second step.

This should work!