BerriAI / litellm

Call all LLM APIs using the OpenAI format. Use Bedrock, Azure, OpenAI, Cohere, Anthropic, Ollama, Sagemaker, HuggingFace, Replicate (100+ LLMs)
https://docs.litellm.ai/docs/
Other
10.12k stars 1.13k forks source link

[Bug]: How do I stop import litellm from loading .env? #4361

Open paul-gauthier opened 6 days ago

paul-gauthier commented 6 days ago

What happened?

The proxy docs mention LITELLM_MODE=PRODUCTION will stop loading .env. But that doesn't seem to stop import litellm from loading a .env file.

I also can't determine which .env file is being loaded when I don't have a .env in the current dir or in my home dir. It's loading one of my .env files, but I don't know which one for sure.

Relevant log output

import os
from pathlib import Path

dotenv = Path(".env")
if dotenv.exists():
    dotenv.unlink()

print(os.environ.get("FROM_DOT_ENV")) # None

dotenv.write_text("FROM_DOT_ENV=True")

os.environ["LITELLM_MODE"] = "PRODUCTION"
import litellm
print(os.environ.get("FROM_DOT_ENV")) # True

Twitter / LinkedIn details

No response

krrishdholakia commented 6 days ago

hey @paul-gauthier i haven't run your script yet - but the only place the module should be running 'load_dotenv' is in init.py and that's behind the LITELLM_MODE flag - https://github.com/BerriAI/litellm/blob/0fd9033502c8da8759f26693298ce6d07555f1ac/litellm/__init__.py#L30

Will investigate further with your script - thanks for that