BerriAI / litellm

Python SDK, Proxy Server to call 100+ LLM APIs using the OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
12.23k stars 1.42k forks source link

[Bug]: `requests` missing when testing #3495

Closed Manouchehri closed 4 months ago

Manouchehri commented 4 months ago

What happened?

When following https://github.com/BerriAI/litellm?tab=readme-ov-file#contributing, it looks like the requests module is missing.

git clone https://github.com/BerriAI/litellm.git
cd litellm
poetry install -E extra_proxy -E proxy
cd litellm/tests/
pytest test_bedrock_completion.py::test_completion_bedrock_cloudflare_ai_gateway -s -v

Relevant log output

ImportError while loading conftest '/workspaces/litellm/litellm/tests/conftest.py'.
conftest.py:9: in <module>
    import litellm
../__init__.py:2: in <module>
    import threading, requests, os
E   ModuleNotFoundError: No module named 'requests'

Twitter / LinkedIn details

https://www.linkedin.com/in/davidmanouchehri/

Manouchehri commented 4 months ago

Oh, I needed to run this instead:

poetry run pytest test_bedrock_completion.py::test_completion_bedrock_cloudflare_ai_gateway -s -v