Python SDK, Proxy Server to call 100+ LLM APIs using the OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
git clone https://github.com/BerriAI/litellm.git
cd litellm
poetry install -E extra_proxy -E proxy
cd litellm/tests/
pytest test_bedrock_completion.py::test_completion_bedrock_cloudflare_ai_gateway -s -v
Relevant log output
ImportError while loading conftest '/workspaces/litellm/litellm/tests/conftest.py'.
conftest.py:9: in <module>
import litellm
../__init__.py:2: in <module>
import threading, requests, os
E ModuleNotFoundError: No module named 'requests'
What happened?
When following https://github.com/BerriAI/litellm?tab=readme-ov-file#contributing, it looks like the
requests
module is missing.Relevant log output
Twitter / LinkedIn details
https://www.linkedin.com/in/davidmanouchehri/