BerriAI / litellm

Python SDK, Proxy Server to call 100+ LLM APIs using the OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
12.04k stars 1.39k forks source link

[Feature]: add library stubs to LiteLLM #1540

Open marhar opened 7 months ago

marhar commented 7 months ago

The Feature

Please add library stubs so that mypy or other python linters can verify proper litellm usage.

mh exercises/litellm_examples% mypy --strict litellm_ollama_1.py 
litellm_ollama_1.py:5: error: Cannot find implementation or library stub for module named "litellm"  [import-not-found]
litellm_ollama_1.py:5: note: See https://mypy.readthedocs.io/en/stable/running_mypy.html#missing-imports
Found 1 error in 1 file (checked 1 source file)

Motivation, pitch

mypy --strict helps improve the quality of python code. It would be great to ensure that calls to LiteLLM were verified!

Twitter / LinkedIn details

No response

jamesbraza commented 2 months ago

@marhar do you mind adding a title to this issue?


Also, one type hint that would be useful in litellm==1.40.9:

foo.py:221:39: error: "ModelResponse" has no attribute "usage"
[attr-defined]
ishaan-jaff commented 2 months ago

do you mind adding a title to this issue?

added @jamesbraza

Also, one type hint that would be useful in litellm==1.40.9:

We'd love a PR on this @jamesbraza - it's been highly requested