BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
12.5k stars 1.45k forks source link

Got this error 'ERROR: [Errno 8] nodename nor servname provided, or not known' #4227

Closed inoue0426 closed 3 months ago

inoue0426 commented 3 months ago

Hi,

When I run the litellm on my laptop, it shows the below error, although I didn't get it two days ago and also can import it on Python. Could you help me to deal with this?

Environment

Log

(torch) 
~ via β¬’ v16.4.0 via πŸ…’ torch 
➜ litellm -v                 

LiteLLM: Current Version = 1.40.14

(torch) 
~ via β¬’ v16.4.0 via πŸ…’ torch 
➜ litellm --model ollama/llama3
INFO:     Started server process [12027]
INFO:     Waiting for application startup.

#------------------------------------------------------------#
#                                                            #
#       'This feature doesn't meet my needs because...'       #
#        https://github.com/BerriAI/litellm/issues/new        #
#                                                            #
#------------------------------------------------------------#

 Thank you for using LiteLLM! - Krrish & Ishaan

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new

INFO:     Application startup complete.
ERROR:    [Errno 8] nodename nor servname provided, or not known
INFO:     Waiting for application shutdown.
INFO:     Application shutdown complete.
(torch) 
~ via β¬’ v16.4.0 via πŸ…’ torch 
➜ 

Calling on Python

(torch) 
~ via β¬’ v16.4.0 via πŸ…’ torch 
➜ python 
Python 3.10.14 (main, May  6 2024, 14:42:37) [Clang 14.0.6 ] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import litellm
>>> 
inoue0426 commented 3 months ago

I am not sure why but I can run it after creating new conda env.