When I run the litellm on my laptop, it shows the below error, although I didn't get it two days ago and also can import it on Python.
Could you help me to deal with this?
Environment
Python 3.10.14
lightllm 1.40.14
ollama 0.1.44
Log
(torch)
~ via β¬’ v16.4.0 via π torch
β litellm -v
LiteLLM: Current Version = 1.40.14
(torch)
~ via β¬’ v16.4.0 via π torch
β litellm --model ollama/llama3
INFO: Started server process [12027]
INFO: Waiting for application startup.
#------------------------------------------------------------#
# #
# 'This feature doesn't meet my needs because...' #
# https://github.com/BerriAI/litellm/issues/new #
# #
#------------------------------------------------------------#
Thank you for using LiteLLM! - Krrish & Ishaan
Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
INFO: Application startup complete.
ERROR: [Errno 8] nodename nor servname provided, or not known
INFO: Waiting for application shutdown.
INFO: Application shutdown complete.
(torch)
~ via β¬’ v16.4.0 via π torch
β
Calling on Python
(torch)
~ via β¬’ v16.4.0 via π torch
β python
Python 3.10.14 (main, May 6 2024, 14:42:37) [Clang 14.0.6 ] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import litellm
>>>
Hi,
When I run the litellm on my laptop, it shows the below error, although I didn't get it two days ago and also can import it on Python. Could you help me to deal with this?
Environment
Log
Calling on Python