microsoft / autogen

A programming framework for agentic AI 🤖
https://microsoft.github.io/autogen/
Creative Commons Attribution 4.0 International
34.45k stars 4.98k forks source link

[Issue]: Connect error with local LLM #2149

Open nothingness6 opened 7 months ago

nothingness6 commented 7 months ago

Describe the issue

A connect error occurred while using local LLM, Ollama/Mistral.

Steps to reproduce

Tried to input another address from CMD.

Screenshots and logs

AutoGen-01 AutoGen-02 AutoGen-03

Additional Information

AutoGen Version: Latest Operating System: Windows 10 64 bit Python Version: 3.11 Related Issues: N/A I followed this instruction : https://www.youtube.com/watch?v=mUEFwUU0IfE&t=126s

jackgerrits commented 7 months ago

@marklysze @victordibia Have you got any ideas here?

marklysze commented 7 months ago

Hi @nothingness6 and @jackgerrits, for LiteLLM you shouldn't need to add the "/v1" to the URL.

@nothingness6, can you try http://localhost:4000 as your URL

nothingness6 commented 7 months ago

autogen Hi, mark. I gave it a shot, but it gave me an error.

Hi @nothingness6 and @jackgerrits, for LiteLLM you shouldn't need to add the "/v1" to the URL.

@nothingness6, can you try http://localhost:4000 as your URL

marklysze commented 7 months ago

Okay, what happens when you put http://localhost:4000/ into your browser, and can you try http://localhost:4000/v1 as well

qingyun-wu commented 5 months ago

Hi @nothingness6, is this issue addressed with suggestions from marklysze? Thanks!