victordibia / autogen-ui

Web UI for AutoGen (A Framework Multi-Agent LLM Applications)
MIT License
651 stars 98 forks source link

Ability to run locally #9

Open iplayfast opened 8 months ago

iplayfast commented 8 months ago

I'm able to run autogen using a config list like config_list = [ { "model": "mistral-7b-instruct-v0.1.Q5_0.gguf",#"mistral-instruct-7b", #t he name of your running model "api_base": "http://0.0.0.0:5001/v1", #the local address of the api "api_type": "open_ai", "api_key": "sk-111111111111111111111111111111111111111111111111", # just a placeholder } ]

Which talks to text-generation-Webui which has the openai api emulation turned on. It would be nice to have a similar way of using a local llm with autogen-ui

hieuminh65 commented 8 months ago

Hey Are you trying to use Runpods here? I dont have any error with this

iplayfast commented 8 months ago

Maybe I'm just not doing it right, but I wasn't able to get it to work.

victordibia commented 8 months ago

Recording of the autogenui. Main steps

https://github.com/victordibia/autogen-ui/assets/1547007/d560957c-7e13-47a7-a80d-da75695217bd

dlaliberte commented 6 months ago

Recording of the autogenui. Main steps

The op issue is about how to run autogen-ui with a local llm service, not how to run autogen-ui locally. I don't see instructions for how to configure use of a local llm, or any llm service besides ChatGPT.