kuafuai / DevOpsGPT

Multi agent system for AI-driven software development. Combine LLM with DevOps tools to convert natural language requirements into working software. Supports any development language and extends the existing code.
https://www.kuafuai.net
Other
6.51k stars 837 forks source link

Custom LLMs & APIs Endpoints #64

Closed LSXAxeller closed 1 year ago

LSXAxeller commented 1 year ago

Will it support the local llm models like LlaMa and RWKV using the local APIs like oobobga Text Generation WebUI ? This would solve the problems of insufficient OpenAI credits and limited tokens and the countries that ChatGPT access is still unavailable in yet

booboosui commented 1 year ago

Currently, you can use proxy to resolve access restrictions, and we will support more model choices in the future

LSXAxeller commented 1 year ago

Currently, you can use proxy to resolve access restrictions, and we will support more model choices in the future

Not so good with python and networking so can you please enlighten me on how I can configure this proxy ? I tried to edit the llm_basic.py and replace the OpenAI API with requests to attach it to my localhost llm but still no luck

booboosui commented 1 year ago

You can set up your own VPN, or the easiest way is to go to the VPN provider to buy related services, sorry I can't give you more advice.

in addition, you are modifying llm_basic.py is the right way, you need to make sure that the output format is the same as before, however, it is limited by different LLM capabilities, which may not work

LSXAxeller commented 1 year ago

I almost forgot this issue, but still thanks for help, I already found that OpenAI API actually is compatible with LLM backends I use like the RWKV Runner and Oobobga WebUI without any need for proxy, just edit the API link, well still can't use because of local llm response form is different of ChatGPT, maybe I will just wait for a official implementation