Closed LSXAxeller closed 1 year ago
Currently, you can use proxy to resolve access restrictions, and we will support more model choices in the future
Currently, you can use proxy to resolve access restrictions, and we will support more model choices in the future
Not so good with python and networking so can you please enlighten me on how I can configure this proxy ? I tried to edit the llm_basic.py and replace the OpenAI API with requests to attach it to my localhost llm but still no luck
You can set up your own VPN, or the easiest way is to go to the VPN provider to buy related services, sorry I can't give you more advice.
in addition, you are modifying llm_basic.py is the right way, you need to make sure that the output format is the same as before, however, it is limited by different LLM capabilities, which may not work
I almost forgot this issue, but still thanks for help, I already found that OpenAI API actually is compatible with LLM backends I use like the RWKV Runner and Oobobga WebUI without any need for proxy, just edit the API link, well still can't use because of local llm response form is different of ChatGPT, maybe I will just wait for a official implementation
Will it support the local llm models like LlaMa and RWKV using the local APIs like oobobga Text Generation WebUI ? This would solve the problems of insufficient OpenAI credits and limited tokens and the countries that ChatGPT access is still unavailable in yet