Closed d13g4 closed 1 year ago
You can already do this since https://github.com/matrixgpt/matrix-chatgpt-bot/pull/161
Just use a api for llama which is compatible to the openai api like this one and set the reverse proxy (CHATGPT_REVERSE_PROXY
) environment of the bot
hm. maybe i configured it wrong, but that didn't work. i get a 401 with the bot but have no problems with the api when using curl. i will, if wanted/needed/interested investigate further and attach more information. maybe its a misconfiguration, maybe a bug? in any case: i tried exactly that a couple of hours before posting the message above...
Edit: Nevermind; Had another configuration. I will investigate tomorrow; its very late for me today.
I couldn't verify if its working, but I take your word for it. Unfortunately my AI isn't working with LocalAI, because... unsure, but I am not the only one and an issue was already open. Thanks for answering!
Hi! I am running my own llama-based chat-ai on my pc (fastchat with vicuna in my case) which provides a open-ai compatible api. I would like to have the option to change the api-url to my own; I am sure it could be a really nice feature. Sorry for posting it as an issue, I didn't know where else - I hope you would consider it anyway.