EniasCailliau / GirlfriendGPT

OSS AI Companion Chatbot - Build your own AI companion in Python using ChatGPT.
2.6k stars 441 forks source link

Communicate via the Matrix standard (not, / in addition to, the Telegram API). #30

Closed RokeJulianLockhart closed 1 year ago

RokeJulianLockhart commented 1 year ago

https://www.youtube.com/watch?v=LiN3D1QZGQw&lc=Ugxwren6JeJqXEFqgVt4AaABAg.9q2rujGGtVa9qR0E2utFQl

Original comment

It's connected to some centralized server, why can't it just be offline like many other open source UIs?

Myself, response

The sole viable offline method of accomplishing this with standard tools would be a local loopback Matrix server running on your PC, interacting with a locally hosted and computed version of the AI model. This is certainly locally deployable, but inconvenient for all but strictly offline contexts. ...However, considering that the source-code for the bot is available at GitHub > EniasCailliau > GirlfriendGPT, all that needs to be done is either create a loopback proxy that pretends to Telegram that it's the official server, and spoofs just enough command output to make it viable to either use the official Telegram client or a local Matrix server, or edit the source code so that the bot outputs to a Matrix server, thereby making it instantly able to communicate via any platform, anywhere, whether offline or not. That's potentially a little more work than it might seem, depending upon the route and your requirements. I'd just make an issue at the aforementioned repo asking the dev t use the Matrix protocol rather than the Telegram API.

I realized when typing that response that it doesn't seem like a bad idea. It'd allow users to use the bot offline, since getting it to communicate with a local Matrix server via the loopback (localhost) address would be fairly trivial.

EniasCailliau commented 1 year ago

I can easily connect the chatbot to other messaging platforms.

Are you looking for a full offline experience or are you ok with using OpenAI GPT4 as the LLM?

RokeJulianLockhart commented 1 year ago

@EniasCailliau, I have a desire to use the chatbot locally, but that would be easy enough myself (by swapping the relevant ChatGPT APIs with code that interacts with a local LLaMA LLM) but the difficult bit, for me, would be reprogramming it to interact with a Matrix server.

Having it interact with a matrix server would allow me to easily just set the desired server to localhost, and thus have it communicate locally via a standard chat-interface – a Matrix client.


https://github.com/EniasCailliau/GirlfriendGPT/issues/30#event-9450737498

Is this actually complete? I'm rather surprised the issue was closed so soon.