Closed RokeJulianLockhart closed 1 year ago
I can easily connect the chatbot to other messaging platforms.
Are you looking for a full offline experience or are you ok with using OpenAI GPT4 as the LLM?
@EniasCailliau, I have a desire to use the chatbot locally, but that would be easy enough myself (by swapping the relevant ChatGPT APIs with code that interacts with a local LLaMA LLM) but the difficult bit, for me, would be reprogramming it to interact with a Matrix server.
Having it interact with a matrix server would allow me to easily just set the desired server to localhost
, and thus have it communicate locally via a standard chat-interface – a Matrix client.
https://github.com/EniasCailliau/GirlfriendGPT/issues/30#event-9450737498
Is this actually complete? I'm rather surprised the issue was closed so soon.
https://www.youtube.com/watch?v=LiN3D1QZGQw&lc=Ugxwren6JeJqXEFqgVt4AaABAg.9q2rujGGtVa9qR0E2utFQl
Original comment
Myself, response
I realized when typing that response that it doesn't seem like a bad idea. It'd allow users to use the bot offline, since getting it to communicate with a local Matrix server via the loopback (localhost) address would be fairly trivial.