Closed AmmarkoV closed 3 months ago
I personally feel like adding LLM to this bot is beyond the scope of this project. To answer your question, you're looking for the function message_received
in mumbleBot.py
. You should create a function that takes the input, processes it with the LLM, and then sends the message back (assuming the user is not trying to run a command, by checking whether the message starts with the command character, which is !
by default).
https://github.com/azlux/botamusique/blob/f046a3c177286cfc5e0aab5782948941a1684311/mumbleBot.py#L269
Hope this helps!
I should also note that this would be a good candidate for a 3rd-party optional library (PR #228).
Hello,
First of all thanks for the great work developing this bot! I would like to make a ChatGPT like "feature" where the bot will be able to "respond" to regular chat input using the llama.cpp as a back-end : https://github.com/ggerganov/llama.cpp
To setup and test llama.cpp
Where would the correct part on your codebase be to implement the binding to such behaviour ?
I would expect a function like newChatMessage(username,message):
Thank you very much