azlux / botamusique

Bot to play youtube / soundcloud / radio / local music on Mumble (using pymumble).
MIT License
312 stars 79 forks source link

Integration with a Large Language Model like LLama #372

Closed AmmarkoV closed 3 months ago

AmmarkoV commented 1 year ago

Hello,

First of all thanks for the great work developing this bot! I would like to make a ChatGPT like "feature" where the bot will be able to "respond" to regular chat input using the llama.cpp as a back-end : https://github.com/ggerganov/llama.cpp

To setup and test llama.cpp

git clone https://github.com/ggerganov/llama.cpp
cd llama.cpp
make
cd models
wget https://huggingface.co/eachadea/ggml-vicuna-7b-1.1/resolve/main/ggml-vic7b-uncensored-q5_1.bin
cd ..
./main -m ./models/ggml-vic7b-uncensored-q5_1.bin -n 256 --repeat_penalty 1.0 --color -i -r "User:" -f prompts/chat-with-bob.txt

Where would the correct part on your codebase be to implement the binding to such behaviour ?

I would expect a function like newChatMessage(username,message):

Thank you very much

luca0N commented 1 year ago

I personally feel like adding LLM to this bot is beyond the scope of this project. To answer your question, you're looking for the function message_received in mumbleBot.py. You should create a function that takes the input, processes it with the LLM, and then sends the message back (assuming the user is not trying to run a command, by checking whether the message starts with the command character, which is ! by default). https://github.com/azlux/botamusique/blob/f046a3c177286cfc5e0aab5782948941a1684311/mumbleBot.py#L269

Hope this helps!

luca0N commented 1 year ago

I should also note that this would be a good candidate for a 3rd-party optional library (PR #228).