innightwolfsleep / text-generation-webui-telegram_bot

LLM telegram bot
MIT License
100 stars 21 forks source link

Failure to load telegram_bot #212

Open Avroboros opened 2 weeks ago

Avroboros commented 2 weeks ago

Hi, i'm having difficulties loading this on the current versions of aiogram and oogabooga. I tried installing this as a standalone app and it didn't work because the cmd kept crashing because it wanted to use CUDA 11 instead of CUDA 12 (which is the version I have).

After this, I decided to download it as an extension for WebUI. The installation and everything was flawless, until the extension is loaded :

image

I don't know if it has something to do with compability issue (I tried installing and reinstalling the requirements.txt files of both telegram_bot and oogabooga several times), but I'd really appreciate a solution for this problem.

Avroboros commented 2 weeks ago

I figured out the problem. I've looked deep into the files of this extension, and turns out the majority of its modules and assets are outdated or discontinued (Aiogram nowdays being in 3.X, while this project uses 2.X, and other modules like backoff and urllib3 also being outdated aswell) .

I've spent the afternoon experimenting with the main.py of this project in pycharm, and I fixed all of the issues (however, the console doesn't even appear anymore and the extension immediately shuts down after 2 seconds in silence), meaning that there's a massive version discrepancy between Oogabooga and telegram_bot.

innightwolfsleep commented 2 weeks ago

I will check it.

p.s. Did you install extensions\telegram_bot\requirements_ext.txt or pip install -r extensions\telegram_bot\requirements_app.txt?

p.p.s. there is workaround - you can launch telegram_bot as standalone app, which refer to oobabooga's API.

Avroboros commented 2 weeks ago

I will check it.

p.s. Did you install extensions\telegram_bot\requirements_ext.txt or pip install -r extensions\telegram_bot\requirements_app.txt?

p.p.s. there is workaround - you can launch telegram_bot as standalone app, which refer to oobabooga's API.

Launching it as an app doesn't work either. Like I said its because the app is trying to use CUDA 11 instead of 12 (my version), so to at least go around this issue there could be some version check inside the app that checks whether the user has 11 or 12. Also, just to check it out again I tried installing it again right now, but the installation failed because the versions of the modules/packages of telegram_bot are outdated to the ones installed in my system.

I can understand this is a pain in the ass (because it is, its tough), but do you have plans on updating this extension onto the newer versions of aiogram? That would solve all current and future compatibility issues

innightwolfsleep commented 2 weeks ago

I rolled back aiogram from 3 to 2... 8 month ago because Oogabooga has another pydantic version) Seems much changed since that time.

Complete rework from aiogram2 to aiogram3 takes a much time, but I will try to do it later.

But seems changing of pydantic version to Oogabooga actual pydantic==2.8.0 solve a problem. Just made clear Oogabooga install and clear telegram-bot install with pydantic==2.8.0... and it works fine (I using clear llama_cpp on CPU foe test). Just updated #213.


Please, share details of inconsistences in standalone mode. telegram-bot shouldn't have any conflict with CUDA, I made it as front-end module for few backends - llama_cpp, exllama, transformers, text-generation-webui or openapi... without chance of inconsistency

Which backend you used? Which module inconsistecy you saw? Снимок

Avroboros commented 2 weeks ago

I'll gladly help you out figuring the inconsistencies of the standalone mode, but for now I tried using the extension (because i've used it as an extension before and its easier this way), and I got this error :

image

I did --upgrade on the requirements_ext

image

and when I ran again the same problem popped up. Then, just to verify, make sure and double-check I had the module in my system I ran pip, but it indeed already exists with its latest version :

image

innightwolfsleep commented 1 week ago

Did you run run cmd_windows.bat script from oogabooga and then run pip install backoff in opened shell with active condo shell? Or just run cmd?

Extension mode should be installed inside oogabooga conda env. Perhaps it is my mistake, but I don't see conda env on last two screenshots.

Avroboros commented 1 week ago

Damn, I didn't know about the whole cmd_windows thing. I reinstalled the extension (and did a few tweaks here and there) using that and now it works without problems, thanks.

Taking the opportunity that this is fixed, could you implement an option for this extension that only lets it speak if its directly mentioned (@) or directly replied to? (because it can be a little annoying to have the bot speak and reply to every single message that's sent)