chrisrude / oobabot

A Discord bot which talks to Large Language Model AIs running on oobabooga's text-generation-webui
MIT License
98 stars 33 forks source link

Install problems #46

Closed Urammar closed 1 year ago

Urammar commented 1 year ago

Hi, I cannot for the life of me work out how to actually use this.

I've followed all the steps and now have the bot in the server and the pip install done

However... how do I launch the actual application? How do I give the bot its token? How do I point it to my local ip? I think the install guide is missing a few steps...

jmoney7823956789378 commented 1 year ago

The installation works very well on Linux systems. I saw you were trying to export a variable on windows, try using set instead. Launching on windows should be the same as Linux, try oobabot <args> or try python -m oobabot <args>

Urammar commented 1 year ago

Thanks that worked! Okay so last step we have everything ready to go

Now this

C:\MyShit\AI\Flan-t5-XXL\OobaBot>python -m oobabot --ai-name YourBotsName --persona "You are a cat named YourBotsName"
←[33;40m2023-06-22 03:57:51,874←[0m←[37;40m  INFO ←[0m←[37;40mOobabooga is at ws://localhost:5005←[0m
←[33;40m2023-06-22 03:57:55,939←[0m←[37;40m ERROR ←[0m←[31;40mCould not connect to Oobabooga server: [ws://localhost:5005]←[0m
←[33;40m2023-06-22 03:57:55,940←[0m←[37;40m ERROR ←[0m←[31;40mPlease check the URL and try again.←[0m
←[33;40m2023-06-22 03:57:55,940←[0m←[37;40m ERROR ←[0m←[31;40mReason: Cannot connect to host localhost:5005 ssl:default [The remote computer refused the network connection]←[0m
Urammar commented 1 year ago

API is checked in the boolean command-line flags in the UI and I hit apply and restart the interface.

image

The final code on the command prompt says its running, but doesnt mention the API, unsure if its supposed to

Urammar commented 1 year ago

Really appreciate the help, AND quick response time btw!

jmoney7823956789378 commented 1 year ago

If your oobabooga server is not on the same machine, you may need to pass the --listen and --api (just to be sure) option when launching the oobabooga server, then pass the server ip as an argument to oobabot. Even if you checked and applied the api option in the webui, if it isnt telling you "Starting API at http://127.0.0.1:5000" then it may not be running. You can confirm if the api ports are listening with netstat -ano | findstr /i ":5005 :5000" Also there is a built-in extension for oobabot that will provide all this functionality into the oobabooga webui.

Urammar commented 1 year ago

Right you are, looks like that fixed it, also kill me theres totally a UI extension!!!!

Urammar commented 1 year ago

Yup that fixed it, thanks a ton for your help! I'll let you know how it goes!

Urammar commented 1 year ago

Its very slow compared to ordinary generation, i'll open a 2nd issue for that

jmoney7823956789378 commented 1 year ago

Its very slow compared to ordinary generation, i'll open a 2nd issue for that

It's mostly because oobabot can't cache the continuous context into vram and keep it there, so the context is pulled each and every time the bot activates and parsed then, adding extra time to the generation. Normal operation in the webui keeps up to 2048 (or more, since models are being developed) tokens of context in vram, so new generations don't have to reread the context.

chrisrude commented 1 year ago

Glad you got it working!