Rubiksman78 / MonikA.I

Submod for MAS with AI based features
MIT License
112 stars 17 forks source link

Installing CUDA not required? #59

Closed iceypotato closed 1 year ago

iceypotato commented 1 year ago

So I saw that I had to install CUDA 11.7 and the cudnn libraries for this program to use the GPU to generate text. However, I saw that the GPU was being utilized when generating the responses (about 100% utilization), without the CUDA Tookit and cudnn libraries. So, is CUDA still needed, or did I do something wrong?

Rubiksman78 commented 1 year ago

You need to setup CUDA and cudnn if you want to use heavy GPU features like locally running chatbots or Tortoise TTS. If these are not setup, these features won't use your GPU and will use your CPU, resulting in slower processing.

Be careful to distinguish your integrated GPU (Intel for example) and your dedicated GPU if you have one (only Nvidia GPU are supported for the libraries used).

iceypotato commented 1 year ago

Hmm. Though, I clearly see my dedicated GPU (nvidia) being used in task manager when I do a chat session with Monika. Task manager reports that it utilizes almost 100% of the GPU. I tested this twice today. I also know that my CPU does not have integrated graphics.

Rubiksman78 commented 1 year ago

What options are you using? Is it the local chatbot or CAI ?

iceypotato commented 1 year ago

Local chatbot

Rubiksman78 commented 1 year ago

What is the time response approximatively ? Quantified by seconds and words would be good

iceypotato commented 1 year ago

I put in 2 words, and it took about 14 seconds to generate a response.

Rubiksman78 commented 1 year ago

I was talking about the amount of words generated by the model but anyway 14 seconds seems not too long to me. Which model did you use?

iceypotato commented 1 year ago

The pygmalion 2.7b one.

Rubiksman78 commented 1 year ago

Okay if it works for you good, normally you would need to setup CUDA. Maybe it was already setup for you. I will close for now as it is working without issues for now, feel free to open another issue if you have another problem.