PABannier / bark.cpp

Suno AI's Bark model in C/C++ for fast text-to-speech generation
MIT License
733 stars 58 forks source link

bark.cpp is not using gpu #181

Open sasatte opened 5 months ago

sasatte commented 5 months ago

inspite of turning on GGML_CUBLAS it is not using gpu. When running bark, it still says using CPU backend. How to force it to use gpu? there is no --cuda option.

mfrederico commented 4 months ago

I was wondering the same - but found this in the issues: https://github.com/PABannier/bark.cpp/issues/165

soarek94 commented 3 months ago

have you checked the number of n_gpu_layers? for me, that's where the problem is coming from cause if the number of the gpu layers is not greater than 0, it will run on CPU

axelatsiemens commented 3 months ago

Same issue, in the bark_context n_gpu_layers is default initialized to 0, but then neither main nor server offer to possibility to set it. What should it be? does it need to be calculated form the model?

newpolygons commented 2 months ago

Im just getting here. So im assuming the Metal support check box in the readme just isnt true? Has anyone been able to implement the feature themselves or will I have to do more digging.

vinovo commented 2 weeks ago

@PABannier I see that n_gpu_layers is configurable only via examples, but if we want to build bark as a dynamic library, n_gpu_layers is not exposed via bark.h