Open 524125153 opened 2 months ago
Exactly right. You need to have Cuda to use mamba; I've had the same experience as you a while back (luckily I use our university HPC cluster, so do have the needed GPU). You could run it on Google Colab or Kaggle if you want, just watch out since you'll have limited GPU time.
You can run mamba-minimal on a Mac: https://github.com/johnma2006/mamba-minimal.git
My current device is only Macbook, its corresponding chip is M1 chip, and there is no cuda, in this case, I can not run this code?