Open ikamensh opened 1 year ago
Hi, I'm interested to try running llama models, I'm using a macbook with AMD GPU, so probably easiest would be to use CPU. Would be nice to know if it's possible from README. Thanks!
Hi, I'm interested to try running llama models, I'm using a macbook with AMD GPU, so probably easiest would be to use CPU. Would be nice to know if it's possible from README. Thanks!