Closed fdstevex closed 1 year ago
Hi,
Very strange that the torch installation failed… Could you double check whether you are running arm64 version of Python and not x86 under Rosetta?
What's the output of the following command?
file `which python3`
Alternatively, could you also try with Python 3.11?
If that doesn't help, to help you figure out what's going on, please repeat all steps from the readme, and reply here with the complete output from your terminal.
Hi, thanks for the response .. I've been learning how to manage the python environment and the changes I made to get llama.cpp to work have also fixed my problem here. I'm not sure which change, but when I tried to get it to work today, it worked fine (although very slowly). So it's not LLaMA_MPS, it was my setup. Thanks!
Hi - Thanks for building this, it looks like a great way to try out the model.
I wasn't able to follow the instructions exactly -
pip3 install -r requirements.txt
reportedNo matching distribution found for torch
. The python I have installed is 3.11, so I'm explicitly usingpip3.9
/python3.9
. I don't know if this is related.Anyway, when I run the example chat command, I get prompted for my input, and when I enter it, about 30 seconds later, I get this:
Using a Mac Studio, 32gb RAM.