Open tugsjargal1 opened 7 months ago
Hello, I am not in the team but from what I see, you need to have CUDA to use torch which the AMcbook M1 does not have.
It has the MEtal API to use the GPU part of the SOC chip. So maybe you should try it on a different device taht has a CUDA compatible GPU. =)
Torch without CUDA is fine. However the code needs a slight alteration.
Go to Inference.py in the folder MobileSamV2. At line 82 it says "input_boxes = torch.from_numpy(input_boxes).cuda()", you would change it to "input_boxes = torch.from_numpy(input_boxes).cpu()".
Hi MobileSAM team,
First of all, thank you for this amazing library.
I am trying to run "bash ./experiments/mobilesamv2.sh" shell on my Mac Book M1 pro chip laptop. I have installed a lot of packages without any error. And I got error message when I run:
How can I resolve this one ? I am using M1 Pro chip Mac book.
Thanks in advance! :)