dnl13 / ComfyUI-dnl13-seg

I'm working on enabling SAM-HQ and Dino for ComfyUI to easily generate masks automatically, either through automation or prompts.
18 stars 3 forks source link

SAM Model Loader node doesn't support Apple MPS #4

Closed alessandroperilli closed 11 months ago

alessandroperilli commented 11 months ago

Attempting to use the SAM Model Loader node on an Apple M2 generates the following error:

RuntimeError: Attempting to deserialize object on a CUDA device but torch.cuda.is_available() is False. If you are running on a CPU-only machine, please use torch.load with map_location=torch.device('cpu') to map your storages to the CPU.

This is an old problem that I never reported because I normally use the SAMLoader node from Impact Pack, which allows you to choose CPU as device_mode. But given that this is a new repo and this node exists...

Update:

Reading the documentation of your new suite, I discovered that you only utilize the SAM_HQ models. So I tried to use sam_hq_vit_h.pth with the SAMLoader node from Impact Pack, in the attempt to debug #6.

SAMLoader refuses to load that model and return the same error above:

RuntimeError: Attempting to deserialize object on a CUDA device but torch.cuda.is_available() is False. If you are running on a CPU-only machine, please use torch.load with map_location=torch.device('cpu') to map your storages to the CPU.

So I'm assuming the problem is not specifically about the SAM Model Loader node, but the model family that by default the node tries to load.

dnl13 commented 11 months ago

Ah, I see. First of all, thank you very much for your tests! I was hopeful that moving the models to the available device in one place would be sufficient. I'll take a look at it as soon as possible and make the necessary corrections.

dnl13 commented 11 months ago

I've made some more changes. Now, all models should be initially loaded on the CPU and moved to the corresponding Torch device if one is available. A new test would be highly appreciated.

alessandroperilli commented 11 months ago

This is working now. Thank you.