tencent-ailab / IP-Adapter

The image prompt adapter is designed to enable a pretrained text-to-image diffusion model to generate images with image prompt.
Apache License 2.0
4.52k stars 297 forks source link

"RuntimeError: Attempting to deserialize object on a CUDA device" error in Automatic1111 v1.6, ControlNet v1.1.409 - Apple Silicon OSX #53

Open raisindetre opened 10 months ago

raisindetre commented 10 months ago

Using the SD1.5 IP-Adaptor models in v1.6.0 Automatic1111 environment (python: 3.10.13, torch: 2.0.1 ) with latest ControlNet on Apple ARM architecture generates a random image and produces console runtime error below. COMMANDLINE_ARGS="--skip-torch-cuda-test --upcast-sampling --no-half-vae --use-cpu interrogate".


2023-09-10 21:04:57,087 - ControlNet - STATUS - preprocessor resolution = 512
*** Error running process: /Users/guestuser/Documents/Projects/StableDiffusion/stable-diffusion-webui/extensions/sd-webui-controlnet/scripts/controlnet.py
    Traceback (most recent call last):
      File "/Users/guestuser/Documents/Projects/StableDiffusion/stable-diffusion-webui/modules/scripts.py", line 619, in process
        script.process(p, *script_args)
      File "/Users/guestuser/Documents/Projects/StableDiffusion/stable-diffusion-webui/extensions/sd-webui-controlnet/scripts/controlnet.py", line 977, in process
        self.controlnet_hack(p)
      File "/Users/guestuser/Documents/Projects/StableDiffusion/stable-diffusion-webui/extensions/sd-webui-controlnet/scripts/controlnet.py", line 966, in controlnet_hack
        self.controlnet_main_entry(p)
      File "/Users/guestuser/Documents/Projects/StableDiffusion/stable-diffusion-webui/extensions/sd-webui-controlnet/scripts/controlnet.py", line 808, in controlnet_main_entry
        detected_map, is_image = preprocessor(
      File "/Users/guestuser/Documents/Projects/StableDiffusion/stable-diffusion-webui/extensions/sd-webui-controlnet/scripts/utils.py", line 75, in decorated_func
        return cached_func(*args, **kwargs)
      File "/Users/guestuser/Documents/Projects/StableDiffusion/stable-diffusion-webui/extensions/sd-webui-controlnet/scripts/utils.py", line 63, in cached_func
        return func(*args, **kwargs)
      File "/Users/guestuser/Documents/Projects/StableDiffusion/stable-diffusion-webui/extensions/sd-webui-controlnet/scripts/global_state.py", line 35, in unified_preprocessor
        return preprocessor_modules[preprocessor_name](*args, **kwargs)
      File "/Users/guestuser/Documents/Projects/StableDiffusion/stable-diffusion-webui/extensions/sd-webui-controlnet/scripts/processor.py", line 350, in clip
        from annotator.clipvision import ClipVisionDetector
      File "/Users/guestuser/Documents/Projects/StableDiffusion/stable-diffusion-webui/extensions/sd-webui-controlnet/annotator/clipvision/__init__.py", line 81, in <module>
        clip_vision_h_uc = torch.load(clip_vision_h_uc)['uc']
      File "/Users/guestuser/Documents/Projects/StableDiffusion/stable-diffusion-webui/modules/safe.py", line 108, in load
        return load_with_extra(filename, *args, extra_handler=global_extra_handler, **kwargs)
      File "/Users/guestuser/Documents/Projects/StableDiffusion/stable-diffusion-webui/modules/safe.py", line 156, in load_with_extra
        return unsafe_torch_load(filename, *args, **kwargs)
      File "/Users/guestuser/Documents/Projects/StableDiffusion/stable-diffusion-webui/venv/lib/python3.10/site-packages/torch/serialization.py", line 809, in load
        return _load(opened_zipfile, map_location, pickle_module, **pickle_load_args)
      File "/Users/guestuser/Documents/Projects/StableDiffusion/stable-diffusion-webui/venv/lib/python3.10/site-packages/torch/serialization.py", line 1172, in _load
        result = unpickler.load()
      File "/opt/homebrew/Cellar/python@3.10/3.10.13/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 1213, in load
        dispatch[key[0]](self)
      File "/opt/homebrew/Cellar/python@3.10/3.10.13/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 1254, in load_binpersid
        self.append(self.persistent_load(pid))
      File "/Users/guestuser/Documents/Projects/StableDiffusion/stable-diffusion-webui/venv/lib/python3.10/site-packages/torch/serialization.py", line 1142, in persistent_load
        typed_storage = load_tensor(dtype, nbytes, key, _maybe_decode_ascii(location))
      File "/Users/guestuser/Documents/Projects/StableDiffusion/stable-diffusion-webui/venv/lib/python3.10/site-packages/torch/serialization.py", line 1116, in load_tensor
        wrap_storage=restore_location(storage, location),
      File "/Users/guestuser/Documents/Projects/StableDiffusion/stable-diffusion-webui/venv/lib/python3.10/site-packages/torch/serialization.py", line 217, in default_restore_location
        result = fn(storage, location)
      File "/Users/guestuser/Documents/Projects/StableDiffusion/stable-diffusion-webui/venv/lib/python3.10/site-packages/torch/serialization.py", line 182, in _cuda_deserialize
        device = validate_cuda_device(location)
      File "/Users/guestuser/Documents/Projects/StableDiffusion/stable-diffusion-webui/venv/lib/python3.10/site-packages/torch/serialization.py", line 166, in validate_cuda_device
        raise RuntimeError('Attempting to deserialize object on a CUDA '
    RuntimeError: Attempting to deserialize object on a CUDA device but torch.cuda.is_available() is False. If you are running on a CPU-only machine, please use torch.load with map_location=torch.device('cpu') to map your storages to the CPU.```
xiaohu2015 commented 10 months ago

@raisindetre it seems a bug in webui, can you open a issue in https://github.com/Mikubill/sd-webui-controlnet

Aime-ry commented 9 months ago

This is because you can't use CPU preprocessing on a Mac computer. To fix this, you can open the file located at /stable-diffusion-webui/extensions/sd-webui-controlnet/annotator/clipvision/init.py using a text editor or IDE (such as VSCode or PyCharm).

1、Navigate to line 81 and locate the line: clip_vision_h_uc = torch.load(clip_vision_h_uc)['uc']. 2、Modify this line to: clip_vision_h_uc = torch.load(clip_vision_h_uc, map_location=torch.device('cpu'))['uc']. 3、Save your changes and exit the editor. 4、Run your program again. This should prevent any CUDA-related errors. However, even after this change, you may still encounter errors when generating images.

edwios commented 9 months ago

On Apple Silicon, change cpu to mps seems also works.

jackykyd commented 9 months ago

This is because you can't use CPU preprocessing on a Mac computer. To fix this, you can open the file located at /stable-diffusion-webui/extensions/sd-webui-controlnet/annotator/clipvision/init.py using a text editor or IDE (such as VSCode or PyCharm).

1、Navigate to line 81 and locate the line: clip_vision_h_uc = torch.load(clip_vision_h_uc)['uc']. 2、Modify this line to: clip_vision_h_uc = torch.load(clip_vision_h_uc, map_location=torch.device('cpu'))['uc']. 3、Save your changes and exit the editor. 4、Run your program again. This should prevent any CUDA-related errors. However, even after this change, you may still encounter errors when generating images.

Still doesn't work. TypeError: Cannot convert a MPS Tensor to float64 dtype as the MPS framework doesn't support float64. Please use float32 instead.

raisindetre commented 9 months ago

Still doesn't work. TypeError: Cannot convert a MPS Tensor to float64 dtype as the MPS framework doesn't support float64. Please use float32 instead.

Have you set "--no-half" in your start-up flags? Mine works with that set.

emule commented 9 months ago

This is because you can't use CPU preprocessing on a Mac computer. To fix this, you can open the file located at /stable-diffusion-webui/extensions/sd-webui-controlnet/annotator/clipvision/init.py using a text editor or IDE (such as VSCode or PyCharm).

1、Navigate to line 81 and locate the line: clip_vision_h_uc = torch.load(clip_vision_h_uc)['uc']. 2、Modify this line to: clip_vision_h_uc = torch.load(clip_vision_h_uc, map_location=torch.device('cpu'))['uc']. 3、Save your changes and exit the editor. 4、Run your program again. This should prevent any CUDA-related errors. However, even after this change, you may still encounter errors when generating images.

THANK YOU! Work here after this fix! (rx6650xt) nomorerror2 nomorerror

jackykyd commented 9 months ago

Still doesn't work. TypeError: Cannot convert a MPS Tensor to float64 dtype as the MPS framework doesn't support float64. Please use float32 instead.

Have you set "--no-half" in your start-up flags? Mine works with that set.

I tried it today. It works! Thx ;)

pphoto808 commented 8 months ago

This line change "clip_vision_h_uc = torch.load(clip_vision_h_uc, map_location=torch.device('cpu'))['uc']." and "--no-half" fixed the problem for me on Mac Studio M2 Ultra. All ip-adapters functional!

Thanks!

skuznyuk commented 8 months ago

setting --no-half and that line change does fix the issue, HOWEVER performance when setting --no-half drops about 25% (1:50 -> 2:26 for same image), comparing same image with flag set and not set control net not enabled.