I suddenly started getting this error in my workflow when I'm attempting to merge two models and use the result as input for KSampler. The issue is caused by all model merging nodes, using models with loras already loaded as inputs.
Requested to load SD1ClipModel
Loading 1 new model
Requested to load BaseModel
Loading 1 new model
ERROR:root:!!! Exception during processing !!!
ERROR:root:Traceback (most recent call last):
File "C:\Users\jason\Stable-Diffusion\web-ui\ComfyUI\execution.py", line 153, in recursive_execute
output_data, output_ui = get_output_data(obj, input_data_all)
File "C:\Users\jason\Stable-Diffusion\web-ui\ComfyUI\execution.py", line 83, in get_output_data
return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
File "C:\Users\jason\Stable-Diffusion\web-ui\ComfyUI\execution.py", line 76, in map_node_over_list
results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
File "C:\Users\jason\Stable-Diffusion\web-ui\ComfyUI\nodes.py", line 1237, in sample
return common_ksampler(model, seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise)
File "C:\Users\jason\Stable-Diffusion\web-ui\ComfyUI\nodes.py", line 1207, in common_ksampler
samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image,
File "C:\Users\jason\Stable-Diffusion\web-ui\ComfyUI\comfy\sample.py", line 93, in sample
real_model, positive_copy, negative_copy, noise_mask, models = prepare_sampling(model, noise.shape, positive, negative, noise_mask)
File "C:\Users\jason\Stable-Diffusion\web-ui\ComfyUI\comfy\sample.py", line 86, in prepare_sampling
comfy.model_management.load_models_gpu([model] + models, comfy.model_management.batch_area_memory(noise_shape[0] * noise_shape[2] * noise_shape[3]) + inference_memory)
File "C:\Users\jason\Stable-Diffusion\web-ui\ComfyUI\comfy\model_management.py", line 406, in load_models_gpu
cur_loaded_model = loaded_model.model_load(lowvram_model_memory)
File "C:\Users\jason\Stable-Diffusion\web-ui\ComfyUI\comfy\model_management.py", line 289, in model_load
raise e
File "C:\Users\jason\Stable-Diffusion\web-ui\ComfyUI\comfy\model_management.py", line 285, in model_load
self.real_model = self.model.patch_model(device_to=patch_model_to) #TODO: do something with loras and offloading to CPU
File "C:\Users\jason\Stable-Diffusion\web-ui\ComfyUI\comfy\model_patcher.py", line 180, in patch_model
out_weight = self.calculate_weight(self.patches[key], temp_weight, key).to(weight.dtype)
File "C:\Users\jason\Stable-Diffusion\web-ui\ComfyUI\comfy\model_patcher.py", line 208, in calculate_weight
weight += alpha * comfy.model_management.cast_to_device(w1, weight.device, weight.dtype)
File "C:\Users\jason\Stable-Diffusion\web-ui\ComfyUI\comfy\model_management.py", line 515, in cast_to_device
return tensor.to(device).to(dtype)
NotImplementedError: Cannot copy out of meta tensor; no data!
Comfy works as expected if I remove the model merging nodes. It's worth to mention this workflow was working before. I didn't change anything; so I don't know what could be causing this, anyone can reproduce?
I suddenly started getting this error in my workflow when I'm attempting to merge two models and use the result as input for KSampler. The issue is caused by all model merging nodes, using models with loras already loaded as inputs.
Comfy works as expected if I remove the model merging nodes. It's worth to mention this workflow was working before. I didn't change anything; so I don't know what could be causing this, anyone can reproduce?