Sometimes get a size mismatch error. I have left settings as default (lambda, p and scale values), the resulting verbose trying to merge multiple sdxl loras
Error occurred when executing DARE Merge LoRA Stack:
The size of tensor a (32) must match the size of tensor b (16) at non-singleton dimension 0
File "E:\cunty\execution.py", line 151, in recursive_execute
output_data, output_ui = get_output_data(obj, input_data_all)
File "E:\cunty\execution.py", line 81, in get_output_data
return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
File "E:\cunty\execution.py", line 74, in map_node_over_list
results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
File "E:\cunty\custom_nodes\ComfyUI-DARE-LoRA-Merge\dare_nodes.py", line 125, in apply_lora_stack
weights[key] += lora_weights[key]
I should mention these are Pony sdxl related, so that may be the cause for errors
Sometimes get a size mismatch error. I have left settings as default (lambda, p and scale values), the resulting verbose trying to merge multiple sdxl loras
I should mention these are Pony sdxl related, so that may be the cause for errors