First of all, thank you for the wonderful custom node!
Adding loras is a great feature, but unfortunately the "lora_scale" value does not seem to be affecting the output. See the attached image for reference:
I did take a look at the code, and the cross_attention_kwargs={"scale": lora_scale} does seem correct after some investigation, but I don't know enough to say if something else is missing.
I thought xformers could be the issue, but the result is same even after uninstalling it.
First of all, thank you for the wonderful custom node!
Adding loras is a great feature, but unfortunately the "lora_scale" value does not seem to be affecting the output. See the attached image for reference:
I did take a look at the code, and the
cross_attention_kwargs={"scale": lora_scale}
does seem correct after some investigation, but I don't know enough to say if something else is missing.I thought xformers could be the issue, but the result is same even after uninstalling it.
ComfyUI: 2258[922e7c](2024-06-15) Portable torch: 2.3.1+cu121 transformers: 4.41.2