Closed elen07zz closed 1 month ago
Same
same here.
same
same - also, when I get a lora to load it doesn't change the image anymore. Yesterday it worked fine, but I updated Forge today - guess something broke. The swap location seems to make a difference too, although I can't see a clear pattern yet.
I had to revert two commits back to get it going again.
git checkout 394da01959ae09acca361dc2be0e559ca26829d4
I tried git checkout
command, swith to the repo of several hours ago:
git clone https://github.com/lllyasviel/stable-diffusion-webui-forge
cd ~/stable-diffusion-webui-forge
git checkout 6e6e5c21622a02f6978b0ed21e4a9557c8b49913 # commit at "do some profile on 3090", avoid "AssertionError: BNB bad weight"
Then the "Patching LoRAs" process goes fine:
[Memory Management] Estimated Remaining GPU Memory: 7657.46 MB
Patching LoRAs: 100%|███████████████████████████| 76/76 [00:06<00:00, 10.89it/s]
LoRA patching has taken 6.98 seconds
Same
Same
I had to revert two commits back to get it going again.
git checkout 394da01959ae09acca361dc2be0e559ca26829d4
made a alternative run.bat
so people can hop back to that specific version https://github.com/lllyasviel/stable-diffusion-webui-forge/issues/1216
Temp solution: change run.bat file then change to commit: 394da01959ae09acca361dc2be0e559ca26829d4
same problem
new update seems to have fixed it
Can confirm. All better now.
Using the latest commit 2f0555f7dc3f2d06b3a3cc238a4fa2b72e11e28d I get this error when I use a Lora. Using either of these two modelsl: flux1-dev-bnb-nf4-v2 and flux-dev-Q8_0.ggguf