Closed wkpark closed 7 months ago
torch.mm()
torch.matmul()
lap
lapjv
pip install lap
pip install lapjv
originally, the special_layers for rebasin were "P_bg358", "P_bg324", "P_bg337" https://github.com/ogkalu2/Merge-Stable-Diffusion-models-without-distortion/commit/c0dd03ec382899722f955fbb6aaa6d02bd5d390f#diff-ed17b6ad958b07d3f6300910b1c1e917c6995ec7770aff5fee28c95c3c5f0b2fL788 sd_meh also use the same https://github.com/s1dlx/meh/blob/main/sd_meh/rebasin.py#L2296
special_layers
"P_bg358", "P_bg324", "P_bg337"
after some testing, we can confirm that the permutation groups whose weights change after maximum weight matching are limited to a subset of the permutations. and an additional special layer "P_bg371" was found.
"P_bg371"
P_bg324
.out.
P_bg358
.decoder.norm_out
P_bg337
.encoder.norm_out
P_bg371
.mlp.fc1
fc2
now we can add "fast rebasin" mode.
torch.mm()
insteadtorch.matmul()
lap
orlapjv
bypip install lap
,pip install lapjv
command(currently, only "lap" will works)originally, the
special_layers
for rebasin were"P_bg358", "P_bg324", "P_bg337"
https://github.com/ogkalu2/Merge-Stable-Diffusion-models-without-distortion/commit/c0dd03ec382899722f955fbb6aaa6d02bd5d390f#diff-ed17b6ad958b07d3f6300910b1c1e917c6995ec7770aff5fee28c95c3c5f0b2fL788 sd_meh also use the same https://github.com/s1dlx/meh/blob/main/sd_meh/rebasin.py#L2296after some testing, we can confirm that the permutation groups whose weights change after maximum weight matching are limited to a subset of the permutations. and an additional special layer
"P_bg371"
was found.P_bg324
:.out.
unetP_bg358
:.decoder.norm_out
vaeP_bg337
:.encoder.norm_out
vaeP_bg371
:.mlp.fc1
,fc2
textencodernow we can add "fast rebasin" mode.