I think it is a strange behavior that when merging Lora to ckpt, it is not merged to the ckpt specified in Checkpoint Original. Is that what Checkpoint Tuned means? I think this is a very stupid update for people who don't use Dreambooth. ckpt-A and ckpt-B are fine, right?
I think it is a strange behavior that when merging Lora to ckpt, it is not merged to the ckpt specified in Checkpoint Original. Is that what Checkpoint Tuned means? I think this is a very stupid update for people who don't use Dreambooth. ckpt-A and ckpt-B are fine, right?