Closed Gus-Guo closed 1 month ago
It must be a bug here. But it would not influence anything since in the ckpt loading during training, we will replace it anyway. https://github.com/PixArt-alpha/PixArt-sigma/blob/ac7004b26f3fce9223ea8fdde1950508c772798c/diffusion/utils/checkpoint.py#L67
It must be a bug here. But it would not influence anything since in the ckpt loading during training, we will replace it anyway.
I see the code in line 67 replace y_embedder.y_embedding instead of pos_embed, but code in line 57 actually delete pos_embed when loading ckpt. Does it affect inference?
Nope
why pixart-sigma-256x256 ckpt has no 'pos_embed', while other ckpts has?