PixArt-alpha / PixArt-sigma

PixArt-Σ: Weak-to-Strong Training of Diffusion Transformer for 4K Text-to-Image Generation
https://pixart-alpha.github.io/PixArt-sigma-project/
GNU Affero General Public License v3.0
1.44k stars 68 forks source link

why the checkpoints is biger #34

Open chongxian opened 2 months ago

chongxian commented 2 months ago

hello,when I use the PixArt Training Tutorial to train model, the origin checkpoints "PixArt-Sigma-XL-2-256x256.pth"only has 2.9GB, but my train result has 4.19GB, how to use the trained result ? thanks for your reply

ApolloRay commented 2 months ago

hello,when I use the PixArt Training Tutorial to train model, the origin checkpoints "PixArt-Sigma-XL-2-256x256.pth"only has 2.9GB, but my train result has 4.19GB, how to use the trained result ? thanks for your reply

fp16 and fp32 ? maybe

lawrence-cj commented 2 months ago

Maybe the optimizer's parameter is also saved?

chongxian commented 2 months ago

Maybe the optimizer's parameter is also saved? I follow this command,but get the 4.19GB image

chongxian commented 2 months ago

hello,when I use the PixArt Training Tutorial to train model, the origin checkpoints "PixArt-Sigma-XL-2-256x256.pth"only has 2.9GB, but my train result has 4.19GB, how to use the trained result ? thanks for your reply

fp16 and fp32 ? maybe

fp16

lawrence-cj commented 2 months ago

Maybe the optimizer's parameter is also saved?

No?

chongxian commented 2 months ago

Maybe the optimizer's parameter is also saved?

No?

I am not quite sure,I only modify PixArt_sigma_xl2_img256_internal.py context about the data file,how can I do to get the right checkpoints? image

chongxian commented 2 months ago

Maybe the optimizer's parameter is also saved?

No?

yes, include epoch ,optimizer,scheduler

chongxian commented 2 months ago

Maybe the optimizer's parameter is also saved?

No?

but I can't use the comfyUi to load this checkpoints, report the error about loading state_dict for PixArtMS,but can load PixArt-Sigma-XL-2-512-MS.pth and others image