UCSB-NLP-Chang / DiffSTE

MIT License
85 stars 8 forks source link

The training question about diffste. #12

Open thebestYezhang opened 1 year ago

thebestYezhang commented 1 year ago

Thanks for your great work! I have a question. When I train the diffste in my own dataset, the training Progress bar only prints out the loss of half of total training steps, and the loss of the rest half of total training steps is not show by the Progress bar. I carefully reviewed the training process, but couldn't find the problem. May I ask if you were in the same situation at training time? image

Question406 commented 1 year ago

Hi, actually, the training loss printed in the terminal is the loss for a single batch instead of the average loss over the entire dataset. For the details of what is printed, please see this link.

The reason this progress bar is split in the middle of one epoch is the output during training, the line do_classifier_free_guidance; I guess this is some log information you print when you log some generated images.

yang-chenyu104 commented 8 months ago

i want to say format of data,i want to train text editing

yang-chenyu104 commented 8 months ago

when i training model,the path of log makes me very sad