Open BbChip0103 opened 1 month ago
I found something more. regardless --alpha_mask option, text_encoder was not loaded (related on transformers module) I solve this issue using transformers==4.42.4
transformers>=4.43.0
transformers<=4.42.4
sadly, text_encoder loss curve is still not shown in the tensorboard. but as i checked the size of trained checkpoint, text_encoder may be contained.
Hi. I usually use '--alpha_mask' option in the 'dev' branch. If i use this option, text_encoder learning curve is disappear in the tensorboard.
without '--alpha_mask' option, text_encoder training seems to work well.
Am i something wrong?