yuanzhi-zhu / prolific_dreamer2d

Unofficial implementation of 2D ProlificDreamer
130 stars 6 forks source link

Why the loss term is always 1.0 in VSD? #4

Closed cwchenwang closed 1 year ago

cwchenwang commented 1 year ago

If I set the loss_weight_type to none, it seems that the loss is always 1.0, quite curious about this. After I look into the code, still don't know why. Train_loss_curve_20230704_0015

yuanzhi-zhu commented 1 year ago

Hi @cwchenwang , can you provide more info about the other parameters?

yuanzhi-zhu commented 1 year ago

@cwchenwang sorry for the confusion, but currently the loss is a dummy, only reflecting the weight of loss at each $t$ https://github.com/yuanzhi-zhu/prolific_dreamer2d/blob/b7a196f460e42c3474f0bbf453b0a70c55f51ed5/model_utils.py#L17-L23 when you set loss_weight_type to none, the weights are set to 1 but default hence the loss you get.

cwchenwang commented 1 year ago

Thanks for your explanation.

yuanzhi-zhu commented 1 year ago

even this loss is not that informative, I found that for vsd it's much smaller than sds, in the latest update :]

cwchenwang commented 1 year ago

shouldn't it always be 1.0? Why there is a difference between sds and vsd?

yuanzhi-zhu commented 1 year ago

in the latest update you can print the "loss" as: https://github.com/yuanzhi-zhu/prolific_dreamer2d/blob/bd94535c633ab48800cdf050697f52993ef590b1/model_utils.py#L24-L25