YongfeiYan / Neural-Document-Modeling

PyTorch implementations of NVDM, GSM, NTM, NTMR
74 stars 7 forks source link

The topic diversity regularization penalty may have a wrong sign. #5

Closed pangzss closed 2 years ago

pangzss commented 2 years ago

Thanks for sharing your great codes, which are very helpful to me. It is just that there seems to be a minor issue regarding the topic diversity regularization term.

In the GSM paper, the objective in the appendix

J = L + lambda*(mean-var)

is something that we need to maximize. So expressed in loss function, the regularization should be

lambda*(var-mean)

which corresponds to the text in the paper

During training, the mean angle is encouraged to be larger while the variance is suppressed to be smaller so that all of the topics will be pushed away from each other in the topic semantic space

However, it seems that the code forgot to flip the sign of the result obtained from this line

https://github.com/YongfeiYan/Neural-Document-Modeling/blob/763972476f391872eec8de73472cf836f08ee054/models/utils.py#L181

and directly used the result as a penalty term as in here

https://github.com/YongfeiYan/Neural-Document-Modeling/blob/763972476f391872eec8de73472cf836f08ee054/models/NTM.py#L50

What do you think?

Looking forward to your reply. Thank again.