thuiar / MMSA

MMSA is a unified framework for Multimodal Sentiment Analysis.
MIT License
634 stars 104 forks source link

Question for Forward lld (gaussian prior) and entropy estimation in MMILB Module #74

Open yangmiemiemie1 opened 10 months ago

yangmiemiemie1 commented 10 months ago

positive = -(mu - y)**2/2./torch.exp(logvar) Is "positive" vector (above in line 152) for the p(y|x) ~ N(y|µθ1(x), σ2 (x) I)? where is the -(lnσ + C) items in the probability density function for Normal distribution ?Why is it missing?

Columbine21 commented 8 months ago

Hi, This code is directly copied from the official implementation code of the paper Improving Multimodal Fusion with Hierarchical Mutual Information Maximization for Multimodal Sentiment Analysis, Please refer to the official implement here.

https://github.com/declare-lab/Multimodal-Infomax/blob/cd0774c5a712ca5f1a5497dbf27dde11cade7434/src/modules/encoders.py#L152

We also find some other errors in symbols (including the Eq. (3) $I(X;Y) =E{p(x,y)} [\log \frac{q(y|x)}{p(y)} ]+ E{p(\mathbf{y})}[KL(p(y|x) || q(y|x))]$ Should be modified as $I(X;Y) =E{p(x,y)} [\log \frac{q(y|x)}{p(y)} ]+ E{p(\mathbf{x})}[KL(p(y|x) || q(y|x))]$ . But for your question, the "-(lnσ + C) items"

  1. C term is not matter due to the existence of the weighted loss coefficient.
  2. -lnσ term here is probably ignored, as it contributes less to the model performance I think. (but I am not sure.)

(If you want to further discuss the problem, you can find me through email: yzq21@mails.tsinghua.edu.cn)