gathierry / FastFlow

Apache License 2.0
124 stars 22 forks source link

why loss is negtive? #18

Open cyj95 opened 1 year ago

cyj95 commented 1 year ago

图片

cytotoxicity8 commented 1 year ago

The loss is therotically negative log-likelihood, and the likelihood is computed as {1/sqrt(2pi)}^n exp(-z^Tz / 2) Jac. => The negative log-likelihood: n/2 * log(2pi) + z^Tz / 2 - logJac

  1. The likelihood can be itself over 1 anyway.
  2. Because n/2 * log(2pi) aren't computed for the actual computing. (It is constant). The loss is lower than the real negative log-likelihood. Refer to loss in fastflow.py.