RayUCF / NF

0 stars 0 forks source link

Normalizing flows training with negative loss #1

Open RayUCF opened 1 year ago

RayUCF commented 1 year ago

@yarinbar. Hi, I saw you published a question 'Negative -log likelihood loss when training Normalizing Flow'. Can you please let me know if this problem is solved? I have the same problem. With the same code applied to different datasets, one is positive, but another is negative in terms of loss function.

yarinbar commented 1 year ago

In my understanding, since the log likelihood is of a distribution function, the value of p(x) may be higher than 1 - thus making the log in the negatives.

For example, a uniform distribution in [0, 0.1] will yield p(x)=10.

RayUCF commented 1 year ago

Hello Yarin,

Thank you so much for the reply.

From my understanding, the loss function presents the KL divergence. With a negative natural logarithm, it should be positive, right?

If the loss function toward to negative infinity, it is hard to determine when to stop training process.

Best, Ray

On Fri, Oct 6, 2023 at 12:51 AM Yarin Bar @.***> wrote:

In my understanding, since the log likelihood is of a distribution function, the value of p(x) may be higher than 1 - thus making the log in the negatives.

For example, a uniform distribution in [0, 0.1] will yield p(x)=10.

— Reply to this email directly, view it on GitHub https://github.com/RayUCF/NF/issues/1#issuecomment-1750020970, or unsubscribe https://github.com/notifications/unsubscribe-auth/AIHGH4EJBRXMGE7RACFMVQLX56L5LAVCNFSM6AAAAAA5VCWG46VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTONJQGAZDAOJXGA . You are receiving this because you authored the thread.Message ID: @.***>