mathiesonlab / pg-gan

33 stars 9 forks source link

discriminator entropy regularisation confusion #3

Closed grahamgower closed 2 years ago

grahamgower commented 2 years ago

I think I found a small problem with the regularisation term in the discriminator loss function. The cross-entropy function being used is keras' BinaryCrossentropy with from_logits=True. This means for cross_entopy(a, b), a should be probabilities (or class labels) on [0, 1] and b should be logits on(-inf, inf). https://github.com/mathiesonlab/pg-gan/blob/b1a82ae331fb852ec486e43eb0efd5a16786237a/pg_gan.py#L288

In the discriminator loss function, this cross entropy function is used appropriately for the main loss term. https://github.com/mathiesonlab/pg-gan/blob/b1a82ae331fb852ec486e43eb0efd5a16786237a/pg_gan.py#L345-L347

But then for the regularisation term, both arguments are given the same value. I guess the first argument is wrong. https://github.com/mathiesonlab/pg-gan/blob/b1a82ae331fb852ec486e43eb0efd5a16786237a/pg_gan.py#L349-L354

Maybe this still has a regularising effect? I guess the constant 0.001 would need to be empirically rediscovered if this code were changed?

saramathieson commented 2 years ago

Thank you for letting me know about this! (and apologies for the delay, I missed this issue earlier) I have fixed the entropy calculation now. I'll need to do more testing to reevaluate the constant, but I don't think this regularization term is doing that much anyway.