Open inspirit opened 6 months ago
Hi @inspirit , I am definitely not an expert on using LFQ (on which there is quite some work going on on this repo, if I am not mistaken), but why the negativity of the entropy_aux_loss is an issue? I thought the same at some point, but in the end the goal is to get the minimum of a function loss, not necesseraily a positive one.
Hey @MisterBourbaki, you right and as i pointed above even with negative aux_loss the training progresses as expected with reduction of overall reconstruction error, but we still need to carefully select weight for entropy part of aux_loss since it affects code book usage and overall reconstruction error. when tuning entropy weight we increase code book usage it does not lead to lower reconstruction error as one would hope for, or may be it requires much longer training to generalise for larger code book use... i wonder
Hi @inspirit , I am definitely not an expert on using LFQ (on which there is quite some work going on on this repo, if I am not mistaken), but why the negativity of the entropy_aux_loss is an issue? I thought the same at some point, but in the end the goal is to get the minimum of a function loss, not necesseraily a positive one.
the reconstruction loss over VQ is a positive value, while the LFQ returns a negtive value
Hey @MisterBourbaki, you right and as i pointed above even with negative aux_loss the training progresses as expected with reduction of overall reconstruction error, but we still need to carefully select weight for entropy part of aux_loss since it affects code book usage and overall reconstruction error. when tuning entropy weight we increase code book usage it does not lead to lower reconstruction error as one would hope for, or may be it requires much longer training to generalise for larger code book use... i wonder
have you found a good reconstruction loss weight there? I have the same issure. We are training a VQVAE-GAN, as a result of the negtive value of aux_loss, the whole generative loss is a negative value.
Hi Phil,
I was experimenting with FSQ/LFQ for 3d-motion autoencoder and was wondering how to work through LFQ variety of options? With FSQ its quite straight forward as it does not have any losses and you just need to find suitable levels for particular task. However when I start training LFQ immediately I face negative aux_loss issue which is of course due to entropy_aux_loss. I understand that we can combat it with decreasing diversity but i dont like the idea, another approach would be lowering entropy_loss_weight which can lead to poor codebook utilisation i believe...
So what are the options to tune LFQ parameters? btw even with negative aux_loss it still seems to train fine, at least reconstruction loss goes down