Closed xesdiny closed 4 months ago
Oops! I forgot to mention the weighting construct for 1,2 in lookup_free_quantize.py
...
loss = (sample_minimization_weight * sample_entropy) - (
batch_maximization_weight * avg_entropy
)
return sample_entropy, avg_entropy, loss
I see LFQ result 3 loss
In
VQLPIPSWithDiscriminator.py
just printIs it because it is just a reverse sharp protrusion of energy entropy as a metric?
""" Entropy loss of unnormalized logits
logits: Affinities are over the last dimension
https://github.com/google-research/magvit/blob/05e8cfd6559c47955793d70602d62a2f9b0bdef5/videogvt/train_lib/losses.py#L279 LANGUAGE MODEL BEATS DIFFUSION — TOKENIZER IS KEY TO VISUAL GENERATION (2024) """