FoundationVision / VAR

[NeurIPS 2024 Oral][GPT beats diffusion🔥] [scaling laws in visual generation📈] Official impl. of "Visual Autoregressive Modeling: Scalable Image Generation via Next-Scale Prediction". An *ultra-simple, user-friendly yet state-of-the-art* codebase for autoregressive image generation!
MIT License
4.3k stars 316 forks source link

Question about the cross-antropy loss average? #53

Closed Yheechou closed 5 months ago

Yheechou commented 6 months ago

What level of cross entropy loss in model training will have a good convergence effect

YilanWang commented 6 months ago

I'm not the author but you can try the entropy loss in CodeFormer

keyu-tian commented 6 months ago

@Yheechou cross entropy < 5.5 will look good. It's a 4096-category classification task.

Yheechou commented 6 months ago

thanks

MiracleDance commented 5 months ago

@keyu-tian So what level of 'Accm' may be good?