Open rabeeh-karimi opened 2 years ago
Hi,
Thanks for the questions. re 1: there is top_p in the code because I am curious about truncated sampling, similar to what people do in the Big GAN paper. It helps the sample quality a bit, and could be a interested sampling approach, but this experiment is not included in any results in the paper.
re 2: Technically this is not necessary, but this makes implementation easier, adoted from the openai codebase. You could turn it on or off, as long as you are consistent during training and inference.
Hope this helps!
Hi thanks for sharing the codes. I wonder why there is top_p in the codes, the part you are adjusting the noise, also the scaling of timesteps here
Are these necessary ? thanks