tqch / ddpm-torch

Unofficial PyTorch Implementation of Denoising Diffusion Probabilistic Models (DDPM)
MIT License
200 stars 35 forks source link

DDIM sampler training parameters #10

Closed h-bouzid closed 1 year ago

h-bouzid commented 1 year ago

Hi, thanks for your great work in implementing the DDPM baseline. I'm curious about the changes in numberOfSteps, betas start and end, when changing the sampler from DDPM to DDIM. could you please provide me with the DDIM sampler training parameters. Thank you in advance.

tqch commented 1 year ago

Hi, regarding the parameters for the DDIM sampler, you can refer to the original DDIM paper. For faster inference speed, I would consider a $S$ of 10 or 20; for better generation quality, I would suggest a $S$ no less than 50. You can specify $S$ by setting the subseq-size parameter. image In terms of time skipping schedule, according to the paper, they recommend the linear schedule for most datasets except for CIFAR10. To do this, simply setting skip-schedule as linear (or quadratic). image

h-bouzid commented 1 year ago

Thank you very much