junhsss / consistency-models

A Toolkit for OpenAI's Consistency Models.
https://arxiv.org/abs/2303.01469
MIT License
195 stars 11 forks source link

Timestep schedule for sampling #1

Closed kimihailv closed 1 year ago

kimihailv commented 1 year ago

Hello. Could you please explain why do you use this tilmestep schedule:

 _timesteps = list(
            reversed(range(0, self.bins_max, self.bins_max // steps - 1))
        )[1:]
_timesteps = [t + self.bins_max // ((steps - 1) * 2) for t in _timesteps]

May be it is from some paper?

junhsss commented 1 year ago

Hi @kimihailv! I improvised that part since there aren't many details about determining sampling schedules.

That's basically the same schedule used when training is almost finished, so picking evenly spaced points (in _timestep space) from there seems to be a natural choice. Maybe it could be overfitted to those points, who knows? 😂 I think that's the reason why continuous-time training is better, without discretization.

But practically, you could freely choose whatever sampling schemes you want. There shouldn't be a noticeable quality difference.

kimihailv commented 1 year ago

@junhsss Thank you for such detailed answer! By the way, it is interesting that authors found optimal timesteps by greedy algorithm

junhsss commented 1 year ago

@kimihailv Oh, I must have missed that part! Thanks for pointing that out.

It is indeed interesting. So we need to sample more than N steps to perform N-step sampling... 😅 I'll implement it as soon as possible. Thanks again!