Kinyugo / consistency_models

A mini-library for training consistency models.
https://arxiv.org/abs/2303.01469
MIT License
208 stars 21 forks source link

Obtain latent samples from the initial samples #2

Closed discordance closed 10 months ago

discordance commented 1 year ago

Hello!

Just trying to implement something similar here:

https://github.com/Kinyugo/consistency_models/blob/57812874a0e450bfba70727dd38864c5d1501644/consistency_models/consistency_models.py#L403

I don't really get how you'll be sure that it won't just be any random latent, cause this process is stochastic:

 # Obtain latent samples from the initial samples
  a = a + sigmas[0] * torch.randn_like(a)
  b = b + sigmas[0] * torch.randn_like(b)

I mean that if you have a batch of clean a, and you add sigmas[0] * torch.randn_like(a), and then denoise again, you won't have a or anything similar. Am I missing something ?

Kinyugo commented 11 months ago

Given the amount of noise that you add you can get a really close sample. As well as obtain good interpolation results.