WASasquatch / PowerNoiseSuite

Power Noise Suite for ComfyUI is a latent noise oriented custom_node for variation in your generations
GNU General Public License v3.0
69 stars 7 forks source link

normalize function doesn't actually normalize anything in blending functions #10

Open blepping opened 3 months ago

blepping commented 3 months ago

the normalize() in latent_utils.py never actually has an effect because target_min and target_max are never passed - so it normalizes to the same min/max as the latent it was passed.

it doesn't literally have no effect because of small imprecisions that result from floating point math but i verified that the output passes torch.isclose(input, output, atol=1e-05, rtol-1e-05). the difference is enough to change seeds but no actual normalization is occurring in blend functions like

    # Simulates a brightening effect by adding tensor b to tensor a, scaled by t.
    'linear dodge': lambda a, b, t: normalize(a + b * t),

since there are two latents involved you could possibly do something like:

def normalize(latent, *, reference_latent=None, dim=(-3, -2, -1)):
    if reference_latent is None:
        return latent
    min_val, max_val = (
        latent.amin(dim=dim, keepdim=True),
        latent.amax(dim=dim, keepdim=True),
    )
    target_min, target_max = (
        reference_latent.amin(dim=dim, keepdim=True),
        reference_latent.amax(dim=dim, keepdim=True),
    )

    normalized = (latent - min_val) / (max_val - min_val)
    return normalized * (target_max - target_min) + target_min

and

    # Simulates a brightening effect by adding tensor b to tensor a, scaled by t.
    'linear dodge': lambda a, b, t: normalize(a + b * t, a),

to normalize it to the same scale as a. i'm not really sure what's reasonable since that seems arbitrary.