huggingface / diffusers

🤗 Diffusers: State-of-the-art diffusion models for image and audio generation in PyTorch and FLAX.
https://huggingface.co/docs/diffusers
Apache License 2.0
25.4k stars 5.26k forks source link

Using a custom pipeline with from_single_file #9552

Open agarwalml opened 5 days ago

agarwalml commented 5 days ago

I'm trying the same thing as https://github.com/huggingface/diffusers/issues/3567 using from_single_file() (assuming this is a renamed from_ckpt().

So far, this is what I have:

pipe = StableDiffusionPipeline.from_single_file(
    checkpoint,
    torch_dtype=torch.float16,
    variant="fp16",
    clip_skip=2,
)

# Load custom VAE for sketch and anime checkpoints
if "sketch" in checkpoint:
    vae = AutoencoderKL.from_single_file("models/vaes/klf8vae.safetensors", torch_dtype=torch.float16)
    pipe.vae = vae
    print(f"Loaded custom VAE for sketch")

# Load the custom LPW pipeline
from diffusers.utils import get_class_from_dynamic_module
LPWStableDiffusionPipeline = get_class_from_dynamic_module("lpw_stable_diffusion", module_file="lpw_stable_diffusion.py")

# Filter out 'image_encoder' from components
components = {k: v for k, v in pipe.components.items() if k != 'image_encoder'}
pipe = LPWStableDiffusionPipeline(**components)

The file keeps getting reloaded every single iteration, which causes my server to shut down. Any clue how I can keep the .py file stable?

Originally posted by @agarwalml in https://github.com/huggingface/diffusers/issues/3567#issuecomment-2381028483

asomoza commented 5 days ago

Hi, diffusers doesn't load in a loop, so if you're getting the file reloading multiple times it's probably because of something in your code and not in diffusers. For what I remember, the only thing that could happen is that the custom pipeline doesn't get loaded but in no instance diffusers will try to reload a file in a loop multiple times.

Also that pipeline it's old and doesn't get any updates, the same author did an awesome external library you can use that works better with the current changes and you don't have to use a custom pipeline.