Closed midcoastal closed 9 months ago
Enter valid title.
Enter valid title.
My bad.
how was this model created? original lcm-dreamshaper model can only be loaded in diffusers format, not safetensors. and all others should either be standard sd models with lcm lora loaded on top or with that same lora fused in the model. and for those models, there is no need to trigger "special" lcm loader behavior for model.
if model somehow started from original lcm-dreamshaper, then yes, it would need a special loading handler that right now doesn't exist - in which case, that would be a feature request (and i'd need to discuss it with diffusers team as it needs to be implemented upstream)
Ok. I guess where the confusion comes in is when we explicitly detect the LCM
in the model name, and assign the Latent Consistency Model
guess
, which then causes the error when load is attempted. The curious part is we actually throw an error if LCM is detected, returning the guess that leads to this class loading, if not using backend.DIFFUSERS
. Hence my being confused...
I do not know how the model was created, only that it is an LCM
model, and I can confirm that it is not behaving as expected when loaded as a Stable Diffusion
Model, or as an XL
model. So I am left to believe that it does indeed need to be loaded as LCM.
Diffusers does support LCM models, for both T2I and I2I, but just not loading this particular way (using from_single_file
). Searching for the class LatentConsistencyModelPipeline
(which is returned from our own pipeline resolution).
The diffusers
library has it's own "AutoPipeline" implimentation in diffusers.pipelines.auto_pipeline
in the classes AutoPipelineForImage2Image
and AutoPipelineForText2Image
. I haven't played with this at all yet, but looking about it, it seems like we should be using from_pretrained
rather than from_single_file
. But I am not certain.
Looking through the code some some, I do see where prom_pretrained
is actually loading a diffusers
-style directory. Interesting...
I have several models from CivitAI that are marked LCM
. Curiously, none of them would trigger the LCM check, as they do not follow the LCM[_-]
naming convention, so they get loaded as standard StableDiffusion
models, which I guess, as you said, checks out?
yes, sdnext is using from_pretrained
- for diffusers-style models - that's what its for.
for safetensors, there is only from_single_file
- and its implemented for some classes, but not for all. i've been working close with diffusers team on gradually adding support to more and more (e.g. if you remember, when sdxl first came out, sdnext didn't support loading from safetensors).
regarding lcm - if there was only one type of lcm models, i would have pushed to add from_single_file
to that class long time ago.
but as it is, that applies only to original lcm and that is 1% of usage, the rest is normal sd model with either loaded or merged lcm lora on top of it. and for that type of lcm, you load it as normal sd model.
as-it-is, i might disable name-check for lcm safetensors models as it doesn't really serve a purpose in current state.
@vladmandic - see #2569
explicit load for lcm is removed as not needed.
Issue Description
I noticed that one of my LCM models was not loading, so I went in and adjusted the matching criteria for LCM to properly detect the model name, and when I loaded it, I got the following error:
Unsure what I did wrong, I see others here who look like they have gotten LCM to work correctly, but this seems like a bug? This is on current
dev
and a launch with--reinstall
.Version Platform Description
No response
Relevant log output
Backend
Diffusers
Branch
Dev
Model
LCM
Acknowledgements