Vchitect / LaVie

LaVie: High-Quality Video Generation with Cascaded Latent Diffusion Models
Apache License 2.0
852 stars 60 forks source link

Cannot load ../pretrained_models/stable-diffusion-v1-4 because encoder.conv_in.weight expected shape tensor(..., device='meta', size=(64, 3, 3, 3)), but got torch.Size([128, 3, 3, 3]). #51

Open foxyear-kyumin opened 9 months ago

foxyear-kyumin commented 9 months ago

chosen wrong model?

Freedomcls commented 9 months ago

Hello, I was wondering if you solved the problem.

maxin-cn commented 9 months ago

Hello, I was wondering if you solved the problem.

@Freedomcls Hi, Could you please provide more details for this problem? thanks~

delcompan commented 6 months ago

Same error as you vae = AutoencoderKL.from_pretrained(sd_path, subfolder="vae", torch_dtype=torch.float16).to(device) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "G:\ImageAI\FreeNoise-LaVie\venv\Lib\site-packages\diffusers\models\modeling_utils.py", line 583, in from_pretrained raise ValueError( ValueError: Cannot load <class 'diffusers.models.autoencoder_kl.AutoencoderKL'> from G:/ImageAI/FreeNoise-LaVie/pretrained_models/stable-diffusion-v1-4 because the following keys are missing: encoder.mid_block.attentions.0.value.weight, decoder.mid_block.attentions.0.proj_attn.weight, decoder.mid_block.attentions.0.key.bias, decoder.mid_block.attentions.0.query.bias, decoder.mid_block.attentions.0.key.weight, encoder.mid_block.attentions.0.query.bias, encoder.mid_block.attentions.0.proj_attn.weight, encoder.mid_block.attentions.0.proj_attn.bias, decoder.mid_block.attentions.0.value.weight, decoder.mid_block.attentions.0.proj_attn.bias, encoder.mid_block.attentions.0.key.weight, encoder.mid_block.attentions.0.key.bias, decoder.mid_block.attentions.0.value.bias, decoder.mid_block.attentions.0.query.weight, encoder.mid_block.attentions.0.value.bias, encoder.mid_block.attentions.0.query.weight. Please make sure to pass low_cpu_mem_usage=False and device_map=None if you want to randomly initialize those weights or else make sure your checkpoint file is correct.