ToTheBeginning / PuLID

[NeurIPS 2024] Official code for PuLID: Pure and Lightning ID Customization via Contrastive Alignment
Apache License 2.0
2.72k stars 189 forks source link

KeyError 'img_in.weight._data' on run with --offload --fp8 #69

Closed heinrichI closed 2 months ago

heinrichI commented 2 months ago

image KeyError 'img_in.weight._data' File "I:\flux\PuLID\flux\util.py", line 158, in load_flow_model_quintized requantize(model, sd, quantization_map, device=device) File "I:\flux\PuLID\app_flux.py", line 25, in get_models model = load_flow_model_quintized(name, device="cpu" if offload else device) File "I:\flux\PuLID\app_flux.py", line 39, in init self.model, self.ae, self.t5, self.clip = get_models( File "I:\flux\PuLID\app_flux.py", line 197, in create_demo generator = FluxGenerator(model_name, device, offload, aggressive_offload, args) File "I:\flux\PuLID\app_flux.py", line 325, in demo = create_demo(args, args.name, args.device, args.offload, args.aggressive_offload) KeyError: 'img_in.weight._data'

optimum-quanto==0.2.4 torch==2.4.1+cu121 torchvision==0.19.1+cu121

In ckpt_path I load unet/flux1-dev-fp8.safetensors

ToTheBeginning commented 2 months ago

We use flux-fp8 checkpoint from https://huggingface.co/XLabs-AI/flux-dev-fp8 (it will be automatically downloaded), please check if you are using the same model. We also update the readme to make it more clear.

smthemex commented 2 months ago

I also encountered an error when loading the X_lab model and its accompanying config file. Instead, I changed it to other unet and unet config files, and it run successfully. The following is a picture of the success: flux_pulid_fp8_12GVR

ToTheBeginning commented 2 months ago

I also encountered an error when loading the X_lab model and its accompanying config file. Instead, I changed it to other unet and unet config files, and it run successfully. The following is a picture of the success: flux_pulid_fp8_12GVR

This does not look like PuLID-FLUX. As far as we know, the porting to comfyui is still in develop.

smthemex commented 2 months ago

I also encountered an error when loading the X_lab model and its accompanying config file. Instead, I changed it to other unet and unet config files, and it run successfully. The following is a picture of the success: flux_pulid_fp8_12GVR

This does not look like PuLID-FLUX. As far as we know, the porting to comfyui is still in develop.

it working with PULID flux ,and run in 12 G VRAM

heinrichI commented 2 months ago

flux-dev-fp8

Yes I pass another flux-dev-fp8.safetensors, which weighed 16 GB. With the correct file it worked.