cloneofsimo / lora

Using Low-rank adaptation to quickly fine-tune diffusion models.
https://arxiv.org/abs/2106.09685
Apache License 2.0
6.99k stars 481 forks source link

AssertionError: Cached latents not supported for inpainting #225

Open pf67 opened 1 year ago

pf67 commented 1 year ago

When I try to use training inpainting on LoRA PTI. An error occurs Traceback (most recent call last): File "/root/miniconda3/envs/lora-train1/bin/lora_pti", line 8, in sys.exit(main()) File "/root/miniconda3/envs/lora-train1/lib/python3.10/site-packages/lora_diffusion/cli_lora_pti.py", line 1040, in main fire.Fire(train) File "/root/miniconda3/envs/lora-train1/lib/python3.10/site-packages/fire/core.py", line 141, in Fire component_trace = _Fire(component, args, parsed_flag_args, context, name) File "/root/miniconda3/envs/lora-train1/lib/python3.10/site-packages/fire/core.py", line 475, in _Fire component, remaining_args = _CallAndUpdateTrace( File "/root/miniconda3/envs/lora-train1/lib/python3.10/site-packages/fire/core.py", line 691, in _CallAndUpdateTrace component = fn(*varargs, **kwargs) File "/root/miniconda3/envs/lora-train1/lib/python3.10/site-packages/lora_diffusion/cli_lora_pti.py", line 856, in train assert not cached_latents, "Cached latents not supported for inpainting" AssertionError: Cached latents not supported for inpainting

nafiturgut commented 1 year ago

You can add --cached_latents=False to lora_pti cli command to disable use for cached latents. This argument should be False for inpainting.

hudscsc commented 1 year ago

may I ask what ./data/data_captioned looks like?