I trained pre-optimized loras with train_preoptimized_liloras.py
Used them to train the hypernetwork with train_hyperdreambooth.py
Passed an image in input to generate the weights with hypernetwork_gen_weigth.py
At this point, I would like to test what I got running inference_test.py, but I get this error:
Traceback (most recent call last):
File "/home/wizard/bendai/research/dawnai/hyper_dreambooth/inference_test.py", line 19, in <module>
pipe.unet.load_attn_procs(model_path)
File "/home/wizard/mambaforge/envs/hyper/lib/python3.9/site-packages/diffusers/loaders.py", line 297, in load_attn_procs
state_dict = safetensors.torch.load_file(model_file, device="cpu")
File "/home/wizard/mambaforge/envs/hyper/lib/python3.9/site-packages/safetensors/torch.py", line 259, in load_file
with safe_open(filename, framework="pt", device=device) as f:
safetensors_rust.SafetensorError: Error while deserializing header: HeaderTooLarge
I followed all the steps:
train_preoptimized_liloras.py
train_hyperdreambooth.py
hypernetwork_gen_weigth.py
At this point, I would like to test what I got runninginference_test.py
, but I get this error:Any idea? Thank you!