advimman / lama

🦙 LaMa Image Inpainting, Resolution-robust Large Mask Inpainting with Fourier Convolutions, WACV 2022
https://advimman.github.io/lama-project/
Apache License 2.0
7.79k stars 827 forks source link

Cannot fine tune CelebaHq model checkpoint #268

Open aman-autophoto-ai opened 10 months ago

aman-autophoto-ai commented 10 months ago

Hello, I am unable to load celebaHq model.ckpt for fine tuning purpose. It show the below warning even after using "+trainer.kwargs.resume_from_checkpoint=LaMa_models/lama-celeba-hq/lama-fourier/models/best.ckpt" flag for using pretrained checkpoints.

UserWarning: No checkpoint file exists at resume_from_checkpoint. Start from scratch

Kindly help me resolve this issue.

Best regards, Aman

aman-autophoto-ai commented 10 months ago

Got the solution for the above problem. But now it is showing another error:

lama_error_model_loading

aman-autophoto-ai commented 10 months ago

@windj007 Hello, I am currently working on a project where i need to fine tune my model, and i believe that using the LAMA inpainting CelebaHQ model weights files as a starting point could be greatly benefit my research. I kindly request if you would be willing to share the complete CelebaHQ model weight files (both Gen&Disc) with me.

Looking forward for your help and support.

best regards, Aman Gupta

HappyYuji commented 10 months ago

Got the solution for the above problem. But now it is showing another error:

lama_error_model_loading

Hi! How did you solve this problem? I have same error message now

amangupta2303 commented 10 months ago

@HappyYuji

Hello, I have addressed the "UserWarning: No checkpoint file exists at resume_from_checkpoint. Start from scratch" error by using the absolute path for the weight files.

Regarding the error depicted in the image, I discovered that the pretrained LAMA model does not include discriminator weight files. Instead, it only contains generator weights in the .ckpt file. This means we cannot fine-tune the pretrained model on our dataset. The "best.ckpt" model weights are solely intended for inference purposes.

Furthermore, I believe the authors have only provided a complete checkpoint (both discriminator and generator weights) for the Big LAMA variant.

HappyYuji commented 10 months ago

@aman-autophoto-ai Thank you! Then I also hope that the authors provide a complete checkpoint. Have a nice day.

aman-autophoto-ai commented 10 months ago

+1