Closed codingperks closed 1 year ago
Yeah, it’s expected. If you look at the average loss curve you should see a small decrease over time, but only a very small one
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.
Describe the bug
Hi there,
I'm currently fine-tuning SD1.5 on my machine using the diffusers LoRA script (detailed here) on the Pokemon BLIP dataset.
However, my (unsmoothed/smoothed) loss curve looks like this:
Are these the kinds of loss curves (which do not seem to decrease) par for the course with LoRA fine-tuning? What kind of loss curves should we expect with ideal conditions/parameters?
Thank you in advance for the help!
Reproduction
https://huggingface.co/docs/diffusers/training/lora
Logs
No response
System Info
Nvidia RTX 3080
diffusers
version: 0.18.0.dev0Who can help?
@williamberman @patrickvonplaten @saya