Thanks for your great works! I am trying to train the model on laion2B datasets.
My training details are: base_size=4096, learning_rate=1e-4, AdamW optimizer.
After 130000 iterations, the loss curve is as follows, it seems to flatten...
And the visualized images are:
prompts are:
A tiger in a lab coat with a 1980s Miami vibe, turning a well oiled science content machine, digital art
A group photo
a fish on a bike
star citizen aurora
Are you in a similar situation during training, what is the lowest loss you can achieve during training?? Hope to get your suggestions!
Thanks for your great works! I am trying to train the model on laion2B datasets.
My training details are: base_size=4096, learning_rate=1e-4, AdamW optimizer.
After 130000 iterations, the loss curve is as follows, it seems to flatten...
And the visualized images are:
prompts are: A tiger in a lab coat with a 1980s Miami vibe, turning a well oiled science content machine, digital art A group photo a fish on a bike star citizen aurora
Are you in a similar situation during training, what is the lowest loss you can achieve during training?? Hope to get your suggestions!