Closed ylhua closed 12 months ago
Hi! Please notice that over-saturation happens after 5000 steps (which is the t annealing step). In our experiments, some cases did behave worse after t annealing and we're trying to find out why. Could you try training without t annealing by setting system.guidance.anneal_start_step=null
?
Hi! You could also consider using a black background. In my experience, a black background tends to be more stable and avoids issues of over-saturation. Here are my results:
Before:
After:
I modify the background to
background_type: "solid-color-background"
background:
n_output_dims: 3
color: [0.0, 0.0, 0.0]
In addition, please note that there are some floaters present. You may consider increasing the weight of sparse loss to eliminate them.
Thanks for your help. This is the result after setting system.guidance.anneal_start_step=null, black background and increasing the weight of sparse loss to 20. It seems better than before, but also have over-saturation. This is just a case, actually many cases do not have good results compared to magic3d, which is more stable.
@ylhua This result you showed does not seem over-saturated to me. But using black background does bring an artifact: part of the foreground object is painted to background color (black), which makes the tentacles "blob-ish", losing details.
Hi, guys. @bennyguo @ylhua I have discovered another significant issue, which is opaque loss. In the default configuration, the lambda_opaque will be set to 1000 after 10000 iterations. However, this setting is not appropriate for translucent objects like jellyfish. Here is the result before 10000 iterations, which looks fine to me.
Perhaps you can consider setting the background to black and the lambda_opaque to zero for better generation.
@bennyguo @DSaurus Thanks for all your help. I will have a try!
Thanks for your great jod! I'm getting some bad results when using Prolific Dreamer models. Below is an example using "a jellyfish" as a text prompt. It went well in the first 15000 steps, but gradually became chaotic. Do you have some ideas of the reason? And here is the training loss.