sairajk / easi-tex

[SIGGRAPH 2024] "EASI-Tex: Edge-Aware Mesh Texturing from Single Image", ACM Transactions on Graphics.
https://sairajk.github.io/easi-tex/
Other
97 stars 7 forks source link

[TESTING] Result for a little bit complicated mesh #2

Closed MrForExample closed 2 months ago

MrForExample commented 2 months ago

As expected, since this texture generation pipeline has no way to ensure consistent texture generation across the different views, and it doesn't try to keep the lighting condition the same across the different views, so the result is rather messy and unusable:

https://github.com/sairajk/easi-tex/assets/62230687/07b341b9-a8d8-4d52-9c2d-fff0abcc0071

Last year, I tried the similar workflow, but in addition to Canny ControlNet & IP-Adaptor, I used AnimateDiff as a mean to ensure consistent texture generation across the different views, and use Brightness ControlNet to keep the lighting condition the same across the different views.

Result is better but still takes long to generate & same parameters are not always gives good results.

https://github.com/sairajk/easi-tex/assets/62230687/e0fa5f21-b679-46bc-a84c-67bc2eb6a821

sairajk commented 2 months ago

Interesting, thank you for sharing.

bdcms1 commented 4 weeks ago

@MrForExample Super awesome work, can you please share how to use AnimateDiff to eliminate the texture consistency issue?