AiuniAI / Unique3D

[NeurIPS 2024] Unique3D: High-Quality and Efficient 3D Mesh Generation from a Single Image
https://wukailu.github.io/Unique3D/
MIT License
3.1k stars 248 forks source link

How did you render the normal map? #113

Open bnawras opened 3 weeks ago

bnawras commented 3 weeks ago

Hello, thank you for your great work! I'm trying to replace the diffusion-generated data with rendered data in Blender. The normal map part is tricky. Could you share how you rendered them? A rendering script would be ideal, but details on compositing settings, pre/post-processing, space, and axis alignment would also be very helpful.

P.S. Here’s an example of the generated and rendered normal maps. I’ve experimented with axis and object orientation, but I couldn’t find a configuration that produces the same normal map.