Hello, thank you for your great work! I'm trying to replace the diffusion-generated data with rendered data in Blender. The normal map part is tricky. Could you share how you rendered them? A rendering script would be ideal, but details on compositing settings, pre/post-processing, space, and axis alignment would also be very helpful.
P.S. Here’s an example of the generated and rendered normal maps. I’ve experimented with axis and object orientation, but I couldn’t find a configuration that produces the same normal map.
Hello, thank you for your great work! I'm trying to replace the diffusion-generated data with rendered data in Blender. The normal map part is tricky. Could you share how you rendered them? A rendering script would be ideal, but details on compositing settings, pre/post-processing, space, and axis alignment would also be very helpful.
P.S. Here’s an example of the generated and rendered normal maps. I’ve experimented with axis and object orientation, but I couldn’t find a configuration that produces the same normal map.