half-potato / nmf

Our method takes as input a collection of images (100 in our experiments) with known cameras, and outputs the volumetric density and normals, materials (BRDFs), and far-field illumination (environment map) of the scene.
https://half-potato.gitlab.io/posts/nmf/
MIT License
53 stars 3 forks source link

Where can I get surface normal images of blender dataset. #14

Open K-nowing opened 10 months ago

K-nowing commented 10 months ago

Hi, where can I get surface normal images of blender dataset. The value of the surface normal image in the data accessible from the official github page seems to be an inaccurate value. Thank you so much!

image

The predicted normal from the paper.

image

The GT normal from the data.

half-potato commented 10 months ago

I believe these need to be mapped from [-1, 1] to [0, 1] by dividing by 2 and adding 0.5.

K-nowing commented 10 months ago

The predicted normal image is already mapped from [-1, 1] to [0, 1].

half-potato commented 10 months ago

Check the ground truth.

K-nowing commented 10 months ago

The ground truth is a png file, so it ranges from 0 to 255. Even if I change the range from [0, 255] to [-1, 1], it does not work. Am I missing something? Thank you for your reply!

K-nowing commented 10 months ago

+) From this line, it means that the suffix of the normal image file is "_normal.png". However the suffix of the normal image from the official github page is "_normal_0000.png".

half-potato commented 9 months ago

Did you get this working?

K-nowing commented 9 months ago

No, it doesn't work... The GT normal image that I have seems to be different from the one you have.

half-potato commented 9 months ago

This can happen if the normals are clipped before being exported, so you don't have the correct range on the normals. In this case, the ground truth normals have the wrong range.