pengHTYX / Era3D

GNU Affero General Public License v3.0
527 stars 24 forks source link

About training data #46

Open lzt02 opened 3 hours ago

lzt02 commented 3 hours ago

Hello, I want to use my own dataset for training. Could you share some normal maps from your training data?

pengHTYX commented 3 hours ago

Sorry about that because the dataset is quite large, nearly 2T. You can render the normal maps according to the released blender code and filtered .glb list.

lzt02 commented 2 hours ago

Sorry about that because the dataset is quite large, nearly 2T. You can render the normal maps according to the released blender code and filtered .glb list.

Thank you for your reply. Could you share the six views of just one object? Because I noticed that the def load_normal has made some adjustments to the normal. I'm not sure which coordinate system's normal maps should be used as input for training the model

pengHTYX commented 2 hours ago

Please find the Google Drive link for the rendering sample. https://drive.google.com/file/d/1_CYdkAZu6ISM62aLyFjKxsF34xjENlbk/view?usp=sharing

lzt02 commented 2 hours ago

Please find the Google Drive link for the rendering sample. https://drive.google.com/file/d/1_CYdkAZu6ISM62aLyFjKxsF34xjENlbk/view?usp=sharing

Thanks for your help! Also, I noticed that 'worldNormal2camNormal' was executed in def load_normal. I would like to ask if the normals input to the model is in camera space. If so, is the output normal of the model also in camera space?

pengHTYX commented 2 hours ago

No, the normals input is in world space. The worldNormal2camNormal is called to transform the rendering world coordinate to training world coordinate relative to input view. Beacuse we randomly select one view as input view when training, rather than always using rendering front view.

lzt02 commented 43 minutes ago

I understand, thanks for your answer!