Open lzt02 opened 1 month ago
Sorry about that because the dataset is quite large, nearly 2T. You can render the normal maps according to the released blender code and filtered .glb list.
Sorry about that because the dataset is quite large, nearly 2T. You can render the normal maps according to the released blender code and filtered .glb list.
Thank you for your reply. Could you share the six views of just one object? Because I noticed that the def load_normal has made some adjustments to the normal. I'm not sure which coordinate system's normal maps should be used as input for training the model
Please find the Google Drive link for the rendering sample. https://drive.google.com/file/d/1_CYdkAZu6ISM62aLyFjKxsF34xjENlbk/view?usp=sharing
Please find the Google Drive link for the rendering sample. https://drive.google.com/file/d/1_CYdkAZu6ISM62aLyFjKxsF34xjENlbk/view?usp=sharing
Thanks for your help! Also, I noticed that 'worldNormal2camNormal' was executed in def load_normal. I would like to ask if the normals input to the model is in camera space. If so, is the output normal of the model also in camera space?
No, the normals input is in world space. The worldNormal2camNormal is called to transform the rendering world coordinate to training world coordinate relative to input view. Beacuse we randomly select one view as input view when training, rather than always using rendering front view.
I understand, thanks for your answer!
Hello, I want to use my own dataset for training. Could you share some normal maps from your training data?