bharat-b7 / MultiGarmentNetwork

Repo for "Multi-Garment Net: Learning to Dress 3D People from Images, ICCV'19"
282 stars 65 forks source link

dress_smpl #34

Closed electronicliujiang closed 4 years ago

electronicliujiang commented 4 years ago

Hello, thank you for your work and outstanding contributions! The code can currently only try on the SMPL, can it be extended to the one shown in Figure 1 of the paper, can you try on the personalized body? How can I modify the code? In addition, the 356 body scan data mentioned in the paper seems to be insufficient in your data set (dataset1: more than 90 human body scans / dataset2: only contains clothing, not texture maps). Data? Or is it only partly provided for permission reasons? Looking forward to your reply!

electronicliujiang commented 4 years ago

Question 2: How are sacn_tex.jpg and registered_tex obtained? Is it obtained by RGBD camera? Or are there other ways? I now have colored vertex .obj files, how do I get a texture map? The texture map I got through blender cannot be aligned, and I do n’t know why

bharat-b7 commented 4 years ago

Hello, thank you for your work and outstanding contributions! The code can currently only try on the SMPL, can it be extended to the one shown in Figure 1 of the paper, can you try on the personalized body? How can I modify the code? In addition, the 356 body scan data mentioned in the paper seems to be insufficient in your data set (dataset1: more than 90 human body scans / dataset2: only contains clothing, not texture maps). Data? Or is it only partly provided for permission reasons? Looking forward to your reply!

Hi, we do not have permission to release the textures for part-2 of the dataset. Moreover, MGN does not require textures for training.

bharat-b7 commented 4 years ago

Question 2: How are sacn_tex.jpg and registered_tex obtained? Is it obtained by RGBD camera? Or are there other ways? I now have colored vertex .obj files, how do I get a texture map? The texture map I got through blender cannot be aligned, and I do n’t know why

The texture maps for MGN dataset were obtained using a 3D scanner.

electronicliujiang commented 4 years ago

Question3:Thank you for your answer! There is still a question, how to get the smpl_registered.obj file? Can you provide detailed operation steps

bharat-b7 commented 4 years ago

We used the following works for SMPL registration: 1) Learning to Reconstruct People in Clothing from a Single RGB Camera, CVPR'19. Alldieck et al. 2) 360-Degree Textures of People in Clothing from a Single Image, 3DV'19. Lazova et al.

electronicliujiang commented 4 years ago

Thank you for your reply! The first paper is to reconstruct the 3D model with eight pictures from different perspectives. Are you using the open source code of octopus? Do I also need to get smpl_registered.obj from eight pictures? Or are there other simpler ways? As far as I know, the second paper is not open source yet? Do I need to reproduce it according to the thesis? Or are there some other open source alternatives?

bharat-b7 commented 4 years ago

Hi, The dataset is not generated using the octopus (image based method) but I use the scan registration code they used as preprocessing for their scans to generate training data. Unfortunately neither of these works released the registration code publicly. (I don't think so they will release the code in near future.)