Open Liuzhengli opened 5 months ago
Hello author, I want to test my dataset with the trained model, that is, the final input is a picture and the output is a three-dimensional rotation image. I haven't found this test code. Can you tell me where I can find it?
2380407017 @.***
------------------ 原始邮件 ------------------ 发件人: "dvlab-research/SMR" @.>; 发送时间: 2024年6月21日(星期五) 上午10:02 @.>; @.**@.>; 主题: Re: [dvlab-research/SMR] "Hello author, could you please tell me where the demo test file is located?" (Issue #18)
Hello, It has been a long time to restore the original ckpt, while here is our previous experimental records, hope it can help you!
According to https://github.com/ShichenLiu/SoftRas/blob/b3150cf5ebaf0ee96276dbb18125add69b8f5cd1/examples/recon/models.py#L98 . SoftRas adopt FOV=152=30, thus please should change the fov of the kaolin's render from “camera_fovy = np.arctan(1.0 / 2.5) 2” to “camera_fovy = 30 * np.pi / 180”;
The sphere template range is from (-0.5, 0.5) on ShapeNet rather than (-1, 1) on CUB, thus please scale it by "vertices_init = vertices_init * 0.5”
image.png (view on web)
— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>
Our evaluation code is borrowed from https://github.com/ShichenLiu/SoftRas/blob/b3150cf5ebaf0ee96276dbb18125add69b8f5cd1/examples/recon/models.py#L139
Please just use it to convert the reconstructed mesh to voxels and evaluate by 3D IoU.
I apologize for the confusion earlier. I have trained a model using my own dataset, and now I want to test it. The process involves inputting an original image to generate its three-dimensional rotation image to see the effect. This is the content from the docs folder on your GitHub ,but I haven't found the relevant code..
2380407017 @.***
------------------ 原始邮件 ------------------ 发件人: "dvlab-research/SMR" @.>; 发送时间: 2024年6月21日(星期五) 上午10:14 @.>; @.**@.>; 主题: Re: [dvlab-research/SMR] "Hello author, could you please tell me where the demo test file is located?" (Issue #18)
Our evaluation code is borrowed from https://github.com/ShichenLiu/SoftRas/blob/b3150cf5ebaf0ee96276dbb18125add69b8f5cd1/examples/recon/models.py#L139
Please just use it to convert the reconstructed mesh to voxels and evaluate by 3D IoU.
— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>
Hello, The test codes for evaluated fid and visualization are in train.py. Using the following code can see the 360 effect. https://github.com/dvlab-research/SMR/blob/924be9de70d2ccadd19cfddf688cc68f797fa4bd/train.py#L697
Also, please contact me via yihouxiang@gmail.com.
How were the contents of this docs file produced? Isn't it just a matter of inputting an image and then directly getting the result using the trained model? Or are these animated images directly screenshots taken during the training process?
2380407017 @.***
------------------ 原始邮件 ------------------ 发件人: "dvlab-research/SMR" @.>; 发送时间: 2024年6月21日(星期五) 上午10:27 @.>; @.**@.>; 主题: Re: [dvlab-research/SMR] "Hello author, could you please tell me where the demo test file is located?" (Issue #18)
Hello, The test codes for evaluated fid and visualization are in train.py. Using the following code can see the 360 effect. https://github.com/dvlab-research/SMR/blob/924be9de70d2ccadd19cfddf688cc68f797fa4bd/train.py#L697
— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>
Thank you very much, author.
2380407017 @.***
------------------ 原始邮件 ------------------ 发件人: "dvlab-research/SMR" @.>; 发送时间: 2024年6月21日(星期五) 上午10:30 @.>; @.**@.>; 主题: Re: [dvlab-research/SMR] "Hello author, could you please tell me where the demo test file is located?" (Issue #18)
Also, please contact me via @.***
— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>
Hello, It has been a long time to restore the original ckpt, while here is our previous experimental records, hope it can help you!