amazon-science / indoor-scene-generation-eai

Other
50 stars 9 forks source link

How to test the results of the model of DeepSynth? #13

Open YeolYao opened 2 years ago

YeolYao commented 2 years ago

Hello, I have used the train_continue.ipynb, train_location.ipynb and train_rotation.ipynb to train network model. But I don't konw how to test the results of the model. I try to run batch_synth.py in deep=synth-master to test, But it doesn't seem to use the data we've processed. I would be very grateful if you could give me some help.

yizhouzhao commented 2 years ago

Thanks for your interest for this work. Actually, we were unable to test the SUNCG dataset (due to license issues) to perform the original batch_synth. The way we did is to transfer the 3D-Front into similar format for training and synthesizing, then we perform user studies based on that.

YeolYao commented 2 years ago

Thanks for your reply. I would like to know how to synthesize and test when you have transferred the 3D-Front into similar format for training? Because I only find the ipynb file for training.

yizhouzhao commented 2 years ago

https://github.com/yizhouzhao/indoor-scene-generation-eai/blob/main/Issue/Process%20for%203D-SLN.ipynb

I haven't found time to clean my code for this part. But here is the notebook I had to preprocess 3D-front to deep-synth trainable format

yizhouzhao commented 2 years ago

I have cleaned up a tutorial to parse 3D-Front scene into Maya. Probably this would help.

https://github.com/yizhouzhao/indoor-scene-generation-eai/blob/main/Issue/Load3DFront2Maya-Tutorial.ipynb

YeolYao commented 2 years ago

Thank you for your help. I will try again soon.

shanqiiu commented 11 months ago

谢谢你的帮助。我很快就会再试一次。 Hello, how is the result of the attempt