yunishi3 / 3D-FCR-alphaGAN

Fully Convolutional Refined Auto Encoding Generative Adversarial Networks for 3D Multi Object Scenes
103 stars 33 forks source link

Exporting Latent Interpolation #3

Open dollarbilll opened 4 years ago

dollarbilll commented 4 years ago

Hi!

Is there a particular way you went about exporting this Latent Interpolation? Is it possible to do so in 3D?

I've been able to export .obj files without problems, but I'm not sure how to export the Interpolation. I'm assuming that one would have to export obj's for each frame of an interpolations trajectory through the latent space but I am unclear on how to target that trajectory and its specific frames.

Any thoughts? Thankyou thank you thank you!

yunishi3 commented 4 years ago

I am really really sorry for the super late response. At first, you need to execute the "evaluation_recons" mode of main.py as follows. $ python main.py --mode evaluate_recons --conf_epoch 10000

The conf_epoch is the number of checkpoint file which you could download. Please modify your checkpoint num if you might use your training result. You could get the reconstruction voxels as .npy files and the decoded latent space as decode_z.npy after execution.

And then, You could get the interpolation voxels as .npy files using "evaluate_interpolate" mode as following code. $ python main.py --mode evaluate_interpolate --conf_epoch 10000

Actually this code generates all of interpolation files of randomly reconstructed voxels which you would execute and the number of interpolation files are 30. So if you would like to assign specific scene target and change the number of interpolation files, please modify the code of evaluate.py. I am sorry for your inconvenience.

Also, if you would like to make gif image like the top of my README.md, you need to visualize .npy files using visualization code in ./eval and make gif image as you would like!!

Thanks!