Closed yiwenchen1999 closed 5 months ago
Hi,
In the step 4 you render the scenes based on the boxes.npz you create in step 1-3. Have you tried step 4 already? It should be possible to render also from the checkpoint without the full dataset, but i never tried it and it might need some code adjustments. Currently the code assumes that the dataset is complete with labels and images. So if you don't want to adjust the code, the easiest is to render also the full dataset in step 4
Thank you so much for replying!
After running step1-3, I got 3 different folders with .npz files:
/path/to/data/3dfront/processed/bedrooms_without_lamps
from preprocess.py
/path/to/data/3dfront/processed/bedrooms_without_lamps_full_labels
from create camera position.sh
/path/to/data/3dfront/processed/bedrooms_without_lamps_full_labels_vertices
and
/path/to/data/3dfront/processed/bedrooms_without_lamps_full/labels
from create_norm_labels.sh
I suppose the last folder is the one I want to zip together with generated images to create the dataset, however, when giving the model this dataset, I got some distorted views using the given checkpoint of the scene such as this:
Is there any other preprocessing steps I should be aware of?
Do all results look this? Does it follow the layout conditioning? Do you have video results? Also did you check if the step 4 gives the correct outputs?
Taking /path/to/data/3dfront/processed/bedrooms_without_lamps_full/labels is correct and should be enough
But it's difficult to judge what went wrong without more descriptions.
Please reopen if the issue has not been solved!
Hi, thank you so much for your great work and open code. I'm trying to recreate your 3D front dataset following your code at: https://github.com/sherwinbahmani/threed_front_rendering/blob/main/create_dataset.sh However, all of the first 3 steps seem to create box.npz, can you describe which is the correct data for the trained genereate.py to produce results showed in the demo? Thank you very very much!