yccyenchicheng / AutoSDF

233 stars 29 forks source link

Question about dataset #1

Closed yuchenrao closed 2 years ago

yuchenrao commented 2 years ago

Hi,

Thanks a lot for releasing your great work! I would like to use this as one of my baselines.

I have checked the dataset from DISN, and notice that their sdf.h5 is in (32^3, 4) dimension, which is different as yours: (1, 64, 64, 64). Is it possible to release your shapenet dataset as well? Or could you also mention about how to do data pre-processing in your Readme, so that we can try to generate the data by ourselves?

Thanks a lot!

Best, Yuchen

yccyenchicheng commented 2 years ago

Hi,

Thank you! Unfortunately, I cannot release our processed ShapeNet dataset as there might be some license issues. However, following the preprocessing script from DISN should obtain the (1, 64, 64, 64) SDF given a mesh. Have you tried their preprocessing scripts?

Best, Yen-Chi

yuchenrao commented 2 years ago

Hi Yen-Chi,

Thanks a lot for your reply! I haven't tried DISN yet, but I tried your preprocessing script, which I think should be fine. Actually I am wondering whether it's because of how VQ-VAE works. It doesn't pass the one-sample overfitting test, however, if I use my whole training dataset (>7,000), the IOU looks reasonable to me. 

BTW, I have another question about using your work as my shape completion baseline, during training, I need to use my GT SDF to train PVQ-VAE, then extract the codes, then use GT SDF to train rand_tf. And during testing, I need to use my TEST SDF (incomplete), extract the codes first, and then use it as the inputs for rand_tf. Does that sound reasonable?

The confused part for me is that since I don't need multi-model prediction, I am wondering whether I can just use PVQ-VAE as the baseline model.

Best, Yuchen

yccyenchicheng commented 2 years ago

Hi Yuchen,

Thank you for the detailed reply! I'm not exactly sure how you do the one-sample overfitting test. Did you just training the VQ-VAE and fit one sample?

For the shape completion, that's correct and exactly how we do for shape completion! You also need to figure out what's the known shapes you wish to condition on as the condition for the rand_tf. This can be handled by giving the gen_order to the inputs of the rand_tf. For instance, if a chair (in 2D) is given by indices [0, 1, 2, 3] (top-to-bottom), to condition on the bottom of the chair, rand_tf needs to have gen_order=[2, 3] as the inputs. You do not need to worry about the order since rand_tf can handle random orders.

Best regards, Yen-Chi

MrZ19 commented 2 years ago

Hi Yen-Chi Thank you for releasing your great work, it inspired me a lot! Here, I want to train your model by myself.

However, I meet the same problem as Yuchen. Specifically, I download the sdf dataset provided by DISN, where it is in (32^3, 4) dimension. It is different as yours: (1, 64, 64, 64).

I have checked the script (./preprocessing/create_point_sdf_grid.py) of the DISN, the ''num_sample'' parameter is set to 32^3=32768, do you change this parameter when generating your dataset? I also check your preprocess script (./preprocess/process_one_mesh.py), where this parameter is set to 65^3.

image

image

Could you please provide a more detailed data generation introduction? Thanks a lot!

Best regards, Zhiyuan

yuchenrao commented 2 years ago

Hi Yuchen,

Thank you for the detailed reply! I'm not exactly sure how you do the one-sample overfitting test. Did you just training the VQ-VAE and fit one sample?

For the shape completion, that's correct and exactly how we do for shape completion! You also need to figure out what's the known shapes you wish to condition on as the condition for the rand_tf. This can be handled by giving the gen_order to the inputs of the rand_tf. For instance, if a chair (in 2D) is given by indices [0, 1, 2, 3] (top-to-bottom), to condition on the bottom of the chair, rand_tf needs to have gen_order=[2, 3] as the inputs. You do not need to worry about the order since rand_tf can handle random orders.

Best regards, Yen-Chi

Hi Yen-Chi,

Sorry, I just saw your reply. Thanks a lot for your example! I have figured out how to implement your work as my baseline. Thank you very much for your great work again!

Best, Yuchen

Silverster98 commented 2 years ago

Hi Yuchen, Thank you for the detailed reply! I'm not exactly sure how you do the one-sample overfitting test. Did you just training the VQ-VAE and fit one sample? For the shape completion, that's correct and exactly how we do for shape completion! You also need to figure out what's the known shapes you wish to condition on as the condition for the rand_tf. This can be handled by giving the gen_order to the inputs of the rand_tf. For instance, if a chair (in 2D) is given by indices [0, 1, 2, 3] (top-to-bottom), to condition on the bottom of the chair, rand_tf needs to have gen_order=[2, 3] as the inputs. You do not need to worry about the order since rand_tf can handle random orders. Best regards, Yen-Chi

Hi Yen-Chi,

Sorry, I just saw your reply. Thanks a lot for your example! I have figured out how to implement your work as my baseline. Thank you very much for your great work again!

Best, Yuchen

Hi, have you solved the problem? Can you provide more details about how to convert the sdf.h5 provided by DISN?

yuchenrao commented 2 years ago

Hi Yuchen, Thank you for the detailed reply! I'm not exactly sure how you do the one-sample overfitting test. Did you just training the VQ-VAE and fit one sample? For the shape completion, that's correct and exactly how we do for shape completion! You also need to figure out what's the known shapes you wish to condition on as the condition for the rand_tf. This can be handled by giving the gen_order to the inputs of the rand_tf. For instance, if a chair (in 2D) is given by indices [0, 1, 2, 3] (top-to-bottom), to condition on the bottom of the chair, rand_tf needs to have gen_order=[2, 3] as the inputs. You do not need to worry about the order since rand_tf can handle random orders. Best regards, Yen-Chi

Hi Yen-Chi, Sorry, I just saw your reply. Thanks a lot for your example! I have figured out how to implement your work as my baseline. Thank you very much for your great work again! Best, Yuchen

Hi, have you solved the problem? Can you provide more details about how to convert the sdf.h5 provided by DISN?

Hi, I used my own dataset, so I didn't solve this.

yccyenchicheng commented 2 years ago

Hi,

@MrZ19, @yuchenrao

Sorry for the late reply. I indeed set the num_sample=64**3. You can change that number when preprocessing the SNet.

Thank you!

Silverster98 commented 2 years ago

I think the preprocessing script provided by the author is enough. I have tried DISN's scripts, but I didn't get valid data.

Yishun99 commented 2 years ago

Hi, @yccyenchicheng

Thanks a lot for releasing your great work!

I found you set the expand_rate=1.3, which is different from DISN: 1.2.

Does this is necessary to set expand_rate=1.3 to reproduce the results in paper?

Thanks a lot!

JTT94 commented 2 years ago

Has anyone actually managed to get this to work?

RunpeiDong commented 1 year ago

Hi Yen-Chi Thank you for releasing your great work, it inspired me a lot! Here, I want to train your model by myself.

However, I meet the same problem as Yuchen. Specifically, I download the sdf dataset provided by DISN, where it is in (32^3, 4) dimension. It is different as yours: (1, 64, 64, 64).

I have checked the script (./preprocessing/create_point_sdf_grid.py) of the DISN, the ''num_sample'' parameter is set to 32^3=32768, do you change this parameter when generating your dataset? I also check your preprocess script (./preprocess/process_one_mesh.py), where this parameter is set to 65^3.

image

image

Could you please provide a more detailed data generation introduction? Thanks a lot!

Best regards, Zhiyuan

Hi @MrZ19,

Does it work by modifying the resolution parameter in DISN preprocessing code? Do you get reasonable SDF data?

Best, Runpei