google-research / google-research

Google Research
https://research.google
Apache License 2.0
33.81k stars 7.83k forks source link

[implicit_pdf] Paper results not reproducible #1059

Open shubham-goel opened 2 years ago

shubham-goel commented 2 years ago

Hi,

Can you please share the training command (with hyper-parameters) for reproducing numbers in Table 1 of the main paper?

I'm unable to reproduce reported results for implicit_pdf on the SYMSOL1 dataset. The paper (arxiv version) specifies the following hyper-parameters for reproducing results (Section S8):

The corresponding training run command is:

python -m implicit_pdf.train --symsol_shapes symsol1 \
 --number_fourier_components 3 \
 --batch_size 128 \
 --number_training_iterations 100000 \
 --head_network_specs 256 --head_network_specs 256 --head_network_specs 256 --head_network_specs 256 

However, the trained model seems to be overfitting as gt_log_likelihood starts reducing for cyl and cone after ~3k iterations. Please see the uploaded tensorboard logs. Reducing the depth of the MLP network to the default 2 layers didn't help either.

Thanks and Regards, Shubham

saeedmaroof commented 1 year ago

Hi.

I am facing these problem too. I can not reproduce results reported in main paper. first I tried configs in this link but maximum accuracy for training dataset was about 35%. after that, I tried to reproduce result by running code with changing hyper parameters such learning rate and iteration number. the best accuracy I reached was 83% for training dataset and 89% for validation dataset.

--how_many_training_steps 200,200,200,200,200,200,200,200,200,200,200,200,200,200,200,200,200,200 --learning_rate 2e-2,1e-2,1e-3,1e-4,1e-5,1e-6,2e-2,1e-2,1e-3,1e-4,1e-5,1e-6,2e-2,1e-2,1e-3,1e-4,1e-5,1e-6'