Closed cravisjan97 closed 3 years ago
@cravisjan97 You should re-train the network with a higher weight of KL divergence. In the repo, we aim to get a better reconstruction to train the network with a small weight of KL divergence regularization. If you have some questions, please feel free to contact us.
@tommaoer Thanks for the response. How can I change the KL-Divergence (is it l3 parameter) weight in train_stacknewvae.py? Are there any arguments that need to be set for running train_stacknewvae.py? Any suggestions for the other training hyperparameters?
I am trying to randomly generate a chair model. I ran the test_stacknewvae.py (with model.random_gen()) and ran GetOptimizedObj.m with use_struct=1 (use_struct=0 gave very bad results, don't know why). All 200 chair models were pretty similar (most of them were the same). How do I resolve this issue?