MoritzWag / Representation-Learning

Representation Learning of Image Data with VAE.
2 stars 0 forks source link

Tuning Models with Hyperband #48

Open MoritzWag opened 4 years ago

MoritzWag commented 4 years ago

We want to tune all models with hyperband and a fixed time budget that is equal for all models. This gives us a better comparability.

Model Parameters Budget Done
GaussianVae None None
BetaVae beta: [low=1, high=20, steps=2]; max_capacity: [low=1, high=50, step=5] 50h 16.08 21.00h
DIPVae lambda_dig: [low=1, high=20, steps=2]; lambda_offdig: [low=1, high=30, step=2] 50h 20.8 0.00h
GaussMixVae temperature: [low=1, high=40, steps=10]; anneal_rate: [low=0.1, high=0.5, steps=0.1] cont_weight [1, 2, 3]; cat_weight = [1, 2, 3] ??? tbd
InfoVae to be discussed ??? tbd
MoritzWag commented 4 years ago
Results Model Results
GaussianVae None
BetaVae beta = 20 max_capacity = 20
DIPVae lambda_dig: 0.0001 ; lambda_offdig: 7e-5
GaussMixVae temperature = 1; anneal_rate = 0.5 cont_weight = 1; cat_weight = 2
InfoVae alpha = -14, latent_var = 1, reg_weight = 50