nabenabe0928 / reading-list

1 stars 0 forks source link

DEHB: Evolutionary Hyperband for Scalable, Robust and Efficient Hyperparameter Optimization #60

Closed nabenabe0928 closed 1 year ago

nabenabe0928 commented 1 year ago

DEHB: Evolutionary Hyperband for Scalable, Robust and Efficient Hyperparameter Optimization

Main points

Performance over time

  1. Random search
  2. HB
  3. TPE
  4. BOHB
  5. SMAC
  6. RE
  7. DE
  8. DEHB

The benchmark datasets are:

  1. Counting one (toy func)
  2. surrogate bench by the BOHB paper (probably ParamNet?)
  3. BNN by the BOHB paper
  4. RL by the BOHB paper
  5. NAS benches (NB101, NB201, NB1shot1, HPOlib)

Ablation study

some parameters

Scalability test

$n \in \{1,2,4,8,16,32,64\}$

Average rank over time

I am not sure if the expected runtime for each benchmark is more or less similar, but anyways they took the average over NB101, NB201, HPOlib, NB1shot1, OpenML surrogates, and the RL bench.

Confusing points

Section 4

The algorithm was really hard to understand from the text. So basically, each population is inherited from (1) the previous SH bracket with exactly the same budget and the population is augmented from (2) the pruned low-budget population as well. The first point is exactly what we do in evolution algorithms in general. The second point is described in Fig. 2 (the extra individual is called parent pool) and Fig. 3 (the right figure shows that we additionally uses the parent pool on top of the inherited population).