Closed slyviacassell closed 4 years ago
Hi, BPE can be found here:https://github.com/zhengxiawu/rethinking_performance_estimation_in_NAS/blob/master/param_setting.py . As for the MIP, we found that it may useless to release MIP, since 1). the random forest algorithm can be easily found on sklearn. 2)https://github.com/automl/fanova also provides a similar method for hyperparameter analysis. 3) Finding the best BPE in NAS is extremely time consuming(more than 3000 GPU hours).
We have also released the selection of MIP in https://github.com/zhengxiawu/rethinking_performance_estimation_in_NAS/blob/master/experimantal_stastics.xlsx
Well, I am just confused with the description of parameter pruning in the paper. So I want to figure out what happend by reading the original code. My questions are as follow. 1). Why Eq. 9 sets $$ \Theta_i $$ twice? As far as I am concerned, the description of can be format as follow: . So the description may not match the Eq. 9. 2). If I misunderstand, would you mind geive more details of Eq. 9? For instance, what the $$ \cdot $$ in Eq. 9 denote? Would mind help me to figure out theses problems? Thank you very much!
Sorry for the misunderstanding, in Eq 9 it should be changed to "Θi = βi, i = argminIΘ.", and $$ \cdot $$ is a full stop.
Well, that makes sense. Thank you for your patient responses! Maybe this open access version needs minor revisions.
Hi, after reading the repository, I do not find the code for BPE and MIP. Would you mind open source them? Thanks a lot.