IBM / analog-nas

Analog AI Neural Architecture Search (analog-nas) is a modular and flexible framework to facilitate implementation of Analog-aware Neural Architecture Search.
https://analog-nas.readthedocs.io/
Apache License 2.0
41 stars 16 forks source link

Some code issues and implementation not match the paper. #3

Open blyucs opened 1 year ago

blyucs commented 1 year ago

Thank you for your excellent work. The code in this repository appears to be outdated, and does not match the paper. Could you please confirm? I have encountered the following issues and would appreciate your assistance in resolving them:

  1. reference before assignment xgboost.py: self.ranker = self.get_ranker() self.avm_predictor = self.get_avm_predictor() self.std_predictor = self.get_std_predictor() self.ranker_path = ranker_path self.avm_predictor_path = avm_predictor_path self.std_predictor_path = std_predictor_path
  2. The ranker xgboost model (with weight loaded) always generate 0.5 with different architecture config, could you please check again. The EA optimization towards the highest ranking does not work.
  3. The EA algorithm (ea_optimized.EAOptimizer) does not match the algorithm 1 announced in the paper. e.g., there is no code of selecting the top_50 population. Also, no "union" or "crossover".
IHIaadj commented 1 year ago

Hi @blyucs Thank you for raising this issue.