automl / NASLib

NASLib is a Neural Architecture Search (NAS) library for facilitating NAS research for the community by providing interfaces to several state-of-the-art NAS search spaces and optimizers.
Apache License 2.0
513 stars 117 forks source link

Missing Layers in Search Space for NASBench-201 #86

Closed NeoChaos12 closed 2 years ago

NeoChaos12 commented 2 years ago

Hi everyone,

I was comparing the code for the search space in the original repo (as per the changelog) with the NASLib implementation since we were observing consistently worse performance (down by 5-6 percentage points) when trying to re-create the top-10 or even top-100 architectures' performance and noticed that, in the NASLib implementation here, when compared to the corresponding original code here, there is a missing BatchNorm + ReLU before the global adaptive pooling layer. Interestingly, I did not see any mention of these layers in the paper on arXiv. But introducing them by making the following change to the aboive-mentioned NASLib code instantly re-created the expected top-10 model performance:

# post-processing
  self.edges[edge_names["postproc"]].set('op', ops.Sequential(
        nn.BatchNorm2d(channels[-1]),
        nn.ReLU(inplace=False),
        nn.AdaptiveAvgPool2d(1),
        nn.Flatten(),
        nn.Linear(channels[-1], self.num_classes)
  ))
Neonkraft commented 2 years ago

Thank you for pointing this out! PR has been raised with this fix.

https://github.com/automl/NASLib/pull/87

arberzela commented 2 years ago

Merged to Develop