automl / NASLib

NASLib is a Neural Architecture Search (NAS) library for facilitating NAS research for the community by providing interfaces to several state-of-the-art NAS search spaces and optimizers.
Apache License 2.0
512 stars 117 forks source link

How to initialize a NASBench-301 model for training? #169

Closed akhauriyash closed 1 year ago

akhauriyash commented 1 year ago

For a hash, say

from naslib.search_spaces import nasbench301
hash = '([(0,6),(1,2),(1,4),(0,6),(0,6),(1,3),(4,0),(3,6)],[(0,6),(1,2),(1,4),(0,6),(0,6),(1,3),(4,0),(3,6)])'
nbg = nasbench301.graph.NasBench301SearchSpace()
nbg.set_spec(eval(hash))
nbg.prepare_evaluation()
nbg.parse()
print(nbg.adj)
nbg = nasbench301.graph.NasBench301SearchSpace()
hash = '([(1,3),(0,5),(0,4),(2,4),(1,4),(2,0),(2,5),(1,1)],[(1,3),(0,5),(0,4),(2,4),(1,4),(2,0),(2,5),(1,1)])'
nbg.set_spec(eval(hash))
nbg.prepare_evaluation()
nbg.parse()
print(nbg.adj)

The adjacency matrix and other properties of the model 'nbg' does not change.

How can I initialize a NASBench-301 model with NASLib such that it can be trained? Also, is it possible to extract a pure PyTorch model from the nbg graph?

abhash-er commented 1 year ago

Hi @akhauriyash,

When we initialize the search space, we initialize a super graph, and when we call the function set_spec(hash), we update the edge data (to build operations), which is used in the forward pass of the architecture. You can look at search_spaces/core/graph.py for a detailed view.

Currently, we don't provide support to extract a pure pytorch model from a nasbench graph. But the nasbench graph is also extended from pytorch model, so you can train it like how you would train a other pytorch models. You can still use it to do a forward pass and run a backward pass through the network. If you are interested in viewing how the model looks, you can call the function convert_naslib_to_genotype to view how the Genotype for normal and reduction cell looks like.

Thanks, Abhash

akhauriyash commented 1 year ago

Thank you for your response! I appreciate it.