google-research / nasbench

NASBench: A Neural Architecture Search Dataset and Benchmark
Apache License 2.0
680 stars 127 forks source link

Could we get the trained model weights for all the ~5M models? #5

Open goodboyanush opened 5 years ago

goodboyanush commented 5 years ago

Are you planning to make that available, atleast on a request basis.

It could help use cases such as retraining of models, getting intermediate outputs to analyze the nature of the search spaces.

chrisying commented 5 years ago

Hi goodboyanush,

Originally we wanted to release all trained model weights but the total file size is well over a petabyte and we haven't found an effective way to serve this data to researchers.

Is there some specific use-case you are looking for? It may be possible to regenerate the checkpoints by retraining a single model using the open-sourced code.

coallaoh commented 5 years ago

Hi Chris,

Thanks for open-sourcing the interesting code base!

Maybe more realistic request is to open source at least the training code because that is already non-trivial. That code would consist of two parts, (1) going from ModelSpec(matrix, ops) to full tensorflow code for building the corresponding graph and (2) running optimization steps with pre-defined hyperparameters. Writing those codes and trying to be 100% identical to your implementation would be quite a bit of work, if not impossible. Since you already have code for this, can you open source it? (Or is it already there and I missed it?)

coallaoh commented 5 years ago

Never mind my comment above - found those codes in the repository. https://github.com/google-research/nasbench/blob/master/nasbench/lib/evaluate.py https://github.com/google-research/nasbench/blob/master/nasbench/lib/model_builder.py