This tutorial shows how to search for neural architectures with zero-cost proxies on CIFAR-10 and ImageNet-16-120 datasets using the MASE framework.
Note: The Zero-Cost Proxy project is done by Group 2 of Advanced deep learning system course at Impeiral college london.
Search for Neural Architectures with Zero-Cost Proxies
We load the NAS-Bench-201 benchmark API and use the MASE framework to search for neural architectures with zero-cost proxies. Architectures with its true post-train accuracy are selected from NAS-Bench-201 and searched with zero-cost proxies as objectives.
Requirements
Before initiating the project, please ensure the following prerequisites are satisfied:
Python (same with MASE)
PyTorch (same with MASE)
NAS-Bench-201 API
An environment capable of running PyTorch models, such as CUDA for GPU acceleration
Sample search Config
Here is the search part in configs/examples/search_zero_cost.toml looks like the follows.
[search.search_space]
# the search space named defined in mase
# should be set as "graph/software/zero_cost" for zero-cost search
name = "graph/software/zero_cost"
[search.search_space.setup]
by = "name"
[search.search_space.seed.default.config]
# the only choice "NA" is used to indicate that layers are not quantized by default
name = ["NA"]
### search config for vision
[search.search_space.nas_zero_cost.config]
## default cifar 10 (can be selected from [cifar10, cifar10-valid, cifar100, ImageNet16-120])
dataset = ['ImageNet16-120']
name = ['infer.tiny']
C = [16]
N = [5]
## following are the option choices for each node in the cell
op_0_0 = [0]
op_1_0 = [0,1,2,3,4]
op_2_0 = [0,1,2,3,4]
op_2_1 = [0,1,2,3,4]
op_3_0 = [0,1,2,3,4]
op_3_1 = [0,1,2,3,4]
op_3_2 = [0,1,2,3,4]
number_classes = [10]
[search.strategy]
## use optuna as the optimization algorithm
name = "optuna"
## should be set to false because zero-cost nas requires a mini-batch of training
eval_mode = false
[search.sw_runner]
## set to "zero_cost" to call the newly-defined zc runner that can return both the zero-cost proxies and the true accuracy values
name = "zero_cost"
[search.strategy.sw_runner.zero_cost]
# metric can be chosen from
# "grad_norm", "snip", "grasp", "fisher", "plain", "l2_norm", "naswot", "naswot_relu", "tenas", "zico"
# should be modified while choosing subsample of the proxies.
metrics = ["grad_norm", "snip", "grasp", "fisher", "plain", "l2_norm", "naswot", "naswot_relu", "tenas", "zico"]
data_loader = "val_dataloader"
num_samples = 512
[search.strategy.hw_runner.average_bitwidth]
compare_to = 32 # compare to FP32
[search.strategy.setup]
n_jobs = 1
n_trials = 100
timeout = 20000
sampler = "tpe"
# sum_scaled_metrics should be false for multi-objective optimization
sum_scaled_metrics = false # multi objective
# direction needs to be commented out for multi-objective optimization
# direction = "maximize"
[search.strategy.metrics]
grad_norm.scale = 0.0
grad_norm.direction = "maximize"
snip.scale = 1.0
snip.direction = "maximize"
grasp.scale = 0.0
grasp.direction = "maximize"
fisher.scale = 0.0
fisher.direction = "maximize"
plain.scale = 1.0
plain.direction = "maximize"
l2_norm.scale = 0.0
l2_norm.direction = "minimize"
naswot.scale = 1.0
naswot.direction = "maximize"
naswot_relu.scale = 1.0 # number 3
naswot_relu.direction = "maximize"
t_cet.scale = 1.0
t_cet.direction = "maximize"
tenas.scale = 1.0
tenas.direction = "maximize"
zen.scale = 1.0
zen.direction = "maximize"
zico.scale = 0.0
zico.direction = "maximize"
NAS-Bench-201 Dataset Requirement
First download the NAS-Bench-201 .pth file. This file contains the dataset of pre-evaluated architectures for the zero-cost proxy evaluation process.
Download the file from the official NAS-Bench-201 repository or an alternative provided source. After downloading, place the .pth file in your project directory under third_party/NAS-Bench-201-v1_1-096897.pth or adjust the configuration to reflect the file's location accurately.
Then install the nas-bench-201 pth file using the following link:
Google Drive NAS-Bench
Configuration Check: Ensure that the configs/examples/search_zero_cost.toml configuration file points to the correct NAS-Bench-201 .pth file location and adjust other settings as necessary.
Project Execution
Execute the project by following command from the root directory:
This command triggers the search procedure that employs zero-cost proxies to evaluate and rank neural network architectures. The process includes the model and dataset initialization, search space construction, and the execution of the zero-cost proxy search strategy, followed by the logging and saving of results.
Expected Outputs
Upon successful completion, the project generates:
Predicted accuracy rankings of neural network architectures.
True accuracy values as evaluated on the NAS-Bench-201 dataset.
Log files and visualizations, depending on your setup and configurations.
These outputs are crucial for assessing the performance of various architectures and the predictive accuracy of zero-cost proxies.
In addition, a weights series containing estimated importance weights to each proxy will also return.
[7.48018255e+00 -2.42037272e-01 -3.68762718e+00]
Conclusion
We finished all basic elements and one of the extensions for the project assignments.
[X] Design a search space for zero-cost proxy search using option choices for each node in the cell that construct the architecture rather than solely index.
[X] Implement a zero-cost proxy search strategy using TPE.
[X] Evaluate the performance of zero-cost proxies on CIFAR-10.
[X] Estimate an ensemble for the proxy combination using linear regression.
[X] Broaden the search to larger datasets such as CIFAR-100 and ImageNet-16-120.
The explanation documentation is under the machop/sphinx_docs/source/modules/documentation/tutorials/actions/search directory.
When we open a new pull request, our current error is "nas-bench-201" is not found. This is our environment requirements but it seems we couldn't add this into the github check. @Aaron-Zhao123
Zero-Cost Proxy for NAS Project with MASE
This tutorial shows how to search for neural architectures with zero-cost proxies on CIFAR-10 and ImageNet-16-120 datasets using the MASE framework.
Search for Neural Architectures with Zero-Cost Proxies
We load the NAS-Bench-201 benchmark API and use the MASE framework to search for neural architectures with zero-cost proxies. Architectures with its true post-train accuracy are selected from NAS-Bench-201 and searched with zero-cost proxies as objectives.
Requirements
Before initiating the project, please ensure the following prerequisites are satisfied:
Sample search Config
Here is the search part in
configs/examples/search_zero_cost.toml
looks like the follows.NAS-Bench-201 Dataset Requirement
First download the NAS-Bench-201
.pth
file. This file contains the dataset of pre-evaluated architectures for the zero-cost proxy evaluation process.Download the file from the official NAS-Bench-201 repository or an alternative provided source. After downloading, place the
.pth
file in your project directory underthird_party/NAS-Bench-201-v1_1-096897.pth
or adjust the configuration to reflect the file's location accurately.Setup Instructions
Create and Activate a Python Virtual Environment:
Install Required Dependencies:
Then install the nas-bench-201 pth file using the following link: Google Drive NAS-Bench
configs/examples/search_zero_cost.toml
configuration file points to the correct NAS-Bench-201.pth
file location and adjust other settings as necessary.Project Execution
Execute the project by following command from the root directory:
This command triggers the search procedure that employs zero-cost proxies to evaluate and rank neural network architectures. The process includes the model and dataset initialization, search space construction, and the execution of the zero-cost proxy search strategy, followed by the logging and saving of results.
Expected Outputs
Upon successful completion, the project generates:
These outputs are crucial for assessing the performance of various architectures and the predictive accuracy of zero-cost proxies.
For instance, given the following configuration:
After a complete search run:
In addition, a weights series containing estimated importance weights to each proxy will also return.
Conclusion
We finished all basic elements and one of the extensions for the project assignments.
The explanation documentation is under the
machop/sphinx_docs/source/modules/documentation/tutorials/actions/search
directory.