Open HeYDwane3 opened 3 years ago
Did you encounter this kind of error again when searching? An error is reported when the configured ofa version is ofa 0.1.0-202012082159. Then try ofa 0.0.4-2012082155 but the same error still occurs.
Traceback (most recent call last):
File "msunas.py", line 8, in
I noticed that, after searching, we got .config and .inherited file. For validation, we need .config and .init file. So the only way to get .init file is to fine-tune with the .inherited file?
The acc reported during search is predicted right? And if we want to get the real test results, we have to fine-tune 450 epochs, right? You name it fine-tune, but it looks more like a retraing process.
Since we already got the .inherited file, how can we use that directly to do a real testing on the Imagenet validation set?
Quote from the paper: "An alternative approach to solve the bi-level NAS problem, i.e., simultaneously optimizing the architecture and learn the optimal model weights."
After working with the source code and read the paper again, i feel like this repo is not the exact implementation of the paper. The trained weights of candidates are dropped after getting the KPI. The result is a list of architecture codes + its KPI. You need to retrain the candidate. Sure you can easily just save the weights of the candidate and continue the training, but does it make sense to update the weights of the supernet like gradient-based algorithms?
I noticed that, after searching, we got .config and .inherited file. For validation, we need .config and .init file. So the only way to get .init file is to fine-tune with the .inherited file?
The acc reported during search is predicted right? And if we want to get the real test results, we have to fine-tune 450 epochs, right? You name it fine-tune, but it looks more like a retraing process.
Since we already got the .inherited file, how can we use that directly to do a real testing on the Imagenet validation set?