Closed joe8767 closed 2 years ago
This is caused by the variances of the supernet training. Since this supernet is different from the one from which the pre-trained final compressed model is exported, the results may be a little bit different. You could train the supernet with a different seed to see if you could get a better result. A model with computation <3e9 GMACs and FID < 70 is a reasonable result.
Got it. Many thanks for your quick reply.
Thanks for sharing the excellent research.
Searching from the pretrained supernet, I was able to get a configuration
But I'm a little bit confuse about the setting of the budget for evolution search. I noticed that in the cyclegan horse2zebra experiment, the command for evolution search uses a budget of 3e9, which is larger than the reported ones (around 2.6e9). After changing the budget to 2.6e9, the obtained configuration is
evolution_search.sh
Are there any suggestions for obtaining compressed models using the fast pipeline while achieving fid and macs that are similar to those reported in README.md?