Open berisfu opened 5 years ago
Hi, ENAS builds a DAG to achieve weight sharing. The DAG is created in this file. https://github.com/melodyguan/enas/blob/master/src/cifar10/general_child.py
Hi, ENAS builds a DAG to achieve weight sharing. The DAG is created in this file. https://github.com/melodyguan/enas/blob/master/src/cifar10/general_child.py
I also have view the ENAS Keras implementation whose url is as here https://github.com/shibuiwilliam/ENAS-Keras/blob/master/ENAS.py. they use CNC.set_weight_to_layer(save_to_disk=self.save_to_disk) to save the weight file, use CNC.fetch_layer_weight(save_to_disk=self.save_to_disk) to load the weight. Via this method,they realize the weight sharing,finally get best efficiency NAS. I am pretty new to tensorflow,I have read the official code,but can not find out which part of the code realizes the weight saving and weight loading.
anybody can help me?Thanks a lot!
@berisfu its where the reuse=reuse comes in. within the variablescope under ("child") all train, val, test graph use the same parameters.
I have read the enas paper,which says that weight sharing is used for efficiency. Any one can tell me which part of the code realize the weight sharing?