Hi, we are working on neural architecture searching and very interested in ENAS. We have downloaded your code and run it. Now we try to convert the code to a c++ version. Do you have any suggestions for us, considering c++ version more complicated?
In addition, as far as we know, tensorflow c++ APIs do not include the Optimizer class, and we find a minist tutorial applying gradient to each trainable variable manually https://github.com/rockzhuang/tensorflow/blob/master/tensorflow/examples/cc/cnn/mnist/mnist.cc. Consider that for enas, the nodes used are not fixed and decided by the controller every epoch. Thus the variables need to be applied gradient to are varying. Then how can we realize the right backward process?
Looking forward to your reply. Thank you very much !
Hi, we are working on neural architecture searching and very interested in ENAS. We have downloaded your code and run it. Now we try to convert the code to a c++ version. Do you have any suggestions for us, considering c++ version more complicated? In addition, as far as we know, tensorflow c++ APIs do not include the Optimizer class, and we find a minist tutorial applying gradient to each trainable variable manually https://github.com/rockzhuang/tensorflow/blob/master/tensorflow/examples/cc/cnn/mnist/mnist.cc. Consider that for enas, the nodes used are not fixed and decided by the controller every epoch. Thus the variables need to be applied gradient to are varying. Then how can we realize the right backward process? Looking forward to your reply. Thank you very much !