Open hubert0527 opened 6 years ago
Hi,
Thanks for the great work and sharing the implementation to the community! I recently found that this part of codes looks kind of weird.
During micro-search for CIFAR-10, you first count the number of precious layers which is NOT picked during current _enas_layer construction: https://github.com/melodyguan/enas/blob/d1a90ac915301198f2a30ce136e9040e6c4235ff/src/cifar10/micro_child.py#L660
_enas_layer
Then count how many layers are NOT picked, and save the value to num_outs: https://github.com/melodyguan/enas/blob/d1a90ac915301198f2a30ce136e9040e6c4235ff/src/cifar10/micro_child.py#L663
num_outs
The naming of num_outs itself looks kind of weird and it is then used to reshape the output tensor, which looks logically weird. https://github.com/melodyguan/enas/blob/d1a90ac915301198f2a30ce136e9040e6c4235ff/src/cifar10/micro_child.py#L674
Could you clarify why num_outs uses the number of previous layers which are NOT picked? Sincerely sorry if I misunderstand anything.
Thanks!
Hi,
Thanks for the great work and sharing the implementation to the community! I recently found that this part of codes looks kind of weird.
During micro-search for CIFAR-10, you first count the number of precious layers which is NOT picked during current
_enas_layer
construction: https://github.com/melodyguan/enas/blob/d1a90ac915301198f2a30ce136e9040e6c4235ff/src/cifar10/micro_child.py#L660Then count how many layers are NOT picked, and save the value to
num_outs
: https://github.com/melodyguan/enas/blob/d1a90ac915301198f2a30ce136e9040e6c4235ff/src/cifar10/micro_child.py#L663The naming of
num_outs
itself looks kind of weird and it is then used to reshape the output tensor, which looks logically weird. https://github.com/melodyguan/enas/blob/d1a90ac915301198f2a30ce136e9040e6c4235ff/src/cifar10/micro_child.py#L674Could you clarify why
num_outs
uses the number of previous layers which are NOT picked? Sincerely sorry if I misunderstand anything.Thanks!