This patch fix FPGA layer conversion bug. Originally it is possible for a pool layer to merge with the previous convolution layer even if that convolution layer is not the inbound layer of the pool layer.
This patch also fixes ZeroPadding parsing in Keras so that now a ZeroPadding layer will store its padding info in a pad_map dict and the node that use that layer as inbound layer will get the padding info from the pad_map dict. For Sequential models it works like before.
This patch fix FPGA layer conversion bug. Originally it is possible for a pool layer to merge with the previous convolution layer even if that convolution layer is not the inbound layer of the pool layer.
This patch also fixes ZeroPadding parsing in Keras so that now a ZeroPadding layer will store its padding info in a
pad_map
dict and the node that use that layer as inbound layer will get the padding info from thepad_map
dict. For Sequential models it works like before.