Closed dvorotnev closed 3 years ago
These seem to be correct conversions for me. I guess you expected those ops to be mapped 1-1, but the problem is that when they are saved to TF protobuf, they are already separated into parts, so the converter converts the parts. Furthermore, in case of batch_norm, constant folding merges the constants.
Okey, thanks for the answer!
I am trying to save and to convert a neural network from TF to NNEF using the nnef-tools converter. But layers
tf.nn.batch_normalization
andtf.clip_by_value
are converted not according to operation_mapping.md.I am using these commands to convert the network:
Also I tried to add
--optimize
flag, but result didn't change. Below are two simple examples to reproduce this conversion bugs:tf.nn.batch_normalization
A simple python example: A conversion result:tf.clip_by_value
A simple python example: A conversion result: