Closed zhiwei-dong closed 1 year ago
Thank you for this great job, im trying to add qat scheme with keras_model in converter.py, but encounter this exception
keras_model
keras_model = keras_builder(model_proto, input_node_names, output_node_names, native_groupconv) # start qat #! add by dongz import keras import tensorflow as tf import tensorflow_model_optimization as tfmot quantize_model = tfmot.quantization.keras.quantize_model(keras_model)
Error: Shape must be rank 4 but is rank 5 for '{{node depthwise_conv2d/depthwise}} = DepthwiseConv2dNative[T=DT_FLOAT, data_format="NHWC", dilations=[1, 1, 1, 1], explicit_paddings=[], padding="SAME", strides=[1, 1, 1, 1]](Placeholder, depthwise_conv2d/depthwise/ReadVariableOp)' with input shapes: [1,1,80,80,24], [3,3,24,1]
It seems like keras builder add batch dimension, and quantizer add batch dimension too. So how can i let keras builder return a non-batch fixed model?
Hello, I'm not familiar with 'tensorflow_model_optimization', I'm afraid I can't help you.
OK, thx
Thank you for this great job, im trying to add qat scheme with
keras_model
in converter.py, but encounter this exceptionIt seems like keras builder add batch dimension, and quantizer add batch dimension too. So how can i let keras builder return a non-batch fixed model?