Open suriyadeepan opened 4 years ago
basemodel = MobileNetV2(weights="imagenet", include_top=False,
input_tensor=Input(shape=(224, 224, 3)))
basemodel.trainable = False
#------------------
# MobileNetV2
flatten = basemodel.output
bboxHead = Flatten(name="flatten")(flatten)
bboxHead = Dense(128, activation="relu")(bboxHead)
bboxHead = Dropout(0.1)(bboxHead)
bboxHead = Dense(4, activation="sigmoid")(bboxHead)
instead of softmax of create_model(), I use sigmoid to get the result in [0..1]. However, when I execute quantize_model to quantize the model, it returns following error:
ValueError: Only some Keras activations under tf.keras.activationsare supported. For other activations, useQuantizerdirectly,
and update layer config usingQuantizeConfig.
I cannot find out how to resolve it at all. I tested quantize_annotate_layer to quantize only the dense layer, but failed.
Any idea about this?
Thanks
I've cleared this issue with modification model_optimization library.
Lets see if this shows up!