tensorflow / model-optimization

A toolkit to optimize ML models for deployment for Keras and TensorFlow, including quantization and pruning.
https://www.tensorflow.org/model_optimization
Apache License 2.0
1.49k stars 319 forks source link

Support Quantizing a tf.keras Model inside another tf.keras Model #957

Open plooney opened 2 years ago

plooney commented 2 years ago

Right now if I do something like

Quantize functional model

inputs = tf.keras.Input((3,)) out = tf.keras.layers.Dense(2)(inputs) seq = tf.keras.Sequential() seq.add(tf.keras.layers.Dense(2)) model = tf.keras.Model(inputs, seq(inputs)) quantized_model = quantize_model(model)

I get

Quantizing a tf.keras Model inside another tf.keras Model is not supported.