Open edge7 opened 1 year ago
Please note the above error just appears if the output model is set to be a graph_model, If I remove the following option:
--output_format=tfjs_graph_model
it works, however a graph model is preferred as it should be optimized for inference.
Hi,
allow me to link this one to this as they are strictly related each other
hi @edge7 This is a known error for TensorFlow as tracked https://github.com/tensorflow/tensorflow/issues/31668 The fix seems to be set the trainable to false on the batch normalization layers before you export to saved model. Can you give it a try? thanks
yes, that would work, but then I'd need to call the model with training=True which lowers a lot performance.
System information
Linux ed7-laptop 5.14.0-1027-oem #30-Ubuntu SMP Mon Mar 7 15:00:29 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux
Describe the current behavior I am trying to convert a tf_saved_model like this:
tensorflowjs_converter --input_format=tf_saved_model --output_format=tfjs_graph_model --signature_name=serving_default --saved_model_tags=serve pix_pix_256_16 pix_pix_256_16_web
Describe the expected behavior It gets converted Standalone code to reproduce the issue Run the above command using this modelOther info / logs
Please note the following is the model summary, and as you can see I have got some batch_normalization and dropout layer (both with training=true):