tensorflow / tensorflow

An Open Source Machine Learning Framework for Everyone
https://tensorflow.org
Apache License 2.0
185.65k stars 74.18k forks source link

support SyncBatchNormalization gradient in loaded saved_model #50005

Closed LukeWood closed 3 years ago

LukeWood commented 3 years ago

System information

Describe the feature and the current behavior/state. Loading the saved_model included with the SimCLR repository using tf.saved_model.load yields the error message:

W0602 17:01:50.030123 3770209 function_deserialization.py:573] Importing a function (__inference_sync_batch_normalization_2_layer_call_
and_return_conditional_losses_29971) with ops with unsaved custom gradients. Will likely fail if a gradient is requested.

This is caused by tf.keras.layers.experimental.SyncBatchNormalization.

Will this change the current api? How? Won't

Who will benefit with this feature? Anyone attempting to use tf.keras.layers.experimental.SyncBatchNormalization from a saved_model.

Any Other info. I'm interested in contributing this as it would help me with some research I am performing. I am attempting to implement some functions that measure the robustness of a given saved_model for Neural Structured Learning. In order to do this, I'd like to perform Projected Gradient Descent on a given SavedModel, which requires the gradient to be included.

jvishnuvardhan commented 3 years ago

@LukeWood Can you please share a standalone code to demonstrate what is not possible with the current saved_model?

Generally, when you use keras layers, it is better to save the model with tf.keras.models.save and load with tf.keras.models.load_model. Did you try saving it as a keras model and load. Thanks!

google-ml-butler[bot] commented 3 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you.