Open abbyDC opened 2 years ago
Hi @abbyDC Can you take a look at the workaround proposed in this link and see if it helps in resolving your issue? Also you can refer to TFMA Evaluator, Hope this helps. Thanks!
Hi @abbyDC Can you take a look at the workaround proposed in this link and see if it helps in resolving your issue? Also you can refer to TFMA Evaluator, Hope this helps. Thanks!
Hi @pindinagesh! The link you attached doesn't show anything on my end when i click on it. May I ask for the working link for this so I can take a look at it? Thanks! :) ks!
Sorry for the inconvenience, I have updated it again, Could you please check it?
Sorry for the inconvenience, I have updated it again, Could you please check it?
Hi yup the link works now. Will take a look at the post first to check which of the workarounds I have already tried
Hi @abbyDC
Could you please tell us the status of this issue?
Hi @abbyDC
Could you please tell us the status of this issue?
Hello! Upon further investigation and experimentations, the problem still looks the same for me. Several things I've tried similar to the issue above:
def _get_tf_examples_serving_signature(model, tf_transform_output):
"""Returns a serving signature that accepts `tensorflow.Example`."""
@tf.function(input_signature=[tf.TensorSpec(shape=[None], dtype=tf.string, name="examples")])
def serve_tf_examples_fn(serialized_tf_example):
"""Returns the output to be used in the serving signature."""
transformed_specs = tf_transform_output.transformed_feature_spec()
transformed_features = tf.io.parse_example(serialized_tf_example, transformed_specs)
transformed_features["audio"] = tf.sparse.to_dense(transformed_features["audio"])
transformed_features["target_phones"] = tf.sparse.to_dense(transformed_features["target_phones"])
audio = transformed_features["audio"]
labels = transformed_features["target_phones"]
outputs = model((audio, labels))
return outputs
return serve_tf_examples_fn
signatures = {
"serving_default": default_signature,
"serving_evaluator": _get_tf_examples_serving_signature(model, tf_transform_output),
}
System information
You can obtain the TensorFlow Model Analysis version with
python -c "import tensorflow_model_analysis as tfma; print(tfma.version.VERSION)"
Describe the problem
I have a custom layer named MultiHeadAttention layer and when I ran the tfx pipeline, it shows a warning that it has a conflict with the default MultiHeadAttention layer and that I should rename the layer something else. When I rename it to CustomMultiHeadAttention, it suddenly breaks the tfx pipeline particularly in the evaluator component. When I don't change anything else in the code except reverting it back to the name "MultiHeadAttention" the evaluator component runs okay but the problem is that when trying to export the model/saving and loading, I'm also having some problems. What is the cause of this or is it a bug in tfma/tfx?
Source code / logs
Error when changing Custom Layer name from MultiHeadAttention -> CustomMultiHeadAttention
eval_config.py
code snippet for evaluator component in tfx pipeline
multihead attention layer declaration snippet