Closed sidharths closed 2 years ago
@sidharths,
Please create a separate signature function for just the preprocessing and then update the EvalConfig to specify this function under preprocessing_function_names leaving the main signature function for inference as mentioned in this issue for resolution. Kindly let us know if it works.
Closing this due to inactivity. Please take a look into the answers provided above, feel free to reopen and post your comments(if you still have queries on this). Thank you!
version info
Building TFX pipeline for an image segmentation model. In Transform component I am saving the image and mask in int64 dtype. Therafter trainer works fine and I am saving the model in SavedModel format with the follwing signature.
The evaluator component gets a runtime error saying that it is getting int32 tensor where it was expecting int64 tensor.
The
context.run(evaluator)
in notebook gives the following tracebackI can see the error starts from TFMA in evaluator. But not sure why there is a type mismatch. I have verified the output from Transform component and it is indeed in int64 type. The training also happens as expected and gives good results based on that assumption.