Closed chaitanya-basava closed 2 years ago
@chaitanya-basava
This is current limitation in DJL. We filtered out String tensor in the output (https://github.com/deepjavalibrary/djl/blob/master/engines/tensorflow/tensorflow-engine/src/main/java/ai/djl/tensorflow/engine/TfSymbolBlock.java#L120). Since we added String Tensor support, should be able to add String tensor support. Can you share your model or point to a similar model, so we can add a test for this type of model.
@frankfliu thanks for the response. I won't be able to share the exact model file, so have created this dummy model which has similar input and output signatures, except it would be returning the query
string as corrected_query
and score=0.0
Hope this will be helpful.
Wanted to clarify one more thing would there be a similar behaviour (of string output's getting filtered out) when using other model types like onnx and pytorch as well?
@chaitanya-basava
You can try nightly snapshot build once this PR is merged. The ONNX and PyTorch engine should not have this issue. This tensorflow behavior is more like a bug (we forget to remove this limit when we added String tensor support).
Hi, I am testing out a tensorflow model which takes a string as input and returns the spell corrected version of this string and it's confidence score. The signature_def retuned for the model using
saved_model_cli
is as followingBut when loading the model on DJL, the loaded model is only identifying
score
as model output.corrected_query
is not getting returned when checking withmodel.describeOutput().keys()
and onlyscore
is getting returned upon model prediction call.Any support or document around solving this issue would be really helpful.
Thanks