Closed harishankar-gopalan closed 1 year ago
For tfjs, the unsupported ops may actually be part of the training half of the model and not be needed for inference. We were able to convert the model with --skip_op_check
and then perform inference using the resulting model.
tensorflowjs_converter --skip_op_check --input_format=tf_saved_model models/saved_model models/model-tfjs
Hi @NilSet thanks for the information. Will check the same and get back.
Hi @NilSet I can confirm that I was able to convert the model using the --skip_op_check
option and also load the model using loadGraphModel
from the TFJS SDK. Thanks for the help.
@harishankar-gopalan could you share onnx model?
Has anyone been able to convert the default model to either ONNX or TFJS ? I have tried but mostly it seems to be an issue of unsupported operators. I am attaching the output that I get for each of them.
Command line for TFJS:
tensorflowjs_converter --input_format=tf_saved_model --output_format=tfjs_graph_model <path-to-default-saved-model> <path-to-new-folder>
Output for the above:
Command line for ONNX:
python -m tf2onnx.convert --saved-model <path-to-model-directory> --output guesslang.onnx --opset 11 --verbose
Output for the above:
Environment: