onnx / onnxmltools

ONNXMLTools enables conversion of models to ONNX
https://onnx.ai
Apache License 2.0
1k stars 181 forks source link

Unsupported shape calculation for operator textClassifier, CoreML -> ONNX #378

Open Siberi0 opened 4 years ago

Siberi0 commented 4 years ago

Hi all I am totally new in Machine Learning. I have started to work on a project in which I've been asked to add some ML features. The first and most simple task I've completed is Text Recognition and Classification. I have followed a tutorial linked below, ran and tested the app with success. Now I have applied the little knowledge I've gained to our project. The feature is the same, the only difference are the articles used to train the model. Again, all ran smooth with iOS. A colleague needs to do the same on Android. Since I have already trained a model, we decided to try and use the same model for Android. So here I am, trying to convert a .mlmodel to something Android compatible (should be tensor flow light) using onnxmltools. Since I am a newbie in ML and python, please excuse me if I say something wrong

Problem description

As title states, Unsupported shape calculation for operator textClassifier error is printed on console when trying to convert a .mlmodel to .onnx

Trace

Traceback (most recent call last): File "convertML.py", line 14, in onnx_model = onnxmltools.convert_coreml(coreml_model) File "/usr/local/lib/python3.7/site-packages/onnxmltools/convert/main.py", line 20, in convert_coreml custom_conversion_functions, custom_shape_calculators) File "/usr/local/lib/python3.7/site-packages/onnxmltools/convert/coreml/convert.py", line 60, in convert topology = parse_coreml(spec, initial_types, target_opset, custom_conversion_functions, custom_shape_calculators) File "/usr/local/lib/python3.7/site-packages/onnxmltools/convert/coreml/_parse.py", line 467, in parse_coreml topology.compile() File "/usr/local/lib/python3.7/site-packages/onnxconverter_common/topology.py", line 680, in compile self._infer_all_types() File "/usr/local/lib/python3.7/site-packages/onnxconverter_common/topology.py", line 556, in _infer_all_types operator.infer_types() File "/usr/local/lib/python3.7/site-packages/onnxconverter_common/topology.py", line 108, in infer_types registration.get_shape_calculator(self.type)(self) File "/usr/local/lib/python3.7/site-packages/onnxconverter_common/registration.py", line 67, in get_shape_calculator raise ValueError('Unsupported shape calculation for operator %s' % operator_name) ValueError: Unsupported shape calculation for operator textClassifier

To Reproduce

`import onnxmltools
import coremltools

# Update the input name and path for your CoreML model
input_coreml_model = 'NewsClassifier.mlmodel'

# Change this path to the output name and path for the ONNX model
output_onnx_model = 'NewsClassifier.onnx'

# Load your CoreML model
coreml_model = coremltools.utils.load_spec(input_coreml_model)

# Convert the CoreML model into ONNX
onnx_model = onnxmltools.convert_coreml(coreml_model)

# Save as protobuf
onnxmltools.utils.save_model(onnx_model, output_onnx_model)

NewsClassifier.mlmodel.zip

I have attached CoreML model, the one created following this tutorial https://heartbeat.fritz.ai/text-classification-on-ios-using-create-ml-f71d7191404a

System environment:

Additional context

Again, I am a new comer in ML so please excuse me for my lack of knowledge on the subject

jiafatom commented 4 years ago

textClassifier in your model is a custom op, so you need specify it explicitly as a parameter custom_conversion_functions when calling convert_coreml.

Siberi0 commented 4 years ago

textClassifier in your model is a custom op, so you need specify it explicitly as a parameter custom_conversion_functions when calling convert_coreml.

Hello and many thanks for your answer. I have searched but found no documentation on how to properly call that function. Could you please point me in the right direction?

okpatil4u commented 1 year ago

Hello there, any update on this issue ?