Open Janus289 opened 1 year ago
Thanks for the report. Apparently the pooler_output
output has an incorrect shape in the converted model. This is likely a bug in the exporter. As a workaround, you can try removing this output from the model.
Something like this (not tested so may have bugs):
del mlmodel._spec.description.output[-1]
mlmodel.save(...)
Hi,
I tried that workaround and am now getting the error RuntimeWarning: You will not be able to run predict() on this Core ML model. Underlying exception message was: Error compiling model: "compiler error: Encountered an error while compiling a neural network model: validator error: Model and main function must have same number of outputs.
Ah, too bad. One thing that might work is to create your own CoreMLConfig object and override the function that creates the output definitions to remove the pooler_output
, and then use that config to do the conversion.
When trying to export the Huggingface models Deeppavlov/rubert-base-cased and ckiplab/bert-base-chinese-ner using the command line, it fails with the output
It runs correctly with --model=distillbert-base-uncased. Using python 3.9.13, coremltools 6.1 torch 1.12.1
A .mlpackage file is created, but I can't use one I can't call predict() on.