[X] An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
[ ] My own task or dataset (give details below)
Reproduction
from optimum.onnxruntime import ORTModelForQuestionAnswering
from transformers import AutoTokenizer
# Load a model from transformers and export it to ONNX
ort_model = ORTModelForQuestionAnswering.from_pretrained(MODEL_PATH, from_transformers=True)
tokenizer = AutoTokenizer.from_pretrained(MODEL_PATH)
ONNX_PATH = f"{MODEL_PATH}/onnx"
# Save the onnx model and tokenizer
ort_model.save_pretrained(ONNX_PATH)
tokenizer.save_pretrained(ONNX_PATH)
from optimum.onnxruntime import ORTOptimizer
optimizer = ORTOptimizer.from_pretrained(ONNX_PATH) # <- This fails
Expected behavior
Loading the ORT model for the optimizer with ORTOptimizer.from_pretrained(ONNX_PATH) fails with:
ValueError: Unrecognized model in ... Should have a model_type key in its config.json, or contain one of the following strings in its name: ...
The config.json generated when converting to Onnx doesn't have a model_type key.
Unlike the config.json of the 🤗 Transformers model: "model_type": "xlm-roberta"; There only seems to be a "name": "XLMRoberta" key.
But the optimizer and quantizer classes expect a model_type key when loading the ort_model.
I was expecting the .save_pretrained method of a ORTModel class to also save the model_type key from the 🤗 Transformers config.json model.
System Info
Who can help?
@lewtun @michaelbenayoun
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
Expected behavior
Loading the ORT model for the optimizer with
ORTOptimizer.from_pretrained(ONNX_PATH)
fails with:ValueError: Unrecognized model in ... Should have a model_type key in its config.json, or contain one of the following strings in its name: ...
The
config.json
generated when converting to Onnx doesn't have amodel_type
key. Unlike theconfig.json
of the 🤗 Transformers model:"model_type": "xlm-roberta"
; There only seems to be a"name": "XLMRoberta"
key.But the optimizer and quantizer classes expect a
model_type
key when loading the ort_model.I was expecting the
.save_pretrained
method of aORTModel
class to also save themodel_type
key from the 🤗 Transformersconfig.json
model.