Describe the bug
The benchmark code using torch.onnx.export uses argument example_outputs but this named argument was removed from the export function.
Urgency
non
System information
OS Platform and Distribution (e.g., Linux Ubuntu 16.04): linux ubuntu 20.04
ONNX Runtime installed from (source or binary): binary
ONNX Runtime version: 1.11.1
Python version: 3.7
CUDA/cuDNN version: 11.6
GPU model and memory:
To Reproduce
Describe steps/code to reproduce the behavior.
Attach the ONNX model to the issue (where applicable) to expedite investigation.
Run
Expected behavior
Benchmark runs and reports results
Screenshots
If applicable, add screenshots to help explain your problem.
Additional context
output
Arguments: Namespace(batch_sizes=[1], cache_dir='./cache_models', detail_csv=None, disable_ort_io_binding=False, engines=['onnxruntime'], force_num_layers=None, fusion_csv=None, input_counts=[1], model_class=None, model_source='pt', models=['bert-base-cased'], num_threads=[64], onnx_dir='./onnx_models', optimizer_info=<OptimizerInfo.BYSCRIPT: 'by_script'>, overwrite=False, precision=<Precision.FLOAT32: 'fp32'>, provider=None, result_csv=None, sequence_lengths=[4, 8, 16, 32, 64, 128, 256], test_times=100, use_gpu=True, validate_onnx=False, verbose=False)
Model class name: AutoModel
Some weights of the model checkpoint at bert-base-cased were not used when initializing BertModel: ['cls.predictions.decoder.weight', 'cls.seq_relationship.weight', 'cls.predictions.transform.LayerNorm.weight', 'cls.seq_relationship.bias', 'cls.predictions.transform.dense.bias', 'cls.predictions.transform.dense.weight', 'cls.predictions.transform.LayerNorm.bias', 'cls.predictions.bias']
- This IS expected if you are initializing BertModel from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).
- This IS NOT expected if you are initializing BertModel from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).
Exporting ONNX model to ./onnx_models/bert_base_cased_1.onnx
Exception
Traceback (most recent call last):
File "/net/storage149/mnt/md0/ccyang/miniconda/envs/mini-nlp/lib/python3.7/site-packages/onnxruntime/transformers/benchmark.py", line 587, in main
args.model_source)
File "/net/storage149/mnt/md0/ccyang/miniconda/envs/mini-nlp/lib/python3.7/site-packages/onnxruntime/transformers/benchmark.py", line 108, in run_onnxruntime
validate_onnx, use_raw_attention_mask, overwrite, model_fusion_statistics)
File "/net/storage149/mnt/md0/ccyang/miniconda/envs/mini-nlp/lib/python3.7/site-packages/onnxruntime/transformers/onnx_exporter.py", line 396, in export_onnx_model_from_pt
use_external_data_format=use_external_data_format)
TypeError: export() got an unexpected keyword argument 'example_outputs'
No any result avaiable.
Describe the bug The benchmark code using
torch.onnx.export
uses argumentexample_outputs
but this named argument was removed from the export function.Urgency non
System information
To Reproduce
Expected behavior Benchmark runs and reports results
Screenshots If applicable, add screenshots to help explain your problem.
Additional context output