Open nabsabraham opened 2 years ago
LayerNormalization will be included in official ONNX 1.12: https://github.com/onnx/onnx/pull/4076. Even for now, IIUC, LayerNormalization is an existing contr_ops in ONNX Runtime so the model has this LayerNorm op should be runnable. Does this warning/error block your inference? If so, I would suggest you raise this issue in ONNX Runtime repo to let runtime experts take a closer look.
Hey @nabsabraham, were you able to solve the issue?
I think at least the latest torch-nightly should have covered LayerNorm conversion: https://github.com/pytorch/pytorch/pull/84293.
Ask a Question
Question
I have a bert-base model trained with some linear layers on top and adapter layers in the backbone. I export the model with onnx like so:
However, when I try to run an inference session, I see this warning/error pop up:
Can someone suggest a custom op solution to this? Do I need a solution to this? My understanding is in the absence of the operator, values will be replaced with constants - what are the implications of this? I can run the a sample through this model but i'm worried about the warnings leading to a long-term issue.