onnx / tutorials

Tutorials for creating and using ONNX models
Apache License 2.0
3.34k stars 626 forks source link

LayerNorm Op missing? #267

Open nabsabraham opened 2 years ago

nabsabraham commented 2 years ago

Ask a Question

Question

I have a bert-base model trained with some linear layers on top and adapter layers in the backbone. I export the model with onnx like so:

torch.onnx.export(
    model,
    (ids, mask),
    "model.onnx",
    opset_version=10,
    input_names=["ids", "mask"],
    output_names=["output"],
    export_params=True,
    dynamic_axes={
        "ids": {0: "batch_size"},
        "mask": {0: "batch_size"},
        "output": {0: "batch_size"},
    },
)

However, when I try to run an inference session, I see this warning/error pop up:

Execution will fail if ORT does not have a specialized kernel for this op
2022-06-01 03:27:22.058423804 [W:onnxruntime:, graph.cc:2676 InitFunctionBodyForNode] Function body initialization failed for node 'LayerNormalization_token_28' optype LayerNormalization. Error message /onnxruntime_src/onnxruntime/core/graph/function.cc:788 onnxruntime::FunctionImpl::FunctionImpl(onnxruntime::Graph&, const NodeIndex&, const onnx::FunctionProto&, const std::unordered_map<std::basic_string<char>, const onnx::FunctionProto*>&, std::vector<std::unique_ptr<onnxruntime::Function> >&, const onnxruntime::logging::Logger&, bool) status.IsOK() was false. Resolve subgraph failed:Node (0x5a5b2e0) Op (Flatten) [ShapeInferenceError] Invalid value(-1) for attribute 'axis'

Can someone suggest a custom op solution to this? Do I need a solution to this? My understanding is in the absence of the operator, values will be replaced with constants - what are the implications of this? I can run the a sample through this model but i'm worried about the warnings leading to a long-term issue.

jcwchen commented 2 years ago

LayerNormalization will be included in official ONNX 1.12: https://github.com/onnx/onnx/pull/4076. Even for now, IIUC, LayerNormalization is an existing contr_ops in ONNX Runtime so the model has this LayerNorm op should be runnable. Does this warning/error block your inference? If so, I would suggest you raise this issue in ONNX Runtime repo to let runtime experts take a closer look.

sanjay23singh commented 1 year ago

Hey @nabsabraham, were you able to solve the issue?

jcwchen commented 1 year ago

I think at least the latest torch-nightly should have covered LayerNorm conversion: https://github.com/pytorch/pytorch/pull/84293.