apache / tvm

Open deep learning compiler stack for cpu, gpu and specialized accelerators
https://tvm.apache.org/
Apache License 2.0
11.56k stars 3.43k forks source link

[Bug] [Relay][ONNX] Scaler rather than Scale: a bug about Scaler in the onnx frontend #16381

Open jikechao opened 7 months ago

jikechao commented 7 months ago

Expected behavior

The Scaler operator has been supported in the TVM ONNX frontend, however, the map has the wrong key name Scale. I can not find the operator Scale in the ONNX official documentation and ONNXRuntime opset. Thus, I thought the Scale is a typo of Scaler. https://github.com/apache/tvm/blob/a1a1a7ca033f83315d02411d4b2c475ddacc795e/python/tvm/relay/frontend/onnx.py#L6531

Actual behavior

Traceback (most recent call last):
  File "bug_scaler.py", line 7, in <module>
    irmod, params = relay.frontend.from_onnx(model, {'input': (1, 2, 3, 4)}, freeze_params=True)
  File "/home/shenqingchao/software/tvm/python/tvm/relay/frontend/onnx.py", line 7225, in from_onnx
    mod, params = g.from_onnx(graph, opset)
  File "/home/shenqingchao/software/tvm/python/tvm/relay/frontend/onnx.py", line 6843, in from_onnx
    self._check_for_unsupported_ops(graph)
  File "/home/shenqingchao/software/tvm/python/tvm/relay/frontend/onnx.py", line 6936, in _check_for_unsupported_ops
    raise tvm.error.OpNotImplemented(msg)
tvm.error.OpNotImplemented: The following operators are not supported for frontend ONNX: Scaler

Steps to reproduce

step 1. Download the ONNX model here

import tvm
from tvm import relay

model = onnx.load("scaler_model.onnx")
irmod, params = relay.frontend.from_onnx(model, {'input': (1, 2, 3, 4)}, freeze_params=True)

cc @KJlaccHoeUM9l @shingjan

jikechao commented 7 months ago

If the Scale is a typo, the conversion function has another bug. The type of attribute scale in Operator Scaler is a list, while TVM considers it as a float. The conversion will crash under the given ONNX model above. First of all, I need to confirm if the Scale is a typo. If so, I can provide patches to fix these bugs.