apache / tvm

Open deep learning compiler stack for cpu, gpu and specialized accelerators
https://tvm.apache.org/
Apache License 2.0
11.56k stars 3.43k forks source link

OpNotImplemented: The following operators are not supported for frontend ONNX: Softplus #7176

Closed FelixFu520 closed 3 years ago

FelixFu520 commented 3 years ago

when I transfer ONNX to TVM lib, an error

---------------------------------------------------------------------------
OpNotImplemented                          Traceback (most recent call last)
<ipython-input-2-4fbc9b21ba7b> in <module>
     19 shape_dict = {input_name: img.shape}
     20 # 利用Relay中的onnx前端读取我们导出的onnx模型
---> 21 sym, params = relay.frontend.from_onnx(onnx_model, shape_dict)
     22 
     23 # 这里利用TVM构建出优化后模型的信息

~/tvm/python/tvm/relay/frontend/onnx.py in from_onnx(model, shape, dtype, opset, freeze_params)
   2746     # Use the graph proto as a scope so that ops can access other nodes if needed.
   2747     with g:
-> 2748         mod, params = g.from_onnx(graph, opset, freeze_params)
   2749     return mod, params

~/tvm/python/tvm/relay/frontend/onnx.py in from_onnx(self, graph, opset, freeze_params, get_output_expr)
   2527             msg = "The following operators are not supported for frontend ONNX: "
   2528             msg += ", ".join(unsupported_ops)
-> 2529             raise tvm.error.OpNotImplemented(msg)
   2530         # construct nodes, nodes are stored as directed acyclic graph
   2531         for node in graph.node:

OpNotImplemented: The following operators are not supported for frontend ONNX: Softplus

the code below

# 导入onnx,转换成*.so动态库
import onnx
import time
import tvm
import numpy as np
import tvm.relay as relay
from PIL import Image

#开始同样是读取.onnx模型
onnx_model = onnx.load('../models/yolov4.onnx')  # 导入模型

img = Image.open("street.jpg")
img = img.resize((416, 416))
img = np.array(img, dtype=np.float32)
img /= 255.0
img = img.transpose((2, 0, 1))
img = np.expand_dims(img, axis=0)

# 这里首先在PC的CPU上进行测试 所以使用LLVM进行导出
# target = tvm.target.create('llvm') # x86
target = tvm.target.Target('llvm') # x86
# target = tvm.target.arm_cpu("rasp3b") # raspi
# target = 'llvm'

input_name = "input_0"  # 注意这里为之前导出onnx模型中的模型的输入id,这里为0
shape_dict = {input_name: img.shape}
# 利用Relay中的onnx前端读取我们导出的onnx模型
sym, params = relay.frontend.from_onnx(onnx_model, shape_dict)

# 这里利用TVM构建出优化后模型的信息
with relay.build_config(opt_level=2):
    graph, lib, params = relay.build_module.build(sym, target, params=params)

dtype = 'float32'
from tvm.contrib import graph_runtime

# 下面的函数导出我们需要的动态链接库 地址可以自己定义
print("Output model files")
libpath = "../models/yolov4_pc.so"
lib.export_library(libpath)

# 下面的函数导出我们神经网络的结构,使用json文件保存
graph_json_path = "../models/yolov4_pc.json"
with open(graph_json_path, 'w') as fo:
    fo.write(graph)

# 下面的函数中我们导出神经网络模型的权重参数
param_path = "../models/yolov4_pc.params"
with open(param_path, 'wb') as fo:
    fo.write(relay.save_param_dict(params))
# -------------至此导出模型阶段已经结束--------

what should I do?

junrushao commented 3 years ago

Looks like softplus is not implemented yet, so a PR is more than welcomed :-)

junrushao commented 3 years ago

Thanks for your interests in TVM. Please create a thread at the discuss forum (https://discuss.tvm.apache.org/) since it is not a bug. Thanks!

insop commented 3 years ago

Softplus is added in 12/10/2020 from this https://github.com/apache/tvm/pull/7089

@FelixFu520 , you might want to pull the latest main and re-try

@junrushao1994 @jwfromm

However, I see that there were SoftPlus (not ethe P is in upper case) was already in. According to Onnx spec, it is Softplus not SoftPlus. I am not sure we need to keep them both (Softplus and SoftPlus).

I have a branch that removed SoftPlus, let me know I can create a PR. https://github.com/insop/incubator-tvm/commit/1e944644680188f31ada93a7c4ec7de797a1a0e1.patch

From 1e944644680188f31ada93a7c4ec7de797a1a0e1 Mon Sep 17 00:00:00 2001
From: Insop Song <x@y.z>
Date: Thu, 31 Dec 2020 18:53:33 -0800
Subject: [PATCH] Remove seemingly invalid SoftPlus

- `Softplus` is added in 12/10/2020 from this https://github.com/apache/tvm/pull/7089
- However, I see that there were `SoftPlus` (not the P is in capital) was already in.
According to [Onnx spec](https://github.com/onnx/onnx/blob/master/docs/Operators.md), it is `Softplus` not `SoftPlus`.
---
 python/tvm/relay/frontend/onnx.py          | 9 ---------
 tests/python/frontend/onnx/test_forward.py | 1 -
 2 files changed, 10 deletions(-)

diff --git a/python/tvm/relay/frontend/onnx.py b/python/tvm/relay/frontend/onnx.py
index 6122c81d321..1c544d30971 100644
--- a/python/tvm/relay/frontend/onnx.py
+++ b/python/tvm/relay/frontend/onnx.py
@@ -932,14 +932,6 @@ def _impl_v1(cls, inputs, attr, params):
         return _op.tanh(_expr.const(beta) * inputs[0]) * _expr.const(alpha)

-class SoftPlus(OnnxOpConverter):
-    """Operator converter for SoftPlus."""
-
-    @classmethod
-    def _impl_v1(cls, inputs, attr, params):
-        return _op.log(_op.exp(inputs[0]) + _expr.const(1.0))
-
-
 class Softsign(OnnxOpConverter):
     """Operator converter for Softsign."""

@@ -2661,7 +2653,6 @@ def _get_convert_map(opset):
         "OneHot": OneHot.get_converter(opset),
         # 'Hardmax'
         "Softsign": Softsign.get_converter(opset),
-        "SoftPlus": SoftPlus.get_converter(opset),
         "Gemm": Gemm.get_converter(opset),
         "MatMul": MatMul.get_converter(opset),
         "Mod": Mod.get_converter(opset),
diff --git a/tests/python/frontend/onnx/test_forward.py b/tests/python/frontend/onnx/test_forward.py
index 33dd048896b..3d95a9a83ee 100644
--- a/tests/python/frontend/onnx/test_forward.py
+++ b/tests/python/frontend/onnx/test_forward.py
@@ -1983,7 +1983,6 @@ def verify_single_ops(op, x, out_np, rtol=1e-5, atol=1e-5):
     verify_single_ops("Tanh", x, np.tanh(x))
     verify_single_ops("Sigmoid", x, 1 / (1 + np.exp(-x)))
     verify_single_ops("Softsign", x, x / (1 + np.abs(x)))
-    verify_single_ops("SoftPlus", x, np.log(1 + np.exp(x)))

 @tvm.testing.uses_gpu
jwfromm commented 3 years ago

Thanks for catching that, I actually totally missed that we had a SoftPlus operator. I agree its silly to have both. Thanks for removing the redundant one!

insop commented 3 years ago

Thanks for catching that, I actually totally missed that we had a SoftPlus operator. I agree its silly to have both. Thanks for removing the redundant one!

@jwfromm PR created: https://github.com/apache/tvm/pull/7189