shenweichen / DeepCTR-Torch

【PyTorch】Easy-to-use,Modular and Extendible package of deep-learning based CTR models.
https://deepctr-torch.readthedocs.io/en/latest/index.html
Apache License 2.0
2.98k stars 701 forks source link

How to export DeepCTR model to onnx ? #83

Open batrlatom opened 4 years ago

batrlatom commented 4 years ago

Please refer to the FAQ in doc and search for the related issues before you ask the question.

Describe the question(问题描述) I would like to export model to onnx format, so I can use it for example with Apache PredictionIO

Additional context I tried with :

    dummy_input = np.array([np.array([[0]]), np.array([[0]])])
    pred_ans = model.predict(dummy_input, 100)
    print(pred_ans)
    torch.onnx.export(model, (dummy_input), "deepCTR.onnx")

predict works well, I am getting: [[0.0065056]] but wwith onnx.export, I am getting error with this trace:

RuntimeError: Only tuples, lists and Variables supported as JIT inputs/outputs. Dictionaries and strings are also accepted but their usage is not recommended. But got unsupported type numpy.ndarray

Operating environment(运行环境):

zanshuxun commented 2 years ago

Please transform the input data from numpy.ndarray to list, then try again