microsoft / onnxjs

ONNX.js: run ONNX models using JavaScript
Other
1.75k stars 130 forks source link

What data type does N11onnxruntime11NonOnnxTypeIxEE refer to? #189

Open micah5 opened 4 years ago

micah5 commented 4 years ago

I am getting the error:

(node:40885) UnhandledPromiseRejectionWarning: Error: Failed to run the model: Unexpected input data type. Actual: (N11onnxruntime11NonOnnxTypeIiEE) , expected: (N11onnxruntime11NonOnnxTypeIxEE)
    at OnnxRuntimeInferenceSession.<anonymous> (/Users/user/git/words/node_modules/onnxjs-node/lib/inference-session-override.js:106:59)
    at step (/Users/user/git/words/node_modules/onnxjs-node/lib/inference-session-override.js:32:23)
    at Object.next (/Users/user/git/words/node_modules/onnxjs-node/lib/inference-session-override.js:13:53)
    at /Users/user/git/words/node_modules/onnxjs-node/lib/inference-session-override.js:7:71
    at new Promise (<anonymous>)
    at __awaiter (/Users/user/git/words/node_modules/onnxjs-node/lib/inference-session-override.js:3:12)
    at OnnxRuntimeInferenceSession.run (/Users/user/git/words/node_modules/onnxjs-node/lib/inference-session-override.js:77:16)
    at main (/Users/user/git/words/test_script.js:13:35)
    at process.runNextTicks [as _tickCallback] (internal/process/task_queues.js:62:5)

Looking at the API docs, it says there are 4 data types for tensors ('bool' | 'float32' | 'int32' | 'string') however none of them correspond to N11onnxruntime11NonOnnxTypeIxEE.

I dug around in the codebase a bit but got lost, any help is appreciated.

micah5 commented 4 years ago

For some further context, here is how I am creating the model in python:

# Export the model
torch.onnx.export(model,               # model being run
                  x,                         # model input (or a tuple for multiple inputs)
                  "super_resolution.onnx",   # where to save the model (can be a file or file-like object)
                  export_params=True,        # store the trained parameter weights inside the model file
                  opset_version=10,          # the ONNX version to export the model to
                  do_constant_folding=True,  # whether to execute constant folding for optimization
                  input_names = ['input'],   # the model's input names
                  output_names = ['output'], # the model's output names
                  dynamic_axes={'input' : {1 : 'num_tokens'},    # variable lenght axes
                                'output' : {1 : 'num_tokens'}})

where x is a pytorch tensor

tensor([[   67, 13513,  5128,  1379,   428,   318,   257,  1332,  6827]])

and I'm loading it in js as so:

import { Tensor, InferenceSession } from "onnxjs-node";
const session = new InferenceSession();

async function main() {
  await session.loadModel("super_resolution.onnx");
  const inputs = [
    new Tensor([67, 13513, 5128, 1379, 428, 318, 257, 1332, 6827], "int32")
  ];
  const outputMap = await session.run(inputs);
}

main();

It fails on run()