Open micah5 opened 4 years ago
For some further context, here is how I am creating the model in python:
# Export the model
torch.onnx.export(model, # model being run
x, # model input (or a tuple for multiple inputs)
"super_resolution.onnx", # where to save the model (can be a file or file-like object)
export_params=True, # store the trained parameter weights inside the model file
opset_version=10, # the ONNX version to export the model to
do_constant_folding=True, # whether to execute constant folding for optimization
input_names = ['input'], # the model's input names
output_names = ['output'], # the model's output names
dynamic_axes={'input' : {1 : 'num_tokens'}, # variable lenght axes
'output' : {1 : 'num_tokens'}})
where x is a pytorch tensor
tensor([[ 67, 13513, 5128, 1379, 428, 318, 257, 1332, 6827]])
and I'm loading it in js as so:
import { Tensor, InferenceSession } from "onnxjs-node";
const session = new InferenceSession();
async function main() {
await session.loadModel("super_resolution.onnx");
const inputs = [
new Tensor([67, 13513, 5128, 1379, 428, 318, 257, 1332, 6827], "int32")
];
const outputMap = await session.run(inputs);
}
main();
It fails on run()
I am getting the error:
Looking at the API docs, it says there are 4 data types for tensors ('bool' | 'float32' | 'int32' | 'string') however none of them correspond to
N11onnxruntime11NonOnnxTypeIxEE
.I dug around in the codebase a bit but got lost, any help is appreciated.