Open csaroff opened 5 years ago
onnxjs-node
cannot work in browser environments. it does not work with worker-loader
either, as the latter is specifically targeting running in webworker.
supporting Shape
is a known issue as referenced in #84
The operator Shape
is now available in version 0.1.8.
I still see this error using version 0.1.8.
The shape.ts file hit the master branch on Sep 1 but 0.1.8 was released on August 28, so maybe it just missed the cut to make it into 0.1.8?
Mine is 0.1.8 too! I am not using a nodejs, but Reactjs.
What backend are u using? It's only implemented in the cpu backend.
It was all backend failed. The problem is solved.
I do not understand your message, but good thing if it works.
Hi, I am using backendHint="cpu" with onnxjs 0.1.8 but got the same error
TypeError: cannot resolve operator 'Shape' with opsets: ai.onnx v9
at Object.e.resolveOperator (onnx.min.js:1)
at t.resolve (onnx.min.js:14)
at e.initializeOps (onnx.min.js:14)
at onnx.min.js:14
at t.event (onnx.min.js:1)
at e.initialize (onnx.min.js:14)
at e.<anonymous> (onnx.min.js:14)
at onnx.min.js:14
at Object.next (onnx.min.js:14)
at a (onnx.min.js:14)
@JonathanSum how did you solve the problem? @28Smiles maybe any advices how to solve it?
Repeating what I said above, the Shape operation was added after the latest version (v0.1.8) was released. We won't be able to use the Shape operation until a new version is released.
You can verify this by downloading the snapshot of the code for the release and see that it does not contain the file lib/ops/Shape.ts
Can this be solved or not?
I am having same error using below code, which worked fine in onnxruntime
python and CPP.
<script src="https://cdn.jsdelivr.net/npm/onnxjs@0.1.8/dist/onnx.min.js"></script>
<!-- Code that consume ONNX.js -->
<script>
// create a session
const myOnnxSession = new onnx.InferenceSession();
// load the ONNX model file
myOnnxSession.loadModel("./200_0.19.onnx").then(() => {
// generate model input
// const inferenceInputs = getInputs();
// console.log(inferenceInputs)
// execute the model
// myOnnxSession.run(inferenceInputs).then((output) => {
// // consume the output
// const outputTensor = output.values().next().value;
// console.log(`model output tensor: ${outputTensor.data}.`);
// });
});
</script>
I built onnxjs using master branch of this repository, but still getting same error.
Uncaught (in promise) TypeError: cannot resolve operator 'SVMClassifier' with opsets: ai.onnx v9, ai.onnx.ml v
Built onnxjs with master branch & it seems that SVMClassifier is not supported ops for wasm. (list of supported functions are mentioned)
I also have same issue saying "Uncaught (in promise) TypeError: cannot resolve operator 'Shape' with opsets: ai.onnx v9". Should I modify a model I using not to pass "shape" things??? I feel it will be ok if I can rewrite "Shape" to "ReShape" in the onnx model, but is this possible? If not, I should retrain things right?
Mine is solved. Not very difficult to solve. The problem is related to ONNX.
Solution1(It should solve most people's problem, but I didn't use it): One of the issues and solutions is ONNX does not support all the features of a Pytorch. Sometimes you have to degrade the version of ONNX to the older one to have certain features. On the other hand, you may find out the older version doesn't support another important feature. But trust me, you will find a better way to solve it, which is what I did when I was using on ReactJs.
Solution2 It was my solution that was about digging into ONNX itself. Thus, I suggest you talk to the ONNX coder to solve your issue.
Not that difficult to solve. Here is mine which is solved and runs perfectly🤣
If you have an interesting project. feel free to contact me. Maybe there is a possibility that I can solve it if I have free time on that day if it is running in Tensorflow, Kernes(TF2), or Pytorch.
@JonathanSum Congratulations. Is there any ported SVMClassifier ops that can be easily integrated with wasm_ops
what the heck guys, "I solved this" then no more details, or "Here's 2 solutions: one, I don't know if it works, and second one is talk the ONNX coder"
I was hoping for an actual solution to this...
This is not necessarily a webpack issue (as far as I know)... there are multiple reasons why this error might occur. The most common reason is using really custom reshaping operation in your PyTorch's model definition in python... like x.view(-1, x.size(0))
, or x.reshape(x.shape[0], ...)
....
What solved the issue for me is by making sure that your python code is saying explicit numbers (1, 2, 256, 512, ...)... not x.size(0)
For this, you might need to log every reshaping step and print the shapes, and then use those values to make reshaping explicit and convert the model again to onnx format.
Hope this finally helps 😪
getting the TypeError: cannot resolve operator 'Shape' with opsets: ai.onnx v10
with a mobilenet model from onnx model zoo?
https://github.com/onnx/models/blob/master/vision/classification/mobilenet/model/mobilenetv2-7.onnx
using onnxjs@0.1.8
I've been trying to add onnxjs support in VoTT, but I've been hitting some issues. I have a model that I was able to successfully run in node.js, but when I try to run it on the frontend, I get
I tried converting the model to use a lower opset, but I faced similar errors with the converted models.
So I thought that I'd be clever and try a workaround and run the model on the backend using onnxjs-node, sending the results back to the frontend using IPC.
After adding a little code into src/electron/main.ts
and updating my webpack config, I'm seeing a couple of new errors.
So then I ran
npm install --save-dev node-loader worker-loader
and tried rerunning webpack.And this is where I'm stuck. I see that you chose not to include lib/worker in your tsconfig file, but I couldn't figure out why. So I tried rebuilding the project, removing the
"exclude": ["lib/worker"],
line from the tsconfig.json, but that just gives me a whole mess of typescript errors.I'm pretty much out of ideas. I checked in a minimally reproducible example on this branch. Here's a diff of the changes. Any help/insight here would be great!