microsoft / onnxjs

ONNX.js: run ONNX models using JavaScript
Other
1.75k stars 130 forks source link

Error loading onnxjs from webpack #165

Open csaroff opened 5 years ago

csaroff commented 5 years ago

I've been trying to add onnxjs support in VoTT, but I've been hitting some issues. I have a model that I was able to successfully run in node.js, but when I try to run it on the frontend, I get

TypeError: cannot resolve operator 'Shape' with opsets: ai.onnx v9

I tried converting the model to use a lower opset, but I faced similar errors with the converted models.

So I thought that I'd be clever and try a workaround and run the model on the backend using onnxjs-node, sending the results back to the frontend using IPC.

After adding a little code into src/electron/main.ts

import { Tensor, InferenceSession } from "onnxjs-node";
const inferenceSession = new InferenceSession();

and updating my webpack config, I'm seeing a couple of new errors.

 σ csaroff:~/fiddle/VoTT$ node_modules/.bin/webpack-cli --config config/webpack.dev.js
...
WARNING in ./node_modules/onnxjs-node/bin/napi-v3 sync ^\.\/.*\.node$
Module not found: Error: Can't resolve 'node-loader' in '/Users/csaroff/fiddle/VoTT'
 @ ./node_modules/onnxjs-node/bin/napi-v3 sync ^\.\/.*\.node$
 @ ./node_modules/onnxjs-node/lib/binding.js
 @ ./node_modules/onnxjs-node/lib/inference-session-override.js
 @ ./node_modules/onnxjs-node/lib/index.js
 @ ./src/electron/main.ts

ERROR in ./node_modules/onnxjs/lib/wasm-binding.js
Module not found: Error: Can't resolve 'worker-loader' in '/Users/csaroff/fiddle/VoTT/node_modules/onnxjs/lib'
 @ ./node_modules/onnxjs/lib/wasm-binding.js 101:29-96
 @ ./node_modules/onnxjs/lib/backends/backend-wasm.js
 @ ./node_modules/onnxjs/lib/api/onnx-impl.js
 @ ./node_modules/onnxjs/lib/api/index.js
 @ ./node_modules/onnxjs-node/lib/index.js
 @ ./src/electron/main.ts

So then I ran npm install --save-dev node-loader worker-loader and tried rerunning webpack.

σ csaroff:~/fiddle/VoTT$node_modules/.bin/webpack-cli --config config/webpack.dev.js
...
ERROR in ./node_modules/onnxjs/lib/wasm-binding.js
Module not found: Error: Can't resolve './worker/worker-main' in '/Users/csaroff/fiddle/VoTT/node_modules/onnxjs/lib'
 @ ./node_modules/onnxjs/lib/wasm-binding.js 101:29-96
 @ ./node_modules/onnxjs/lib/backends/backend-wasm.js
 @ ./node_modules/onnxjs/lib/api/onnx-impl.js
 @ ./node_modules/onnxjs/lib/api/index.js
 @ ./node_modules/onnxjs-node/lib/index.js
 @ ./src/electron/main.ts

And this is where I'm stuck. I see that you chose not to include lib/worker in your tsconfig file, but I couldn't figure out why. So I tried rebuilding the project, removing the "exclude": ["lib/worker"], line from the tsconfig.json, but that just gives me a whole mess of typescript errors.

σ csaroff:~/fiddle/onnxjs$npm run build

> onnxjs@0.1.7 build /Users/csaroff/fiddle/onnxjs
> tsc && node tools/build --build-wasm --build-bundle

node_modules/typescript/lib/lib.dom.d.ts:25:1 - error TS6200: Definitions of the following identifiers conflict with those in another file: EventListenerOrEventListenerObject, ImportExportKind, TableKind, BlobPart, HeadersInit, BodyInit, RequestInfo, DOMHighResTimeStamp, CanvasImageSource, OffscreenRenderingContext, MessageEventSource, ImageBitmapSource, TimerHandler, PerformanceEntryList, VibratePattern, AlgorithmIdentifier, HashAlgorithmIdentifier, BigInteger, NamedCurve, GLenum, GLboolean, GLbitfield, GLint, GLsizei, GLintptr, GLsizeiptr, GLuint, GLfloat, GLclampf, TexImageSource, Float32List, Int32List, BufferSource, DOMTimeStamp, FormDataEntryValue, IDBValidKey, Transferable, BinaryType, CanvasDirection, CanvasFillRule, CanvasLineCap, CanvasLineJoin, CanvasTextAlign, CanvasTextBaseline, ClientTypes, EndingType, IDBCursorDirection, IDBRequestReadyState, IDBTransactionMode, ImageSmoothingQuality, KeyFormat, KeyType, KeyUsage, NotificationDirection, NotificationPermission, OffscreenRenderingContextId, PermissionName, PermissionState, PushEncryptionKeyName, PushPermissionState, ReferrerPolicy, RequestCache, RequestCredentials, RequestDestination, RequestMode, RequestRedirect, ResponseType, ServiceWorkerState, ServiceWorkerUpdateViaCache, VisibilityState, WebGLPowerPreference, WorkerType, XMLHttpRequestResponseType

25 interface Account {
   ~~~~~~~~~

  node_modules/typescript/lib/lib.webworker.d.ts:25:1
    25 interface AddEventListenerOptions extends EventListenerOptions {
       ~~~~~~~~~
    Conflicts are in this file.
...

I'm pretty much out of ideas. I checked in a minimally reproducible example on this branch. Here's a diff of the changes. Any help/insight here would be great!

fs-eire commented 5 years ago

onnxjs-node cannot work in browser environments. it does not work with worker-loader either, as the latter is specifically targeting running in webworker.

supporting Shape is a known issue as referenced in #84

28Smiles commented 4 years ago

The operator Shape is now available in version 0.1.8.

kbrose commented 3 years ago

I still see this error using version 0.1.8.

kbrose commented 3 years ago

The shape.ts file hit the master branch on Sep 1 but 0.1.8 was released on August 28, so maybe it just missed the cut to make it into 0.1.8?

JonathanSum commented 3 years ago

image Mine is 0.1.8 too! I am not using a nodejs, but Reactjs.

28Smiles commented 3 years ago

What backend are u using? It's only implemented in the cpu backend.

JonathanSum commented 3 years ago

It was all backend failed. The problem is solved.

28Smiles commented 3 years ago

I do not understand your message, but good thing if it works.

Andredance commented 3 years ago

Hi, I am using backendHint="cpu" with onnxjs 0.1.8 but got the same error

TypeError: cannot resolve operator 'Shape' with opsets: ai.onnx v9
    at Object.e.resolveOperator (onnx.min.js:1)
    at t.resolve (onnx.min.js:14)
    at e.initializeOps (onnx.min.js:14)
    at onnx.min.js:14
    at t.event (onnx.min.js:1)
    at e.initialize (onnx.min.js:14)
    at e.<anonymous> (onnx.min.js:14)
    at onnx.min.js:14
    at Object.next (onnx.min.js:14)
    at a (onnx.min.js:14)

@JonathanSum how did you solve the problem? @28Smiles maybe any advices how to solve it?

kbrose commented 3 years ago

Repeating what I said above, the Shape operation was added after the latest version (v0.1.8) was released. We won't be able to use the Shape operation until a new version is released.

You can verify this by downloading the snapshot of the code for the release and see that it does not contain the file lib/ops/Shape.ts

EJShim commented 3 years ago

Can this be solved or not?

I am having same error using below code, which worked fine in onnxruntime python and CPP.

 <script src="https://cdn.jsdelivr.net/npm/onnxjs@0.1.8/dist/onnx.min.js"></script>
    <!-- Code that consume ONNX.js -->
    <script>
      // create a session
      const myOnnxSession = new onnx.InferenceSession();
      // load the ONNX model file
      myOnnxSession.loadModel("./200_0.19.onnx").then(() => {
        // generate model input
        // const inferenceInputs = getInputs();

        // console.log(inferenceInputs)
        // execute the model
        // myOnnxSession.run(inferenceInputs).then((output) => {
        //   // consume the output
        //   const outputTensor = output.values().next().value;
        //   console.log(`model output tensor: ${outputTensor.data}.`);
        // });
      });
    </script>

I built onnxjs using master branch of this repository, but still getting same error.

dhirajnitk commented 3 years ago

Uncaught (in promise) TypeError: cannot resolve operator 'SVMClassifier' with opsets: ai.onnx v9, ai.onnx.ml v

Built onnxjs with master branch & it seems that SVMClassifier is not supported ops for wasm. (list of supported functions are mentioned)

AiueoABC commented 3 years ago

I also have same issue saying "Uncaught (in promise) TypeError: cannot resolve operator 'Shape' with opsets: ai.onnx v9". Should I modify a model I using not to pass "shape" things??? I feel it will be ok if I can rewrite "Shape" to "ReShape" in the onnx model, but is this possible? If not, I should retrain things right?

JonathanSum commented 3 years ago

Mine is solved. Not very difficult to solve. The problem is related to ONNX.

Solution1(It should solve most people's problem, but I didn't use it): One of the issues and solutions is ONNX does not support all the features of a Pytorch. Sometimes you have to degrade the version of ONNX to the older one to have certain features. On the other hand, you may find out the older version doesn't support another important feature. But trust me, you will find a better way to solve it, which is what I did when I was using on ReactJs.

Solution2 It was my solution that was about digging into ONNX itself. Thus, I suggest you talk to the ONNX coder to solve your issue.

Not that difficult to solve. Here is mine which is solved and runs perfectly🤣 output

If you have an interesting project. feel free to contact me. Maybe there is a possibility that I can solve it if I have free time on that day if it is running in Tensorflow, Kernes(TF2), or Pytorch.

dhirajnitk commented 3 years ago

@JonathanSum Congratulations. Is there any ported SVMClassifier ops that can be easily integrated with wasm_ops

codingdudecom commented 3 years ago

what the heck guys, "I solved this" then no more details, or "Here's 2 solutions: one, I don't know if it works, and second one is talk the ONNX coder"

I was hoping for an actual solution to this...

braindotai commented 3 years ago

This is not necessarily a webpack issue (as far as I know)... there are multiple reasons why this error might occur. The most common reason is using really custom reshaping operation in your PyTorch's model definition in python... like x.view(-1, x.size(0)), or x.reshape(x.shape[0], ...) ....

What solved the issue for me is by making sure that your python code is saying explicit numbers (1, 2, 256, 512, ...)... not x.size(0)

For this, you might need to log every reshaping step and print the shapes, and then use those values to make reshaping explicit and convert the model again to onnx format.

Hope this finally helps 😪

hyperparameters commented 3 years ago

getting the TypeError: cannot resolve operator 'Shape' with opsets: ai.onnx v10 with a mobilenet model from onnx model zoo?

https://github.com/onnx/models/blob/master/vision/classification/mobilenet/model/mobilenetv2-7.onnx

using onnxjs@0.1.8