BruceDai / webnnt

Apache License 2.0
0 stars 2 forks source link

BKM #38

Open BruceDai opened 4 years ago

BruceDai commented 4 years ago

Till now, our Web NN project supports these ops of three formats models, that's, TFLite, ONNX and OpenVino. Our supported and being added ops are driven by use cases and demos for demonstrating neural network inference hardware acceleration on the Web platform.

If you want to try your own model with our Web NN API, please go to see supported ops table at a glance to quickly check whether your model could be well supported by Web NN API. If all ops of your model were supported, it's great to continue your next work. While sometimes you maybe catch that some ops of your model aren't supported by current Web NN project. Regarding to such situations, you may open issues on this github repo and describe required ops details, we will firstly consider those required ops being reasonable and meaningful. Mainly we will give you some suggestion, and encourage you to do your implementation for those ops, and it's appreciated to your pr to our repo.

What should you know

Web NN polyfill API is modeled from NN API to JavaScript, this API has WASM and WebGL backend, WASM backend uses nn_ops.js which was compiled from tensorflow Branch:r1.15, and WebGL backend calls TensorFlow.js Core API.

To implementation new op

To add and implementation new op for Web NN polyfill API, You could refer to our current workflow:

  1. Update src/nn/Enums.js and docs/api.md likes PR#960

  2. Implement new op for WASM backend

this.<NEWOPNAME> = OperationCode.<NEWOPNAME>;