-
-
TensorFlow, the TensorFlow logo and any related marks are trademarks of Google Inc.
https://www.tensorflow.org/extras/tensorflow_brand_guidelines.pdf
ONNX is a community project created by Faceboo…
-
Test Env:
Chromium Version: nightly build 75.0.3739.0 (34a8a0d)
Platform: macOS/Linux/Andriod/Windows
Expected Result:
The directory listing of error messages in console log should display right…
-
### Test Env
Chromium Version : nightly build 65.0.3324.0 (revision 3d99cdd)
Operating System : MacOS
### Expected behavior
Test should be PASS.
### Actual behavior
Error happened as foll…
-
https://github.com/intel/webml-polyfill/blob/master/docs/native_mapping.md
For NNAPI | MPS | BNNS | clDNN | ONNX
-
**Test Env:**
Chromium Version: nightly build 75.0.3739.0 (1a2331d)
Platform: Linux/Windows/macOS
**Expected Result:**
The DeepLab(without Atrous) performance should be improved.
**Actual Res…
-
Hi folks awesome work with hitting native performance via web.
Coming to my question, To use webml, can we just use the polyfill? it doesn't seem to be working though(Should we build the webml …
-
**Chromium build version:** https://github.com/otcshare/chromium-src/commit/f456c6d52199f5c81560e6671a7b9a496d160e6b
**Actual Result:**
MKLDNN backend doesn't work for macOS:
```
$ ./f456c6d…
-
As title, is there any sample code to show how to
1. load model
2. inference
3. measure time
I want to compare the performance between this proposal (webml) and onnx.js
-
According to [Apple* Machine Learning on Intel® Processor Graphics](https://software.intel.com/en-us/articles/apple-machine-learning-on-intel-processor-graphics), the MobileNet inference performance i…