microsoft / onnxruntime

ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
https://onnxruntime.ai
MIT License
14.22k stars 2.87k forks source link

Run time for web(browser) #3290

Open no-1ne opened 4 years ago

no-1ne commented 4 years ago

Hello folks, Thank you for onnx, there is android NN api support in preview but why no love for web.

There was onnx.js (https://github.com/microsoft/onnxjs) but it appears abandoned. It would be awesome to see onnx running on edge via browser, please consider maintaining it.

Thank you, stay safe and healthy

pranavsharma commented 4 years ago

We're working on Javascript bindings for onnxruntime. Stay tuned.

pranavsharma commented 4 years ago

We're working on Javascript bindings for onnxruntime. Stay tuned.

Javascript bindings were released in preview mode as part of 1.3 release. See here https://github.com/microsoft/onnxruntime/tree/master/nodejs.

RicCu commented 4 years ago

The node bindings are a great addition, but will there be support for browsers as well? Maybe bringing back onnx.js?

pranavsharma commented 4 years ago

The node bindings are a great addition, but will there be support for browsers as well? Maybe bringing back onnx.js?

We're still evaluating. cc @faxu

stale[bot] commented 4 years ago

This issue has been automatically marked as stale due to inactivity and will be closed in 7 days if no further activity occurs. If further support is needed, please provide an update and/or more details.

no-1ne commented 4 years ago

Bot, pls keep it open. Issue still being evaluated

stale[bot] commented 3 years ago

This issue has been automatically marked as stale due to inactivity and will be closed in 7 days if no further activity occurs. If further support is needed, please provide an update and/or more details.

no-1ne commented 3 years ago

Having a privacy friendly runtime without having to send data elsewhere for inference is a great to have and there is already onnx.js(although unmaintaned)..

For everyone else looking for an privacy friendly alternative, there is opencv.js, tf.js and webml https://docs.opencv.org/master/d5/d86/tutorial_dnn_javascript.html https://github.com/tensorflow/tfjs https://github.com/intel/webml-polyfill

EJShim commented 3 years ago

Are there any precision or compatibility problem when converting my torch model -> onnx -> (tfjs or cv.js) ? converting into onnx is fine, but converting again into tf or cv makes me nervous.

no-1ne commented 3 years ago

@EJShim please try with this https://github.com/daquexian/onnx-simplifier

EJShim commented 3 years ago

@startupgurukul Thank you. I recently succeeded converting my MobileNet-UNet torch model -> onnx -> tf -> tfjs, and predicted perfectly on browser, but the inference time got really slower (0.06s -> 15.0s). I am going to try that simplifier before converting to tf model.

aohan237 commented 3 years ago

vote for this feature

Update

Have to use tensorflowjs to fulfill things

faxu commented 3 years ago

CC @hanbitmyths

EmergentOrder commented 3 years ago

See: https://github.com/microsoft/onnxruntime/pull/6478 , which enables compiling ONNXRuntime into WebAssembly. It also mentions a forthcoming "ONNX Runtime Web" in TypeScript. Very nice, and looks to be a clear step up from the current state of ONNX.js.

EmergentOrder commented 3 years ago

ONNX Runtime Web has been merged: https://github.com/microsoft/onnxruntime/pull/7394

fs-eire commented 3 years ago

ONNX Runtime Web will be released in v1.8.0. We have finished feature works and currently working on documents and examples.