microsoft / onnxjs

ONNX.js: run ONNX models using JavaScript
Other
1.75k stars 130 forks source link

webgl seems not to work (or be used) #268

Open weingaunity opened 3 years ago

weingaunity commented 3 years ago

Hi,

i'm using onnx.js V0.1.8 from to run mobilenetv2 exported from pytorch. But i had to do some modifications because of unsupported operators created due to an adapative average pooling2d (see #265). But now it works and create the same output compared to pytorch when using the same image.

New issue: webgl seems not to work. A tensorflow.js version of mobilenetv2 achieved approx. 10fps inference.

onnx.js - backend - cpu: Approx. 0.5fps onnx.js - backend - wasm: Approx. 0.9fps onnx.js - backend - webgl: Approx. 0.5fps .... so i thing webgl is not used there.

I used following code to activate webgl.

What is wrong?

Thx. Klaus

Using and download all the files from cdn and placed the files in the same directory like index.html

var mdl="./pytorch_mobilenetv2.onnx";
const myOnnxSession = new onnx.InferenceSession({ backendHint: 'webgl' });
myOnnxSession.loadModel(mdl).then(() => {});
weingaunity commented 3 years ago

When using webgl i get following warnings, but as mentioned no speed improvement compared to cpu.

WebGL warning: drawArraysInstanced: Using format enabled by implicitly enabled extension: EXT_float_blend. For maximal portability enable it explicitly. 31 webgl-context.ts:191:12 WebGL: No further warnings will be reported for this WebGL context. (already reported 32 warnings)

smilemakc commented 3 years ago

@weingaunity you are running on MacBook?

weingaunity commented 3 years ago

@weingaunity you are running on MacBook?

No, i'm using a Laptop with Linux. My solution was now converting the ONNX file to a tensorflow.js based on this i'm using now following lines:

pip install tensorflowjs
don't use: pip install onnx_tf
pip install git+https://github.com/onnx/onnx-tensorflow.git
import tensorflow as tf
import tensorflowjs as tfjs
import onnx
from onnx_tf.backend import prepare
from tensorflow.python.tools.import_pb_to_tensorboard import import_to_tensorboard

onnx_model = onnx.load("output/pytorch_mobilenetv2.onnx")
tf_rep = prepare(onnx_model) # returns: A TensorflowRep class object representing the ONNX model
tf_rep.export_graph("output/pytorch_mobilenetv2.pb")
#import_to_tensorboard("output/pytorch_mobilenetv2.pb", "tb_log","")
mdl2=tf.saved_model.load("output/pytorch_mobilenetv2.pb")
tfjs.converters.convert_tf_saved_model("output/pytorch_mobilenetv2.pb", "output/js")