intel / webml-polyfill

Deprecated, the Web Neural Network Polyfill project has been moved to https://github.com/webmachinelearning/webnn-polyfill
Apache License 2.0
161 stars 46 forks source link

[Test] add ONNX models tests by reusing their test data sets #300

Closed huningxin closed 5 years ago

huningxin commented 5 years ago

Some ONNX models have test data sets. It would be good to leverage those data sets to test WebML API implementation.

@pinzhenx already has some initial work to load and use the data sets. @BruceDai , please work with @pinzhenx to create test cases according to them. Thanks!

ibelem commented 5 years ago

Onnx end to end test

BruceDai commented 5 years ago

This test case is for model level, I will submit PR soon.

BruceDai commented 5 years ago

Currently I find output data with WASM backend is 1e-3 accuracy near reference output data. And I also use native python code with tensorflow backend to get native output data which is also 1.5e-3 accuracy near reference output data

#native python3 code goes following:

import numpy as np
import onnx
import os
import glob
from onnx_tf.backend import prepare
from onnx import numpy_helper
onnx_model = onnx.load("path/squeezenet1.1/squeezenet1.1.onnx")
inputs = []
inputs_num = len(glob.glob(os.path.join('path/squeezenet1.1/test_data_set_0', 'input_*.pb')))
for i in range(inputs_num):
    input_file = os.path.join('path/squeezenet1.1/test_data_set_0', 'input_{}.pb'.format(i))
    tensor = onnx.TensorProto()
    with open(input_file, 'rb') as f:
        tensor.ParseFromString(f.read())
    inputs.append(numpy_helper.to_array(tensor))

ref_outputs = []
ref_outputs_num = len(glob.glob(os.path.join('path/squeezenet1.1/test_data_set_0', 'output_*.pb')))
for i in range(ref_outputs_num):
    output_file = os.path.join('path/squeezenet1.1/test_data_set_0', 'output_{}.pb'.format(i))
    tensor = onnx.TensorProto()
    with open(output_file, 'rb') as f:
        tensor.ParseFromString(f.read())
    ref_outputs.append(numpy_helper.to_array(tensor))

outputs = list(prepare(onnx_model).run(inputs[0]))

# Compare the results with reference outputs.
for ref_o, o in zip(ref_outputs, outputs):
    np.testing.assert_almost_equal(ref_o, o, decimal=3)  #use decimal=3, pass

#help(np.testing.assert_almost_equal)
#assert_almost_equal(actual, desired, decimal=7, err_msg='', verbose=True)
#    Raises an AssertionError if two items are not equal up to desired
#    precision.
#    
#    The test verifies that the elements of ``actual`` and ``desired`` satisfy.
#    
#        ``abs(desired-actual) < 1.5 * 10**(-decimal)``
huningxin commented 5 years ago

Thanks @BruceDai ! I would suggest to file an issue to ONNX and get feedback from there.