nod-ai / SHARK-TestSuite

Temporary home of a test suite we are evaluating
Apache License 2.0
2 stars 23 forks source link

Pow test fails if exponent is 0.333 #264

Closed umangyadav closed 3 weeks ago

umangyadav commented 3 weeks ago

To reproduce change exponent value from 2 to 0.333 here https://github.com/nod-ai/SHARK-TestSuite/blob/6b93f7b084ffde34bb144e44489eb6b50a9ddfc6/e2eshark/onnx/operators/Pow/model.py#L34

Gold reference[output[0]]:
tensor([0.8759, 1.1186,    nan, 0.6297,    nan,    nan, 1.1319, 0.4828, 1.3948,
        1.1382,    nan,    nan])

Inference Output[output[0]]:
tensor([ 0.8759,  1.1186, -0.7707,  0.6297, -0.9900, -0.7919,  1.1319,  0.4828,
         1.3948,  1.1382, -0.7178, -1.0410]):

Element-wise difference[output[0]]:
tensor([ True,  True, False,  True, False, False,  True,  True,  True,  True,
        False, False])

Percentage element-wise match[0]:58.33%
umangyadav commented 3 weeks ago

@zjgarvey @vivekkhandelwal1

zjgarvey commented 3 weeks ago

What was the input tensor for that test-run?

umangyadav commented 3 weeks ago

Not sure. I just ran with modified exponent. https://github.com/nod-ai/SHARK-TestSuite/blob/6b93f7b084ffde34bb144e44489eb6b50a9ddfc6/e2eshark/onnx/operators/Pow/model.py

zjgarvey commented 3 weeks ago

@umangyadav It looks to me like the issue results when the input values are negative. ONNX apparently refuses to compute a decimal power of a negative number (for good reason, since the results would be possibly non-unique and complex-valued in general).

The results computed by our path seem to be consistent with base**exp from python. I'm not sure if this warrants a fix or not, but you could resolve the issue by specifying that the base consists of positive integers in the tests that rely on fractional exponent pow ops.

zjgarvey commented 3 weeks ago

I suppose this leads me to some partial concern:

With lp pooling, does it compute Sum(abs(x)^p for x in kernel window)^(1/p) or without the absolute value?

umangyadav commented 3 weeks ago

I suppose this leads me to some partial concern:

With lp pooling, does it compute Sum(abs(x)^p for x in kernel window)^(1/p) or without the absolute value?

https://github.com/onnx/onnx/blob/375c161c67855fea9612c15b83ebff40fca838a4/onnx/reference/ops/op_lp_pool.py#L31

Looks like it does abs. I'll change it to use abs

umangyadav commented 3 weeks ago

https://github.com/llvm/torch-mlir/pull/3449/files Updated TorchOnnxToOnnx to add abs op before doing lp norm to avoid.

All tests pass now. Thanks @zjgarvey