Closed umangyadav closed 3 weeks ago
@zjgarvey @vivekkhandelwal1
What was the input tensor for that test-run?
Not sure. I just ran with modified exponent. https://github.com/nod-ai/SHARK-TestSuite/blob/6b93f7b084ffde34bb144e44489eb6b50a9ddfc6/e2eshark/onnx/operators/Pow/model.py
@umangyadav It looks to me like the issue results when the input values are negative. ONNX apparently refuses to compute a decimal power of a negative number (for good reason, since the results would be possibly non-unique and complex-valued in general).
The results computed by our path seem to be consistent with base**exp
from python. I'm not sure if this warrants a fix or not, but you could resolve the issue by specifying that the base consists of positive integers in the tests that rely on fractional exponent pow ops.
I suppose this leads me to some partial concern:
With lp pooling, does it compute Sum(abs(x)^p for x in kernel window)^(1/p)
or without the absolute value?
I suppose this leads me to some partial concern:
With lp pooling, does it compute
Sum(abs(x)^p for x in kernel window)^(1/p)
or without the absolute value?
Looks like it does abs
.
I'll change it to use abs
https://github.com/llvm/torch-mlir/pull/3449/files
Updated TorchOnnxToOnnx to add abs
op before doing lp norm to avoid.
All tests pass now. Thanks @zjgarvey
To reproduce change exponent value from 2 to 0.333 here https://github.com/nod-ai/SHARK-TestSuite/blob/6b93f7b084ffde34bb144e44489eb6b50a9ddfc6/e2eshark/onnx/operators/Pow/model.py#L34