microsoft / onnxscript

ONNX Script enables developers to naturally author ONNX functions and models using a subset of Python.
https://onnxscript.ai/
MIT License
271 stars 53 forks source link

[torchlib] Unregister stft, var, var_mean, std, std_mean #1867

Closed justinchuby closed 1 week ago

justinchuby commented 1 week ago

Following https://github.com/pytorch/pytorch/pull/136153, we remove stft, var, var_mean, std, std_mean ops. They were never used even before because the ops were always decomposed.

codecov[bot] commented 1 week ago

:x: 12 Tests Failed:

Tests completed Failed Passed Skipped
13926 12 13914 1623
View the top 3 failed tests by shortest run time > > ``` > onnxscript.backend.onnx_export_test.TestOnnxBackEnd test_export2python_produces_correct_onnx_script_model_1001_test_scatter_without_axis > ``` > >
Stack Traces | 0.005s run time > > > > > ``` > > onnxscript\backend\onnx_export_test.py:137: in extract_functions > > mod = importlib.import_module(import_name) > > C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\importlib\__init__.py:127: in import_module > > return _bootstrap._gcd_import(name[level:], package, level) > > E ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_scatter_without_axis' > > > > The above exception was the direct cause of the following exception: > > .nox\test\lib\site-packages\parameterized\parameterized.py:620: in standalone_func > > return func(*(a + p.args), **p.kwargs, **kw) > > onnxscript\backend\onnx_export_test.py:271: in test_export2python_produces_correct_onnx_script_model > > functions = extract_functions(backend_test.name, code, self.test_folder) > > onnxscript\backend\onnx_export_test.py:139: in extract_functions > > raise AssertionError( > > E AssertionError: Unable to import 'tests.onnx_backend_test_code.test_scatter_without_axis' (e=No module named 'tests.onnx_backend_test_code.test_scatter_without_axis') (file: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_scatter_without_axis.py', absolute path: 'D:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_scatter_without_axis.py', current folder: D:\a\onnxscript\onnxscript > > E ---- CONTENT -- > > E import numpy > > E from onnx import TensorProto > > E from onnx.helper import make_tensor > > E from onnxscript import script, external_tensor > > E from onnxscript.values import Opset > > E from onnxscript.onnx_types import FLOAT, INT64 > > E from onnxscript.onnx_opset import opset10 > > E > > E @script() > > E def bck_test_scatter_without_axis(data: FLOAT[3,3], indices: INT64[2,3], updates: FLOAT[2,3]) -> (FLOAT[3,3]): > > E y = opset10.Scatter(data, indices, updates) > > E return y > > ``` > >
tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU test_output_match_opinfo__erfc_cpu_float16
Stack Traces | 0.172s run time > > ``` > .../function_libs/torch_lib/ops_test.py:252: in run_test_output_match > torch.testing.assert_close( > E AssertionError: Tensor-likes are not close! > E > E Mismatched elements: 1 / 20 (5.0%) > E Greatest absolute difference: 0.00023031234741210938 at index (2,) (up to 0.0002 allowed) > E Greatest relative difference: 0.892578125 at index (2,) (up to 0.01 allowed) > ```
tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU test_output_match_opinfo__div_mode_floor_rounding_cpu_float16
Stack Traces | 0.18s run time > > ``` > Unexpected success > ```

To view individual test run time comparison to the main branch, go to the Test Analytics Dashboard