apache / beam

Apache Beam is a unified programming model for Batch and Streaming data processing.
https://beam.apache.org/
Apache License 2.0
7.71k stars 4.2k forks source link

[Failing Test]: Onnx inference unit tests are failing. #31254

Open tvalentyn opened 2 months ago

tvalentyn commented 2 months ago

What happened?

Due to test configuration issues, the tests were not part of tox unit test suite. Enabling them causes failures:

FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxPytorchRunInferencePipelineTest::test_invalid_input_type - AttributeError: 'TestOnnxModelHandler' object has no attribute '_batching_kwargs'
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxPytorchRunInferencePipelineTest::test_model_handler_large_model - AssertionError: True is not false
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxPytorchRunInferencePipelineTest::test_model_handler_sets_env_vars - AssertionError: True is not false
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxPytorchRunInferencePipelineTest::test_pipeline_gcs_model - AttributeError: 'TestOnnxModelHandler' object has no attribute '_batching_kwargs'
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxPytorchRunInferencePipelineTest::test_pipeline_local_model_simple - AttributeError: 'TestOnnxModelHandler' object has no attribute '_batching_kwargs'
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxTensorflowRunInferencePipelineTest::test_invalid_input_type - AttributeError: 'TestOnnxModelHandler' object has no attribute '_batching_kwargs'
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxTensorflowRunInferencePipelineTest::test_pipeline_gcs_model - AttributeError: 'TestOnnxModelHandler' object has no attribute '_batching_kwargs'
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxTensorflowRunInferencePipelineTest::test_pipeline_local_model_simple - AttributeError: 'TestOnnxModelHandler' object has no attribute '_batching_kwargs'
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxSklearnRunInferencePipelineTest::test_invalid_input_type - AttributeError: 'TestOnnxModelHandler' object has no attribute '_batching_kwargs'
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxSklearnRunInferencePipelineTest::test_pipeline_gcs_model - AttributeError: 'TestOnnxModelHandler' object has no attribute '_batching_kwargs'
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxSklearnRunInferencePipelineTest::test_pipeline_local_model_simple - AttributeError: 'TestOnnxModelHandler' object has no attribute '_batching_kwargs'

Issue Failure

Failure: Test is continually failing

Issue Priority

Priority: 1 (unhealthy code / failing or flaky postcommit so we cannot be sure the product is healthy)

Issue Components

tvalentyn commented 2 months ago

We should also fix enable onnx in dependency compat test suite: https://github.com/apache/beam/issues/25796. Beam supports protobuf3, so we should still be able to test onnx even if it doesn't support protobuf4.