Due to test configuration issues, the tests were not part of tox unit test suite. Enabling them causes failures:
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxPytorchRunInferencePipelineTest::test_invalid_input_type - AttributeError: 'TestOnnxModelHandler' object has no attribute '_batching_kwargs'
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxPytorchRunInferencePipelineTest::test_model_handler_large_model - AssertionError: True is not false
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxPytorchRunInferencePipelineTest::test_model_handler_sets_env_vars - AssertionError: True is not false
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxPytorchRunInferencePipelineTest::test_pipeline_gcs_model - AttributeError: 'TestOnnxModelHandler' object has no attribute '_batching_kwargs'
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxPytorchRunInferencePipelineTest::test_pipeline_local_model_simple - AttributeError: 'TestOnnxModelHandler' object has no attribute '_batching_kwargs'
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxTensorflowRunInferencePipelineTest::test_invalid_input_type - AttributeError: 'TestOnnxModelHandler' object has no attribute '_batching_kwargs'
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxTensorflowRunInferencePipelineTest::test_pipeline_gcs_model - AttributeError: 'TestOnnxModelHandler' object has no attribute '_batching_kwargs'
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxTensorflowRunInferencePipelineTest::test_pipeline_local_model_simple - AttributeError: 'TestOnnxModelHandler' object has no attribute '_batching_kwargs'
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxSklearnRunInferencePipelineTest::test_invalid_input_type - AttributeError: 'TestOnnxModelHandler' object has no attribute '_batching_kwargs'
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxSklearnRunInferencePipelineTest::test_pipeline_gcs_model - AttributeError: 'TestOnnxModelHandler' object has no attribute '_batching_kwargs'
FAILED apache_beam/ml/inference/onnx_inference_test.py::OnnxSklearnRunInferencePipelineTest::test_pipeline_local_model_simple - AttributeError: 'TestOnnxModelHandler' object has no attribute '_batching_kwargs'
Issue Failure
Failure: Test is continually failing
Issue Priority
Priority: 1 (unhealthy code / failing or flaky postcommit so we cannot be sure the product is healthy)
We should also fix enable onnx in dependency compat test suite: https://github.com/apache/beam/issues/25796. Beam supports protobuf3, so we should still be able to test onnx even if it doesn't support protobuf4.
What happened?
Due to test configuration issues, the tests were not part of tox unit test suite. Enabling them causes failures:
Issue Failure
Failure: Test is continually failing
Issue Priority
Priority: 1 (unhealthy code / failing or flaky postcommit so we cannot be sure the product is healthy)
Issue Components