apache / beam

Apache Beam is a unified programming model for Batch and Streaming data processing.
https://beam.apache.org/
Apache License 2.0
7.76k stars 4.21k forks source link

The PostCommit Python job is flaky #30513

Closed github-actions[bot] closed 1 week ago

github-actions[bot] commented 6 months ago

The PostCommit Python is failing over 50% of the time Please visit https://github.com/apache/beam/actions/workflows/beam_PostCommit_Python.yml?query=is%3Afailure+branch%3Amaster to see the logs.

shunping commented 6 months ago

It first failed on https://github.com/apache/beam/actions/runs/8210266873.

The failed task is :sdks:python:test-suites:portable:py38:portableWordCountSparkRunnerBatch.

Traceback:

INFO:apache_beam.utils.subprocess_server:Starting service with ('java' '-jar' '/runner/_work/beam/beam/runners/spark/3/job-server/build/libs/beam-runners-spark-3-job-server-2.56.0-SNAPSHOT.jar' '--spark-master-url' 'local[4]' '--artifacts-dir' '/tmp/beam-temp8q8022zi/artifactsg6e8usou' '--job-port' '56313' '--artifact-port' '0' '--expansion-port' '0')
INFO:apache_beam.utils.subprocess_server:Error: A JNI error has occurred, please check your installation and try again
INFO:apache_beam.utils.subprocess_server:Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/beam/vendor/grpc/v1p60p1/io/grpc/BindableService
INFO:apache_beam.utils.subprocess_server:   at java.lang.ClassLoader.defineClass1(Native Method)
INFO:apache_beam.utils.subprocess_server:   at java.lang.ClassLoader.defineClass(ClassLoader.java:757)
INFO:apache_beam.utils.subprocess_server:   at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
INFO:apache_beam.utils.subprocess_server:   at java.net.URLClassLoader.defineClass(URLClassLoader.java:473)
INFO:apache_beam.utils.subprocess_server:   at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
INFO:apache_beam.utils.subprocess_server:   at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
INFO:apache_beam.utils.subprocess_server:   at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
INFO:apache_beam.utils.subprocess_server:   at java.security.AccessController.doPrivileged(Native Method)
INFO:apache_beam.utils.subprocess_server:   at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
INFO:apache_beam.utils.subprocess_server:   at java.lang.ClassLoader.loadClass(ClassLoader.java:419)
INFO:apache_beam.utils.subprocess_server:   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
INFO:apache_beam.utils.subprocess_server:   at java.lang.ClassLoader.loadClass(ClassLoader.java:352)
INFO:apache_beam.utils.subprocess_server:   at java.lang.Class.getDeclaredMethods0(Native Method)
INFO:apache_beam.utils.subprocess_server:   at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
INFO:apache_beam.utils.subprocess_server:   at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
INFO:apache_beam.utils.subprocess_server:   at java.lang.Class.getMethod0(Class.java:3018)
INFO:apache_beam.utils.subprocess_server:   at java.lang.Class.getMethod(Class.java:1784)
INFO:apache_beam.utils.subprocess_server:   at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:670)
INFO:apache_beam.utils.subprocess_server:   at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:652)
INFO:apache_beam.utils.subprocess_server:Caused by: java.lang.ClassNotFoundException: org.apache.beam.vendor.grpc.v1p60p1.io.grpc.BindableService
INFO:apache_beam.utils.subprocess_server:   at java.net.URLClassLoader.findClass(URLClassLoader.java:387)
INFO:apache_beam.utils.subprocess_server:   at java.lang.ClassLoader.loadClass(ClassLoader.java:419)
INFO:apache_beam.utils.subprocess_server:   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
INFO:apache_beam.utils.subprocess_server:   at java.lang.ClassLoader.loadClass(ClassLoader.java:352)
INFO:apache_beam.utils.subprocess_server:   ... 19 more
ERROR:apache_beam.utils.subprocess_server:Started job service with ('java', '-jar', '/runner/_work/beam/beam/runners/spark/3/job-server/build/libs/beam-runners-spark-3-job-server-2.56.0-SNAPSHOT.jar', '--spark-master-url', 'local[4]', '--artifacts-dir', '/tmp/beam-temp8q8022zi/artifactsg6e8usou', '--job-port', '56313', '--artifact-port', '0', '--expansion-port', '0')
ERROR:apache_beam.utils.subprocess_server:Error bringing up service
Traceback (most recent call last):
  File "/runner/_work/beam/beam/sdks/python/apache_beam/utils/subprocess_server.py", line 175, in start
    raise RuntimeError(
RuntimeError: Service failed to start up with error 1
Traceback (most recent call last):
  File "/opt/hostedtoolcache/Python/3.8.18/x64/lib/python3.8/runpy.py", line 194, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/opt/hostedtoolcache/Python/3.8.18/x64/lib/python3.8/runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File "/runner/_work/beam/beam/sdks/python/apache_beam/examples/wordcount.py", line 111, in <module>
    run()
  File "/runner/_work/beam/beam/sdks/python/apache_beam/examples/wordcount.py", line 106, in run
    output | 'Write' >> WriteToText(known_args.output)
  File "/runner/_work/beam/beam/sdks/python/apache_beam/pipeline.py", line 612, in __exit__
    self.result = self.run()
  File "/runner/_work/beam/beam/sdks/python/apache_beam/pipeline.py", line 586, in run
    return self.runner.run_pipeline(self, self._options)
  File "/runner/_work/beam/beam/sdks/python/apache_beam/runners/runner.py", line 192, in run_pipeline
    return self.run_portable_pipeline(
  File "/runner/_work/beam/beam/sdks/python/apache_beam/runners/portability/portable_runner.py", line 381, in run_portable_pipeline
    job_service_handle = self.create_job_service(options)
  File "/runner/_work/beam/beam/sdks/python/apache_beam/runners/portability/portable_runner.py", line 296, in create_job_service
    return self.create_job_service_handle(server.start(), options)
  File "/runner/_work/beam/beam/sdks/python/apache_beam/runners/portability/job_server.py", line 81, in start
    self._endpoint = self._job_server.start()
  File "/runner/_work/beam/beam/sdks/python/apache_beam/runners/portability/job_server.py", line 110, in start
    return self._server.start()
  File "/runner/_work/beam/beam/sdks/python/apache_beam/utils/subprocess_server.py", line 175, in start
    raise RuntimeError(
RuntimeError: Service failed to start up with error 1
> Task :sdks:python:test-suites:portable:py38:portableWordCountSparkRunnerBatch FAILED
shunping commented 6 months ago

Added the owner of the commit whose post-commit job failed at the first time. @damccorm

damccorm commented 6 months ago

I think we can pretty comfortably rule out that change, it was to the yaml sdk which is unrelated to portableWordCountSparkRunnerBatch. Note that this runs on a schedule, not on commits, though none of the commits in that scheduled time look particularly harmful

shunping commented 6 months ago

I see. It was red for the last two weeks and flaky before that too.

kennknowles commented 4 months ago

Permared right now

damccorm commented 4 months ago

Only sorta - each component job is actually not permared - e.g. there are 2 successes here, https://github.com/apache/beam/actions/runs/8873798546

The whole workflow is permared just because our flake percentage is so high

kennknowles commented 4 months ago

Yea, let's work out how to get top-level signal.

Abacn commented 4 months ago

The lowest and highest Python version (3.8, 3.11) are running more tests than (3.9, 3.10), could be those tests or task permared

kennknowles commented 4 months ago

Could make sense to find a way to get separate top-level signal for Python versions, assuming we can use software engineering to share everything necessary so they don't get out of sync.

Abacn commented 4 months ago

Yeah, we used to have this for Jenkins where each Python PostCommit had its own task

liferoad commented 3 months ago

The Vertex AI package version issue (we do not import this directly. So it should be fine.):


../../build/gradleenv/-1734967050/lib/python3.9/site-packages/vertexai/preview/developer/__init__.py:33 |  
-- | --
  | ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/vertexai/preview/developer/__init__.py:33 |  
  | ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/vertexai/preview/developer/__init__.py:33 |  
  | ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/vertexai/preview/developer/__init__.py:33 |  
  | ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/vertexai/preview/developer/__init__.py:33 |  
  | ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/vertexai/preview/developer/__init__.py:33 |  
  | ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/vertexai/preview/developer/__init__.py:33 |  
  | ../../build/gradleenv/-1734967050/lib/python3.9/site-packages/vertexai/preview/developer/__init__.py:33 |  
  | /runner/_work/beam/beam/build/gradleenv/-1734967050/lib/python3.9/site-packages/vertexai/preview/developer/__init__.py:33: DeprecationWarning: |  
  | After May 30, 2024, importing any code below will result in an error. |  
  | Please verify that you are explicitly pinning to a version of `google-cloud-aiplatform` |  
  | (e.g., google-cloud-aiplatform==[1.32.0, 1.49.0]) if you need to continue using this |  
  | library. |  
  |   |  
  | from vertexai.preview import ( |  
  | init, |  
  | remote, |  
  | VertexModel, |  
  | register, |  
  | from_pretrained, |  
  | developer, |  
  | hyperparameter_tuning, |  
  | tabular_models, |  
  | ) |  
  |  
liferoad commented 3 months ago

A new flaky test in py39 and this is related to https://github.com/apache/beam/issues/29617:

https://ge.apache.org/s/hb7syztoolfhu/console-log?page=17


=================================== FAILURES =================================== |  
-- | --
  | _______________ BigQueryQueryToTableIT.test_big_query_legacy_sql _______________ |  
  | [gw3] linux -- Python 3.9.19 /runner/_work/beam/beam/build/gradleenv/1398941893/bin/python3.9 |  
  |   |  
  | self = <apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT testMethod=test_big_query_legacy_sql> |  
  |   |  
  | @pytest.mark.it_postcommit |  
  | def test_big_query_legacy_sql(self): |  
  | verify_query = DIALECT_OUTPUT_VERIFY_QUERY % self.output_table |  
  | expected_checksum = test_utils.compute_hash(DIALECT_OUTPUT_EXPECTED) |  
  | pipeline_verifiers = [ |  
  | PipelineStateMatcher(), |  
  | BigqueryMatcher( |  
  | project=self.project, |  
  | query=verify_query, |  
  | checksum=expected_checksum) |  
  | ] |  
  |  |  
  | extra_opts = { |  
  | 'query': LEGACY_QUERY, |  
  | 'output': self.output_table, |  
  | 'output_schema': DIALECT_OUTPUT_SCHEMA, |  
  | 'use_standard_sql': False, |  
  | 'wait_until_finish_duration': WAIT_UNTIL_FINISH_DURATION_MS, |  
  | 'on_success_matcher': all_of(*pipeline_verifiers), |  
  | } |  
  | options = self.test_pipeline.get_full_options_as_args(**extra_opts) |  
  | >     big_query_query_to_table_pipeline.run_bq_pipeline(options) |  
  |   |  
  | apache_beam/io/gcp/big_query_query_to_table_it_test.py:178: |  
  | _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ |  
  | apache_beam/io/gcp/big_query_query_to_table_pipeline.py:103: in run_bq_pipeline |  
  | result = p.run() |  
  | apache_beam/testing/test_pipeline.py:115: in run |  
  | result = super().run( |  
  | apache_beam/pipeline.py:560: in run |  
  | return Pipeline.from_runner_api( |  
  | apache_beam/pipeline.py:587: in run |  
  | return self.runner.run_pipeline(self, self._options) |  
  | apache_beam/runners/direct/test_direct_runner.py:42: in run_pipeline |  
  | self.result = super().run_pipeline(pipeline, options) |  
  | apache_beam/runners/direct/direct_runner.py:117: in run_pipeline |  
  | from apache_beam.runners.portability.fn_api_runner import fn_runner |  
  | apache_beam/runners/portability/fn_api_runner/__init__.py:18: in <module> |  
  | from apache_beam.runners.portability.fn_api_runner.fn_runner import FnApiRunner |  
  | apache_beam/runners/portability/fn_api_runner/fn_runner.py:68: in <module> |  
  | from apache_beam.runners.portability.fn_api_runner import execution |  
  | apache_beam/runners/portability/fn_api_runner/execution.py:62: in <module> |  
  | from apache_beam.runners.portability.fn_api_runner import translations |  
  | apache_beam/runners/portability/fn_api_runner/translations.py:55: in <module> |  
  | from apache_beam.runners.worker import bundle_processor |  
  | apache_beam/runners/worker/bundle_processor.py:69: in <module> |  
  | from apache_beam.runners.worker import operations |  
  | _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ |  
  |   |  
  | >   ??? |  
  | E   KeyError: '__pyx_vtable__' |  
  |   |  
  | apache_beam/runners/worker/operations.py:1: KeyError
liferoad commented 3 months ago

Last three runs are green now.

image

Close this for now.

shunping commented 3 months ago

Great. Thanks @liferoad

github-actions[bot] commented 3 months ago

Reopening since the workflow is still flaky

liferoad commented 3 months ago

New error:


==================================== ERRORS ==================================== |  
-- | --
  | ________________ ERROR at setup of ReadTests.test_native_source ________________ |  
  | [gw5] linux -- Python 3.9.19 /runner/_work/beam/beam/build/gradleenv/1398941893/bin/python3.9 |  
  |   |  
  | self = <apache_beam.io.gcp.bigquery_tools.BigQueryWrapper object at 0x7f248f59baf0> |  
  | project_id = 'apache-beam-testing' |  
  | dataset_id = 'python_read_table_17178042710ffd3b', location = None |  
  | labels = None |  
  |   |  
  | @retry.with_exponential_backoff( |  
  | num_retries=MAX_RETRIES, |  
  | retry_filter=retry.retry_on_server_errors_and_timeout_filter) |  
  | def get_or_create_dataset( |  
  | self, project_id, dataset_id, location=None, labels=None): |  
  | # Check if dataset already exists otherwise create it |  
  | try: |  
  | >       dataset = self.client.datasets.Get( |  
  | bigquery.BigqueryDatasetsGetRequest( |  
  | projectId=project_id, datasetId=dataset_id)) |  
  |   |  
  | apache_beam/io/gcp/bigquery_tools.py:809:
kennknowles commented 2 months ago

I looked at a couple flakes and could not discern if they represented anything that should be release blocking, so I am moving this to the next release milestone.

liferoad commented 2 months ago

Green for last two days.

github-actions[bot] commented 2 months ago

Reopening since the workflow is still flaky

liferoad commented 2 months ago

[31m_______ ERROR collecting apache_beam/runners/worker/log_handler_test.py ________ |  
-- | --
  | apache_beam/runners/worker/log_handler_test.py:34: in <module> |  
  | from apache_beam.runners.worker import bundle_processor |  
  | apache_beam/runners/worker/bundle_processor.py:69: in <module> |  
  | from apache_beam.runners.worker import operations |  
  | apache_beam/runners/worker/operations.py:1: in init apache_beam.runners.worker.operations |  
  | ??? |  
  | E   KeyError: '__pyx_vtable__' |  
  | ________ ERROR collecting apache_beam/runners/worker/opcounters_test.py ________ |  
  | apache_beam/runners/worker/opcounters_test.py:27: in <module> |  
  | from apache_beam.runners.worker import opcounters |  
  | apache_beam/runners/worker/opcounters.py:1: in init apache_beam.runners.worker.opcounters |  
  | ??? |  
  | E   ValueError: apache_beam.utils.counters.Counter size changed, may indicate binary incompatibility. Expected 56 from C header, got 32 from PyObject

https://ge.apache.org/s/w6kem3hrdnwii/console-log/task/:sdks:python:test-suites:direct:py38:tensorflowInferenceTest?anchor=1334&page=2


[36m=========================== short test summary info ============================ |  
-- | --
  | ERROR apache_beam/dataframe/transforms_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/dataframe/transforms_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/render_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/render_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/trivial_runner_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/trivial_runner_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/dataflow/dataflow_job_service_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/dataflow/dataflow_job_service_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/interactive/interactive_runner_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/interactive/interactive_runner_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/interactive/utils_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/interactive/utils_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/portability/flink_runner_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/portability/flink_runner_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/portability/flink_uber_jar_job_server_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/portability/flink_uber_jar_job_server_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/portability/local_job_service_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/portability/local_job_service_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/portability/portable_runner_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/portability/portable_runner_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/portability/samza_runner_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/portability/samza_runner_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/portability/spark_java_job_server_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/portability/spark_runner_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/portability/spark_java_job_server_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/portability/spark_uber_jar_job_server_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/portability/spark_runner_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/portability/spark_uber_jar_job_server_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/portability/fn_api_runner/fn_runner_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/portability/fn_api_runner/fn_runner_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/portability/fn_api_runner/translations_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/portability/fn_api_runner/translations_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/portability/fn_api_runner/trigger_manager_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/worker/bundle_processor_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/worker/log_handler_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/worker/opcounters_test.py - ValueError: apache_beam.utils.counters.Counter size changed, may indicate binary incompatibility. Expected 56 from C header, got 32 from PyObject |  
  | ERROR apache_beam/runners/portability/fn_api_runner/trigger_manager_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/worker/bundle_processor_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/worker/log_handler_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/worker/opcounters_test.py - ValueError: apache_beam.utils.counters.Counter size changed, may indicate binary incompatibility. Expected 56 from C header, got 32 from PyObject |  
  | ERROR apache_beam/runners/worker/sdk_worker_main_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/worker/sdk_worker_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/worker/sideinputs_test.py - ValueError: apache_beam.utils.counters.Counter size changed, may indicate binary incompatibility. Expected 56 from C header, got 32 from PyObject |  
  | ERROR apache_beam/runners/worker/sdk_worker_main_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/worker/sdk_worker_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/runners/worker/sideinputs_test.py - ValueError: apache_beam.utils.counters.Counter size changed, may indicate binary incompatibility. Expected 56 from C header, got 32 from PyObject |  
  | ERROR apache_beam/testing/load_tests/microbenchmarks_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/transforms/combinefn_lifecycle_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/testing/load_tests/microbenchmarks_test.py - KeyError: '__pyx_vtable__' |  
  | ERROR apache_beam/transforms/combinefn_lifecycle_test.py - KeyError: '__pyx_vtable__'
jrmccluskey commented 2 months ago

No cython issues in recent runs, just a number of flakes for tests with external connections (GCSIO, RRIO) that aren't consistent across Python versions or different runs

Abacn commented 3 weeks ago

Currently Python3.12 Dataflow test has two test failing consistently:

apache_beam/ml/inference/sklearn_inference_it_test.py::SklearnInference::test_sklearn_mnist_classification 

apache_beam/ml/inference/sklearn_inference_it_test.py::SklearnInference::test_sklearn_mnist_classification_large_model

Error:

 subprocess.CalledProcessError: Command '['/runner/_work/beam/beam/build/gradleenv/2050596100/bin/python3.12', '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', '/tmp/tmpoq1ebvgy/tmp_requirements.txt', '--exists-action', 'i', '--no-deps', '--implementation', 'cp', '--abi', 'cp312', '--platform', 'manylinux2014_x86_64']' returned non-zero exit status 1.

Error compiling Cython file:

sklearn/utils/_vector_sentinel.pyx:31:9: Previous declaration is here

Cannot install sklearn from source using cython

happened as early as https://github.com/apache/beam/commits/5b2bfe96f83a5631c3a8d5c3b92a0f695ffe2d7d

Abacn commented 3 weeks ago

We need bump sklearn requirements here: https://github.com/apache/beam/blob/master/sdks/python/apache_beam/examples/inference/sklearn_examples_requirements.txt

github-actions[bot] commented 3 weeks ago

Reopening since the workflow is still flaky

github-actions[bot] commented 1 week ago

Reopening since the workflow is still flaky

liferoad commented 1 week ago

2024-08-30T07:28:39.6571287Z if setup_options.setup_file is not None: 2024-08-30T07:28:39.6571763Z if not os.path.isfile(setup_options.setup_file): 2024-08-30T07:28:39.6572227Z > raise RuntimeError( 2024-08-30T07:28:39.6572923Z 'The file %s cannot be found. It was specified in the ' 2024-08-30T07:28:39.6573578Z '--setup_file command line option.' % setup_options.setup_file) 2024-08-30T07:28:39.6574970Z E RuntimeError: The file /runner/_work/beam/beam/sdks/python/apache_beam/examples/complete/juliaset/src/setup.py cannot be found. It was specified in the --setup_file command line option.

https://productionresultssa6.blob.core.windows.net/actions-results/9f18d66f-dabf-46e8-8b29-ae50d075f3dd/workflow-job-run-912db29d-d57b-5850-6efb-b125ca814b95/logs/job/job-logs.txt?rsct=text%2Fplain&se=2024-08-30T14%3A06%3A43Z&sig=aqESnfP68oo0sF7TUtpq%2BNFgdgfCbq8Ey3q%2BFMLZtvI%3D&ske=2024-08-31T00%3A21%3A54Z&skoid=ca7593d4-ee42-46cd-af88-8b886a2f84eb&sks=b&skt=2024-08-30T12%3A21%3A54Z&sktid=398a6654-997b-47e9-b12b-9515b896b4de&skv=2024-05-04&sp=r&spr=https&sr=b&st=2024-08-30T13%3A56%3A38Z&sv=2024-05-04

tvalentyn commented 1 week ago

Currently failing test:

gradlew :sdks:python:test-suites:portable:py312:portableLocalRunnerJuliaSetWithSetupPy