Open pytorch-bot[bot] opened 1 year ago
Another case of trunk flakiness has been found here. Please verify the issue was opened after this instance, that the platforms list includes all of [linux], or disable bot might not be working as expected.
stack trace from last trunk failure log:
distributed/test_distributed_spawn.py::TestDistBackendWithSpawn::test_ddp_apply_optim_in_backward_ignored_params <- ../../../../opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/testing/_internal/distributed/distributed_test.py INFO:numba.cuda.cudadrv.driver:init
Error: 8-01 02:19:37,634] torch.testing._internal.common_distributed: [ERROR] Could not retrieve traceback for timed out process: 0
Error: 8-01 02:19:37,634] torch.testing._internal.common_distributed: [ERROR] Process 1 timed out with traceback:
Error: 8-01 02:19:37,634] torch.testing._internal.common_distributed: [ERROR]
Error: 8-01 02:19:37,634] torch.testing._internal.common_distributed: [ERROR] Thread 0x00007fc1e8fbc700 (most recent call first):
Error: 8-01 02:19:37,634] torch.testing._internal.common_distributed: [ERROR] <no Python frame>
Error: 8-01 02:19:37,634] torch.testing._internal.common_distributed: [ERROR]
Error: 8-01 02:19:37,634] torch.testing._internal.common_distributed: [ERROR] Thread 0x00007fc2118ba700 (most recent call first):
Error: 8-01 02:19:37,634] torch.testing._internal.common_distributed: [ERROR] <no Python frame>
Error: 8-01 02:19:37,634] torch.testing._internal.common_distributed: [ERROR]
Error: 8-01 02:19:37,634] torch.testing._internal.common_distributed: [ERROR] Current thread 0x00007fc2135fe700 (most recent call first):
Error: 8-01 02:19:37,634] torch.testing._internal.common_distributed: [ERROR] File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/testing/_internal/common_distributed.py", line 620 in _event_listener
Error: 8-01 02:19:37,634] torch.testing._internal.common_distributed: [ERROR] File "/opt/conda/envs/py_3.10/lib/python3.10/threading.py", line 953 in run
Error: 8-01 02:19:37,634] torch.testing._internal.common_distributed: [ERROR] File "/opt/conda/envs/py_3.10/lib/python3.10/threading.py", line 1016 in _bootstrap_inner
Error: 8-01 02:19:37,634] torch.testing._internal.common_distributed: [ERROR] File "/opt/conda/envs/py_3.10/lib/python3.10/threading.py", line 973 in _bootstrap
Error: 8-01 02:19:37,634] torch.testing._internal.common_distributed: [ERROR]
Error: 8-01 02:19:37,634] torch.testing._internal.common_distributed: [ERROR] Thread 0x00007fc2a3c41080 (most recent call first):
Error: 8-01 02:19:37,634] torch.testing._internal.common_distributed: [ERROR] File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/distributed/utils.py", line 265 in _verify_param_shape_across_processes
Error: 8-01 02:19:37,634] torch.testing._internal.common_distributed: [ERROR] File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/nn/parallel/distributed.py", line 797 in __init__
Error: 8-01 02:19:37,634] torch.testing._internal.common_distributed: [ERROR] File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/testing/_internal/distributed/distributed_test.py", line 5111 in test_ddp_apply_optim_in_backward_ignored_params
Error: 8-01 02:19:37,634] torch.testing._internal.common_distributed: [ERROR] File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/testing/_internal/common_distributed.py", line 174 in wrapper
Error: 8-01 02:19:37,634] torch.testing._internal.common_distributed: [ERROR] File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/testing/_internal/common_utils.py", line 2363 in wrapper
Error: 8-01 02:19:37,634] torch.testing._internal.common_distributed: [ERROR] File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/testing/_internal/common_distributed.py", line 543 in wrapper
Error: 8-01 02:19:37,634] torch.testing._internal.common_distributed: [ERROR] File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/testing/_internal/common_distributed.py", line 657 in run_test
Error: 8-01 02:19:37,634] torch.testing._internal.common_distributed: [ERROR] File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/testing/_internal/distributed/distributed_test.py", line 590 in _run
Error: 8-01 02:19:37,634] torch.testing._internal.common_distributed: [ERROR] File "/opt/conda/envs/py_3.10/lib/python3.10/multiprocessing/process.py", line 108 in run
Error: 8-01 02:19:37,634] torch.testing._internal.common_distributed: [ERROR] File "/opt/conda/envs/py_3.10/lib/python3.10/multiprocessing/process.py", line 314 in _bootstrap
Error: 8-01 02:19:37,634] torch.testing._internal.common_distributed: [ERROR] File "/opt/conda/envs/py_3.10/lib/python3.10/multiprocessing/spawn.py", line 129 in _main
Error: 8-01 02:19:37,634] torch.testing._internal.common_distributed: [ERROR] File "/opt/conda/envs/py_3.10/lib/python3.10/multiprocessing/spawn.py", line 116 in spawn_main
Error: 8-01 02:19:37,634] torch.testing._internal.common_distributed: [ERROR] File "<string>", line 1 in <module>
Error: 8-01 02:19:37,634] torch.testing._internal.common_distributed: [ERROR]
('RERUN', {'yellow': True}) [305.2724s] [100%]
Resolving the issue because the test is not flaky anymore after 10260 reruns without any failures and the issue hasn't been updated in 14 days. Please reopen the issue to re-disable the test if you think this is a false positive
Another case of trunk flakiness has been found here. Reopening the issue to disable. Please verify that the platforms list includes all of [linux].
Platforms: linux
This test was disabled because it is failing in CI. See recent examples and the most recent trunk workflow logs.
Over the past 3 hours, it has been determined flaky in 2 workflow(s) with 2 failures and 2 successes.
Debugging instructions (after clicking on the recent samples link): DO NOT ASSUME THINGS ARE OKAY IF THE CI IS GREEN. We now shield flaky tests from developers so CI will thus be green but it will be harder to parse the logs. To find relevant log snippets:
test_ddp_apply_optim_in_backward_ignored_params
Test file path:
distributed/test_distributed_spawn.py
cc @mrshenli @pritamdamania87 @zhaojuanmao @satgera @rohan-varma @gqchen @aazzolini @osalpekar @jiayisuse @H-Huang @kwen2501 @awgu @penguinwu