Closed pytorch-bot[bot] closed 1 month ago
Resolving the issue because the test is not flaky anymore after 2850 reruns without any failures and the issue hasn't been updated in 14 days. Please reopen the issue to re-disable the test if you think this is a false positive
Another case of trunk flakiness has been found here. Reopening issue. The list of platforms [linux] appears to contain all the recently affected platforms [linux].
Another case of trunk flakiness has been found here. The list of platforms [linux] appears to contain all the recently affected platforms [linux]. Either the change didn't propogate fast enough or disable bot might be broken.
Another case of trunk flakiness has been found here. The list of platforms [linux] appears to contain all the recently affected platforms [linux]. Either the change didn't propogate fast enough or disable bot might be broken.
Another case of trunk flakiness has been found here. The list of platforms [linux] appears to contain all the recently affected platforms [linux]. Either the change didn't propogate fast enough or disable bot might be broken.
Resolving the issue because the test is not flaky anymore after 3000 reruns without any failures and the issue hasn't been updated in 14 days. Please reopen the issue to re-disable the test if you think this is a false positive
Another case of trunk flakiness has been found here. Reopening issue. The list of platforms [linux] appears to contain all the recently affected platforms [linux].
Resolving the issue because the test is not flaky anymore after 2550 reruns without any failures and the issue hasn't been updated in 14 days. Please reopen the issue to re-disable the test if you think this is a false positive
Platforms: linux
This test was disabled because it is failing in CI. See recent examples and the most recent trunk workflow logs.
Over the past 3 hours, it has been determined flaky in 2 workflow(s) with 6 failures and 2 successes.
Debugging instructions (after clicking on the recent samples link): DO NOT ASSUME THINGS ARE OKAY IF THE CI IS GREEN. We now shield flaky tests from developers so CI will thus be green but it will be harder to parse the logs. To find relevant log snippets:
test_scaled_dot_product_attention_4D_input_dim_no_attn_mask_dropout_p_0_0_cuda
Sample error message
``` Traceback (most recent call last): File "/var/lib/jenkins/workspace/test/test_transformers.py", line 1097, in test_scaled_dot_product_attention assert gradcheck(lambda *args, **kwargs: File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/testing/_internal/common_utils.py", line 4485, in gradcheck return torch.autograd.gradcheck(fn, inputs, **kwargs) File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/autograd/gradcheck.py", line 2053, in gradcheck return _gradcheck_helper(**args) File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/autograd/gradcheck.py", line 2082, in _gradcheck_helper _gradcheck_real_imag( File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/autograd/gradcheck.py", line 1492, in _gradcheck_real_imag gradcheck_fn( File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/autograd/gradcheck.py", line 1922, in _fast_gradcheck analytical_vJu = _get_analytical_vJu_backward_mode( File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/autograd/gradcheck.py", line 805, in _get_analytical_vJu_backward_mode all_vJ = _check_analytical_jacobian_attributes( File "/opt/conda/envs/py_3.10/lib/python3.10/site-packages/torch/autograd/gradcheck.py", line 791, in _check_analytical_jacobian_attributes raise GradcheckError( torch.autograd.gradcheck.GradcheckError: Backward is not reentrant, i.e., running backward with same input and grad_output multiple times gives different values, although analytical gradient matches numerical gradient.The tolerance for nondeterminism was 0.0. NOTE: If your op relies on non-deterministic operations i.e., it is listed here: https://pytorch.org/docs/stable/generated/torch.use_deterministic_algorithms.html this failure might be expected. If you are adding a new operator, please file an issue and then use one of the workarounds. The workaround depends on how your test invokes gradcheck/gradgradcheck. If the test - manually invokes gradcheck/gradgradcheck, then call gradcheck/gradgradcheck with `nondet_tol=Test file path:
test_transformers.py
cc @albanD @mruberry @jbschlosser @walterddr @mikaylagawarecki @clee2000