Open jataylo opened 1 week ago
More info https://github.com/pytorch/pytorch/issues/139621
Encountered while testing for preview release/2.6 builds as part of https://github.com/pytorch/pytorch/issues/139175 python test_flex_attention.py -k "test_load_from_bias_seq_only_float16
Traceback
test_load_from_bias_seq_only_float16 (main.TestFlexAttention) ... python: /root/.triton/llvm/llvm-b5cc222d-ubuntu-x64/include/llvm/ADT/SmallVector.h:291: T& llvm::SmallVectorTemplateCommon<T, >::operator[](llvm::SmallVectorTemplateCommon<T, >::size_type) [with T = unsigned int; = void; llvm::SmallVectorTemplateCommon<T, >::reference = unsigned int&; llvm::SmallVectorTemplateCommon<T, >::size_type = long unsigned int]: Assertion `idx < size()' failed.
Failed version reproducer:
Note this UT passed before the triton upgrade, to reproduce passing version:
rocm/pytorch-nightly:latest
Ubuntu
.
MI250X
ROCm 6.2.0
No response
cc: @antiagainst
Upstream PR is filed that should resolve this https://github.com/triton-lang/triton/pull/5084
Problem Description
More info https://github.com/pytorch/pytorch/issues/139621
Encountered while testing for preview release/2.6 builds as part of https://github.com/pytorch/pytorch/issues/139175 python test_flex_attention.py -k "test_load_from_bias_seq_only_float16
Traceback
Failed version reproducer:
Note this UT passed before the triton upgrade, to reproduce passing version:
rocm/pytorch-nightly:latest
Operating System
Ubuntu
CPU
.
GPU
MI250X
ROCm Version
ROCm 6.2.0
ROCm Component
No response
Steps to Reproduce
No response
(Optional for Linux users) Output of /opt/rocm/bin/rocminfo --support
No response
Additional Information
No response