Closed erkintelnyx closed 2 months ago
@erkintelnyx Can you try now with current main branch as the torch now used is 2.5 which might fix your issue: see this PR https://github.com/vllm-project/vllm/pull/6352
also, can someone add label "rocm" to this issue so that it is easier for us to track?
@erkintelnyx Can you try now with current main branch as the torch now used is 2.5 which might fix your issue: see this PR #6352
Working with the main branch as you suggested.
Your current environment
🐛 Describe the bug
Using gfx908 GPU, we recently updated the version to 0.5.1, but came across the issue: