Open zjchen77 opened 1 year ago
@zjchen77 thanks! Our CK kernels https://github.com/ROCmSoftwarePlatform/Composable_kernel already supports and in the process of refining them, Indeed exposing these kernels via MIOpen might be useful in many cases.
@JehandadKhan is this issue relater to multi-head attention as well?
@CAHEK7 That is correct.
@junliume Has this request been added to MIOpen? Thanks!
Attention mechanisms are widely used in deep learning models, particularly in large language models. And a flexible attention kernel can help users to build accelerated language models conveniently on AMD platform. I have already designed an attention kernel for variable text lengths. I believe that adding an attention kernel to MIOpen would be a valuable enhancement for the community. Thank you for considering this request.