Closed drisspg closed 1 year ago
This pull request was exported from Phabricator. Differential Revision: D41625335
This pull request was exported from Phabricator. Differential Revision: D41625335
This pull request was exported from Phabricator. Differential Revision: D41625335
Summary: Replaces the the inline block of code in nn.funcitonal.mha with
_scaled_dot_product_attention
. This function allows the fused kernels to be called if all the required input conditions are met.cc VitalyFedyunin ngimel
X-link: https://github.com/pytorch/pytorch/pull/89470
Reviewed By: cpuhrsch
Differential Revision: D41625335
Pulled By: drisspg