tma15 / paper-reading-list

3 stars 0 forks source link

Hydra Attention: Efficient Attention with Many Heads #22

Open tma15 opened 1 year ago

tma15 commented 1 year ago
caijiali123 commented 1 year ago

it's good job! where has the code?

tma15 commented 1 year ago

@caijiali123 The code seems to be not available now. But, you can implement by yourself referring Appendix C.