SamsungLabs / Butterfly_Acc

The codes and artifacts associated with our MICRO'22 paper titled: "Adaptable Butterfly Accelerator for Attention-based NNs via Hardware and Algorithm Co-design"
113 stars 16 forks source link

Hardware Code of Attention Processor #6

Closed ZhangYuanQing99 closed 1 month ago

ZhangYuanQing99 commented 2 months ago

Hi, I have been reviewing your code and found it highly insightful. However, I noticed that the implementation seems to only include the Butterfly Processor from the hardware architecture diagram presented in the paper. Could you kindly clarify if the Attention Processor has been implemented in the current codebase? If not, is there any plan to release its implementation, or any guidance on how one could proceed to implement it?

Thank you for your time and contributions.

os-hxfan commented 1 month ago

Thanks for your interest. @ZhangYuanQing99

  1. During the design space exploration in our experiments (Section-V and Section-VI), our co-design framework suggests that the Attention Processor should be optimized away (Pls read our paper). This means the Butterfly-only Processor can achieve the best algorithmic & hardware performance in our target tasks. Therefore, in our final hardware design, we used the Butterfly-only Processor to achieve the best performance.
  2. You can still check the performance of the Attention Processor using the simulator we provided. That simulator is cross-checked with our RTL design of Attention Processor and waveform simulation. Although the hardware design of the Attention processor is not used in our experiments, we might consider open-sourcing the RTL design as a baseline in another paper.
ZhangYuanQing99 commented 1 month ago

Thank you so much for your prompt response! Your explanation clarified my doubts, and I also noticed the mention of N_ABfly=0 in your paper. I really appreciate your help and the effort you've put into maintaining this repository.