XPixelGroup / HAT

CVPR2023 - Activating More Pixels in Image Super-Resolution Transformer Arxiv - HAT: Hybrid Attention Transformer for Image Restoration
Apache License 2.0
1.14k stars 134 forks source link

Value of RPI in hat_arch.py value #134

Open bleachman08 opened 3 months ago

bleachman08 commented 3 months ago

I am trying to understand the attention mechanism in the HAT model, I am trying to run the code of hat_arch.py but I don't understand what value to give rpi(used in window attention block). Thanks for your help