whai362 / PVT

Official implementation of PVT series
Apache License 2.0
1.73k stars 246 forks source link

question for PVTv2: in the paper the reduction ratio is 7 in Linear SRA, but in the code is sr_ratios=[8, 4, 2, 1] #90

Closed StormArcher closed 2 years ago

StormArcher commented 2 years ago

question for PVTv2: in the paper the reduction ratio is 7 in Linear SRA, but in the code is sr_ratios=[8, 4, 2, 1],

Is there something wrong with my understanding?

whai362 commented 2 years ago

Hi, 7 is the attention pooling window size of PVTv2-Li. In "*-Li" models, sr_ratios are not available.

image

whai362 commented 2 years ago

See "7" at here: https://github.com/whai362/PVT/blob/v2/classification/pvt_v2.py#L77

StormArcher commented 2 years ago

Thank you .  Very nice idea.  Thanks for you response

发自我的iPhone

------------------ Original ------------------ From: whai362 @.> Date: Wed,Dec 29,2021 4:49 PM To: whai362/PVT @.> Cc: Archerzjc @.>, Author @.> Subject: Re: [whai362/PVT] question for PVTv2: in the paper the reduction ratio is 7 in Linear SRA, but in the code is sr_ratios=[8, 4, 2, 1] (Issue #90)

Hi, 7 is the attention pooling window size of PVTv2-Li. In "*-Li" models, sr_ratios are not available.

— Reply to this email directly, view it on GitHub, or unsubscribe. Triage notifications on the go with GitHub Mobile for iOS or Android. You are receiving this because you authored the thread.Message ID: @.***>