Closed StormArcher closed 2 years ago
Hi, 7 is the attention pooling window size of PVTv2-Li. In "*-Li" models, sr_ratios are not available.
See "7" at here: https://github.com/whai362/PVT/blob/v2/classification/pvt_v2.py#L77
Thank you . Very nice idea. Thanks for you response
发自我的iPhone
------------------ Original ------------------ From: whai362 @.> Date: Wed,Dec 29,2021 4:49 PM To: whai362/PVT @.> Cc: Archerzjc @.>, Author @.> Subject: Re: [whai362/PVT] question for PVTv2: in the paper the reduction ratio is 7 in Linear SRA, but in the code is sr_ratios=[8, 4, 2, 1] (Issue #90)
Hi, 7 is the attention pooling window size of PVTv2-Li. In "*-Li" models, sr_ratios are not available.
— Reply to this email directly, view it on GitHub, or unsubscribe. Triage notifications on the go with GitHub Mobile for iOS or Android. You are receiving this because you authored the thread.Message ID: @.***>
question for PVTv2: in the paper the reduction ratio is 7 in Linear SRA, but in the code is sr_ratios=[8, 4, 2, 1],
Is there something wrong with my understanding?