Atten4Vis / LW-DETR

This repository is an official implementation of the paper "LW-DETR: A Transformer Replacement to YOLO for Real-Time Detection".
Apache License 2.0
249 stars 16 forks source link

Question about the setting of hyper parameters #27

Open bartbuaa opened 3 months ago

bartbuaa commented 3 months ago

Great job! I'm curious if there are comparative experiments regarding window_block_indexes and out_feature_indexes settings. Why are the attention settings within the window specifically at layers 0, 1, 3, 6, 7, and 9? For example, what impact would increasing or decreasing the number of window_block_indexes have on the metrics? Thanks.