YiqunChen1999 / RefineBox

Implementation of Enhancing Your Trained DETRs with Box Refinement
Other
53 stars 1 forks source link

is it possible to use rt-detr as well? #2

Open john09282922 opened 1 year ago

YiqunChen1999 commented 1 year ago

We don't conduct experiments on RT-DETR, but you can have a try.

sdreamforchen commented 12 months ago

We don't conduct experiments on RT-DETR, but you can have a try.

我感觉可以转变下架构,作为辅助头? 这样就可以无痛张点了。不知对否

DuckJ commented 12 months ago

要是能改变架构辅助头涨点就好了,加了论文中的两阶段类似csacade rcnn的方式,肯定能涨点,detr部署推理较慢,后面又堆了一些论文中提出的refine层,效率影响太大了,并且论文方式不会提高召回,大部分detr系列的检测,召回低才是比较大的问题吧

YiqunChen1999 commented 12 months ago

转变架构是指把 RT-DETR 作为 refinement network 是吗?后续有空的话我试试这个实验。

另外,AR 是有提高的。

sdreamforchen commented 12 months ago

您好,我感觉作为辅助头可能更加友好。

---Original--- From: "CHEN @.> Date: Wed, Sep 20, 2023 19:31 PM To: @.>; Cc: @.**@.>; Subject: Re: [YiqunChen1999/RefineBox] is it possible to use rt-detr as well?(Issue #2)

转变架构是指把 RT-DETR 作为 refinement network 是吗?后续有空的话我试试这个实验。

另外,AR 是有提高的。

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>

YiqunChen1999 commented 12 months ago

您好,我感觉作为辅助头可能更加友好。 ---Original--- From: "CHEN @.> Date: Wed, Sep 20, 2023 19:31 PM To: @.>; Cc: @.**@.>; Subject: Re: [YiqunChen1999/RefineBox] is it possible to use rt-detr as well?(Issue #2) 转变架构是指把 RT-DETR 作为 refinement network 是吗?后续有空的话我试试这个实验。 另外,AR 是有提高的。 — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>

意思是 refinement network 并行加到 DETR 上?

sdreamforchen commented 12 months ago

是的。我感觉可行,值得尝试,可以最大限度利用作者的这项工作

---Original--- From: "CHEN @.> Date: Wed, Sep 20, 2023 19:37 PM To: @.>; Cc: @.**@.>; Subject: Re: [YiqunChen1999/RefineBox] is it possible to use rt-detr as well?(Issue #2)

您好,我感觉作为辅助头可能更加友好。 … ---Original--- From: "CHEN @.> Date: Wed, Sep 20, 2023 19:31 PM To: @.>; Cc: @.@.>; Subject: Re: [YiqunChen1999/RefineBox] is it possible to use rt-detr as well?(Issue #2) 转变架构是指把 RT-DETR 作为 refinement network 是吗?后续有空的话我试试这个实验。 另外,AR 是有提高的。 — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>

意思是 refinement network 并行加到 DETR 上?

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>

YiqunChen1999 commented 12 months ago

是的。我感觉可行,值得尝试,可以最大限度利用作者的这项工作 ---Original--- From: "CHEN @.> Date: Wed, Sep 20, 2023 19:37 PM To: @.>; Cc: @.**@.>; Subject: Re: [YiqunChen1999/RefineBox] is it possible to use rt-detr as well?(Issue #2) 您好,我感觉作为辅助头可能更加友好。 … ---Original--- From: "CHEN @.> Date: Wed, Sep 20, 2023 19:31 PM To: @.>; Cc: @.@.>; Subject: Re: [YiqunChen1999/RefineBox] is it possible to use rt-detr as well?(Issue #2) 转变架构是指把 RT-DETR 作为 refinement network 是吗?后续有空的话我试试这个实验。 另外,AR 是有提高的。 — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.> 意思是 refinement network 并行加到 DETR 上? — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.>

非常感谢你的建议,后续有时间我做一下实验看看。可以保持交流!:handshake:

sdreamforchen commented 12 months ago

测试后,请分享下源码和对比测试结果哇。哈哈,谢谢。

---Original--- From: "CHEN @.> Date: Wed, Sep 20, 2023 19:47 PM To: @.>; Cc: @.**@.>; Subject: Re: [YiqunChen1999/RefineBox] is it possible to use rt-detr as well?(Issue #2)

是的。我感觉可行,值得尝试,可以最大限度利用作者的这项工作 … ---Original--- From: "CHEN @.> Date: Wed, Sep 20, 2023 19:37 PM To: @.>; Cc: @.@.>; Subject: Re: [YiqunChen1999/RefineBox] is it possible to use rt-detr as well?(Issue #2) 您好,我感觉作为辅助头可能更加友好。 … ---Original--- From: "CHEN @.> Date: Wed, Sep 20, 2023 19:31 PM To: @.>; Cc: @.@.>; Subject: Re: [YiqunChen1999/RefineBox] is it possible to use rt-detr as well?(Issue #2) 转变架构是指把 RT-DETR 作为 refinement network 是吗?后续有空的话我试试这个实验。 另外,AR 是有提高的。 — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.> 意思是 refinement network 并行加到 DETR 上? — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.>

非常感谢你的建议,后续有时间我做一下实验看看。可以保持交流!🤝

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>

DuckJ commented 12 months ago

转变架构是指把 RT-DETR 作为 refinement network 是吗?后续有空的话我试试这个实验。

另外,AR 是有提高的。

是的,之前理解有误,把高分数的预测框拿来精修,会使得回归更好,在同一iou阈值下,召回的框更多。 还有个疑问,论文中拿来精修的框,标签分配直接继承的detr系列的吧,没有再拿框和GT去做基于IoU的匹配吧

DuckJ commented 12 months ago

您好,我感觉作为辅助头可能更加友好。 ---Original--- From: "CHEN @.**> Date: Wed, Sep 20, 2023 19:31 PM To: @.**>; Cc: @.**@.**>; Subject: Re: [YiqunChen1999/RefineBox] is it possible to use rt-detr as well?(Issue #2) 转变架构是指把 RT-DETR 作为 refinement network 是吗?后续有空的话我试试这个实验。 另外,AR 是有提高的。 — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>

意思是 refinement network 并行加到 DETR 上?

@sdreamforchen 说的,感觉论文中已经实验过了。我理解的做法是不是就是和detr一起联合训练,然后测试的时候只测试detr的检测效果是不是有提升,论文中如果补这个实验结果感觉会更好一点。论文中强调一起训练相对固定detr只训练refinenet会掉点的原因是因为后者初始化不行,建议将目前得到的最终的refinebox的模型权重全部放开,训训后测试一下detr的检测效果。 效果肯定没加refine好,但是是不是也能在原基础上提点呢

DuckJ commented 12 months ago

要是能有更好的辅助设计涨点那就太好了

sdreamforchen commented 12 months ago

后续很想测试一下。Co-DETR也是辅助头。 主要我关注的都是实时网络

---Original--- From: @.> Date: Thu, Sep 21, 2023 10:44 AM To: @.>; Cc: @.**@.>; Subject: Re: [YiqunChen1999/RefineBox] is it possible to use rt-detr as well?(Issue #2)

您好,我感觉作为辅助头可能更加友好。 … ---Original--- From: "CHEN @.> Date: Wed, Sep 20, 2023 19:31 PM To: @**.>; Cc: @.**@.>; Subject: Re: [YiqunChen1999/RefineBox] is it possible to use rt-detr as well?(Issue #2) 转变架构是指把 RT-DETR 作为 refinement network 是吗?后续有空的话我试试这个实验。 另外,AR 是有提高的。 — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @_.>

意思是 refinement network 并行加到 DETR 上?

@sdreamforchen 说的,感觉论文中已经实验过了。我理解的做法是不是就是和detr一起联合训练,然后测试的时候只测试detr的检测效果是不是有提升,论文中如果补这个实验结果感觉会更好一点。论文中强调一起训练相对固定detr只训练refinenet会掉点的原因是因为后者初始化不行,建议将目前得到的最终的refinebox的模型权重全部放开,训训后测试一下detr的检测效果。 效果肯定没加refine好,但是是不是也能在原基础上提点呢

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you were mentioned.Message ID: @.***>

sdreamforchen commented 12 months ago

应该对源码要做些细节调整,或者不冻结backbone

---Original--- From: @.> Date: Thu, Sep 21, 2023 10:44 AM To: @.>; Cc: @.**@.>; Subject: Re: [YiqunChen1999/RefineBox] is it possible to use rt-detr as well?(Issue #2)

要是能有更好的辅助设计涨点那就太好了

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you were mentioned.Message ID: @.***>

sdreamforchen commented 11 months ago

是的。我感觉可行,值得尝试,可以最大限度利用作者的这项工作 ---Original--- From: "CHEN @.**> Date: Wed, Sep 20, 2023 19:37 PM To: @.**>; Cc: @.**@.**>; Subject: Re: [YiqunChen1999/RefineBox] is it possible to use rt-detr as well?(Issue #2) 您好,我感觉作为辅助头可能更加友好。 … ---Original--- From: "CHEN @.> Date: Wed, Sep 20, 2023 19:31 PM To: @.>; Cc: @.@.>; Subject: Re: [YiqunChen1999/RefineBox] is it possible to use rt-detr as well?(Issue #2) 转变架构是指把 RT-DETR 作为 refinement network 是吗?后续有空的话我试试这个实验。 另外,AR 是有提高的。 — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.**> 意思是 refinement network 并行加到 DETR 上? — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.**>

非常感谢你的建议,后续有时间我做一下实验看看。可以保持交流!🤝

您好。我今天又读了一下论文,我有个技术点上的初步认识: 相比其他DETR,增加K值会增加不少的精度,比如从300到500,到900. 但是你的refine box,可以从300减少到100,30,精度都保持很好。所以,我的初步认识是:你的refine box,将query的表达更清楚了、更加简洁了,仅仅需要少量或者说与目标数相对应的K就可以了;对比其他DETR,它们可能是通过多个query(与K维度对应)去表达1个proposal(这里假设是单个目标,其实可能是多个query去表达多个目标)。 所以: 1 可能该文论还有上述非常重要的贡献; 我觉得这是一个很厉害的东西(因为deformable DETR也会对更大K值得到很大的增益),不应该在你论文中漏掉,有点可惜;当然可以再进一步分析下为什么有这个效果,我感觉和你Training and Inference这个章节里面的表述有直接关系。 2 可能通过辅助头或者分阶段2次/多次训练,可以进一步减轻推理负担,或者推理时舍去。(这部分值得思考一下) 3 既然可以增加选择的query的表达能力,是不是也可能推动one-layer-decoder的发展。