Zhangyong-Tang / GMMT-AAAI2024

The official implementation of the AAAI2024 paper-"Generative-based Fusion Mechanism for Multi-modal Object Tracking"
17 stars 0 forks source link

About the training steps #4

Open LiShenglana opened 2 weeks ago

LiShenglana commented 2 weeks ago

Hello, this paper mentions training the fusion module and tracking head during the first phase of training, training the diffusion model and the new tracking head during the second phase, and discarding the fusion module and tracking head from the first phase during testing. Can you tell me if the direct discard of the tracking head in the first stage will affect the test results, and is it possible to use the same tracking head in the first stage and the second stage? Looking forward for your reply. Thank you very much!

Zhangyong-Tang commented 2 weeks ago

Q1: Direct discard of the tracking head in the first stage A1: There will be supervision for the fusion. But you can try train the fusion module and forzen the tracking head if there has pre-trained parameters. Q2: Is it possible to use the same tracking head in the first stage and the second stage A2: From my perspective, it's possible. But the latent output domain of the original fusion module and the diffusion-based fusion module might be different. So it should be better to use different tracking heads.

LiShenglana commented 2 weeks ago

Q1: Direct discard of the tracking head in the first stage A1: There will be supervision for the fusion. But you can try train the fusion module and forzen the tracking head if there has pre-trained parameters. Q2: Is it possible to use the same tracking head in the first stage and the second stage A2: From my perspective, it's possible. But the latent output domain of the original fusion module and the diffusion-based fusion module might be different. So it should be better to use different tracking heads.

Thank you very much for your reply!

Zhangyong-Tang commented 2 weeks ago

Sorry for the wrong reply. In A1: There will be no supervision.....

Many thanks for your attention!