GitHubOfHyl97 / SkeAttnCLR

The Official PyTorch implementation of "Part Aware Contrastive Learning for Self-Supervised Action Recognition" in IJCAI 2023
13 stars 3 forks source link

The fusion results of NTU 120 #4

Open JHang2020 opened 9 months ago

JHang2020 commented 9 months ago

Hi, thanks for the great work! I note that in the linear evaluation setting, the fusion results of the proposed method achieve huge gains in NTU 120 (7+% gains), which is not observed in NTU 60. Meanwhile, for other methods, the performance gain of fusion is generally not more than 5%. So it confuses me a lot, while I haven't reproduced it yet. What do you think might be causing this?

Thanks for your help!!!

GitHubOfHyl97 commented 9 months ago

Hello! Thank you for following my work. I think the reason for this problem may be that skeleton representation cross-modal consistency is not emphasized in this work. Therefore, under the effect of the attention mechanism, the encoders of different modalities in the proposed method will emphasize different local information. This makes the fusion operation more complementary. In addition, in our experiments, this method also significantly improved in NTU120, which has more detailed classification. Through T-SNE visualization, we found that the proposed method is easier to cluster samples with strong local similarity together(See supplementary for details) . My personal conclusion is that the proposed method achieves finer-grained representation learning by bringing locally similar samples closer, and is therefore more conducive to classifying more detailed tasks.

华一磊

@. | ---- Replied Message ---- | From | @.> | | Date | 10/20/2023 15:57 | | To | @.> | | Cc | @.> | | Subject | [GitHubOfHyl97/SkeAttnCLR] The fusion results of NTU 120 (Issue #4) |

Hi, thanks for the great work! I note that in the linear evaluation setting, the fusion results of the proposed method achieve huge gains in NTU 120 (7+% gains), which is not observed in NTU 60. Meanwhile, for other methods, the performance gain of fusion is generally not more than 5%. So it confuses me a lot, while I haven't reproduced it yet. What do you think might be causing this?

Thanks for your help!!!

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you are subscribed to this thread.Message ID: @.***>

JHang2020 commented 9 months ago

Thanks for the reply!