Gardlin / PEAL

[CVPR2023] PEAL: Prior-embedded Explicit Attention Learning for Low-overlap Point Cloud Registration
49 stars 3 forks source link

About the One-Way attention #5

Open Fzuerzmj opened 1 year ago

Fzuerzmj commented 1 year ago

Thank you for your nice work!

I would like to ask a question about the paper about One-Way attention. Why not used the One-Way attention mechanism on the Cross-Attention(Inter Frame) but only in intra frame?Have you ever done some experiments? Just a little confusion!

Gardlin commented 1 year ago

Hi, thanks for your interest in our work. In our paper, we didn't evaluate the inter-frame one-way attention method, focusing instead on the more meaningful global inter-frame cross-attention approach.

Cross-attention serves as a pivotal technique in facilitating feature exchange between two point clouds, thereby enabling the exploration of feature similarities between the source and target point clouds. Notably, global inter-frame cross-attention emerges as more insightful. Conversely, I think adopting inter-frame one-way attention introduces superfluous computational overhead, yielding limited benefits. Feel free to explore other methods.