Hi, thank you for the research. I have a question about my project. I am considering applying knowledge distillation to the LightGlue model in order to replace the self-attention and cross-attention mechanisms with simpler layers, such as convolutional layers or multi-layer perceptrons. Has anyone thought about this?
Hi, thank you for the research. I have a question about my project. I am considering applying knowledge distillation to the LightGlue model in order to replace the self-attention and cross-attention mechanisms with simpler layers, such as convolutional layers or multi-layer perceptrons. Has anyone thought about this?