RunpeiDong / PointDistiller

[CVPR 2023] PointDistiller: Structured Knowledge Distillation Towards Efficient and Compact 3D Detection
https://arxiv.org/abs/2205.11098
MIT License
66 stars 1 forks source link

channel number in teacher and student #3

Closed chyohoo closed 1 year ago

chyohoo commented 1 year ago

hello, thank you for your wonderful work. I was wondering how you handle the channel number difference in calculating the KD-Loss? And what kind of loss was used for KD?

RunpeiDong commented 1 year ago

Hi @chyohoo,

Thanks for your interest in our work.

We plan to release the code after the paper camera-ready is available online, and the details will be shown.

chyohoo commented 1 year ago

Which layer is used to distillate or you do distillation over all layers in the backbone? You mentioned Dynamic graph conv, was knn used in the feature space as the same way in the paper DGCNN, or in sololy in Euclidean space?

RunpeiDong commented 1 year ago

Hi @chyohoo,

Hope this could help you.

chyohoo commented 1 year ago

I appreciate your reply.