xinghaochen / TinySAM

Official PyTorch implementation of "TinySAM: Pushing the Envelope for Efficient Segment Anything Model"
Apache License 2.0
403 stars 23 forks source link

Which part is knowledge distillation #16

Open skill-diver opened 10 months ago

skill-diver commented 10 months ago

Hello, Author,

Your paper is very nice, thanks for sharing. I have a confusion is that I can't find which part of the codes shows the knowledge distillation.

shuh15 commented 9 months ago

Hi, the released code is for inference and the knowledge distillation is during training process.

code-inflation commented 9 months ago

Are there any plans to release the code used for training?

chankeh commented 7 months ago

for

hi bro, have you found the distillation part ? i also have the same confusion ....