mpc001 / Lipreading_using_Temporal_Convolutional_Networks

ICASSP'22 Training Strategies for Improved Lip-Reading; ICASSP'21 Towards Practical Lipreading with Distilled and Efficient Models; ICASSP'20 Lipreading using Temporal Convolutional Networks
Other
393 stars 102 forks source link

The code for the Knowledge Distillation loss #65

Open BaochaoZhu opened 4 weeks ago

BaochaoZhu commented 4 weeks ago

Hi.

Did you publish the code for the Knowledge Distillation loss? I couldn't find it in the code. If it is not there, could you please publish the code?

Thanks

kyrie0520 commented 4 weeks ago

你好。

您是否发布了 Knowledge Distillation 损失的代码?我在代码中找不到它。如果它不存在,您能否发布代码?

谢谢

您好,请问您能跑这个项目吗,如果您能运行它,可以给我一些指导吗