RafaelDavisH / news-translation-tasks

BSD 3-Clause "New" or "Revised" License
0 stars 0 forks source link

[Auto](How Does Knowledge Distillation Work in Deep Learning Models?) #9

Closed RafaelDavisH closed 2 months ago

RafaelDavisH commented 2 months ago

name: 翻译任务(自动爬取) about: "新增待翻译文章,使用自动 GitHub Actions 自动爬取。" title: "[Auto](How Does Knowledge Distillation Work in Deep Learning Models?)" labels: Translation-needed assignees: ''

How Does Knowledge Distillation Work in Deep Learning Models?