issues
search
Tebmer
/
Awesome-Knowledge-Distillation-of-LLMs
This repository collects papers for "A Survey on Knowledge Distillation of Large Language Models". We break down KD into Knowledge Elicitation and Distillation Algorithms, and explore the Skill & Vertical Distillation of LLMs.
573
stars
35
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Update README.md
#8
OpZest
opened
3 months ago
0
Further distillation papers to consider
#7
begab
closed
5 months ago
1
[One paper] A new way to perform KD (verified on BERT compression)
#6
wutaiqiang
closed
5 months ago
2
[One latest paper] Rethinking Kullback-Leibler Divergence in Knowledge Distillation for Large Language Models
#5
wutaiqiang
closed
5 months ago
3
Missing paper (Towards Cross-Tokenizer Distillation: the Universal Logit Distillation Loss for LLMs)
#4
Nicolas-BZRD
closed
7 months ago
1
Request for adding a reference
#3
youganglyu
closed
7 months ago
1
Update README.md
#2
alphadl
closed
7 months ago
1
Fix typos
#1
alphadl
closed
7 months ago
1