issues
search
codecaution
/
Awesome-Mixture-of-Experts-Papers
A curated reading list of research in Mixture-of-Experts(MoE).
Apache License 2.0
524
stars
40
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Recommendation of a new repo of awesome MoE: https://github.com/Oliver-FutureAI/Awesome-MoE
#12
Oliver-FutureAI
opened
3 months ago
0
fix an arxiv link
#11
Spico197
opened
1 year ago
0
Uni-Perceiver-MoE on NIPS2022
#10
Lechatelia
closed
1 year ago
1
Add 4 new papers
#9
LQBDD
closed
2 years ago
0
Add 4 new papers
#8
LQBDD
closed
2 years ago
0
FasterMoE paper on PPoPP'22
#7
laekov
closed
2 years ago
1
Three new papers about MoE
#6
XueFuzhao
closed
2 years ago
2
Update README.md
#5
LQBDD
closed
2 years ago
0
Add MoEfication & PLE
#4
ZhengZixiang
closed
2 years ago
1
Adding 2 new papers
#3
LQBDD
closed
2 years ago
0
Update README.md
#2
LQBDD
closed
2 years ago
0
Update README.md
#1
Hsword
closed
2 years ago
0