issues
search
SimiaoZuo
/
MoEBERT
This PyTorch package implements MoEBERT: from BERT to Mixture-of-Experts via Importance-Guided Adaptation (NAACL 2022).
Apache License 2.0
97
stars
13
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Parameters are not shared in experts
#6
tairan-w
opened
2 years ago
0
What is the bash script of finetune without MoE
#5
CaffreyR
opened
2 years ago
0
Error when run `bash bert_base_mnli_example.sh`
#4
CaffreyR
opened
2 years ago
0
How about the performance difference between token-gate and sentence gate?
#3
GeneZC
opened
2 years ago
0
"Need to turn the model to a MoE first" error
#2
Harry-zzh
opened
2 years ago
5
The model on target task should be fined-tuned on the basis of BERT or MoEBERT?
#1
LisaWang0306
closed
2 years ago
3