issues
search
tianyang-x
/
Mixture-of-Domain-Adapters
Codebase for ACL 2023 paper "Mixture-of-Domain-Adapters: Decoupling and Injecting Domain Knowledge to Pre-trained Language Models' Memories"
MIT License
46
stars
1
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Problem with package installation
#11
truongchau2602
opened
4 months ago
0
Question regarding the baselines
#10
JohannaOm
opened
5 months ago
1
关于baseline中使用peft微调的代码
#9
lhyscau
opened
10 months ago
1
UserWarning: Total length of `CombinedLoader` across ranks is zero. Please make sure this was your intention.
#8
txchen-USTC
opened
1 year ago
3
Runnning stage_one_pretrain.sh brings error
#7
txchen-USTC
closed
1 year ago
0
Can not train domain-adapter with unstructured knowledge
#6
nguyentuc
closed
10 months ago
4
stage2似乎没加载上stage1的权重,state_dict里的key不一样了
#5
JachinLin2022
closed
10 months ago
2
感谢分享代码!有个疑问是为什么只在layer 7和11加入Adapter?
#4
JachinLin2022
closed
10 months ago
1
Domain Adapter Training
#3
DopamineLcy
closed
1 year ago
2
请问该模型支持中文嘛
#2
scofield687
closed
10 months ago
1
can we do this with LLAMA model?
#1
kiran1501
closed
10 months ago
1