issues
search
TUDB-Labs
/
MoE-PEFT
An Efficient LLM Fine-Tuning Factory Optimized for MoE PEFT
Apache License 2.0
19
stars
4
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Question about the router loss of MixLoRA
#6
LouisDo2108
opened
7 hours ago
1
[bugfix] fix launch.py
#5
mikecovlee
closed
1 week ago
0
Adding validation.
#4
loadingyy
closed
6 days ago
2
Support for Multi-GPU Training
#3
kekethu
opened
3 weeks ago
1
[feature] sync patches from transformers, support LongRoPE for Phi3 models
#2
mikecovlee
closed
3 weeks ago
0
[bugfix] fix bug of loading datasets and router loss of MixLoRA
#1
mikecovlee
closed
3 weeks ago
0