issues
search
Lightning-AI
/
litgpt
Pretrain, finetune, deploy 20+ LLMs on your own data. Uses state-of-the-art techniques: flash attention, FSDP, 4-bit, LoRA, and more.
https://lightning.ai
Apache License 2.0
6.85k
stars
726
forks
source link
Add Mixtral MoE to README
#1365
Closed
lantiga
closed
3 weeks ago
lantiga
commented
3 weeks ago
Mixtral MoE was missing from README.md
Mixtral MoE was missing from README.md