issues
search
mosaicml
/
llm-foundry
LLM training code for Databricks foundation models
https://www.databricks.com/blog/introducing-dbrx-new-state-art-open-llm
Apache License 2.0
3.99k
stars
525
forks
source link
Configurable submesh
#1236
Closed
dakinggg
closed
4 months ago
dakinggg
commented
4 months ago
Makes the submesh for MoE + FSDP more configurable
Makes the submesh for MoE + FSDP more configurable