mosaicml / streaming

A Data Streaming Library for Efficient Neural Network Training
https://streaming.docs.mosaicml.com
Apache License 2.0
1.15k stars 145 forks source link

Pipeline Parallelism (Supported? How to?) #827

Open casper-hansen opened 1 week ago

casper-hansen commented 1 week ago

🚀 Feature Request

Supporting TP and SP seems quite easy to do with the `replication parameter:

replication = tp * sp

I have tried various ways to enable PP without success (unexpected high loss). I tried adding pp into the equation when computing replication and num_canonical_nodes, but I cannot get it to function normally because I get an unexpected high loss.

Motivation

I want to use the mosaicml streaming library with 4D parallel. Specifically, I rely on TorchTitan as my training tool and have simply swapped in the mosaicml streaming library by modifying the StreamingTextDataset implementation from LLM Foundry.

ethantang-db commented 6 days ago

we can look into this more in detail, meanwhile, have you tried using mosaicml/composer though for training? Are there specific features you are relying on in Torchtitan?

casper-hansen commented 6 days ago

I would really appreciate if you could look into it! TorchTitan uses torch.distributed.pipelining, most of which is only available from 2.5.0 or in nightly builds.

There are many key features like FSDP2, 4D parallelism, FP8, and torch.compile that makes LLaMa models scale well in pretraining. You also get full control over the training loop which is desirable if you want to experiment.