Wanted to make an issue for this instead of constantly asking in discord.
I saw the other ticket for multigpu fp16 training which is also nice. But ddp would let users scale up training that can happen on single gpus to multi gpu for linear speedup.
@nivibilla We're actively working to prep for a later release to most likely provide DDP to the OSS :) We're still figuring out licensing and distribution methods, so just figuring out those stuff first :)
Wanted to make an issue for this instead of constantly asking in discord.
I saw the other ticket for multigpu fp16 training which is also nice. But ddp would let users scale up training that can happen on single gpus to multi gpu for linear speedup.