Lightning-Universe / lightning-transformers

Flexible components pairing 🤗 Transformers with :zap: Pytorch Lightning
https://lightning-transformers.readthedocs.io
Apache License 2.0
610 stars 77 forks source link

Set DDP As default, allowing trainer.gpus=2 without specifying ddp #164

Closed SeanNaren closed 3 years ago

SeanNaren commented 3 years ago

Fixes #147

This does bring up the issue of no multi-gpu support within notebooks, however given the intricacies of the scheduler I'd rather have this fix, till there is demand.

codecov[bot] commented 3 years ago

Codecov Report

Merging #164 (23e954c) into master (9dd2f10) will not change coverage. The diff coverage is n/a.

Impacted file tree graph

@@           Coverage Diff           @@
##           master     #164   +/-   ##
=======================================
  Coverage   84.60%   84.60%           
=======================================
  Files          66       66           
  Lines        1397     1397           
=======================================
  Hits         1182     1182           
  Misses        215      215           
Flag Coverage Δ
unittests 84.60% <ø> (ø)

Flags with carried forward coverage won't be shown. Click here to find out more.


Continue to review full report at Codecov.

Legend - Click here to learn more Δ = absolute <relative> (impact), ø = not affected, ? = missing data Powered by Codecov. Last update 9dd2f10...23e954c. Read the comment docs.