Lightning-Universe / lightning-transformers

Flexible components pairing 🤗 Transformers with :zap: Pytorch Lightning
https://lightning-transformers.readthedocs.io
Apache License 2.0
610 stars 77 forks source link

Set correct plugin for deepspeed general trainer config #163

Closed SeanNaren closed 3 years ago

SeanNaren commented 3 years ago

Closes #161

I may remove this in the future to suggest people to use directly the plugins, as this is just additional complication, and same with the sharded.yaml.

codecov[bot] commented 3 years ago

Codecov Report

Merging #163 (c5cc52a) into master (7704b32) will not change coverage. The diff coverage is n/a.

Impacted file tree graph

@@           Coverage Diff           @@
##           master     #163   +/-   ##
=======================================
  Coverage   84.60%   84.60%           
=======================================
  Files          66       66           
  Lines        1397     1397           
=======================================
  Hits         1182     1182           
  Misses        215      215           
Flag Coverage Δ
unittests 84.60% <ø> (ø)

Flags with carried forward coverage won't be shown. Click here to find out more.


Continue to review full report at Codecov.

Legend - Click here to learn more Δ = absolute <relative> (impact), ø = not affected, ? = missing data Powered by Codecov. Last update 7704b32...c5cc52a. Read the comment docs.