Lightning-Universe / lightning-transformers

Flexible components pairing 🤗 Transformers with :zap: Pytorch Lightning
https://lightning-transformers.readthedocs.io
Apache License 2.0
610 stars 77 forks source link

Add default optimizer/scheduler when instantiating + drop lr #193

Closed SeanNaren closed 3 years ago

SeanNaren commented 3 years ago

To better support using this package as a library, I've introduced a default AdamW/scheduler. This means you can instantiate the transformer and plug in values, rather than having to inherit and modify configure_optimizers.

I've added a TODO to plug in a documentation link once that has been included!

codecov[bot] commented 3 years ago

Codecov Report

Merging #193 (90408bd) into master (bf45ddd) will increase coverage by 0.16%. The diff coverage is 100.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #193      +/-   ##
==========================================
+ Coverage   90.04%   90.21%   +0.16%     
==========================================
  Files          71       71              
  Lines        1537     1543       +6     
==========================================
+ Hits         1384     1392       +8     
+ Misses        153      151       -2     
Flag Coverage Δ
unittests 90.21% <100.00%> (+0.16%) :arrow_up:

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
lightning_transformers/core/config.py 100.00% <100.00%> (ø)
lightning_transformers/core/model.py 93.93% <100.00%> (+2.27%) :arrow_up:
...lp/question_answering/datasets/squad/processing.py 84.13% <0.00%> (+0.68%) :arrow_up:

Continue to review full report at Codecov.

Legend - Click here to learn more Δ = absolute <relative> (impact), ø = not affected, ? = missing data Powered by Codecov. Last update bf45ddd...90408bd. Read the comment docs.