Lightning-Universe / lightning-transformers

Flexible components pairing 🤗 Transformers with :zap: Pytorch Lightning
https://lightning-transformers.readthedocs.io
Apache License 2.0
610 stars 77 forks source link

Set the effective block size to None in the default config, fix log message #160

Closed SeanNaren closed 3 years ago

SeanNaren commented 3 years ago

Should fix #159

codecov[bot] commented 3 years ago

Codecov Report

Merging #160 (59c143c) into master (e84601b) will decrease coverage by 0.12%. The diff coverage is n/a.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #160      +/-   ##
==========================================
- Coverage   84.73%   84.60%   -0.13%     
==========================================
  Files          66       66              
  Lines        1395     1397       +2     
==========================================
  Hits         1182     1182              
- Misses        213      215       +2     
Flag Coverage Δ
unittests 84.60% <ø> (-0.13%) :arrow_down:

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
...ng_transformers/task/nlp/language_modeling/data.py 92.85% <ø> (ø)
lightning_transformers/core/instantiator.py 75.92% <0.00%> (-2.93%) :arrow_down:

Continue to review full report at Codecov.

Legend - Click here to learn more Δ = absolute <relative> (impact), ø = not affected, ? = missing data Powered by Codecov. Last update e84601b...59c143c. Read the comment docs.