Closed lorr1 closed 3 years ago
Merging #101 (e5068a1) into master (3475115) will decrease coverage by
0.31%
. The diff coverage is28.57%
.
@@ Coverage Diff @@
## master #101 +/- ##
==========================================
- Coverage 85.31% 85.00% -0.32%
==========================================
Files 40 40
Lines 1955 1967 +12
Branches 418 422 +4
==========================================
+ Hits 1668 1672 +4
- Misses 171 177 +6
- Partials 116 118 +2
Flag | Coverage Δ | |
---|---|---|
unittests | 85.00% <28.57%> (-0.32%) |
:arrow_down: |
Flags with carried forward coverage won't be shown. Click here to find out more.
Impacted Files | Coverage Δ | |
---|---|---|
src/emmental/learner.py | 67.51% <23.07%> (-1.47%) |
:arrow_down: |
src/emmental/utils/parse_args.py | 99.32% <100.00%> (+<0.01%) |
:arrow_up: |
Description of the problems or issues
There is currently no flag to iterate over data loader if you want to restart your model at the same spot in the data.
Description of the proposed changes
If user provides steps_learned and has skip_learned_data turned on, the I set the start_step to be 0 and the train step to be steps learn. I increment the batches without calling forward of the model until I reach train steps.
Test plan
Made the argument test pass. I have manually tested the resume behavior and that the model does continue when learned steps are set.
Checklist