Closed PetrochukM closed 6 years ago
Merging #29 into master will increase coverage by
0.03%
. The diff coverage is88.23%
.
@@ Coverage Diff @@
## master #29 +/- ##
==========================================
+ Coverage 94.59% 94.63% +0.03%
==========================================
Files 55 55
Lines 1536 1528 -8
==========================================
- Hits 1453 1446 -7
+ Misses 83 82 -1
Impacted Files | Coverage Δ | |
---|---|---|
torchnlp/nn/attention.py | 96.66% <ø> (ø) |
:arrow_up: |
torchnlp/utils.py | 93.06% <ø> (ø) |
:arrow_up: |
torchnlp/word_to_vector/pretrained_word_vectors.py | 78.82% <ø> (-0.25%) |
:arrow_down: |
torchnlp/nn/weight_drop.py | 100% <100%> (ø) |
:arrow_up: |
torchnlp/metrics/accuracy.py | 100% <100%> (ø) |
:arrow_up: |
torchnlp/nn/lock_dropout.py | 93.33% <100%> (+5.09%) |
:arrow_up: |
torchnlp/text_encoders/moses_encoder.py | 75% <66.66%> (-3.95%) |
:arrow_down: |
torchnlp/nn/sru.py | 98.91% <83.33%> (-0.01%) |
:arrow_down: |
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact)
,ø = not affected
,? = missing data
Powered by Codecov. Last update 9fc48f5...00f043d. Read the comment docs.
This PR adds PyTorch 0.4 support.
Updates:
Variable
were appropriateitem()
APIvolatile
.new
to thetensor.new_*
APIclip_grad_norm
toclip_grad_norm_
constant
toconstant_
Variable(tensor.data)
pattern to.detach()
Other:
embedded_dropout
from the examples. It relied on old PyTorch 0.3.1 private API features and was not relevant to using PyTorch-NLP.codecov
allowing a small decrease in code coverage.@pytest.mark.skip(reason="Unsafe dataset host (SSL: CERTIFICATE_VERIFY_FAILED)")
sacremoses
to fix NLTK moses dependancy