Closed Alexander-Barth closed 2 years ago
Merging #93 (cc3960b) into master (7a7a25a) will not change coverage. The diff coverage is
100.00%
.:exclamation: Current head cc3960b differs from pull request most recent head bc444fc. Consider uploading reports for the commit bc444fc to get more accurate results
@@ Coverage Diff @@
## master #93 +/- ##
=======================================
Coverage 22.05% 22.05%
=======================================
Files 69 69
Lines 3378 3378
=======================================
Hits 745 745
Misses 2633 2633
Impacted Files | Coverage Δ | |
---|---|---|
src/basic/embeds/position_embed.jl | 70.27% <100.00%> (ø) |
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact)
,ø = not affected
,? = missing data
Powered by Codecov. Last update 7a7a25a...bc444fc. Read the comment docs.
Thanks!
This PR addresses issue #92.
By setting n temporarily to 100, I get the same values as show in [1]:
(but this PR does not touches n and keeps its original value of n = 1e4 😀)
[1] https://machinelearningmastery.com/a-gentle-introduction-to-positional-encoding-in-transformer-models-part-1/