ropensci / tokenizers

Fast, Consistent Tokenization of Natural Language Text
https://docs.ropensci.org/tokenizers
Other
184 stars 25 forks source link

Comply with TIF requirements #63

Closed lmullen closed 6 years ago

lmullen commented 6 years ago

Closes #49

codecov-io commented 6 years ago

Codecov Report

Merging #63 into master will decrease coverage by 0.09%. The diff coverage is 99.56%.

Impacted file tree graph

@@            Coverage Diff            @@
##           master      #63     +/-   ##
=========================================
- Coverage   99.41%   99.32%   -0.1%     
=========================================
  Files          11       12      +1     
  Lines         341      443    +102     
=========================================
+ Hits          339      440    +101     
- Misses          2        3      +1
Impacted Files Coverage Δ
R/stem-tokenizers.R 100% <ø> (ø) :arrow_up:
R/ptb-tokenizer.R 100% <100%> (ø) :arrow_up:
R/character-shingles-tokenizers.R 100% <100%> (ø) :arrow_up:
R/ngram-tokenizers.R 100% <100%> (ø) :arrow_up:
R/basic-tokenizers.R 100% <100%> (ø) :arrow_up:
R/tokenize_tweets.R 100% <100%> (ø) :arrow_up:
R/coercion.R 92.85% <92.85%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more Δ = absolute <relative> (impact), ø = not affected, ? = missing data Powered by Codecov. Last update 6ebbd22...e5f87e5. Read the comment docs.