ropensci / tokenizers

Fast, Consistent Tokenization of Natural Language Text
https://docs.ropensci.org/tokenizers
Other
184 stars 25 forks source link

Deprecate tokenize_regex() #42

Closed lmullen closed 7 years ago

lmullen commented 7 years ago

Deprecate this function since it will become a segment function in a later release, and it's not really useful now. And remove it from the vignette/README.