ropensci / tokenizers

Fast, Consistent Tokenization of Natural Language Text
https://docs.ropensci.org/tokenizers
Other
184 stars 25 forks source link

tokenize_tweets replacement #84

Closed alanault closed 1 year ago

alanault commented 1 year ago

Hi there,

I noticed today that tokenize_tweets has been removed from the package without any deprecation.

I saw the thread about CRAN compliance, which is obviously a real pain. Any suggestion on how to replace the same functionality, or whether the package will be updated to bring this function back again?

Many thanks Alan

lmullen commented 1 year ago

Sorry for the sudden removal. You can find the code for the tokenize_tweets() function in this repository. I suggest you simply copy and paste it into your own source base.

alanault commented 1 year ago

Thanks for the quick response!