ropensci / tokenizers

Fast, Consistent Tokenization of Natural Language Text
https://docs.ropensci.org/tokenizers
Other
185 stars 25 forks source link

Error: could not find function "%>%" #54

Closed fahadshery closed 7 years ago

fahadshery commented 7 years ago

Error when executing:

james <- paste0( "The question thus becomes a verbal one\n", "again; and our knowledge of all these early stages of thought and feeling\n", "is in any case so conjectural and imperfect that farther discussion would\n", "not be worth while.\n", "\n", "Religion, therefore, as I now ask you arbitrarily to take it, shall mean\n", "for us _the feelings, acts, and experiences of individual men in their\n", "solitude, so far as they apprehend themselves to stand in relation to\n", "whatever they may consider the divine_. Since the relation may be either\n", "moral, physical, or ritual, it is evident that out of religion in the\n", "sense in which we take it, theologies, philosophies, and ecclesiastical\n", "organizations may secondarily grow.\n" ) tokenize_ngrams(james, n = 5, n_min = 2)[[1]] %>% head(10)

lmullen commented 7 years ago

The %>% function comes from the magrittr package, which you need to load first as in the example in the README.