mkearney / tweetbotornot

🤖 R package for detecting Twitter bots via machine learning
https://tweetbotornot.mikewk.com
Other
388 stars 136 forks source link

Rate Limit Exceeded with Fast argument ON #24

Open ebravofm opened 5 years ago

ebravofm commented 5 years ago

Hi!

I'm running this snippet on a csv file containing 200 user names:

library("tweetbotornot") rts <- read.csv(file="rts.csv", header=TRUE, sep=",") data <- tweetbotornot(rts$Twitter.Name, fast = TRUE)

I'm setting the fast argument on, however I'm getting this error:

Warning message: “Rate limit exceeded - 88” Error in if (n%/%200 < n.times) {: argument is of length zero Traceback:

  1. botornot(rts$Twitter.Name, fast = TRUE)
  2. botornot.factor(rts$Twitter.Name, fast = TRUE)
  3. botornot(x, fast = fast)
  4. botornot.character(x, fast = fast)
  5. rtweet::get_timelines(x, n = 100)
  6. get_timeline(user, n, max_id, home, parse, check, token, ...)
  7. do.call("gettimeline", args)
  8. gettimeline(user = c("---list of users---), n = 100, home = FALSE, max_id = NULL, . parse = TRUE, check = TRUE, token = NULL)
  9. Map(get_timeline_call, user = user, n = n, home = home, MoreArgs = dots)
  10. mapply(FUN = f, ..., SIMPLIFY = FALSE)
  11. (function (user, n = 200, max_id = NULL, home = FALSE, parse = TRUE, . check = TRUE, token = NULL, ...) . { . stopifnot(is_n(n), is.atomic(user), is.atomic(max_id), is.logical(home)) . if (home) { . query <- "statuses/home_timeline" . } . else { . query <- "statuses/user_timeline" . } . if (length(user) > 1) { . stop("can only return tweets for one user at a time.", . call. = FALSE) . } . token <- check_token(token) . if (check) { . rl <- rate_limit(token, query) . n.times <- rl[["remaining"]] . if (n%/%200 < n.times) { . n.times <- ceiling(n/200L) . } . } . else { . rl <- NULL . n.times <- ceiling(n/200L) . } . if (n.times == 0L) { . if (!is.null(rl)) { . reset <- round(as.numeric(rl[["reset"]], "mins"), . 2) . } . else { . reset <- "An unknown number of" . } . warning("rate limit exceeded. ", round(reset, 2), " mins until rate limit resets.", . call. = FALSE) . return(data.frame()) . } . if (n < 200) { . count <- n . } . else { . count <- 200 . } . params <- list(user_type = user, count = count, max_id = max_id, . tweet_mode = "extended", ...) . names(params)[1] <- .id_type(user) . url <- make_url(query = query, param = params) . tm <- scroller(url, n, n.times, type = "timeline", token) . if (parse) { . tm <- tweets_with_users(tm) . } . tm . })(user = dots[[1L]][[1L]], n = dots[[2L]][[1L]], home = dots[[3L]][[1L]], . max_id = NULL, parse = TRUE, check = TRUE, token = NULL)

Am i doing something wrong?

Thanks!

joaoakio commented 5 years ago

I'm having the same issue

jimeneztyler commented 5 years ago

i'm having the same problem as well, did anyone find a solution?

Jupaoqq commented 5 years ago

@ebravofm @joaoakio @jimeneztyler

  1. delete this package (tweetbotornot) from your R packages library
  2. Fork this Repo
  3. in (whatever your github username is)/tweetbotornot/R/tweetbotornot.R, in line 89 to 96, change

botornot.character <- function(x, fast = FALSE) { x <- x[!is.na(x) & !duplicated(x)] x <- rtweet::get_timelines(x, n = 100) botornot(x, fast = fast) }

to

botornot.character <- function(x, fast = FALSE) { x <- x[!is.na(x) & !duplicated(x)] if (fast) { x <- rtweet::lookup_users(x) } else { x <- rtweet::get_timelines(x, n = 100) } botornot(x, fast = fast) }

  1. commit those changes
  2. open a R session, then do

library(devtools) install_github("(whatever your github username is)/tweetbotornot", dependencies = TRUE)

  1. Run your R script

I've also opened a pull request to make those changes to this repo to fix this issue.

jimeneztyler commented 5 years ago

@Jupaoqq thanks so much! This worked for me

jimeneztyler commented 5 years ago

Hi, i'm back again with the same issue. I have a large list of user names and keep getting the rate limit exceeded message when running tweetbotornot. I edited the code to include @Jupaoqq suggestion, which worked previously but isn't this time. Any ideas on whats going on? or is anyone else having this issue?