cjbarrie / academictwitteR

Repo for academictwitteR package to query the Twitter Academic Research Product Track v2 API endpoint.
Other
272 stars 59 forks source link

Program stops for no reason #225

Closed NicuSV closed 3 years ago

NicuSV commented 3 years ago

Hello,

I have an academic API from twitter, and saved the bearer token as suggested. When I run the folowing code it stops at around 80 pages for no reason:

tweets <- get_all_tweets(query_gsibs, start_tweets = "2007-01-01T00:00:00Z", end_tweets = "2017-09-29T00:00:00Z", data_path = path, file = "my_work", lang = "en", n = 1000000 ) After, I run the following code to get it restarted, but is does not work: resume_collection(data_path = path)

After this it says: "Total pages queried: 1 (tweets captured this page: 496). Total tweets captured now reach 100 : finishing collection."

And the warning messages are as follows: "1: Tweets will be bound in local memory as well as stored as JSONs. 2: Directory already exists. Existing JSON files may be parsed and returned, choose a new path if this is not intended."

Thanks in advance for your help.