Open danmlf1 opened 4 years ago
same issue :(
Same issue, getting 0 tweets.
Yes, I'm having the same problem, getting the following error message when "getting 0 tweets"
raise ConnectionError(e, request=request) requests.exceptions.ConnectionError: HTTPSConnectionPool(host='twitter.com', port=443): Max retries exceeded with url: /search?f=tweets&vertical=default&q=%23Harami%20since%3A2020-08-22%20until%3A2020-08-24&l=en (Caused by NewConnectionError('<urllib3.connection.VerifiedHTTPSConnection object at 0x7f5a3decb510>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution'))
I have the same issue, 0 tweets using query_tweets...
It doesn't work and return 0 tweet again. please help.
ooh bad, doesn't work function query_tweets from_user() nor query_tweets(), since yesterday. please help. maybe here is information https://blog.twitter.com/developer/en_us/topics/tips/2020/understanding-the-new-tweet-payload.html
0 Tweets again. Please help.
ooh bad, doesn't work function query_tweets from_user() nor query_tweets(), since yesterday. please help. maybe here is information https://blog.twitter.com/developer/en_us/topics/tips/2020/understanding-the-new-tweet-payload.html
The same error. I agree with you.
Easy Twitter enabled Javascript and shut down the backdor this code used In detail: Twitter has been developing since beginning of 2020 a new HTML Frontend with Randomly generated HTML Description Tags. The scoll mechanism to load new tweets was already changed to work with javascript and unique session ID's in january 2020. The URL "https://twitter.com/i/search/ with the &max_position=cursor" has been disabled in twitter.
It was surprising that with the russia collusion and other horse shit they didnt start to work to protect the data.
So as i said in April, the software is probably dead.
ooh bad, doesn't work function query_tweets from_user() nor query_tweets(), since yesterday. please help. maybe here is information https://blog.twitter.com/developer/en_us/topics/tips/2020/understanding-the-new-tweet-payload.html
This doesnt work with the Twitter API not even close.
They changed the HTML Page to avoid non conscented web scrapping.
So is this library just donzo? It's over?
So is this library just donzo? It's over?
Temporarily. Yeah. Hope it will be back soon
Thought I'd use this for my project. Guess that's not gonna happen anytime soon :(
Same here. Trying the branch of selenium alternatively.
Is there another good similar library that people are using?
same to me
same issue ,twitter disgust,
Even I got 0 Tweets count. I posted a new tweet and still got Zero tweets. I spent a lot of time understanding the workflow and got 0 !
just use tweepy instead
import tweepy as tw
from credentials import * #Using keys as a variable
def twitter_creds(): """ Connects to Twitter API. For confidentiality purposes the access keys are in a separate file in the project folder. Credential file is saved as credentials.py.
"""
#Authenticate credentials using keys:
auth = tw.OAuthHandler(CONSUMER_KEY, CONSUMER_SECRET) #tells twitter I am valid user
auth.set_access_token(ACCESS_TOKEN, ACCESS_SECRET)
#Connect with API and authenticate:
api = tw.API(auth) #speaks to twitter
return api
search_words = "artificial+intelligence" date_since = "2020-1-1"
tweets = tw.Cursor(api.search, q=search_words, lang="en", since=date_since).items(1000)
for tweet in tweets: print(tweet.text)
As twitterscraper is (temporarily) dead, I thought I'd share my favorite alternative scraper. @mattwsutherland your solution requires an API-key (which is super hard to get) so unfortunately it's useless to most people here.
Simply go for TweetScraper - for me its working perfectly. Also see this medium article providing a useful pandas snippet to set up a nicely formatted dataframe.
Installation for Ubuntu is straightforward as described. For Windows just do the following:
git clone https://github.com/jonbakerfish/TweetScraper.git
conda create -n tweetscraper python=3.7.7 -y
conda activate tweetscraper
conda install -y -c conda-forge scrapy ipython ipdb
pip install scrapy-selenium
Change to directory "TweetScraper" and test with
scrapy crawl TweetScraper -a query="foo,#bar"
Thanks I will definitely 👍 try that
On Wed, Nov 11, 2020, 1:40 AM do-me notifications@github.com wrote:
As twitterscraper is (temporarily) dead, I thought I'd share my favorite alternative scraper. @mattwsutherland https://github.com/mattwsutherland your solution requires an API-key (which is super hard to get) so unfortunately it's useless to most people here.
Simply go for TweetScraper https://github.com/jonbakerfish/TweetScraper
- for me its working perfectly. Also see this medium article https://medium.com/@kevin.a.crystal/scraping-twitter-with-tweetscraper-and-python-ea783b40443b providing a useful pandas snippet to set up a nicely formatted dataframe.
Installation for Ubuntu is straightforward as described. For Windows just do the following:
git clone https://github.com/jonbakerfish/TweetScraper.git conda create -n tweetscraper python=3.7.7 -y conda activate tweetscraper conda install -y -c conda-forge scrapy ipython ipdb pip install scrapy-selenium
Change to directory "TweetScraper" and test with
scrapy crawl TweetScraper -a query="foo,#bar"
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/taspinar/twitterscraper/issues/344#issuecomment-724939061, or unsubscribe https://github.com/notifications/unsubscribe-auth/AD4DDQIMFDUXIU7TEKI5O73SPGM4HANCNFSM4REYB7DQ .
still, the issue is continuing, Anyone finds a solution or another work around?
I think this scrapper is dead and may not be able untill somebody make changes.
@Guolin1996 Yes unfortunately ...
I have the same issue. "twitterscraper Trump --limit 1000 --output=tweets.json" gives an empty json file.
where to set Author API in this code?
Commenting to watch, same issue here
I guess it's still dead. Just tried, got 0 tweets.
Still 0 tweets
sad. still 0 tweets. is this related to Elon Musk?
I have the same issue. "twitterscraper Trump --limit 1000 --output=tweets.json" gives an empty json file.