Closed mick-net closed 5 years ago
hey vermaat,
first of all, i really hope you can somehow resolve these costs... Do you have some more information on the specific error you got, or the code with which you called the api? Failures like this should definitely raise an exception and stop the process...
Regarding repeated requests for the same area, it would be better to store all the place ids for the area and then just call get_id for each of those. So you at least save yourself the nearby places call. Also putting in quota limit but that doesn't help you much right now :/
One thing that i could add, is a max_requests parameter for the nearby search. Not sure why so many requests where made in your case though...
Thanks for your quick reply. I wish I put more attention to the error I got, but I did not since I assumed it was my own internet connection. I only vaguely remember seeing something like 'connection refused'. In the Google console I did not see any errors. I used the API using the following code. There are no loops or anything:
def scrape_google_populartimes_json(name, lat1, lon1, lat2, lon2, placetypes):
datenow = strftime("%Y-%m-%d_%H-%M-%S", gmtime())
pt_filename = 'database/popular-times-' + str(datenow) + '_' + str(name) + '-' + str(lat1) +'_' + str(lon1) +'_' + str(lat2) + '_' + str(lon2) +'.json'
data = populartimes.get("####APIKEY###", placetypes, (lat1, lon1), (lat2, lon2))
with open(pt_filename, 'w') as fp:
json.dump(data, fp)
return pt_filename
google_populartimes_json_file = scrape_google_populartimes_json('Testname', -8.69295, 115.119625, -8.65078, 115.168205, ["bar"])
Can you ask Google for the logs?
I didn't get the logs. The engineering team did an investigation, and several days later I got the message from the support team that they adjusted the bill. So in the end I only payed a small amount for the actual valid charges. :-)
Great to hear that this got resolved! :)
Not sure if its a good idea to post it under a closed issue. Until now I have not tried to get new data from Google's API using this module. I got similar issues again, only this time I got an error log and was able to stay within the free montly API limit. Innitially I got an API quota limit error. I made a new API credential, with a quota limit of 1000 requests a day. I tried approximatly 5 times. Resulting in approximatly 520 requests and 45.7% failures. I don't get any data back (I also dont see any JSON data when printing the 'details' variables in the crawler.py), but Google does credit me with the usage (I did kill thep process manually, otherwise it keep on trying and using credit longer).
When I do a single place request with the populartimes.get_id it works fine. Does it maybe have something to do with requests per second? I checked all quota's and notting is hitting its limit (https://console.developers.google.com/iam-admin/quotas?).
Maybe I am doing something wrong.
Python input for Bali/ Indonesia:
populartimes.get(google_api_key, ["bar"], ( -8.69295, 115.119625), (-8.65078, 115.168205))
Terminal output:
2019-03-31_13-13-18
INFO:root:Adding places to queue...
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.6896948899809,115.12943879934265&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.692949775296864,115.13271006578792&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.692949311846657,115.1425238651101&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.692948862440403,115.14906639797533&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.69294949441795,115.13925259867214&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.692950000000003,115.119625&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.692949943824216,115.1261675328959&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.689694960200645,115.1261675328959&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.692948300682596,115.155608930823&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.692949101187473,115.14579513154467&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.692947247386744,115.16542273005312&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.692949985956057,115.1228962664482&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.689695016376442,115.119625&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.692948595605445,115.1523376644016&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.692949873604485,115.12943879934265&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.692949648901353,115.13598133223125&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.689695002332492,115.1228962664482&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.692947977671865,115.15888019723906&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 2098
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.69294684011236,115.16869399645016&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 1015
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.689694791673265,115.13271006578792&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.68969432822298,115.1425238651101&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.689694117563759,115.14579513154467&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.689694665277733,115.13598133223125&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.689693611981646,115.1523376644016&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.689693878816648,115.14906639797533&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.689694510794304,115.13925259867214&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.692947626573245,115.16215146364925&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 2459
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.689693317058746,115.155608930823&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
Exception in thread Thread-55:
Traceback (most recent call last):
File "/Users/verm/anaconda3/lib/python3.6/threading.py", line 916, in _bootstrap_inner
self.run()
File "/Users/verm/anaconda3/lib/python3.6/threading.py", line 864, in run
self._target(*self._args, **self._kwargs)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 112, in worker_radar
get_radar(item)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 135, in get_radar
check_response_code(resp)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 440, in check_response_code
"You exceeded your Query Limit for Google Places API Web Service, "
populartimes.crawler.PopulartimesException: ('Google Places OVER_QUERY_LIMIT', 'You exceeded your Query Limit for Google Places API Web Service, check https://developers.google.com/places/web-service/usage to upgrade your quota.')
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.689692994047961,115.15888019723906&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.686440032198158,115.119625&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.686440018154208,115.1228962664482&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 None
Exception in thread Thread-50:
Traceback (most recent call last):
File "/Users/verm/anaconda3/lib/python3.6/threading.py", line 916, in _bootstrap_inner
self.run()
File "/Users/verm/anaconda3/lib/python3.6/threading.py", line 864, in run
self._target(*self._args, **self._kwargs)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 112, in worker_radar
get_radar(item)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 135, in get_radar
check_response_code(resp)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 440, in check_response_code
"You exceeded your Query Limit for Google Places API Web Service, "
populartimes.crawler.PopulartimesException: ('Google Places OVER_QUERY_LIMIT', 'You exceeded your Query Limit for Google Places API Web Service, check https://developers.google.com/places/web-service/usage to upgrade your quota.')
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.686439807494946,115.13271006578792&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 None
Exception in thread Thread-46:
Traceback (most recent call last):
File "/Users/verm/anaconda3/lib/python3.6/threading.py", line 916, in _bootstrap_inner
self.run()
File "/Users/verm/anaconda3/lib/python3.6/threading.py", line 864, in run
self._target(*self._args, **self._kwargs)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 112, in worker_radar
get_radar(item)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 135, in get_radar
check_response_code(resp)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 440, in check_response_code
"You exceeded your Query Limit for Google Places API Web Service, "
populartimes.crawler.PopulartimesException: ('Google Places OVER_QUERY_LIMIT', 'You exceeded your Query Limit for Google Places API Web Service, check https://developers.google.com/places/web-service/usage to upgrade your quota.')
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.686439344044581,115.1425238651101&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 None
Exception in thread Thread-42:
Traceback (most recent call last):
File "/Users/verm/anaconda3/lib/python3.6/threading.py", line 916, in _bootstrap_inner
self.run()
File "/Users/verm/anaconda3/lib/python3.6/threading.py", line 864, in run
self._target(*self._args, **self._kwargs)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 112, in worker_radar
get_radar(item)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 135, in get_radar
check_response_code(resp)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 440, in check_response_code
"You exceeded your Query Limit for Google Places API Web Service, "
populartimes.crawler.PopulartimesException: ('Google Places OVER_QUERY_LIMIT', 'You exceeded your Query Limit for Google Places API Web Service, check https://developers.google.com/places/web-service/usage to upgrade your quota.')
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.686439681099392,115.13598133223125&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.686439526615935,115.13925259867214&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.686438894638172,115.14906639797533&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.686439976022356,115.1261675328959&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
Exception in thread Thread-57:
Traceback (most recent call last):
File "/Users/verm/anaconda3/lib/python3.6/threading.py", line 916, in _bootstrap_inner
self.run()
File "/Users/verm/anaconda3/lib/python3.6/threading.py", line 864, in run
self._target(*self._args, **self._kwargs)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 112, in worker_radar
get_radar(item)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 135, in get_radar
check_response_code(resp)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 440, in check_response_code
"You exceeded your Query Limit for Google Places API Web Service, "
populartimes.crawler.PopulartimesException: ('Google Places OVER_QUERY_LIMIT', 'You exceeded your Query Limit for Google Places API Web Service, check https://developers.google.com/places/web-service/usage to upgrade your quota.')
Exception in thread Thread-38:
Traceback (most recent call last):
File "/Users/verm/anaconda3/lib/python3.6/threading.py", line 916, in _bootstrap_inner
self.run()
File "/Users/verm/anaconda3/lib/python3.6/threading.py", line 864, in run
self._target(*self._args, **self._kwargs)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 112, in worker_radar
get_radar(item)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 135, in get_radar
check_response_code(resp)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 440, in check_response_code
"You exceeded your Query Limit for Google Places API Web Service, "
populartimes.crawler.PopulartimesException: ('Google Places OVER_QUERY_LIMIT', 'You exceeded your Query Limit for Google Places API Web Service, check https://developers.google.com/places/web-service/usage to upgrade your quota.')
Exception in thread Thread-44:
Traceback (most recent call last):
File "/Users/verm/anaconda3/lib/python3.6/threading.py", line 916, in _bootstrap_inner
self.run()
File "/Users/verm/anaconda3/lib/python3.6/threading.py", line 864, in run
self._target(*self._args, **self._kwargs)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 112, in worker_radar
get_radar(item)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 135, in get_radar
check_response_code(resp)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 440, in check_response_code
"You exceeded your Query Limit for Google Places API Web Service, "
populartimes.crawler.PopulartimesException: ('Google Places OVER_QUERY_LIMIT', 'You exceeded your Query Limit for Google Places API Web Service, check https://developers.google.com/places/web-service/usage to upgrade your quota.')
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.68969264294928,115.16215146364925&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 907
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.689692263762714,115.16542273005312&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 1401
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.686437658770593,115.16215146364925&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.686439133385324,115.14579513154467&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.686437279583965,115.16542273005312&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 None
Exception in thread Thread-39:
Traceback (most recent call last):
File "/Users/verm/anaconda3/lib/python3.6/threading.py", line 916, in _bootstrap_inner
self.run()
File "/Users/verm/anaconda3/lib/python3.6/threading.py", line 864, in run
self._target(*self._args, **self._kwargs)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 112, in worker_radar
get_radar(item)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 135, in get_radar
check_response_code(resp)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 440, in check_response_code
"You exceeded your Query Limit for Google Places API Web Service, "
populartimes.crawler.PopulartimesException: ('Google Places OVER_QUERY_LIMIT', 'You exceeded your Query Limit for Google Places API Web Service, check https://developers.google.com/places/web-service/usage to upgrade your quota.')
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.68969185648826,115.16869399645016&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 1178
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.686438627803124,115.1523376644016&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.686439905802601,115.12943879934265&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 NoneException in thread Thread-54:
Traceback (most recent call last):
File "/Users/verm/anaconda3/lib/python3.6/threading.py", line 916, in _bootstrap_inner
self.run()
File "/Users/verm/anaconda3/lib/python3.6/threading.py", line 864, in run
self._target(*self._args, **self._kwargs)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 112, in worker_radar
get_radar(item)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 135, in get_radar
check_response_code(resp)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 440, in check_response_code
"You exceeded your Query Limit for Google Places API Web Service, "
populartimes.crawler.PopulartimesException: ('Google Places OVER_QUERY_LIMIT', 'You exceeded your Query Limit for Google Places API Web Service, check https://developers.google.com/places/web-service/usage to upgrade your quota.')
Exception in thread Thread-47:
Traceback (most recent call last):
File "/Users/verm/anaconda3/lib/python3.6/threading.py", line 916, in _bootstrap_inner
self.run()
File "/Users/verm/anaconda3/lib/python3.6/threading.py", line 864, in run
self._target(*self._args, **self._kwargs)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 112, in worker_radar
get_radar(item)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 135, in get_radar
check_response_code(resp)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 440, in check_response_code
"You exceeded your Query Limit for Google Places API Web Service, "
populartimes.crawler.PopulartimesException: ('Google Places OVER_QUERY_LIMIT', 'You exceeded your Query Limit for Google Places API Web Service, check https://developers.google.com/places/web-service/usage to upgrade your quota.')
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.686438332880174,115.155608930823&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 None
Exception in thread Thread-43:
Traceback (most recent call last):
File "/Users/verm/anaconda3/lib/python3.6/threading.py", line 916, in _bootstrap_inner
self.run()
File "/Users/verm/anaconda3/lib/python3.6/threading.py", line 864, in run
self._target(*self._args, **self._kwargs)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 112, in worker_radar
get_radar(item)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 135, in get_radar
check_response_code(resp)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 440, in check_response_code
"You exceeded your Query Limit for Google Places API Web Service, "
populartimes.crawler.PopulartimesException: ('Google Places OVER_QUERY_LIMIT', 'You exceeded your Query Limit for Google Places API Web Service, check https://developers.google.com/places/web-service/usage to upgrade your quota.')
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.686438009869333,115.15888019723906&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
Exception in thread Thread-41:
Traceback (most recent call last):
File "/Users/verm/anaconda3/lib/python3.6/threading.py", line 916, in _bootstrap_inner
self.run()
File "/Users/verm/anaconda3/lib/python3.6/threading.py", line 864, in run
self._target(*self._args, **self._kwargs)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 112, in worker_radar
get_radar(item)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 135, in get_radar
check_response_code(resp)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 440, in check_response_code
"You exceeded your Query Limit for Google Places API Web Service, "
populartimes.crawler.PopulartimesException: ('Google Places OVER_QUERY_LIMIT', 'You exceeded your Query Limit for Google Places API Web Service, check https://developers.google.com/places/web-service/usage to upgrade your quota.')
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.686436872309438,115.16869399645016&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 None
Exception in thread Thread-52:
Traceback (most recent call last):
File "/Users/verm/anaconda3/lib/python3.6/threading.py", line 916, in _bootstrap_inner
self.run()
File "/Users/verm/anaconda3/lib/python3.6/threading.py", line 864, in run
self._target(*self._args, **self._kwargs)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 112, in worker_radar
get_radar(item)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 135, in get_radar
check_response_code(resp)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 440, in check_response_code
"You exceeded your Query Limit for Google Places API Web Service, "
populartimes.crawler.PopulartimesException: ('Google Places OVER_QUERY_LIMIT', 'You exceeded your Query Limit for Google Places API Web Service, check https://developers.google.com/places/web-service/usage to upgrade your quota.')
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.683185047465358,115.119625&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 None
Exception in thread Thread-40:
Traceback (most recent call last):
File "/Users/verm/anaconda3/lib/python3.6/threading.py", line 916, in _bootstrap_inner
self.run()
File "/Users/verm/anaconda3/lib/python3.6/threading.py", line 864, in run
self._target(*self._args, **self._kwargs)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 112, in worker_radar
get_radar(item)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 135, in get_radar
check_response_code(resp)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 440, in check_response_code
"You exceeded your Query Limit for Google Places API Web Service, "
populartimes.crawler.PopulartimesException: ('Google Places OVER_QUERY_LIMIT', 'You exceeded your Query Limit for Google Places API Web Service, check https://developers.google.com/places/web-service/usage to upgrade your quota.')
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.683185033421404,115.1228962664482&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 None
Exception in thread Thread-53:
Traceback (most recent call last):
File "/Users/verm/anaconda3/lib/python3.6/threading.py", line 916, in _bootstrap_inner
self.run()
File "/Users/verm/anaconda3/lib/python3.6/threading.py", line 864, in run
self._target(*self._args, **self._kwargs)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 112, in worker_radar
get_radar(item)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 135, in get_radar
check_response_code(resp)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 440, in check_response_code
"You exceeded your Query Limit for Google Places API Web Service, "
populartimes.crawler.PopulartimesException: ('Google Places OVER_QUERY_LIMIT', 'You exceeded your Query Limit for Google Places API Web Service, check https://developers.google.com/places/web-service/usage to upgrade your quota.')
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.683184921069778,115.12943879934265&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.683184991289542,115.1261675328959&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443Exception in thread Thread-48:
Traceback (most recent call last):
File "/Users/verm/anaconda3/lib/python3.6/threading.py", line 916, in _bootstrap_inner
self.run()
File "/Users/verm/anaconda3/lib/python3.6/threading.py", line 864, in run
self._target(*self._args, **self._kwargs)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 112, in worker_radar
get_radar(item)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 135, in get_radar
check_response_code(resp)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 440, in check_response_code
"You exceeded your Query Limit for Google Places API Web Service, "
populartimes.crawler.PopulartimesException: ('Google Places OVER_QUERY_LIMIT', 'You exceeded your Query Limit for Google Places API Web Service, check https://developers.google.com/places/web-service/usage to upgrade your quota.')
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.683184822762104,115.13271006578792&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.68318469636653,115.13598133223125&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 None
Exception in thread Thread-49:
Traceback (most recent call last):
File "/Users/verm/anaconda3/lib/python3.6/threading.py", line 916, in _bootstrap_inner
self.run()
File "/Users/verm/anaconda3/lib/python3.6/threading.py", line 864, in run
self._target(*self._args, **self._kwargs)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 112, in worker_radar
get_radar(item)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 135, in get_radar
check_response_code(resp)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 440, in check_response_code
"You exceeded your Query Limit for Google Places API Web Service, "
populartimes.crawler.PopulartimesException: ('Google Places OVER_QUERY_LIMIT', 'You exceeded your Query Limit for Google Places API Web Service, check https://developers.google.com/places/web-service/usage to upgrade your quota.')
Exception in thread Thread-51:
Traceback (most recent call last):
File "/Users/verm/anaconda3/lib/python3.6/threading.py", line 916, in _bootstrap_inner
self.run()
File "/Users/verm/anaconda3/lib/python3.6/threading.py", line 864, in run
self._target(*self._args, **self._kwargs)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 112, in worker_radar
get_radar(item)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 135, in get_radar
check_response_code(resp)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 440, in check_response_code
"You exceeded your Query Limit for Google Places API Web Service, "
populartimes.crawler.PopulartimesException: ('Google Places OVER_QUERY_LIMIT', 'You exceeded your Query Limit for Google Places API Web Service, check https://developers.google.com/places/web-service/usage to upgrade your quota.')
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.683184541883048,115.13925259867214&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.683184359311662,115.1425238651101&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 None
Exception in thread Thread-56:
Traceback (most recent call last):
File "/Users/verm/anaconda3/lib/python3.6/threading.py", line 916, in _bootstrap_inner
self.run()
File "/Users/verm/anaconda3/lib/python3.6/threading.py", line 864, in run
self._target(*self._args, **self._kwargs)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 112, in worker_radar
get_radar(item)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 135, in get_radar
check_response_code(resp)
File "/Users/verm/Documents/GitHub/BTApp3/BTapp3env/lib/python3.6/site-packages/populartimes/crawler.py", line 440, in check_response_code
"You exceeded your Query Limit for Google Places API Web Service, "
populartimes.crawler.PopulartimesException: ('Google Places OVER_QUERY_LIMIT', 'You exceeded your Query Limit for Google Places API Web Service, check https://developers.google.com/places/web-service/usage to upgrade your quota.')
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.68318414865237,115.14579513154467&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.683183909905177,115.14906639797533&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.68318364307008,115.1523376644016&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 1913
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.683183348147084,115.155608930823&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 2018
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.683183025136186,115.15888019723906&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 3300
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.683182674037388,115.16215146364925&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 1951
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.683182294850694,115.16542273005312&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 1979
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.6831818875761,115.16869399645016&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.679930062178236,115.119625&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.679930048134283,115.1228962664482&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.679930006002413,115.1261675328959&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.679929935782635,115.12943879934265&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.679929837474946,115.13271006578792&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.679929711079348,115.13598133223125&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.679929556595843,115.13925259867214&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.679929374024422,115.1425238651101&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
DEBUG:urllib3.connectionpool:https://maps.googleapis.com:443 "GET /maps/api/place/nearbysearch/json?location=-8.679929163365097,115.14579513154467&radius=180&types=bar&key=AIzaSyBpQD2aiSdyKJwRoKRBM-rHud2TfU-1hg4 HTTP/1.1" 200 83
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): maps.googleapis.com:443
^CException ignored in: <socket.socket fd=6, family=AddressFamily.AF_INET, type=SocketKind.SOCK_STREAM, proto=0, laddr=('0.0.0.0', 5000)>
ResourceWarning: unclosed <socket.socket fd=6, family=AddressFamily.AF_INET, type=SocketKind.SOCK_STREAM, proto=0, laddr=('0.0.0.0', 5000)>
(BTapp3env) M-MacBook-Pro:BT3-run-app verm$
Hey! I too am running into a similar issue (however I haven't yet reached the 5000 queries so my bill is fine 😝). However, when running the following code
# points to delimit search area
p1 = (42.348023,-71.073929)
p2 = (42.368037,-71.053485)
bar_results = pt.get(maps_key,['bar'], p1 = p1, p2 = p2)
I'm getting two warnings stating: WARNING:root:Result limit in search radius reached, some data may get lost
followed by this long warning message. I apologize in advance for it's length. But it looks like there are either problems with my request library or the something with threading. Perhaps my search radius is too large. I found the error in crawler.py
here. Perhaps there is a specific search radius that should be used?
Any thoughts would be greatly appreciated! Thanks :)
Exception in thread Thread-65:
Traceback (most recent call last):
File "/anaconda3/lib/python3.7/urllib/request.py", line 1317, in do_open
encode_chunked=req.has_header('Transfer-encoding'))
File "/anaconda3/lib/python3.7/http/client.py", line 1229, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1275, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1224, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1016, in _send_output
self.send(msg)
File "/anaconda3/lib/python3.7/http/client.py", line 956, in send
self.connect()
File "/anaconda3/lib/python3.7/http/client.py", line 1384, in connect
super().connect()
File "/anaconda3/lib/python3.7/http/client.py", line 928, in connect
(self.host,self.port), self.timeout, self.source_address)
File "/anaconda3/lib/python3.7/socket.py", line 727, in create_connection
raise err
File "/anaconda3/lib/python3.7/socket.py", line 716, in create_connection
sock.connect(sa)
ConnectionRefusedError: [Errno 61] Connection refused
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/anaconda3/lib/python3.7/threading.py", line 917, in _bootstrap_inner
self.run()
File "/anaconda3/lib/python3.7/threading.py", line 865, in run
self._target(*self._args, **self._kwargs)
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 168, in worker_detail
get_detail(item)
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 372, in get_detail
detail_json = get_populartimes_by_detail(params["API_key"], g_places[place_id])
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 409, in get_populartimes_by_detail
detail_json = add_optional_parameters(detail_json, detail, *get_populartimes_from_search(place_identifier))
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 320, in get_populartimes_from_search
context=gcontext)
File "/anaconda3/lib/python3.7/urllib/request.py", line 222, in urlopen
return opener.open(url, data, timeout)
File "/anaconda3/lib/python3.7/urllib/request.py", line 525, in open
response = self._open(req, data)
File "/anaconda3/lib/python3.7/urllib/request.py", line 543, in _open
'_open', req)
File "/anaconda3/lib/python3.7/urllib/request.py", line 503, in _call_chain
result = func(*args)
File "/anaconda3/lib/python3.7/urllib/request.py", line 1360, in https_open
context=self._context, check_hostname=self._check_hostname)
File "/anaconda3/lib/python3.7/urllib/request.py", line 1319, in do_open
raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 61] Connection refused>
Exception in thread Thread-74:
Traceback (most recent call last):
File "/anaconda3/lib/python3.7/urllib/request.py", line 1317, in do_open
encode_chunked=req.has_header('Transfer-encoding'))
File "/anaconda3/lib/python3.7/http/client.py", line 1229, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1275, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1224, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1016, in _send_output
self.send(msg)
File "/anaconda3/lib/python3.7/http/client.py", line 956, in send
self.connect()
File "/anaconda3/lib/python3.7/http/client.py", line 1384, in connect
super().connect()
File "/anaconda3/lib/python3.7/http/client.py", line 928, in connect
(self.host,self.port), self.timeout, self.source_address)
File "/anaconda3/lib/python3.7/socket.py", line 727, in create_connection
raise err
File "/anaconda3/lib/python3.7/socket.py", line 716, in create_connection
sock.connect(sa)
ConnectionRefusedError: [Errno 61] Connection refused
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/anaconda3/lib/python3.7/threading.py", line 917, in _bootstrap_inner
self.run()
File "/anaconda3/lib/python3.7/threading.py", line 865, in run
self._target(*self._args, **self._kwargs)
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 168, in worker_detail
get_detail(item)
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 372, in get_detail
detail_json = get_populartimes_by_detail(params["API_key"], g_places[place_id])
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 409, in get_populartimes_by_detail
detail_json = add_optional_parameters(detail_json, detail, *get_populartimes_from_search(place_identifier))
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 320, in get_populartimes_from_search
context=gcontext)
File "/anaconda3/lib/python3.7/urllib/request.py", line 222, in urlopen
return opener.open(url, data, timeout)
File "/anaconda3/lib/python3.7/urllib/request.py", line 525, in open
response = self._open(req, data)
File "/anaconda3/lib/python3.7/urllib/request.py", line 543, in _open
'_open', req)
File "/anaconda3/lib/python3.7/urllib/request.py", line 503, in _call_chain
result = func(*args)
File "/anaconda3/lib/python3.7/urllib/request.py", line 1360, in https_open
context=self._context, check_hostname=self._check_hostname)
File "/anaconda3/lib/python3.7/urllib/request.py", line 1319, in do_open
raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 61] Connection refused>
Exception in thread Thread-79:
Traceback (most recent call last):
File "/anaconda3/lib/python3.7/urllib/request.py", line 1317, in do_open
encode_chunked=req.has_header('Transfer-encoding'))
File "/anaconda3/lib/python3.7/http/client.py", line 1229, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1275, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1224, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1016, in _send_output
self.send(msg)
File "/anaconda3/lib/python3.7/http/client.py", line 956, in send
self.connect()
File "/anaconda3/lib/python3.7/http/client.py", line 1384, in connect
super().connect()
File "/anaconda3/lib/python3.7/http/client.py", line 928, in connect
(self.host,self.port), self.timeout, self.source_address)
File "/anaconda3/lib/python3.7/socket.py", line 727, in create_connection
raise err
File "/anaconda3/lib/python3.7/socket.py", line 716, in create_connection
sock.connect(sa)
ConnectionRefusedError: [Errno 61] Connection refused
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/anaconda3/lib/python3.7/threading.py", line 917, in _bootstrap_inner
self.run()
File "/anaconda3/lib/python3.7/threading.py", line 865, in run
self._target(*self._args, **self._kwargs)
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 168, in worker_detail
get_detail(item)
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 372, in get_detail
detail_json = get_populartimes_by_detail(params["API_key"], g_places[place_id])
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 409, in get_populartimes_by_detail
detail_json = add_optional_parameters(detail_json, detail, *get_populartimes_from_search(place_identifier))
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 320, in get_populartimes_from_search
context=gcontext)
File "/anaconda3/lib/python3.7/urllib/request.py", line 222, in urlopen
return opener.open(url, data, timeout)
File "/anaconda3/lib/python3.7/urllib/request.py", line 525, in open
response = self._open(req, data)
File "/anaconda3/lib/python3.7/urllib/request.py", line 543, in _open
'_open', req)
File "/anaconda3/lib/python3.7/urllib/request.py", line 503, in _call_chain
result = func(*args)
File "/anaconda3/lib/python3.7/urllib/request.py", line 1360, in https_open
context=self._context, check_hostname=self._check_hostname)
File "/anaconda3/lib/python3.7/urllib/request.py", line 1319, in do_open
raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 61] Connection refused>
Exception in thread Thread-71:
Traceback (most recent call last):
File "/anaconda3/lib/python3.7/urllib/request.py", line 1317, in do_open
encode_chunked=req.has_header('Transfer-encoding'))
File "/anaconda3/lib/python3.7/http/client.py", line 1229, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1275, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1224, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1016, in _send_output
self.send(msg)
File "/anaconda3/lib/python3.7/http/client.py", line 956, in send
self.connect()
File "/anaconda3/lib/python3.7/http/client.py", line 1384, in connect
super().connect()
File "/anaconda3/lib/python3.7/http/client.py", line 928, in connect
(self.host,self.port), self.timeout, self.source_address)
File "/anaconda3/lib/python3.7/socket.py", line 727, in create_connection
raise err
File "/anaconda3/lib/python3.7/socket.py", line 716, in create_connection
sock.connect(sa)
ConnectionRefusedError: [Errno 61] Connection refused
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/anaconda3/lib/python3.7/threading.py", line 917, in _bootstrap_inner
self.run()
File "/anaconda3/lib/python3.7/threading.py", line 865, in run
self._target(*self._args, **self._kwargs)
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 168, in worker_detail
get_detail(item)
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 372, in get_detail
detail_json = get_populartimes_by_detail(params["API_key"], g_places[place_id])
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 409, in get_populartimes_by_detail
detail_json = add_optional_parameters(detail_json, detail, *get_populartimes_from_search(place_identifier))
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 320, in get_populartimes_from_search
context=gcontext)
File "/anaconda3/lib/python3.7/urllib/request.py", line 222, in urlopen
return opener.open(url, data, timeout)
File "/anaconda3/lib/python3.7/urllib/request.py", line 525, in open
response = self._open(req, data)
File "/anaconda3/lib/python3.7/urllib/request.py", line 543, in _open
'_open', req)
File "/anaconda3/lib/python3.7/urllib/request.py", line 503, in _call_chain
result = func(*args)
File "/anaconda3/lib/python3.7/urllib/request.py", line 1360, in https_open
context=self._context, check_hostname=self._check_hostname)
File "/anaconda3/lib/python3.7/urllib/request.py", line 1319, in do_open
raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 61] Connection refused>
Exception in thread Thread-73:
Traceback (most recent call last):
File "/anaconda3/lib/python3.7/urllib/request.py", line 1317, in do_open
encode_chunked=req.has_header('Transfer-encoding'))
File "/anaconda3/lib/python3.7/http/client.py", line 1229, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1275, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1224, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1016, in _send_output
self.send(msg)
File "/anaconda3/lib/python3.7/http/client.py", line 956, in send
self.connect()
File "/anaconda3/lib/python3.7/http/client.py", line 1384, in connect
super().connect()
File "/anaconda3/lib/python3.7/http/client.py", line 928, in connect
(self.host,self.port), self.timeout, self.source_address)
File "/anaconda3/lib/python3.7/socket.py", line 727, in create_connection
raise err
File "/anaconda3/lib/python3.7/socket.py", line 716, in create_connection
sock.connect(sa)
ConnectionRefusedError: [Errno 61] Connection refused
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/anaconda3/lib/python3.7/threading.py", line 917, in _bootstrap_inner
self.run()
File "/anaconda3/lib/python3.7/threading.py", line 865, in run
self._target(*self._args, **self._kwargs)
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 168, in worker_detail
get_detail(item)
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 372, in get_detail
detail_json = get_populartimes_by_detail(params["API_key"], g_places[place_id])
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 409, in get_populartimes_by_detail
detail_json = add_optional_parameters(detail_json, detail, *get_populartimes_from_search(place_identifier))
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 320, in get_populartimes_from_search
context=gcontext)
File "/anaconda3/lib/python3.7/urllib/request.py", line 222, in urlopen
return opener.open(url, data, timeout)
File "/anaconda3/lib/python3.7/urllib/request.py", line 525, in open
response = self._open(req, data)
File "/anaconda3/lib/python3.7/urllib/request.py", line 543, in _open
'_open', req)
File "/anaconda3/lib/python3.7/urllib/request.py", line 503, in _call_chain
result = func(*args)
File "/anaconda3/lib/python3.7/urllib/request.py", line 1360, in https_open
context=self._context, check_hostname=self._check_hostname)
File "/anaconda3/lib/python3.7/urllib/request.py", line 1319, in do_open
raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 61] Connection refused>
Exception in thread Thread-66:
Traceback (most recent call last):
File "/anaconda3/lib/python3.7/urllib/request.py", line 1317, in do_open
encode_chunked=req.has_header('Transfer-encoding'))
File "/anaconda3/lib/python3.7/http/client.py", line 1229, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1275, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1224, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1016, in _send_output
self.send(msg)
File "/anaconda3/lib/python3.7/http/client.py", line 956, in send
self.connect()
File "/anaconda3/lib/python3.7/http/client.py", line 1384, in connect
super().connect()
File "/anaconda3/lib/python3.7/http/client.py", line 928, in connect
(self.host,self.port), self.timeout, self.source_address)
File "/anaconda3/lib/python3.7/socket.py", line 727, in create_connection
raise err
File "/anaconda3/lib/python3.7/socket.py", line 716, in create_connection
sock.connect(sa)
ConnectionRefusedError: [Errno 61] Connection refused
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/anaconda3/lib/python3.7/threading.py", line 917, in _bootstrap_inner
self.run()
File "/anaconda3/lib/python3.7/threading.py", line 865, in run
self._target(*self._args, **self._kwargs)
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 168, in worker_detail
get_detail(item)
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 372, in get_detail
detail_json = get_populartimes_by_detail(params["API_key"], g_places[place_id])
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 409, in get_populartimes_by_detail
detail_json = add_optional_parameters(detail_json, detail, *get_populartimes_from_search(place_identifier))
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 320, in get_populartimes_from_search
context=gcontext)
File "/anaconda3/lib/python3.7/urllib/request.py", line 222, in urlopen
return opener.open(url, data, timeout)
File "/anaconda3/lib/python3.7/urllib/request.py", line 525, in open
response = self._open(req, data)
File "/anaconda3/lib/python3.7/urllib/request.py", line 543, in _open
'_open', req)
File "/anaconda3/lib/python3.7/urllib/request.py", line 503, in _call_chain
result = func(*args)
File "/anaconda3/lib/python3.7/urllib/request.py", line 1360, in https_open
context=self._context, check_hostname=self._check_hostname)
File "/anaconda3/lib/python3.7/urllib/request.py", line 1319, in do_open
raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 61] Connection refused>
Exception in thread Thread-70:
Traceback (most recent call last):
File "/anaconda3/lib/python3.7/urllib/request.py", line 1317, in do_open
encode_chunked=req.has_header('Transfer-encoding'))
File "/anaconda3/lib/python3.7/http/client.py", line 1229, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1275, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1224, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1016, in _send_output
self.send(msg)
File "/anaconda3/lib/python3.7/http/client.py", line 956, in send
self.connect()
File "/anaconda3/lib/python3.7/http/client.py", line 1384, in connect
super().connect()
File "/anaconda3/lib/python3.7/http/client.py", line 928, in connect
(self.host,self.port), self.timeout, self.source_address)
File "/anaconda3/lib/python3.7/socket.py", line 727, in create_connection
raise err
File "/anaconda3/lib/python3.7/socket.py", line 716, in create_connection
sock.connect(sa)
ConnectionRefusedError: [Errno 61] Connection refused
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/anaconda3/lib/python3.7/threading.py", line 917, in _bootstrap_inner
self.run()
File "/anaconda3/lib/python3.7/threading.py", line 865, in run
self._target(*self._args, **self._kwargs)
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 168, in worker_detail
get_detail(item)
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 372, in get_detail
detail_json = get_populartimes_by_detail(params["API_key"], g_places[place_id])
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 409, in get_populartimes_by_detail
detail_json = add_optional_parameters(detail_json, detail, *get_populartimes_from_search(place_identifier))
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 320, in get_populartimes_from_search
context=gcontext)
File "/anaconda3/lib/python3.7/urllib/request.py", line 222, in urlopen
return opener.open(url, data, timeout)
File "/anaconda3/lib/python3.7/urllib/request.py", line 525, in open
response = self._open(req, data)
File "/anaconda3/lib/python3.7/urllib/request.py", line 543, in _open
'_open', req)
File "/anaconda3/lib/python3.7/urllib/request.py", line 503, in _call_chain
result = func(*args)
File "/anaconda3/lib/python3.7/urllib/request.py", line 1360, in https_open
context=self._context, check_hostname=self._check_hostname)
File "/anaconda3/lib/python3.7/urllib/request.py", line 1319, in do_open
raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 61] Connection refused>
Exception in thread Thread-81:
Traceback (most recent call last):
File "/anaconda3/lib/python3.7/urllib/request.py", line 1317, in do_open
encode_chunked=req.has_header('Transfer-encoding'))
File "/anaconda3/lib/python3.7/http/client.py", line 1229, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1275, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1224, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1016, in _send_output
self.send(msg)
File "/anaconda3/lib/python3.7/http/client.py", line 956, in send
self.connect()
File "/anaconda3/lib/python3.7/http/client.py", line 1384, in connect
super().connect()
File "/anaconda3/lib/python3.7/http/client.py", line 928, in connect
(self.host,self.port), self.timeout, self.source_address)
File "/anaconda3/lib/python3.7/socket.py", line 727, in create_connection
raise err
File "/anaconda3/lib/python3.7/socket.py", line 716, in create_connection
sock.connect(sa)
ConnectionRefusedError: [Errno 61] Connection refused
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/anaconda3/lib/python3.7/threading.py", line 917, in _bootstrap_inner
self.run()
File "/anaconda3/lib/python3.7/threading.py", line 865, in run
self._target(*self._args, **self._kwargs)
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 168, in worker_detail
get_detail(item)
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 372, in get_detail
detail_json = get_populartimes_by_detail(params["API_key"], g_places[place_id])
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 409, in get_populartimes_by_detail
detail_json = add_optional_parameters(detail_json, detail, *get_populartimes_from_search(place_identifier))
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 320, in get_populartimes_from_search
context=gcontext)
File "/anaconda3/lib/python3.7/urllib/request.py", line 222, in urlopen
return opener.open(url, data, timeout)
File "/anaconda3/lib/python3.7/urllib/request.py", line 525, in open
response = self._open(req, data)
File "/anaconda3/lib/python3.7/urllib/request.py", line 543, in _open
'_open', req)
File "/anaconda3/lib/python3.7/urllib/request.py", line 503, in _call_chain
result = func(*args)
File "/anaconda3/lib/python3.7/urllib/request.py", line 1360, in https_open
context=self._context, check_hostname=self._check_hostname)
File "/anaconda3/lib/python3.7/urllib/request.py", line 1319, in do_open
raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 61] Connection refused>
Exception in thread Thread-78:
Traceback (most recent call last):
File "/anaconda3/lib/python3.7/urllib/request.py", line 1317, in do_open
encode_chunked=req.has_header('Transfer-encoding'))
File "/anaconda3/lib/python3.7/http/client.py", line 1229, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1275, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1224, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1016, in _send_output
self.send(msg)
File "/anaconda3/lib/python3.7/http/client.py", line 956, in send
self.connect()
File "/anaconda3/lib/python3.7/http/client.py", line 1384, in connect
super().connect()
File "/anaconda3/lib/python3.7/http/client.py", line 928, in connect
(self.host,self.port), self.timeout, self.source_address)
File "/anaconda3/lib/python3.7/socket.py", line 727, in create_connection
raise err
File "/anaconda3/lib/python3.7/socket.py", line 716, in create_connection
sock.connect(sa)
ConnectionRefusedError: [Errno 61] Connection refused
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/anaconda3/lib/python3.7/threading.py", line 917, in _bootstrap_inner
self.run()
File "/anaconda3/lib/python3.7/threading.py", line 865, in run
self._target(*self._args, **self._kwargs)
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 168, in worker_detail
get_detail(item)
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 372, in get_detail
detail_json = get_populartimes_by_detail(params["API_key"], g_places[place_id])
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 409, in get_populartimes_by_detail
detail_json = add_optional_parameters(detail_json, detail, *get_populartimes_from_search(place_identifier))
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 320, in get_populartimes_from_search
context=gcontext)
File "/anaconda3/lib/python3.7/urllib/request.py", line 222, in urlopen
return opener.open(url, data, timeout)
File "/anaconda3/lib/python3.7/urllib/request.py", line 525, in open
response = self._open(req, data)
File "/anaconda3/lib/python3.7/urllib/request.py", line 543, in _open
'_open', req)
File "/anaconda3/lib/python3.7/urllib/request.py", line 503, in _call_chain
result = func(*args)
File "/anaconda3/lib/python3.7/urllib/request.py", line 1360, in https_open
context=self._context, check_hostname=self._check_hostname)
File "/anaconda3/lib/python3.7/urllib/request.py", line 1319, in do_open
raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 61] Connection refused>
Exception in thread Thread-72:
Traceback (most recent call last):
File "/anaconda3/lib/python3.7/urllib/request.py", line 1317, in do_open
encode_chunked=req.has_header('Transfer-encoding'))
File "/anaconda3/lib/python3.7/http/client.py", line 1229, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1275, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1224, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1016, in _send_output
self.send(msg)
File "/anaconda3/lib/python3.7/http/client.py", line 956, in send
self.connect()
File "/anaconda3/lib/python3.7/http/client.py", line 1384, in connect
super().connect()
File "/anaconda3/lib/python3.7/http/client.py", line 928, in connect
(self.host,self.port), self.timeout, self.source_address)
File "/anaconda3/lib/python3.7/socket.py", line 727, in create_connection
raise err
File "/anaconda3/lib/python3.7/socket.py", line 716, in create_connection
sock.connect(sa)
ConnectionRefusedError: [Errno 61] Connection refused
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/anaconda3/lib/python3.7/threading.py", line 917, in _bootstrap_inner
self.run()
File "/anaconda3/lib/python3.7/threading.py", line 865, in run
self._target(*self._args, **self._kwargs)
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 168, in worker_detail
get_detail(item)
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 372, in get_detail
detail_json = get_populartimes_by_detail(params["API_key"], g_places[place_id])
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 409, in get_populartimes_by_detail
detail_json = add_optional_parameters(detail_json, detail, *get_populartimes_from_search(place_identifier))
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 320, in get_populartimes_from_search
context=gcontext)
File "/anaconda3/lib/python3.7/urllib/request.py", line 222, in urlopen
return opener.open(url, data, timeout)
File "/anaconda3/lib/python3.7/urllib/request.py", line 525, in open
response = self._open(req, data)
File "/anaconda3/lib/python3.7/urllib/request.py", line 543, in _open
'_open', req)
File "/anaconda3/lib/python3.7/urllib/request.py", line 503, in _call_chain
result = func(*args)
File "/anaconda3/lib/python3.7/urllib/request.py", line 1360, in https_open
context=self._context, check_hostname=self._check_hostname)
File "/anaconda3/lib/python3.7/urllib/request.py", line 1319, in do_open
raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 61] Connection refused>
Exception in thread Thread-80:
Traceback (most recent call last):
File "/anaconda3/lib/python3.7/urllib/request.py", line 1317, in do_open
encode_chunked=req.has_header('Transfer-encoding'))
File "/anaconda3/lib/python3.7/http/client.py", line 1229, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1275, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1224, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1016, in _send_output
self.send(msg)
File "/anaconda3/lib/python3.7/http/client.py", line 956, in send
self.connect()
File "/anaconda3/lib/python3.7/http/client.py", line 1384, in connect
super().connect()
File "/anaconda3/lib/python3.7/http/client.py", line 928, in connect
(self.host,self.port), self.timeout, self.source_address)
File "/anaconda3/lib/python3.7/socket.py", line 727, in create_connection
raise err
File "/anaconda3/lib/python3.7/socket.py", line 716, in create_connection
sock.connect(sa)
ConnectionRefusedError: [Errno 61] Connection refused
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/anaconda3/lib/python3.7/threading.py", line 917, in _bootstrap_inner
self.run()
File "/anaconda3/lib/python3.7/threading.py", line 865, in run
self._target(*self._args, **self._kwargs)
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 168, in worker_detail
get_detail(item)
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 372, in get_detail
detail_json = get_populartimes_by_detail(params["API_key"], g_places[place_id])
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 409, in get_populartimes_by_detail
detail_json = add_optional_parameters(detail_json, detail, *get_populartimes_from_search(place_identifier))
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 320, in get_populartimes_from_search
context=gcontext)
File "/anaconda3/lib/python3.7/urllib/request.py", line 222, in urlopen
return opener.open(url, data, timeout)
File "/anaconda3/lib/python3.7/urllib/request.py", line 525, in open
response = self._open(req, data)
File "/anaconda3/lib/python3.7/urllib/request.py", line 543, in _open
'_open', req)
File "/anaconda3/lib/python3.7/urllib/request.py", line 503, in _call_chain
result = func(*args)
File "/anaconda3/lib/python3.7/urllib/request.py", line 1360, in https_open
context=self._context, check_hostname=self._check_hostname)
File "/anaconda3/lib/python3.7/urllib/request.py", line 1319, in do_open
raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 61] Connection refused>
Exception in thread Thread-76:
Traceback (most recent call last):
File "/anaconda3/lib/python3.7/urllib/request.py", line 1317, in do_open
encode_chunked=req.has_header('Transfer-encoding'))
File "/anaconda3/lib/python3.7/http/client.py", line 1229, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1275, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1224, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1016, in _send_output
self.send(msg)
File "/anaconda3/lib/python3.7/http/client.py", line 956, in send
self.connect()
File "/anaconda3/lib/python3.7/http/client.py", line 1384, in connect
super().connect()
File "/anaconda3/lib/python3.7/http/client.py", line 928, in connect
(self.host,self.port), self.timeout, self.source_address)
File "/anaconda3/lib/python3.7/socket.py", line 727, in create_connection
raise err
File "/anaconda3/lib/python3.7/socket.py", line 716, in create_connection
sock.connect(sa)
ConnectionRefusedError: [Errno 61] Connection refused
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/anaconda3/lib/python3.7/threading.py", line 917, in _bootstrap_inner
self.run()
File "/anaconda3/lib/python3.7/threading.py", line 865, in run
self._target(*self._args, **self._kwargs)
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 168, in worker_detail
get_detail(item)
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 372, in get_detail
detail_json = get_populartimes_by_detail(params["API_key"], g_places[place_id])
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 409, in get_populartimes_by_detail
detail_json = add_optional_parameters(detail_json, detail, *get_populartimes_from_search(place_identifier))
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 320, in get_populartimes_from_search
context=gcontext)
File "/anaconda3/lib/python3.7/urllib/request.py", line 222, in urlopen
return opener.open(url, data, timeout)
File "/anaconda3/lib/python3.7/urllib/request.py", line 525, in open
response = self._open(req, data)
File "/anaconda3/lib/python3.7/urllib/request.py", line 543, in _open
'_open', req)
File "/anaconda3/lib/python3.7/urllib/request.py", line 503, in _call_chain
result = func(*args)
File "/anaconda3/lib/python3.7/urllib/request.py", line 1360, in https_open
context=self._context, check_hostname=self._check_hostname)
File "/anaconda3/lib/python3.7/urllib/request.py", line 1319, in do_open
raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 61] Connection refused>
Exception in thread Thread-82:
Traceback (most recent call last):
File "/anaconda3/lib/python3.7/urllib/request.py", line 1317, in do_open
encode_chunked=req.has_header('Transfer-encoding'))
File "/anaconda3/lib/python3.7/http/client.py", line 1229, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1275, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1224, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1016, in _send_output
self.send(msg)
File "/anaconda3/lib/python3.7/http/client.py", line 956, in send
self.connect()
File "/anaconda3/lib/python3.7/http/client.py", line 1384, in connect
super().connect()
File "/anaconda3/lib/python3.7/http/client.py", line 928, in connect
(self.host,self.port), self.timeout, self.source_address)
File "/anaconda3/lib/python3.7/socket.py", line 727, in create_connection
raise err
File "/anaconda3/lib/python3.7/socket.py", line 716, in create_connection
sock.connect(sa)
ConnectionRefusedError: [Errno 61] Connection refused
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/anaconda3/lib/python3.7/threading.py", line 917, in _bootstrap_inner
self.run()
File "/anaconda3/lib/python3.7/threading.py", line 865, in run
self._target(*self._args, **self._kwargs)
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 168, in worker_detail
get_detail(item)
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 372, in get_detail
detail_json = get_populartimes_by_detail(params["API_key"], g_places[place_id])
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 409, in get_populartimes_by_detail
detail_json = add_optional_parameters(detail_json, detail, *get_populartimes_from_search(place_identifier))
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 320, in get_populartimes_from_search
context=gcontext)
File "/anaconda3/lib/python3.7/urllib/request.py", line 222, in urlopen
return opener.open(url, data, timeout)
File "/anaconda3/lib/python3.7/urllib/request.py", line 525, in open
response = self._open(req, data)
File "/anaconda3/lib/python3.7/urllib/request.py", line 543, in _open
'_open', req)
File "/anaconda3/lib/python3.7/urllib/request.py", line 503, in _call_chain
result = func(*args)
File "/anaconda3/lib/python3.7/urllib/request.py", line 1360, in https_open
context=self._context, check_hostname=self._check_hostname)
File "/anaconda3/lib/python3.7/urllib/request.py", line 1319, in do_open
raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 61] Connection refused>
Exception in thread Thread-84:
Traceback (most recent call last):
File "/anaconda3/lib/python3.7/urllib/request.py", line 1317, in do_open
encode_chunked=req.has_header('Transfer-encoding'))
File "/anaconda3/lib/python3.7/http/client.py", line 1229, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1275, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1224, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1016, in _send_output
self.send(msg)
File "/anaconda3/lib/python3.7/http/client.py", line 956, in send
self.connect()
File "/anaconda3/lib/python3.7/http/client.py", line 1384, in connect
super().connect()
File "/anaconda3/lib/python3.7/http/client.py", line 928, in connect
(self.host,self.port), self.timeout, self.source_address)
File "/anaconda3/lib/python3.7/socket.py", line 727, in create_connection
raise err
File "/anaconda3/lib/python3.7/socket.py", line 716, in create_connection
sock.connect(sa)
ConnectionRefusedError: [Errno 61] Connection refused
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/anaconda3/lib/python3.7/threading.py", line 917, in _bootstrap_inner
self.run()
File "/anaconda3/lib/python3.7/threading.py", line 865, in run
self._target(*self._args, **self._kwargs)
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 168, in worker_detail
get_detail(item)
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 372, in get_detail
detail_json = get_populartimes_by_detail(params["API_key"], g_places[place_id])
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 409, in get_populartimes_by_detail
detail_json = add_optional_parameters(detail_json, detail, *get_populartimes_from_search(place_identifier))
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 320, in get_populartimes_from_search
context=gcontext)
File "/anaconda3/lib/python3.7/urllib/request.py", line 222, in urlopen
return opener.open(url, data, timeout)
File "/anaconda3/lib/python3.7/urllib/request.py", line 525, in open
response = self._open(req, data)
File "/anaconda3/lib/python3.7/urllib/request.py", line 543, in _open
'_open', req)
File "/anaconda3/lib/python3.7/urllib/request.py", line 503, in _call_chain
result = func(*args)
File "/anaconda3/lib/python3.7/urllib/request.py", line 1360, in https_open
context=self._context, check_hostname=self._check_hostname)
File "/anaconda3/lib/python3.7/urllib/request.py", line 1319, in do_open
raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 61] Connection refused>
Exception in thread Thread-77:
Traceback (most recent call last):
File "/anaconda3/lib/python3.7/urllib/request.py", line 1317, in do_open
encode_chunked=req.has_header('Transfer-encoding'))
File "/anaconda3/lib/python3.7/http/client.py", line 1229, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1275, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1224, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1016, in _send_output
self.send(msg)
File "/anaconda3/lib/python3.7/http/client.py", line 956, in send
self.connect()
File "/anaconda3/lib/python3.7/http/client.py", line 1384, in connect
super().connect()
File "/anaconda3/lib/python3.7/http/client.py", line 928, in connect
(self.host,self.port), self.timeout, self.source_address)
File "/anaconda3/lib/python3.7/socket.py", line 727, in create_connection
raise err
File "/anaconda3/lib/python3.7/socket.py", line 716, in create_connection
sock.connect(sa)
ConnectionRefusedError: [Errno 61] Connection refused
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/anaconda3/lib/python3.7/threading.py", line 917, in _bootstrap_inner
self.run()
File "/anaconda3/lib/python3.7/threading.py", line 865, in run
self._target(*self._args, **self._kwargs)
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 168, in worker_detail
get_detail(item)
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 372, in get_detail
detail_json = get_populartimes_by_detail(params["API_key"], g_places[place_id])
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 409, in get_populartimes_by_detail
detail_json = add_optional_parameters(detail_json, detail, *get_populartimes_from_search(place_identifier))
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 320, in get_populartimes_from_search
context=gcontext)
File "/anaconda3/lib/python3.7/urllib/request.py", line 222, in urlopen
return opener.open(url, data, timeout)
File "/anaconda3/lib/python3.7/urllib/request.py", line 525, in open
response = self._open(req, data)
File "/anaconda3/lib/python3.7/urllib/request.py", line 543, in _open
'_open', req)
File "/anaconda3/lib/python3.7/urllib/request.py", line 503, in _call_chain
result = func(*args)
File "/anaconda3/lib/python3.7/urllib/request.py", line 1360, in https_open
context=self._context, check_hostname=self._check_hostname)
File "/anaconda3/lib/python3.7/urllib/request.py", line 1319, in do_open
raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 61] Connection refused>
Exception in thread Thread-67:
Traceback (most recent call last):
File "/anaconda3/lib/python3.7/urllib/request.py", line 1317, in do_open
encode_chunked=req.has_header('Transfer-encoding'))
File "/anaconda3/lib/python3.7/http/client.py", line 1229, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1275, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1224, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1016, in _send_output
self.send(msg)
File "/anaconda3/lib/python3.7/http/client.py", line 956, in send
self.connect()
File "/anaconda3/lib/python3.7/http/client.py", line 1384, in connect
super().connect()
File "/anaconda3/lib/python3.7/http/client.py", line 928, in connect
(self.host,self.port), self.timeout, self.source_address)
File "/anaconda3/lib/python3.7/socket.py", line 727, in create_connection
raise err
File "/anaconda3/lib/python3.7/socket.py", line 716, in create_connection
sock.connect(sa)
ConnectionRefusedError: [Errno 61] Connection refused
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/anaconda3/lib/python3.7/threading.py", line 917, in _bootstrap_inner
self.run()
File "/anaconda3/lib/python3.7/threading.py", line 865, in run
self._target(*self._args, **self._kwargs)
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 168, in worker_detail
get_detail(item)
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 372, in get_detail
detail_json = get_populartimes_by_detail(params["API_key"], g_places[place_id])
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 409, in get_populartimes_by_detail
detail_json = add_optional_parameters(detail_json, detail, *get_populartimes_from_search(place_identifier))
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 320, in get_populartimes_from_search
context=gcontext)
File "/anaconda3/lib/python3.7/urllib/request.py", line 222, in urlopen
return opener.open(url, data, timeout)
File "/anaconda3/lib/python3.7/urllib/request.py", line 525, in open
response = self._open(req, data)
File "/anaconda3/lib/python3.7/urllib/request.py", line 543, in _open
'_open', req)
File "/anaconda3/lib/python3.7/urllib/request.py", line 503, in _call_chain
result = func(*args)
File "/anaconda3/lib/python3.7/urllib/request.py", line 1360, in https_open
context=self._context, check_hostname=self._check_hostname)
File "/anaconda3/lib/python3.7/urllib/request.py", line 1319, in do_open
raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 61] Connection refused>
Exception in thread Thread-75:
Traceback (most recent call last):
File "/anaconda3/lib/python3.7/urllib/request.py", line 1317, in do_open
encode_chunked=req.has_header('Transfer-encoding'))
File "/anaconda3/lib/python3.7/http/client.py", line 1229, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1275, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1224, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1016, in _send_output
self.send(msg)
File "/anaconda3/lib/python3.7/http/client.py", line 956, in send
self.connect()
File "/anaconda3/lib/python3.7/http/client.py", line 1384, in connect
super().connect()
File "/anaconda3/lib/python3.7/http/client.py", line 928, in connect
(self.host,self.port), self.timeout, self.source_address)
File "/anaconda3/lib/python3.7/socket.py", line 727, in create_connection
raise err
File "/anaconda3/lib/python3.7/socket.py", line 716, in create_connection
sock.connect(sa)
ConnectionRefusedError: [Errno 61] Connection refused
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/anaconda3/lib/python3.7/threading.py", line 917, in _bootstrap_inner
self.run()
File "/anaconda3/lib/python3.7/threading.py", line 865, in run
self._target(*self._args, **self._kwargs)
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 168, in worker_detail
get_detail(item)
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 372, in get_detail
detail_json = get_populartimes_by_detail(params["API_key"], g_places[place_id])
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 409, in get_populartimes_by_detail
detail_json = add_optional_parameters(detail_json, detail, *get_populartimes_from_search(place_identifier))
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 320, in get_populartimes_from_search
context=gcontext)
File "/anaconda3/lib/python3.7/urllib/request.py", line 222, in urlopen
return opener.open(url, data, timeout)
File "/anaconda3/lib/python3.7/urllib/request.py", line 525, in open
response = self._open(req, data)
File "/anaconda3/lib/python3.7/urllib/request.py", line 543, in _open
'_open', req)
File "/anaconda3/lib/python3.7/urllib/request.py", line 503, in _call_chain
result = func(*args)
File "/anaconda3/lib/python3.7/urllib/request.py", line 1360, in https_open
context=self._context, check_hostname=self._check_hostname)
File "/anaconda3/lib/python3.7/urllib/request.py", line 1319, in do_open
raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 61] Connection refused>
Exception in thread Thread-68:
Traceback (most recent call last):
File "/anaconda3/lib/python3.7/urllib/request.py", line 1317, in do_open
encode_chunked=req.has_header('Transfer-encoding'))
File "/anaconda3/lib/python3.7/http/client.py", line 1229, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1275, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1224, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1016, in _send_output
self.send(msg)
File "/anaconda3/lib/python3.7/http/client.py", line 956, in send
self.connect()
File "/anaconda3/lib/python3.7/http/client.py", line 1384, in connect
super().connect()
File "/anaconda3/lib/python3.7/http/client.py", line 928, in connect
(self.host,self.port), self.timeout, self.source_address)
File "/anaconda3/lib/python3.7/socket.py", line 727, in create_connection
raise err
File "/anaconda3/lib/python3.7/socket.py", line 716, in create_connection
sock.connect(sa)
ConnectionRefusedError: [Errno 61] Connection refused
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/anaconda3/lib/python3.7/threading.py", line 917, in _bootstrap_inner
self.run()
File "/anaconda3/lib/python3.7/threading.py", line 865, in run
self._target(*self._args, **self._kwargs)
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 168, in worker_detail
get_detail(item)
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 372, in get_detail
detail_json = get_populartimes_by_detail(params["API_key"], g_places[place_id])
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 409, in get_populartimes_by_detail
detail_json = add_optional_parameters(detail_json, detail, *get_populartimes_from_search(place_identifier))
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 320, in get_populartimes_from_search
context=gcontext)
File "/anaconda3/lib/python3.7/urllib/request.py", line 222, in urlopen
return opener.open(url, data, timeout)
File "/anaconda3/lib/python3.7/urllib/request.py", line 525, in open
response = self._open(req, data)
File "/anaconda3/lib/python3.7/urllib/request.py", line 543, in _open
'_open', req)
File "/anaconda3/lib/python3.7/urllib/request.py", line 503, in _call_chain
result = func(*args)
File "/anaconda3/lib/python3.7/urllib/request.py", line 1360, in https_open
context=self._context, check_hostname=self._check_hostname)
File "/anaconda3/lib/python3.7/urllib/request.py", line 1319, in do_open
raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 61] Connection refused>
Exception in thread Thread-69:
Traceback (most recent call last):
File "/anaconda3/lib/python3.7/urllib/request.py", line 1317, in do_open
encode_chunked=req.has_header('Transfer-encoding'))
File "/anaconda3/lib/python3.7/http/client.py", line 1229, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1275, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1224, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1016, in _send_output
self.send(msg)
File "/anaconda3/lib/python3.7/http/client.py", line 956, in send
self.connect()
File "/anaconda3/lib/python3.7/http/client.py", line 1384, in connect
super().connect()
File "/anaconda3/lib/python3.7/http/client.py", line 928, in connect
(self.host,self.port), self.timeout, self.source_address)
File "/anaconda3/lib/python3.7/socket.py", line 727, in create_connection
raise err
File "/anaconda3/lib/python3.7/socket.py", line 716, in create_connection
sock.connect(sa)
ConnectionRefusedError: [Errno 61] Connection refused
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/anaconda3/lib/python3.7/threading.py", line 917, in _bootstrap_inner
self.run()
File "/anaconda3/lib/python3.7/threading.py", line 865, in run
self._target(*self._args, **self._kwargs)
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 168, in worker_detail
get_detail(item)
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 372, in get_detail
detail_json = get_populartimes_by_detail(params["API_key"], g_places[place_id])
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 409, in get_populartimes_by_detail
detail_json = add_optional_parameters(detail_json, detail, *get_populartimes_from_search(place_identifier))
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 320, in get_populartimes_from_search
context=gcontext)
File "/anaconda3/lib/python3.7/urllib/request.py", line 222, in urlopen
return opener.open(url, data, timeout)
File "/anaconda3/lib/python3.7/urllib/request.py", line 525, in open
response = self._open(req, data)
File "/anaconda3/lib/python3.7/urllib/request.py", line 543, in _open
'_open', req)
File "/anaconda3/lib/python3.7/urllib/request.py", line 503, in _call_chain
result = func(*args)
File "/anaconda3/lib/python3.7/urllib/request.py", line 1360, in https_open
context=self._context, check_hostname=self._check_hostname)
File "/anaconda3/lib/python3.7/urllib/request.py", line 1319, in do_open
raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 61] Connection refused>
Exception in thread Thread-83:
Traceback (most recent call last):
File "/anaconda3/lib/python3.7/urllib/request.py", line 1317, in do_open
encode_chunked=req.has_header('Transfer-encoding'))
File "/anaconda3/lib/python3.7/http/client.py", line 1229, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1275, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1224, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/anaconda3/lib/python3.7/http/client.py", line 1016, in _send_output
self.send(msg)
File "/anaconda3/lib/python3.7/http/client.py", line 956, in send
self.connect()
File "/anaconda3/lib/python3.7/http/client.py", line 1384, in connect
super().connect()
File "/anaconda3/lib/python3.7/http/client.py", line 928, in connect
(self.host,self.port), self.timeout, self.source_address)
File "/anaconda3/lib/python3.7/socket.py", line 727, in create_connection
raise err
File "/anaconda3/lib/python3.7/socket.py", line 716, in create_connection
sock.connect(sa)
ConnectionRefusedError: [Errno 61] Connection refused
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/anaconda3/lib/python3.7/threading.py", line 917, in _bootstrap_inner
self.run()
File "/anaconda3/lib/python3.7/threading.py", line 865, in run
self._target(*self._args, **self._kwargs)
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 168, in worker_detail
get_detail(item)
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 372, in get_detail
detail_json = get_populartimes_by_detail(params["API_key"], g_places[place_id])
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 409, in get_populartimes_by_detail
detail_json = add_optional_parameters(detail_json, detail, *get_populartimes_from_search(place_identifier))
File "/anaconda3/lib/python3.7/site-packages/populartimes/crawler.py", line 320, in get_populartimes_from_search
context=gcontext)
File "/anaconda3/lib/python3.7/urllib/request.py", line 222, in urlopen
return opener.open(url, data, timeout)
File "/anaconda3/lib/python3.7/urllib/request.py", line 525, in open
response = self._open(req, data)
File "/anaconda3/lib/python3.7/urllib/request.py", line 543, in _open
'_open', req)
File "/anaconda3/lib/python3.7/urllib/request.py", line 503, in _call_chain
result = func(*args)
File "/anaconda3/lib/python3.7/urllib/request.py", line 1360, in https_open
context=self._context, check_hostname=self._check_hostname)
File "/anaconda3/lib/python3.7/urllib/request.py", line 1319, in do_open
raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 61] Connection refused>
Hi @JosiahParry Where you able to solve the error? I encountered a similar issue.
I've been using this great module for a while now, only recently I got myself into trouble. I've requested the same area lat1,lon1, lat2,lon2, several times on the same area several times. Usually this resulted in approximatly 1000 search API counts and about 200 places as result. This is all fine. However recently I noticed that the API call gave some kind of repetitive connection error (I didn't pay attention to the error because I assumed it was my own internet connection). After each failed request it automatically retried the call. After several minutes I interrupted the Python process myself and retried it successfully later (getting 200 results).
Several days later I was shocked that spend 1600 Euro's at the time of these 'connection errors'. I checked my API traffic and it stated that I triggered 46000x Google Place API calls, and no Error messages. I should have been more cautious and have set a quota limit, but I did not, since I was never planning to go near the free 200 USD credit. That said, I'm still pretty shocked that Google charged me for place API counts while I never received data from the failing requests. I've contacted Google and they said that the charges were valid. I cannot prove anything. Only google might be able to see that those 46000 counts probably contain the same repeated requests every time.
Is there a method to prevent these quick (costly charges)? Like an exponential backoff for the failed requests? Any other suggestions are also welcome.