Open Audace opened 8 years ago
yes i got that too as of today. Yesterday everything was fine
This doesn't sound good. I'll take a look at it later this evening.
If somebody else could try to find out the source of the issue, that would be appreciated! On 7 Jan 2016 13:42, "William Fry" notifications@github.com wrote:
My call to google places API passes in a keyword and a city, state (i.e. San Antonio, TX). It was working and all of the sudden started failing with `ValueError: lat_lng must be a dict with keys, 'lat' and 'lng'. However, in the init.py file, this should only occur if it can't geocode the location. I've tried with different locations but now get the same error. Any idea what would have caused it? I haven't hit my API limits.
— Reply to this email directly or view it on GitHub https://github.com/slimkrazy/python-google-places/issues/59.
Actually it might be a caching problem I am taking a from wikipedia list with cities and send request for of them to google places with your library. When it starts requesting 150th city, then the error occures and remains.
import os
import urllib2
from string import ascii_uppercase
import simplejson as json
from bs4 import BeautifulSoup
from googleplaces import GooglePlaces, types
MY_KEY = "........"
google_places = GooglePlaces(MY_KEY)
print os.path.abspath(os.curdir)
PATH_SAVE = os.path.join(os.path.abspath(os.curdir), 'SAVE')
SAVE_FILE = 'fileout6.json'
RADIUS = 30000
LOCATION_INDEX = [
['Berlin', 'Deutschland'],
['Paris', 'Frankreich'],
# 'London, United Kingdom',
# 'Madrid, Spanien',
# 'Moskau, Russland'
]
#List for testing
SEARCH_INDEX = [
types.TYPE_ART_GALLERY,
types.TYPE_MUSEUM,
types.TYPE_AQUARIUM,
types.TYPE_CHURCH,
types.TYPE_ZOO,
types.TYPE_SYNAGOGUE,
types.TYPE_MOSQUE,
'STASI',
'NKWD',
'KGB'
]
if not os.path.isdir(PATH_SAVE):
os.mkdir(PATH_SAVE, 777)
class papi():
def run(self):
# filea = open("file1.json", 'r')
# dicta = json.loads(filea.read())
# filea.close()
listobject = {}
LOCATION_INDEX = self.getList()
listlocation = []
for li in LOCATION_INDEX:
dictplace = {}
dictcountry = {}
dictplace['city'] = ','.join(li)
dictplace['country'] = li[1]
#print 'XXXXX ', li, ' XXXXX'
print '%s, %s' %(li[0], li[1])
query_result_de = google_places.nearby_search(
location='%s, %s' % (li[0], li[1]), radius=RADIUS, language='DE', types=SEARCH_INDEX) # types=[types.TYPE_MUSEUM,types.TYPE_ART_GALLERY],,
if query_result_de.has_attributions:
print query_result_de.html_attributions
listplace = []
for place in query_result_de.places:
try:
# f = open(os.path.join(PATH_SAVE, place.place_id + '.json'), 'wb')
# print place.name
place.get_details()
listplace.append(place.details)
#for photo in place.photos:
# 'maxheight' or 'maxwidth' is required
# photo.get(maxheight=500, maxwidth=500)
# MIME-type, e.g. 'image/jpeg'
# photo.mimetype
# Image URL
# photo.url
# Original filename (optional)
# photo.filename
# Raw image data
# photo.data
except Exception as e:
print(e.message)
dictplace['places'] = listplace
#listlocation.append(dictcountry)
listlocation.append(dictplace)
# dicta['objectlist'].append(listlocation)
listobject['objectlist'] = listlocation
print('eintraege', len(listobject['objectlist']))
with open(SAVE_FILE, 'w') as file_j:
file_j.write(json.dumps(listobject, indent=2, skipkeys=True, sort_keys=True))
# file_j.write(json.dumps(tempobj, indent=4, skipkeys=True, sort_keys=True))
def getList(self):
result = []
for c in ascii_uppercase:
url = "http://en.wikipedia.org/wiki/List_of_towns_and_cities_with_100,000_or_more_inhabitants/cityname:_" + c
#print(url)
response = urllib2.urlopen(url)
html = response.read()
# print(1, html)
soup = BeautifulSoup(html, "lxml")
# print(2, soup)
for lel in soup.select('table'):
allrows = lel.findAll('tr')
for row in allrows:
result.append([])
allcols = row.findAll('td')
for col in allcols:
thestrings = [s.encode('utf-8', 'ignore') for s in col.findAll(text=True)]
thetext = ''.join(thestrings)
result[-1].append(thetext)
# print(result) # remove the quote marks as well
# for r in result:
# if len(r) > 0:
# print(r[0].decode('utf-8'))
newlist = []
for r in result:
if len(r) > 1:
newlist.append([r[0].decode('utf-8'), r[1].decode('utf-8')])
return newlist
if __name__ == "__main__":
# papi().getList("https://en.wikipedia.org/w/api.php?action=query&titles=List_of_towns_and_cities_with_100,000_or_more_inhabitants/cityname:_D&&prop=revisions&rvprop=content&format=json")
papi().run()
I got the same error when i first tried:
query_result = google_places.nearby_search( location='London, England', keyword='Fish and Chips', radius=20000) Traceback (most recent call last): File "C:\Python34\lib\googleplacesinit.py", line 546, in _generate_lat_lng_string else geocode_location(location)) File "C:\Python34\lib\googleplacesinit.py", line 107, in geocode_location {'address': location, 'sensor': str(sensor).lower()}) File "C:\Python34\lib\googleplacesinit.py", line 72, in _fetch_remote_json request_url, response = _fetch_remote(service_url, params, use_http_post) File "C:\Python34\lib\googleplacesinit.py", line 55, in _fetch_remote if isinstance(v, six.string_types): NameError: name 'six' is not defined
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "<pyshell#108>", line 3, in
i suspect if i have any API Key issue. i've switched on the Google Places Web Service API in my google account. is that enough? or this error is caused by other problems?
thanks Michael
I got the same error today as well! Any leads on what the problem could be?
Below is my full length of error report: platform: python 3.5,
rslt = gps.nearby_search(location = 'london, england', keyword = 'fish and chips', radius = 20000, types=[types.TYPE_FOOD])
Traceback (most recent call last): File "/home/vzj182/apps/anaconda3/lib/python3.5/site-packages/googleplaces/init.py", line 546, in _generate_lat_lng_string else geocode_location(location)) File "/home/vzj182/apps/anaconda3/lib/python3.5/site-packages/googleplaces/init.py", line 107, in geocode_location {'address': location, 'sensor': str(sensor).lower()}) File "/home/vzj182/apps/anaconda3/lib/python3.5/site-packages/googleplaces/init.py", line 74, in _fetch_remote_json str_response = response.readall().decode('utf-8') AttributeError: 'HTTPResponse' object has no attribute 'readall'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/vzj182/apps/anaconda3/lib/python3.5/site-packages/IPython/core/interactiveshell.py", line 3066, in run_code
exec(code_obj, self.user_global_ns, self.user_ns)
File "
tried on python 2.7 anaconda, not yet find a problem.
Just got the same message, in my case it was because I got disconnected from the internet, without diving into the code I assume that the response from the nearby method is not coming back complete or failing without a catch and moving on directly to try and get the places list.
Bump, and suggested workaround:
In init.py:
Change str_response = response.readall().decode('utf-8')
from the following:
def _fetch_remote_json(service_url, params={}, use_http_post=False):
"""Retrieves a JSON object from a URL."""
request_url, response = _fetch_remote(service_url, params, use_http_post)
if six.PY3:
str_response = response.readall().decode('utf-8')
return (request_url, json.loads(str_response, parse_float=Decimal))
return (request_url, json.load(response, parse_float=Decimal))
to be str_response = response.read().decode('utf-8')
def _fetch_remote_json(service_url, params={}, use_http_post=False):
"""Retrieves a JSON object from a URL."""
request_url, response = _fetch_remote(service_url, params, use_http_post)
if six.PY3:
str_response = response.read().decode('utf-8')
return (request_url, json.loads(str_response, parse_float=Decimal))
return (request_url, json.load(response, parse_float=Decimal))
Quickly tested with python 2.7.10 and 3.5.1
@Michels10 workaround makes it possible to use it with python 3.5, thanks !
@Michels10 still not working for me :/
I'm not sure if this is the exact cause of the bug, but it immediately stood out to me. Note the following code in some of the method definitions:
params={}
This (and similar) can cause all kinds of problems, because if params
is not passed in, and any code modifies the contents of params
, then that modification affects all future calls for which params
is not passed in. It's called the mutable default arguments problem, and it's described in more detail here. I strongly recommend you search-and-destroy any mutable default arguments in this library.
Reporting the same issue occurring again as of March 27, 2017. Just works fine on a single call from the interpreter, but not from the script. The results obtained just recently with the exact same script now generate an error.
Encountered same error tonight was working fine and then suddenly stopped with this error. Correction noted above is in int.py file. Very frustrating.
Pull development version and same issue (2.7)
The error is also occurring for me, sometimes not at all and sometimes 10 queries in a row. Please help!
Do any of you have a stack trace of the error?
Have any of you seen the error on the first API call, or is it always on subsequent calls?
I'm having the same error. I'm using python and here's my code
(loc is my location variable)
query_result = google_places.nearby_search(location=loc)
place = query_result.places[0]
place.get_details()
loc = json.dumps(place.formatted_address)
I have no idea why it's telling me the lat_lng is wrong, since I'm not even using it.
It worked fine for a day and started seeing this error in my code.
Hi, same issue here. Anyone have any luck resolving?
Hi! This error happened today (7/16/2017) when I was running a code. It was working fine, but all of a sudden I got this error, and then no matter what I pass to the following, I get the error.
I had started using this module, and was getting some results, but now I am stuck with the error.
query_result = google_places.nearby_search( location='97124', keyword='"Pizza"', radius=3200, types=[types.TYPE_GROCERY_OR_SUPERMARKET], rankby='prominence')
Kindly please help resolve this!! I will highly appreciate it.
Ok folks - I will take a look at this, tonight. Truth be told, I've never been able to replicate, I might have better luck this evening.
Hi there!
Last evening (7/16/2017) when I got the error, I struggled for a bit, and then shutdown my computer. Then after about ~ 2 hours later, I started again, and then ran it, and worked.
I was running a "for loop" which was iterating over multiple addresses.
It runs for a while, but probably after a set number of addresses have been passed, it fails, and then it keeps giving the error.
You would probably have to run it hundreds of times to repeat this error. It certainly is intermittent and when a computer is shutdown, and restarted, the error goes away.
One of the things I checked is,
Probably even this subroutine may be going through [OVER_QUERY_LIMIT], but the error is attributed as lat_lng must be a dict with the keys, 'lat' and 'lng'.
I hope some of these observations may help you get to the root cause, and see for yourself what the problem may be.
Thank you for coming up with this module. It is very useful! We just need to make it more robust with some more testing :) I liked your work very much, and I am dependent on it :)
@rmalshe Do you want to try the code in master before I go ahead and submit the fix to pypi?
In terms of testing - I'd actually prefer to re-write this entire library with full tests. This library is very old and my motivation for creating it was to simply get a personal project prototype done quickly. I never imagined that others would actually use it. I wouldn't quite call this project an unwanted child, but finding time to get to issues can be quite tough.
I'll try to help out as much as I can to ensure that folks like you can continue to use this library to get your work done.
Let me try to run a long job now to see if this works out. I will keep you posted. I re-installed it now with !pip install --upgrade https://github.com/slimkrazy/python-google-places/zipball/master
Your idea is actually nice. Perhaps sometimes it is better to re-write the package rather than maintaining it on an ongoing basis. Your experience may have improved, and now that Google has changed a few things, it may be better to come up with a new package if it helps more and more people.
I think the issue some people are having is that when you are over the query limit on the geocode api, it doesn't return an over quota error.
def _generate_lat_lng_string(self, lat_lng, location):
try:
return '%(lat)s,%(lng)s' % (lat_lng if lat_lng is not None
else geocode_location(location))
except:
raise ValueError(
'lat_lng must be a dict with the keys, \'lat\' and \'lng\'')
if any error occurs within geocode_location, such as being over the query limit, it will still return 'lat_lng must be a dict with the keys, \'lat\' and \'lng\''.
Maybe there is a way to propagate any errors from geocode_location, so users can recognise when its an over quota error?
Same issue occurred for me. The problem is that you have to enable the Geocoding API here [https://developers.google.com/maps/documentation/geocoding/start]() you don't have to get another api key for geocoding api. Just enable in the same project as the places api.
I enabled everything and still get this error after a few requests on a new day (so it cannot be an over quota error). Was anybody able to fix it?
I'm getting this error only when I try to visit the link to the next page of results, even though there's no issue with the initial call?:
Traceback (most recent call last):
File "googlePlacesAPI.py", line 60, in
My call to google places API passes in a keyword and a city, state (i.e. San Antonio, TX). It was working and all of the sudden started failing with
ValueError: lat_lng must be a dict with keys, 'lat' and 'lng'
. However, in the init.py file, this should only occur if it can't geocode the location. I've tried with different locations but now get the same error. Any idea what would have caused it? I haven't hit my API limits.