apneadiving / Google-Maps-for-Rails

Enables easy Google map + overlays creation in Ruby apps
https://apneadiving.github.io/
MIT License
2.26k stars 382 forks source link

Retrieving OVER_QUERY_LIMIT status from a model? #341

Closed straydogstudio closed 10 years ago

straydogstudio commented 11 years ago

Is there some way to know (outside of the log) if you're getting an 'OVER_QUERY_LIMIT' from Google with an acts_as_gmappable model? If I call Gmaps4rails.geocode(address) it raises an exception with the status. I see the status is recorded in the log for the model, but the exception itself is not used to store the error message.

Thanks for the gem

straydogstudio commented 11 years ago

If I have time, I'll try to make it so the status or the exception itself are saved.

tansaku commented 11 years ago

we could really use support for having the gem return this status in order to allow us to throttle the gmaps requests

straydogstudio commented 11 years ago

Based on my first look at the code, there isn't an easy way to get that status, because the relevant instance isn't retained. I've not had time to try anything more serious. But I will if no one else gets to it.

In the meantime, it is possible to repeat a raw geocode request after the ActiveRecord model save fails:

begin
  customer.save!
rescue ActiveRecord::RecordInvalid => e
  if e.message =~ /Gmaps4rails address Address invalid/
    begin
      Gmaps4rails.geocode(customer.gmaps4rails_address)
    rescue Gmaps4rails::GeocodeStatus => e
      if e.message =~ /OVER_QUERY_LIMIT/
        # do something intelligent
      end
    end
  end
end

That's a bit annoying. I put it in a method for easy reuse.

tansaku commented 11 years ago

great idea, thanks! we were about to start implementing this, when I realized that it wouldn't work for us, as we had monkey patched the validation not to register an error, because we need the model to save, even if it hasn't been geocoded properly. I guess the better long term solution for us is to turn the validations off completely, and then run the geocoding as a separate post-processing step

tansaku commented 11 years ago

we did also start playing around with the gem to create a new GeocodeOverLimit error - if the gem caught that it could have a throttling policy built in perhaps ...

LucasCioffi commented 11 years ago

Along these lines, does anyone know what the query limit from Google is? Is there a way to associate this gem with a premier Google account (see https://developers.google.com/appengine/docs/premier/ for details) to increase the rate limit? Many thanks!

straydogstudio commented 11 years ago

I'm pretty sure the default is 2500 per day.

LucasCioffi commented 11 years ago

Thanks, straydogstudio. Does anyone know if that is 2500/day per app or per IP address or something similar? I do not remember putting in a Google API Key into the gem or in an initializer file, so I am not clear on how Google tracks the number of requests my app makes against the limit.

Additionally, I saw a query limit error in my log yesterday (Heroku app) but I didn't come close to 2500 requests that day. Maybe I hit a rate limit, because I sent 10 requests at once? So maybe there is another rate limit which is per second or total concurrent requests?

Anyone else have experience with the specifics of the rate limit? Will connecting the app to a premier account on Google increase the rate limit?

steveo7 commented 11 years ago

See this article about recommended strategies; https://developers.google.com/maps/articles/geocodestrat

As suggested in the article, the best way to get around the limit is for the Geocoding to happen client side, then the limit only applies to each client.

straydogstudio commented 11 years ago

In my experience it is based on visible ip address. All apps behind a firewall will have a total of 2500 requests per day.

straydogstudio commented 11 years ago

Rate is an issue. It will block you out for 24 hours if you go too fast. I find putting sleep 0.1 between each request is sufficient. This is, as @steveo7 pointed out, only necessary if you are doing it server side.

LucasCioffi commented 11 years ago

Thanks, folks. That article and your responses answered my questions.

mattholtom commented 11 years ago

This would be an excellent feature, in the meantime here are some suggestions as a "bandaid".

I've been getting OVER_QUERY_LIMIT responses in my logs for some time now. I've had separate issues with the daily and per second caps. The application in question fetches data from an outside source at 5 minute intervals and Geocodes it at that time, for display to users later. (12 batches per hour x 24 hours x up to 10 intersections)

A) If you suspect the X/second limit is the problem and the speed of your geo-codes is not an issue, a simple "sleep X" between each request fixed it for me. If there's any way to perform your Geo-coding asynchronous to user interaction, do it.

B) If you think you're bumping up against the 2500 limit, you could potentially set up a Gmaps For Business Account and pay for lookups. The pricing actually seemed quite reasonable, and I believe gmaps4rails allows the setting of an API key (although the name of the key has changed in v3 of gmaps API). However, I managed to eliminate the bulk of my geo-codes by not looking up the same intersection multiple times if it has already been done. The default "acts_as_gmappable" options should avoid performing a second geo-code. If your data has these patterns you may be able to try a similar strategy.

straydogstudio commented 10 years ago

@apneadiving Have you addressed the issue, or is it just outside scope?