daroczig / fbRads

Analyze and manage Facebook ads from R using this client library to access their Marketing APIs
GNU Affero General Public License v3.0
153 stars 57 forks source link

Async call that 'exceeded rate limit' #83

Closed dataders closed 6 years ago

dataders commented 6 years ago

Firstly, @daroczig, this package is awesome! However, I'm getting locked out if I"m requesting too much data, with the error message: "Calls to this api have exceeded the rate limit." Shouldn't an asynchronous job_type take care of this?

df_ads <- fb_insights(
    job_type = "async",
    time_increment = "1",
    time_range = toJSON(list(
        since = "2017-10-30",
        until = "2018-01-05"
    ), auto_unbox = TRUE),
    breakdowns = "country",
    level = "ad",
    fields = toJSON(c(
        "account_name",
        "campaign_name",
        "adset_name",
        "ad_name",
        'spend',
        "impressions",
        "reach",
        "total_actions",
        "total_unique_actions",
        'clicks',
        'unique_clicks',
        'inline_link_clicks',
        'social_clicks',
        'cpc',
        'ctr'
    ))) %>%
    bind_rows()

Cheers!

daroczig commented 6 years ago

Thanks :)

The async job handles issues with querying "too much data", but the API rate limit error is independent from that: https://developers.facebook.com/docs/marketing-api/api-rate-limiting

Not sure how many queries you do and if your app is approved for standard or even basic access (so not using a development stage app), but adding some sleep time between queries might help with this.

Can you share more about the number of queries before you start getting this error and if your app is approved for prod usage?

dataders commented 6 years ago

The app is still in open development mode. In the last 14 days (i.e. the entire lifetime of the app), I've made: 929 marketing API calls (all of them GET requests) with 72 errors. On the page you shared it mentions that error 17 is "API Level Rate Limiting". However, this chart on my app dashboard says I'm not at all close to the rate limiting threshold.

Yesterday I went to see how many queries I could run before encountering an error. I was able to make the above query three time without issue. Today I successfully got a token, ran fb_inti() without issue. However, today the query above failed one the first attempt with this result.

I went to start over and reinitialize and got this error

Here's my script

I'd appreciate it if you have any insight!

daroczig commented 6 years ago

I don't know how many ads you have on that Ad Account, but querying daily data for ~2 months broken down by country might result in quite many rows, and hitting the API multiple times i n row due to paging might result in such rate limit error.

BTW what's the goal with querying such old data? Can you just run the script daily and warehouse the historical data? I think that would resolve the problem. But you could also do this query in a loop with some extra Sys.sleep in between the iterations (eg getting daily or weekly data).

dataders commented 6 years ago

Thanks for getting back. Apologies for being a newb, but I'm still trying to wrap my head around the Marketing API. The API docs say: "For requests with large results, use asynchronous jobs". The ad account I'm using has around 96 ads. The resulting df is 1318X17, as a csv it is 308KB.

  1. Is 308KB of data a "large request?"
  2. Does fbRads implement async requests in line w/ FB's implementation of asynchronous requests?
  3. Do you agree with my thinking that I'm being throttled at the Ad account level?

Thanks again!

daroczig commented 6 years ago

Agreed that it's a pretty small amount of data :)

But paging it with a limitation of 25 items per page, it means ~50 queries to the API without any rate limitation on the client side, and that might trigger an error for a dev app.

I don't think we want to implement client-side rate limits ATM in the package, so I suggest the above: instead of querying one large date interval, query eg 1 week at a time, Sys.sleep for a bit and query the next week etc. Also, get your FB app reviewed by FB and approved for standard access.

dataders commented 6 years ago

Awesome. I'll do exactly that. Thanks!