Closed AlexStead closed 2 years ago
I can't reproduce the error. Have you been making a lot of calls to Nomis? It does rate limit, although the limit is not transparent.
After successfully trying out the same command elsewhere, it does seem to be rate limiting.
But this is odd, as I haven't made any calls to Nomis for a couple of months, at least. December-January I was making lots.
Have you encountered this before? How long before the rate limiting is ended?
tl;dr: try updating {readr} and have another go!
From the error message which {nomisr} helpfully shows us, it seems like the problem might be something to do with an argument called show_col_types
, within a call to readr::read_csv()
, which happens... somewhere. Not sure yet. Let's dig into what {nomisr} is doing behind the scenes:
nomisr::nomis_get_data()
ends up calling httr::content()
to parse the response from the API, also passing in the argument show_col_types = TRUE
via ...
text/csv
, and passes the response over to readr::read_csv()
, along with ...
show_col_types
argument wasn't added to readr::read_csv()
until v2.1.2, released Jan 2020, so if we're using an older version of {readr} it doesn't like the ...
we're giving to it@evanodell for this reason it might be worth adding readr >= 2.1.2
to {nomisr}'s DESCRIPTION file, even though it's not used directly. Or alternatively, checking the installed version of {readr} and passing the version-appropriate silencing arg via ...
.
I'm guessing the different environments you were using @AlexStead had different versions of {readr} installed, one of which must have been older. Hopefully the possible rate limiting was a red herring!
@owenjonesuob Good find, I'll update that shortly.
nomis_get_data()
was previously working fine for me but now throws up an error message. E.g.nomis_get_data(id = "NM_893_1", time = "latest", geography = "TYPE266")
Recently, no problem when I run this. But now I get an error message (below). I've tried different requests to no avail.
Other commands (e.g.
nomis_get_metadata
appear to be working so this appears to be a bug rather than an issue with the API.