sensiblecodeio / data-services-helpers

Python module containing classes and functions that The Sensible Code Company's Data Services often used
https://sensiblecode.io/
BSD 2-Clause "Simplified" License
4 stars 4 forks source link

Error response behaviour poor #15

Open scraperdragon opened 10 years ago

scraperdragon commented 10 years ago

1) Takes ages to request_url() due to retrying. Should probably fail straightaway on 404?

2) Raises an incredibly generic, minimal information error: RuntimeError: Max retries exceeded for <url> rather than the original error. (Logs do contain more info)

3) It's impossible to do any additional handling - e.g. "If it's a 404, that's fine; skip to the next item"

scraperdragon commented 10 years ago

4) Downloading with/without backoff gives different errors.

drj11 commented 10 years ago

In my opinion if there is default retry behaviour, it should be configured for 5xx status codes only. So I agree with dragon that a 404 should not retry. Obviously people wanted to retry on all 4xx codes too should be able to do so with suitable configuration.

StevenMaude commented 9 years ago

+1 on the lack of information. It's not easy to get at the status code, if that's of interest. I'm currently having to use a separate request to retrieve this on failure.

fawkesley commented 9 years ago

Check out the 'backoff' library which may handle this better than my homebrew attempt ;)

— Sent on the move, please excuse typos.

On Tue, Mar 24, 2015 at 5:52 PM, Steven Maude notifications@github.com wrote:

+1 on the lack of information. It's not easy to get at the status code, if that's of interest. I'm currently having to use a separate request to retrieve this on failure.

Reply to this email directly or view it on GitHub: https://github.com/scraperwiki/data-services-helpers/issues/15#issuecomment-85619733

StevenMaude commented 9 years ago

@paulfurley Looks nice; thanks for the tip.