scrapinghub / spidermon

Scrapy Extension for monitoring spiders execution.
https://spidermon.readthedocs.io
BSD 3-Clause "New" or "Revised" License
533 stars 97 forks source link

Create a default setting for SPIDERMON_UNWANTED_HTTP_CODES #153

Closed ejulio closed 5 years ago

ejulio commented 5 years ago

Say that I want to increase the default threshold for SPIDERMON_UNWANTED_HTTP_CODES.

Currently, I need to copy/paste

'SPIDERMON_UNWANTED_HTTP_CODES': {
    code: 100 for code in [400, 407, 429, 500, 502, 503, 504, 523, 540, 541]
}
  1. Maybe we can create a base setting, like https://github.com/scrapy/scrapy/blob/master/scrapy/settings/default_settings.py#L93. This way, I can iterate over this dict instead of copy/pasting the error codes.

  2. Maybe we can create a new setting for the default threshold. This would this setting instead of the const 10 for the default error codes.

Other ideas?

My preference is for 2. It looks cleaner, easier to setup and implement.

rosheen33 commented 5 years ago

Under Progress: https://github.com/scrapinghub/spidermon/compare/default_threshold?expand=1

rosheen33 commented 5 years ago

PR For Review: https://github.com/scrapinghub/spidermon/pull/156