incrementUntil can stop an increment at a failed download. Outside of that though, there is no way to passively allow download failures. (e.g. a site has broken links on half of their page links, but the other half are still good). This option will fix that. With the allowRequestErrors (defaults false) flag, download failures will not bubble up. Instead, they will be marked as warnings in the log, and emitted with '<scraper>:failure'.
Something to consider, there is also plans to add a retry download option (#22). When used in tandem (e.g. options has both allowRequestErrors: true and retry: { limit: 5 }), continue retrying until the limit (5 in this case) and then log a warning. Without allowRequestErrors: true, if the retry limit is reached, then an error will be thrown.
incrementUntil
can stop an increment at a failed download. Outside of that though, there is no way to passively allow download failures. (e.g. a site has broken links on half of their page links, but the other half are still good). This option will fix that. With theallowRequestErrors
(defaultsfalse
) flag, download failures will not bubble up. Instead, they will be marked as warnings in the log, and emitted with'<scraper>:failure'
.Something to consider, there is also plans to add a retry download option (#22). When used in tandem (e.g. options has both
allowRequestErrors: true
andretry: { limit: 5 }
), continue retrying until the limit (5 in this case) and then log a warning. WithoutallowRequestErrors: true
, if the retry limit is reached, then an error will be thrown.