Closed ddseo closed 5 years ago
Hmm, maybe... but what should happen with those URLs that managed to load correctly before the failing one, should the HAR file be generated for those only?
I think that seems ideal, even if not totally necessary.
Setting the capturer to cancel on failure seems like it would indicate that the results with a failure (AKA the HAR file for the URLs successfully loaded before the failure) aren't that useful, but the HAR could be useful for debugging or in the case that partial results are better than none.
For context, this flag would be very useful with sequential URL loads (options.parallel===false
) to avoid waiting for remaining URLs to run. Maybe it wouldn't be as important with concurrent URL loads.
Implemented in 1c2b022d6a89046020348e4b0d613d9b1ea67676, please let me know what you think.
Works great. Thanks! There's one or two minor things to update in the README, see #73.
I totally missed that, thanks!
Currently, if one of the URLs fails to load for whatever reason (triggers the
'fail'
event), the remaining URLs in theurls
array run anyways. Could there be a way provided to disable this behaviour, and skip the remaining URLs upon one of them failing? I'm not aware of a way to do this now. Likely it would be an option in theoptions
object in the API params (or an options flag for the CLI).