Closed benoit74 closed 6 months ago
Thanks for the report, will try to repro. It should have been able to continue after the page crash.
This has hopefully been fixed in 0.11.2 - very hard to be 100% sure, but hopefully won't happen again.
Some good and some bad news on this topic.
I confirm the crawler is now continuing after a page crash. That's great.
It however looks like we have new situations (with 0.12.3 and 0.12.4) around page crashes.
Details are present in https://github.com/openzim/zimit/issues/266 and https://github.com/openzim/zimit/issues/283
Help or any suggestion on what to test to progress on this topic would be welcomed. Most important topic for us is probably the new situations of https://github.com/openzim/zimit/issues/283 were the crawler seems to return code 11 while indeed it has faced a critical situation, not a limit. This is a problem for us because we consider that hitting a limit is "normal" and we should continue processing by creating our ZIM. It is more serious than real crawler crashes because we are not alerted of the issue. If it is easy to identify and fix what led the crawler to "believe" it hits a limit, it would be a great enhancement.
One side-question: is it possible to ask the crawler to stop on first page crash (instead of trying to continue)?
I confirm that crawler 1.x seems to have solved this issue.
Thank you all for the very great work that has been pushed into 1.x release(s)!
Kiwix has a crawler which got stuck without returning, with 0.11.1 (i.e. with #385 merged). A last log is output and then process is still up but nothing more seems to be happening.
Launch command (note that I modified the
userAgentSuffix
):Version log line:
Last log line is:
Do not hesitate to ask if more info is needed.