Some pages may be failed during the crawling, is there any way to enable retry for the crawler? so that i can make sure all pages are downloaded. Meanwhile, is there any way to figure out which pages are not downloaded? There seems no such information in log or es. Thanks a lot.
Some pages may be failed during the crawling, is there any way to enable retry for the crawler? so that i can make sure all pages are downloaded. Meanwhile, is there any way to figure out which pages are not downloaded? There seems no such information in log or es. Thanks a lot.