"Is it possible to have the scraper not completely quit and exit when the limit is reached, but offer an option to wait a bit and then hit something to continue (make it the non-default button). Would be nice to be able to queue everything up and then just periodically hit continue instead of starting over.
The issue I run into is if I queue up say 200, and it does 45 and quits, but 5 of those didn't auto-scrape when I restart it is going to re-attempt those 5 using up some of my searches. I'd like to be able to continue the queued up job after waiting. As of now, I have to do no more then 40-50 at a time which is brutal"
From here:
"Is it possible to have the scraper not completely quit and exit when the limit is reached, but offer an option to wait a bit and then hit something to continue (make it the non-default button). Would be nice to be able to queue everything up and then just periodically hit continue instead of starting over.
The issue I run into is if I queue up say 200, and it does 45 and quits, but 5 of those didn't auto-scrape when I restart it is going to re-attempt those 5 using up some of my searches. I'd like to be able to continue the queued up job after waiting. As of now, I have to do no more then 40-50 at a time which is brutal"