Open tommelo opened 7 years ago
Thanks @tommelo . This prevents an issue where a crawler gets stuck when there are fewer results than the set limit. The original result
callback doesn't inform the caller when there are no more urls, and makes the following implementation fragile:
let urls = [];
scraper.search(options, (err, url) => {
urls.push(url);
// handleResults is never called if there are fewer results.
if (urls.length === options.limit) {
handleResults(name, urls);
}
});
As mentioned in the Issue #6 I have added the possibility of getting multiple results: