Closed curita closed 1 year ago
I'm checking and this probably affects --spider-args
too and others.
Hi @curita I fixed the issue https://github.com/scrapinghub/hcf-backend/commit/3ab1105763b0932dc8d5e89d9265523e54de85b1
I released version 0.5.2.1
Thank you for the quick turnaround 🙇♀️
Issue
No job settings are passed to jobs scheduled with HCFCrawlManager, only Frontera settings. Job settings can still be sent to this manager via the script argument
--job-settings
as it inherits from CrawlManager, but they aren't used.Reproduce
I used the MyArticlesGraphManager from https://github.com/scrapinghub/shub-workflow/wiki/Graph-Managers-with-HCF (adapted to my project), and the
scrapers
task with the consumers didn't work as expected, as theconsumer_settings
weren't provided to the consumer spiders. For that example, it meant that the start requests weren't skipped.