Hi @kirankoduru, I have just started using your module and I may be wrong about this. Here is the situation.
In my case I need to specify a pipeline, but since the SCRAPY_SETTINGS dictionary is empty in default_settings.py, the pipeline is never enabled.
I dug around a bit in your source code and it seems like the get_spider_settings method in scrapy_utils.py only updates/sets a setting if it's already present in SCRAPY_SETTINGS dict - line 66.
If I add the 'ITEM_PIPELINES' empty dict to SCRAPY_SETTINGS in default_settings.py, the pipeline gets enabled. I feel like there is a more elegant way of making it work, will submit a PR when I get time.
Correct me if I am wrong.
EDIT - Just realized am using Python Version 3.6 and Scrapy version 1.5. Maybe the difference on versions could be causing this?
hi @dmkitui, thanks for pointing that out. I am definitely open to accepting PRs for the change you suggested. I should have made a safe get to retrieve the key from the dict for the case you mentioned.
Hi @kirankoduru, I have just started using your module and I may be wrong about this. Here is the situation.
In my case I need to specify a pipeline, but since the SCRAPY_SETTINGS dictionary is empty in default_settings.py, the pipeline is never enabled.
I dug around a bit in your source code and it seems like the get_spider_settings method in
scrapy_utils.py
only updates/sets a setting if it's already present in SCRAPY_SETTINGS dict - line 66.If I add the 'ITEM_PIPELINES' empty dict to
SCRAPY_SETTINGS
indefault_settings.py
, the pipeline gets enabled. I feel like there is a more elegant way of making it work, will submit a PR when I get time.Correct me if I am wrong.
EDIT - Just realized am using Python Version 3.6 and Scrapy version 1.5. Maybe the difference on versions could be causing this?