Open caffeinatedMike opened 3 years ago
Looking at the way this project handles the scrapy.cfg
files made it clear why my projects aren't being picked up. Currently, this project does not account for projects that share the same root directory (see scrapy docs href). Can this be added?
I have the same problem except I have a single scrapy project. I tried directing scrapydweb to the directory with the scrapy.cfg file and the one below it. It still does not see any available projects. How did you make this work?
Nevermind. I figured out that I just had to point scrapydweb one level higher in the path, also the parent folder of the one that contains scrapy.cfg. It works now
Glad you got it figured out. I was just about to point you to the changes I madex.
I'm currently developing locally on windows 10 and have the
SCRAPY_PROJECTS_DIR
setting set toSCRAPY_PROJECTS_DIR = 'C:/Users/mhill/PycharmProjects/dScrapy/d_webscraping'
In that directory, I have the following structure:
d_webscraping/nutritional
(first project)d_webscraping/pricingandreviews
(second project)d_webscraping/scrapy.cfg
d_webscraping/scrapydweb_settings_v10.py
scrapy.cfg
file setup is like so:In
scrapydweb_settings_v10.py
SCRAPYDWEB_BIND
,SCRAPYD_SERVERS
, andLOCAL_SCRAPYD_SERVER
all have'127.0.0.1:6800'
as the server address.What am I doing wrong? Shouldn't the above two projects be showing up on
http://127.0.0.1:5000/1/deploy/
? Instead, I see the error messageNo projects found in 'C:/Users/mhill/PycharmProjects/dScrapy/d_webscraping' of the ScrapydWeb host, check and update the SCRAPY_PROJECTS_DIR option in the config file.