I use the "scrapy crawl woaidu" command to run the crawler, but there are many errors, as below:
ligang.yao@localhost woaidu_crawler$scrapy crawl woaidu
ligang.yao@localhost woaidu_crawler$scrapy crawl woaidu
Traceback (most recent call last):
File "/usr/local/bin/scrapy", line 5, in
pkg_resources.run_script('Scrapy==0.17.0', 'scrapy')
File "/System/Library/Frameworks/Python.framework/Versions/2.7/Extras/lib/python/pkg_resources.py", line 489, in run_script
self.require(requires)[0].run_script(script_name, ns)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/Extras/lib/python/pkg_resources.py", line 1207, in run_script
execfile(script_filename, namespace, namespace)
File "/Library/Python/2.7/site-packages/Scrapy-0.17.0-py2.7.egg/EGG-INFO/scripts/scrapy", line 4, in
execute()
File "/Library/Python/2.7/site-packages/Scrapy-0.17.0-py2.7.egg/scrapy/cmdline.py", line 120, in execute
crawler = CrawlerProcess(settings)
File "/Library/Python/2.7/site-packages/Scrapy-0.17.0-py2.7.egg/scrapy/crawler.py", line 74, in init
super(CrawlerProcess, self).init(_a, _kw)
File "/Library/Python/2.7/site-packages/Scrapy-0.17.0-py2.7.egg/scrapy/crawler.py", line 21, in init
self.stats = load_object(settings['STATS_CLASS'])(self)
File "/Users/ligang.yao/github/distribute_crawler/woaidu_crawler/woaidu_crawler/statscol/graphite.py", line 209, in init
self._graphiteclient = GraphiteClient(host,port)
File "/Users/ligang.yao/github/distribute_crawler/woaidu_crawler/woaidu_crawler/statscol/graphite.py", line 27, in init
self._sock.connect((host,port))
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/socket.py", line 224, in meth
return getattr(self._sock,name)(args)
socket.error: [Errno 61] Connection refused
I use the "scrapy crawl woaidu" command to run the crawler, but there are many errors, as below: ligang.yao@localhost woaidu_crawler$scrapy crawl woaidu ligang.yao@localhost woaidu_crawler$scrapy crawl woaidu Traceback (most recent call last): File "/usr/local/bin/scrapy", line 5, in
pkg_resources.run_script('Scrapy==0.17.0', 'scrapy')
File "/System/Library/Frameworks/Python.framework/Versions/2.7/Extras/lib/python/pkg_resources.py", line 489, in run_script
self.require(requires)[0].run_script(script_name, ns)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/Extras/lib/python/pkg_resources.py", line 1207, in run_script
execfile(script_filename, namespace, namespace)
File "/Library/Python/2.7/site-packages/Scrapy-0.17.0-py2.7.egg/EGG-INFO/scripts/scrapy", line 4, in
execute()
File "/Library/Python/2.7/site-packages/Scrapy-0.17.0-py2.7.egg/scrapy/cmdline.py", line 120, in execute
crawler = CrawlerProcess(settings)
File "/Library/Python/2.7/site-packages/Scrapy-0.17.0-py2.7.egg/scrapy/crawler.py", line 74, in init
super(CrawlerProcess, self).init(_a, _kw)
File "/Library/Python/2.7/site-packages/Scrapy-0.17.0-py2.7.egg/scrapy/crawler.py", line 21, in init
self.stats = load_object(settings['STATS_CLASS'])(self)
File "/Users/ligang.yao/github/distribute_crawler/woaidu_crawler/woaidu_crawler/statscol/graphite.py", line 209, in init
self._graphiteclient = GraphiteClient(host,port)
File "/Users/ligang.yao/github/distribute_crawler/woaidu_crawler/woaidu_crawler/statscol/graphite.py", line 27, in init
self._sock.connect((host,port))
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/socket.py", line 224, in meth
return getattr(self._sock,name)(args)
socket.error: [Errno 61] Connection refused