Deleetdk / OKCubot

A Scrapy scraper to scrape OKCupid.
28 stars 6 forks source link

Crawling error #1

Open Deleetdk opened 8 years ago

Deleetdk commented 8 years ago

System info: Mint 17.1 Cinnamon 64-bit Python 2.7.10.

Fresh install of scrapy following instructions.

Traceback:

wooga@wooga ~/OKCubot/okcubot $ scrapy crawl okcubot -auser=**REDACTED** -apass=**REDACTED**
/home/wooga/OKCubot/okcubot/okcubot/spiders/okcubot_spider.py:2: ScrapyDeprecationWarning: Module `scrapy.spider` is deprecated, use `scrapy.spiders` instead
  from scrapy.spider import Spider
/home/wooga/OKCubot/okcubot/okcubot/spiders/okcubot_spider.py:5: ScrapyDeprecationWarning: Module `scrapy.log` has been deprecated, Scrapy now relies on the builtin Python library for logging. Read the updated logging entry in the documentation to learn more.
  from scrapy import log
2015-11-10 10:50:00 [scrapy] INFO: Scrapy 1.0.3 started (bot: okcubot)
2015-11-10 10:50:00 [scrapy] INFO: Optional features available: ssl, http11, boto
2015-11-10 10:50:00 [scrapy] INFO: Overridden settings: {'NEWSPIDER_MODULE': 'okcubot.spiders', 'LOG_LEVEL': 'INFO', 'SPIDER_MODULES': ['okcubot.spiders'], 'CONCURRENT_REQUESTS_PER_IP': 1, 'BOT_NAME': 'okcubot', 'DEPTH_PRIORITY': 1, 'DOWNLOAD_DELAY': 2}
2015-11-10 10:50:00 [py.warnings] WARNING: /home/wooga/anaconda/lib/python2.7/site-packages/scrapy/utils/deprecate.py:155: ScrapyDeprecationWarning: `scrapy.contrib.feedexport.FeedExporter` class is deprecated, use `scrapy.extensions.feedexport.FeedExporter` instead
  ScrapyDeprecationWarning)

2015-11-10 10:50:01 [scrapy] INFO: Enabled extensions: CloseSpider, TelnetConsole, LogStats, CoreStats, SpiderState
2015-11-10 10:50:02 [boto] ERROR: Caught exception reading instance data
Traceback (most recent call last):
  File "/home/wooga/anaconda/lib/python2.7/site-packages/boto/utils.py", line 210, in retry_url
    r = opener.open(req, timeout=timeout)
  File "/home/wooga/anaconda/lib/python2.7/urllib2.py", line 431, in open
    response = self._open(req, data)
  File "/home/wooga/anaconda/lib/python2.7/urllib2.py", line 449, in _open
    '_open', req)
  File "/home/wooga/anaconda/lib/python2.7/urllib2.py", line 409, in _call_chain
    result = func(*args)
  File "/home/wooga/anaconda/lib/python2.7/urllib2.py", line 1227, in http_open
    return self.do_open(httplib.HTTPConnection, req)
  File "/home/wooga/anaconda/lib/python2.7/urllib2.py", line 1197, in do_open
    raise URLError(err)
URLError: <urlopen error timed out>
2015-11-10 10:50:02 [boto] ERROR: Unable to read instance data, giving up
2015-11-10 10:50:02 [scrapy] INFO: Enabled downloader middlewares: HttpAuthMiddleware, DownloadTimeoutMiddleware, UserAgentMiddleware, RetryMiddleware, DefaultHeadersMiddleware, MetaRefreshMiddleware, HttpCompressionMiddleware, RedirectMiddleware, CookiesMiddleware, ChunkedTransferMiddleware, DownloaderStats
2015-11-10 10:50:02 [scrapy] INFO: Enabled spider middlewares: HttpErrorMiddleware, OffsiteMiddleware, RefererMiddleware, UrlLengthMiddleware, DepthMiddleware
2015-11-10 10:50:02 [py.warnings] WARNING: /home/wooga/OKCubot/okcubot/okcubot/pipelines.py:10: ScrapyDeprecationWarning: Module `scrapy.contrib.exporter` is deprecated, use `scrapy.exporters` instead
  from scrapy.contrib.exporter import CsvItemExporter

2015-11-10 10:50:02 [scrapy] INFO: Enabled item pipelines: DuplicatePipeline, AnswerSanitationPipeline, MultiTSVItemPipeline
2015-11-10 10:50:02 [scrapy] INFO: Spider opened
2015-11-10 10:50:02 [scrapy] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
2015-11-10 10:50:03 [py.warnings] WARNING: /home/wooga/OKCubot/okcubot/okcubot/spiders/okcubot_spider.py:441: ScrapyDeprecationWarning: log.msg has been deprecated, create a python logger and log through it instead
  log.msg('Wrong number of parts in header. Assuming 200 OK', level=log.DEBUG)

2015-11-10 10:50:06 [py.warnings] WARNING: /home/wooga/OKCubot/okcubot/okcubot/spiders/okcubot_spider.py:142: ScrapyDeprecationWarning: log.msg has been deprecated, create a python logger and log through it instead
  log.msg('Credentials incorrect.', level=log.ERROR)

2015-11-10 10:50:06 [scrapy] ERROR: Credentials incorrect.
2015-11-10 10:50:06 [scrapy] INFO: Closing spider (finished)
2015-11-10 10:50:06 [scrapy] INFO: Dumping Scrapy stats:
{'downloader/request_bytes': 673,
 'downloader/request_count': 2,
 'downloader/request_method_count/GET': 1,
 'downloader/request_method_count/POST': 1,
 'downloader/response_bytes': 24210,
 'downloader/response_count': 2,
 'downloader/response_status_count/200': 1,
 'downloader/response_status_count/302': 1,
 'finish_reason': 'finished',
 'finish_time': datetime.datetime(2015, 11, 10, 9, 50, 6, 371265),
 'log_count/ERROR': 3,
 'log_count/INFO': 7,
 'log_count/WARNING': 4,
 'response_received_count': 1,
 'scheduler/dequeued': 2,
 'scheduler/dequeued/memory': 2,
 'scheduler/enqueued': 2,
 'scheduler/enqueued/memory': 2,
 'start_time': datetime.datetime(2015, 11, 10, 9, 50, 2, 488294)}
2015-11-10 10:50:06 [scrapy] INFO: Spider closed (finished)
onbjerg commented 8 years ago

Scrapy version?

Deleetdk commented 8 years ago

Python packages versions:

wooga@wooga ~/OKCubot/okcubot $ pip freeze
abstract-rendering==0.5.1
alabaster==0.7.3
argcomplete==0.8.9
astropy==1.0.3
Babel==1.3
backports.ssl-match-hostname==3.4.0.2
bcolz==0.9.0
beautifulsoup4==4.3.2
binstar==0.11.0
bitarray==0.8.1
blaze==0.8.0
blz==0.6.2
bokeh==0.9.0
boto==2.38.0
Bottleneck==1.0.0
cdecimal==2.3
certifi==14.5.14
cffi==1.1.0
characteristic==14.3.0
clyent==0.3.4
colorama==0.3.3
conda==3.14.1
conda-build==1.14.1
conda-env==2.2.3
configobj==5.0.6
cryptography==0.9.1
cssselect==0.9.1
Cython==0.22.1
cytoolz==0.7.3
datashape==0.4.5
decorator==3.4.2
docutils==0.12
enum34==1.0.4
fastcache==1.0.2
Flask==0.10.1
funcsigs==0.4
gevent==1.0.1
gevent-websocket==0.9.3
greenlet==0.4.7
grin==1.2.1
h5py==2.5.0
idna==2.0
ipaddress==1.0.7
ipython==3.2.0
itsdangerous==0.24
jdcal==1.0
jedi==0.8.1
Jinja2==2.7.3
jsonschema==2.4.0
llvmlite==0.5.0
loginform==1.1.1
lxml==3.4.4
MarkupSafe==0.23
matplotlib==1.4.3
mistune==0.5.1
mock==1.0.1
multipledispatch==0.4.7
networkx==1.9.1
nltk==3.0.3
nose==1.3.7
numba==0.19.1
numexpr==2.4.3
numpy==1.9.2
odo==0.3.2
openpyxl==1.8.5
pandas==0.16.2
patsy==0.3.0
pep8==1.6.2
Pillow==2.8.2
ply==3.6
psutil==2.2.1
ptyprocess==0.4
py==1.4.27
pyasn1==0.1.7
pyasn1-modules==0.0.8
pycosat==0.6.1
pycparser==2.14
pycrypto==2.6.1
pycurl==7.19.5.1
pyflakes==0.9.2
Pygments==2.0.2
pyOpenSSL==0.15.1
pyparsing==2.0.3
pytest==2.7.1
python-dateutil==2.4.2
pytz==2015.4
PyYAML==3.11
pyzmq==14.7.0
queuelib==1.4.2
redis==2.10.3
requests==2.7.0
rope==0.9.4
runipy==0.1.3
scikit-image==0.11.3
scikit-learn==0.16.1
scipy==0.15.1
scrapely==0.12.0
Scrapy==1.0.3
service-identity==14.0.0
six==1.9.0
snowballstemmer==1.2.0
sockjs-tornado==1.0.1
Sphinx==1.3.1
sphinx-rtd-theme==0.1.7
spyder==2.3.5.2
SQLAlchemy==1.0.5
statsmodels==0.6.1
sympy==0.7.6
tables==3.2.0
terminado==0.5
Theano==0.7.0
toolz==0.7.2
tornado==4.2
Twisted==15.4.0
ujson==1.33
unicodecsv==0.9.4
w3lib==1.13.0
Werkzeug==0.10.4
xlrd==0.9.3
XlsxWriter==0.7.3
xlwt==1.0.0
zope.interface==4.1.3
onbjerg commented 8 years ago

Try Scrapy 0.24, they've released 1.0.x and we haven't touched the code in ages

Deleetdk commented 8 years ago

Installed old scrapy using: pip install Scrapy==0.24.5.

Output:

wooga@wooga ~/OKCubot/okcubot $ scrapy crawl okcubot -auser=USER -apass=PASS
2015-11-10 13:13:26+0100 [scrapy] INFO: Scrapy 0.24.5 started (bot: okcubot)
2015-11-10 13:13:26+0100 [scrapy] INFO: Optional features available: ssl, http11, boto
2015-11-10 13:13:26+0100 [scrapy] INFO: Overridden settings: {'NEWSPIDER_MODULE': 'okcubot.spiders', 'LOG_LEVEL': 'INFO', 'SPIDER_MODULES': ['okcubot.spiders'], 'CONCURRENT_REQUESTS_PER_IP': 1, 'BOT_NAME': 'okcubot', 'DEPTH_PRIORITY': 1, 'DOWNLOAD_DELAY': 2}
2015-11-10 13:13:26+0100 [scrapy] INFO: Enabled extensions: LogStats, TelnetConsole, CloseSpider, WebService, CoreStats, SpiderState
2015-11-10 13:13:27+0100 [scrapy] INFO: Enabled downloader middlewares: HttpAuthMiddleware, DownloadTimeoutMiddleware, UserAgentMiddleware, RetryMiddleware, DefaultHeadersMiddleware, MetaRefreshMiddleware, HttpCompressionMiddleware, RedirectMiddleware, CookiesMiddleware, ChunkedTransferMiddleware, DownloaderStats
2015-11-10 13:13:27+0100 [scrapy] INFO: Enabled spider middlewares: HttpErrorMiddleware, OffsiteMiddleware, RefererMiddleware, UrlLengthMiddleware, DepthMiddleware
2015-11-10 13:13:27+0100 [scrapy] INFO: Enabled item pipelines: DuplicatePipeline, AnswerSanitationPipeline, MultiTSVItemPipeline
2015-11-10 13:13:27+0100 [okcubot] INFO: Spider opened
2015-11-10 13:13:27+0100 [okcubot] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
2015-11-10 13:13:31+0100 [scrapy] ERROR: Credentials incorrect.
2015-11-10 13:13:31+0100 [okcubot] INFO: Closing spider (finished)
2015-11-10 13:13:31+0100 [okcubot] INFO: Dumping Scrapy stats:
    {'downloader/request_bytes': 675,
     'downloader/request_count': 2,
     'downloader/request_method_count/GET': 1,
     'downloader/request_method_count/POST': 1,
     'downloader/response_bytes': 24365,
     'downloader/response_count': 2,
     'downloader/response_status_count/200': 1,
     'downloader/response_status_count/302': 1,
     'finish_reason': 'finished',
     'finish_time': datetime.datetime(2015, 11, 10, 12, 13, 31, 14403),
     'log_count/ERROR': 1,
     'log_count/INFO': 7,
     'response_received_count': 1,
     'scheduler/dequeued': 2,
     'scheduler/dequeued/memory': 2,
     'scheduler/enqueued': 2,
     'scheduler/enqueued/memory': 2,
     'start_time': datetime.datetime(2015, 11, 10, 12, 13, 27, 498663)}
2015-11-10 13:13:31+0100 [okcubot] INFO: Spider closed (finished)
onbjerg commented 8 years ago

Did 76224e01b55c3eed6286af6db0eb27bac155ed0f fix it?

Deleetdk commented 8 years ago

New output. Notice that it got to the seeding step.

wooga@wooga ~/OKCubot/okcubot $ scrapy crawl okcubot -auser=USER -apass=PASS
2015-11-10 13:36:30+0100 [scrapy] INFO: Scrapy 0.24.5 started (bot: okcubot)
2015-11-10 13:36:30+0100 [scrapy] INFO: Optional features available: ssl, http11, boto
2015-11-10 13:36:30+0100 [scrapy] INFO: Overridden settings: {'NEWSPIDER_MODULE': 'okcubot.spiders', 'LOG_LEVEL': 'INFO', 'SPIDER_MODULES': ['okcubot.spiders'], 'CONCURRENT_REQUESTS_PER_IP': 1, 'BOT_NAME': 'okcubot', 'DEPTH_PRIORITY': 1, 'DOWNLOAD_DELAY': 2}
2015-11-10 13:36:30+0100 [scrapy] INFO: Enabled extensions: LogStats, TelnetConsole, CloseSpider, WebService, CoreStats, SpiderState
2015-11-10 13:36:31+0100 [scrapy] INFO: Enabled downloader middlewares: HttpAuthMiddleware, DownloadTimeoutMiddleware, UserAgentMiddleware, RetryMiddleware, DefaultHeadersMiddleware, MetaRefreshMiddleware, HttpCompressionMiddleware, RedirectMiddleware, CookiesMiddleware, ChunkedTransferMiddleware, DownloaderStats
2015-11-10 13:36:31+0100 [scrapy] INFO: Enabled spider middlewares: HttpErrorMiddleware, OffsiteMiddleware, RefererMiddleware, UrlLengthMiddleware, DepthMiddleware
2015-11-10 13:36:31+0100 [scrapy] INFO: Enabled item pipelines: DuplicatePipeline, AnswerSanitationPipeline, MultiTSVItemPipeline
2015-11-10 13:36:31+0100 [okcubot] INFO: Spider opened
2015-11-10 13:36:31+0100 [okcubot] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
2015-11-10 13:36:33+0100 [scrapy] INFO: Seeded bot with user (#)
2015-11-10 13:36:36+0100 [okcubot] INFO: Closing spider (finished)
2015-11-10 13:36:36+0100 [okcubot] INFO: Dumping Scrapy stats:
    {'downloader/request_bytes': 1509,
     'downloader/request_count': 4,
     'downloader/request_method_count/GET': 3,
     'downloader/request_method_count/POST': 1,
     'downloader/response_bytes': 25215,
     'downloader/response_count': 4,
     'downloader/response_status_count/200': 1,
     'downloader/response_status_count/301': 1,
     'downloader/response_status_count/302': 2,
     'dupefilter/filtered': 1,
     'finish_reason': 'finished',
     'finish_time': datetime.datetime(2015, 11, 10, 12, 36, 36, 925242),
     'log_count/INFO': 8,
     'request_depth_max': 1,
     'response_received_count': 1,
     'scheduler/dequeued': 4,
     'scheduler/dequeued/memory': 4,
     'scheduler/enqueued': 4,
     'scheduler/enqueued/memory': 4,
     'start_time': datetime.datetime(2015, 11, 10, 12, 36, 31, 341044)}
2015-11-10 13:36:36+0100 [okcubot] INFO: Spider closed (finished)
onbjerg commented 8 years ago

3e42897b6327dfac1911fd0ddcf9ce8c00808c5b?

Deleetdk commented 8 years ago

We got a step further. I note that users are given by URLs to their profiles. I think they used to be given by just their usernames. Perhaps an error.

We probably also want not to fetch the URL with the HTML parameter (?cf=home_matches).

From looking at the home matches, does it use the correct option of scraping users at random? This is crucial and the reason why we buy A-list.

2015-11-10 13:53:32+0100 [scrapy] INFO: Scrapy 0.24.5 started (bot: okcubot)
2015-11-10 13:53:32+0100 [scrapy] INFO: Optional features available: ssl, http11, boto
2015-11-10 13:53:32+0100 [scrapy] INFO: Overridden settings: {'NEWSPIDER_MODULE': 'okcubot.spiders', 'LOG_LEVEL': 'INFO', 'SPIDER_MODULES': ['okcubot.spiders'], 'CONCURRENT_REQUESTS_PER_IP': 1, 'BOT_NAME': 'okcubot', 'DEPTH_PRIORITY': 1, 'DOWNLOAD_DELAY': 2}
2015-11-10 13:53:32+0100 [scrapy] INFO: Enabled extensions: LogStats, TelnetConsole, CloseSpider, WebService, CoreStats, SpiderState
2015-11-10 13:53:33+0100 [scrapy] INFO: Enabled downloader middlewares: HttpAuthMiddleware, DownloadTimeoutMiddleware, UserAgentMiddleware, RetryMiddleware, DefaultHeadersMiddleware, MetaRefreshMiddleware, HttpCompressionMiddleware, RedirectMiddleware, CookiesMiddleware, ChunkedTransferMiddleware, DownloaderStats
2015-11-10 13:53:33+0100 [scrapy] INFO: Enabled spider middlewares: HttpErrorMiddleware, OffsiteMiddleware, RefererMiddleware, UrlLengthMiddleware, DepthMiddleware
2015-11-10 13:53:33+0100 [scrapy] INFO: Enabled item pipelines: DuplicatePipeline, AnswerSanitationPipeline, MultiTSVItemPipeline
2015-11-10 13:53:33+0100 [okcubot] INFO: Spider opened
2015-11-10 13:53:33+0100 [okcubot] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
2015-11-10 13:53:36+0100 [scrapy] INFO: Seeded bot with user (/profile/cyerp2590?cf=home_matches)
2015-11-10 13:53:36+0100 [scrapy] INFO: Seeded bot with user (/profile/ZaneDane?cf=home_matches)
2015-11-10 13:53:36+0100 [scrapy] INFO: Seeded bot with user (/profile/Chrlotte?cf=home_matches)
2015-11-10 13:53:36+0100 [scrapy] INFO: Seeded bot with user (/profile/mabj?cf=home_matches)
2015-11-10 13:53:36+0100 [scrapy] INFO: Seeded bot with user (/profile/LillyStardust?cf=home_matches)
2015-11-10 13:53:36+0100 [scrapy] INFO: Seeded bot with user (/profile/AMS-DK?cf=home_matches)
2015-11-10 13:53:36+0100 [scrapy] INFO: Seeded bot with user (/profile/Hannah13_aqua?cf=home_matches)
2015-11-10 13:53:36+0100 [scrapy] INFO: Seeded bot with user (/profile/Beatrix38?cf=home_matches)
2015-11-10 13:53:36+0100 [scrapy] INFO: Seeded bot with user (/profile/ekah27?cf=home_matches)
2015-11-10 13:53:36+0100 [scrapy] INFO: Seeded bot with user (/profile/JoannaTyr?cf=home_matches)
2015-11-10 13:53:36+0100 [scrapy] INFO: Seeded bot with user (/profile/scorekaj22?cf=home_matches)
2015-11-10 13:53:36+0100 [scrapy] INFO: Seeded bot with user (/profile/alyssaroses?cf=home_matches)
2015-11-10 13:53:43+0100 [okcubot] ERROR: Spider error processing <GET https://www.okcupid.com/profile/alyssaroses?cf=home_matches>
    Traceback (most recent call last):
      File "/home/wooga/anaconda/lib/python2.7/site-packages/twisted/internet/base.py", line 825, in runUntilCurrent
        call.func(*call.args, **call.kw)
      File "/home/wooga/anaconda/lib/python2.7/site-packages/twisted/internet/task.py", line 645, in _tick
        taskObj._oneWorkUnit()
      File "/home/wooga/anaconda/lib/python2.7/site-packages/twisted/internet/task.py", line 491, in _oneWorkUnit
        result = next(self._iterator)
      File "/home/wooga/anaconda/lib/python2.7/site-packages/scrapy/utils/defer.py", line 57, in <genexpr>
        work = (callable(elem, *args, **named) for elem in iterable)
    --- <exception caught here> ---
      File "/home/wooga/anaconda/lib/python2.7/site-packages/scrapy/utils/defer.py", line 96, in iter_errback
        yield next(it)
      File "/home/wooga/anaconda/lib/python2.7/site-packages/scrapy/contrib/spidermiddleware/offsite.py", line 26, in process_spider_output
        for x in result:
      File "/home/wooga/anaconda/lib/python2.7/site-packages/scrapy/contrib/spidermiddleware/referer.py", line 22, in <genexpr>
        return (_set_referer(r) for r in result or ())
      File "/home/wooga/anaconda/lib/python2.7/site-packages/scrapy/contrib/spidermiddleware/urllength.py", line 33, in <genexpr>
        return (r for r in result or () if _filter(r))
      File "/home/wooga/anaconda/lib/python2.7/site-packages/scrapy/contrib/spidermiddleware/depth.py", line 50, in <genexpr>
        return (r for r in result or () if _filter(r))
      File "/home/wooga/OKCubot/okcubot/okcubot/spiders/okcubot_spider.py", line 184, in parse_profile
        val = selector.css(ident).extract()[0]
    exceptions.IndexError: list index out of range

2015-11-10 13:53:44+0100 [okcubot] ERROR: Spider error processing <GET https://www.okcupid.com/profile/scorekaj22?cf=home_matches>
    Traceback (most recent call last):
      File "/home/wooga/anaconda/lib/python2.7/site-packages/twisted/internet/base.py", line 825, in runUntilCurrent
        call.func(*call.args, **call.kw)
      File "/home/wooga/anaconda/lib/python2.7/site-packages/twisted/internet/task.py", line 645, in _tick
        taskObj._oneWorkUnit()
      File "/home/wooga/anaconda/lib/python2.7/site-packages/twisted/internet/task.py", line 491, in _oneWorkUnit
        result = next(self._iterator)
      File "/home/wooga/anaconda/lib/python2.7/site-packages/scrapy/utils/defer.py", line 57, in <genexpr>
        work = (callable(elem, *args, **named) for elem in iterable)
    --- <exception caught here> ---
      File "/home/wooga/anaconda/lib/python2.7/site-packages/scrapy/utils/defer.py", line 96, in iter_errback
        yield next(it)
      File "/home/wooga/anaconda/lib/python2.7/site-packages/scrapy/contrib/spidermiddleware/offsite.py", line 26, in process_spider_output
        for x in result:
      File "/home/wooga/anaconda/lib/python2.7/site-packages/scrapy/contrib/spidermiddleware/referer.py", line 22, in <genexpr>
        return (_set_referer(r) for r in result or ())
      File "/home/wooga/anaconda/lib/python2.7/site-packages/scrapy/contrib/spidermiddleware/urllength.py", line 33, in <genexpr>
        return (r for r in result or () if _filter(r))
      File "/home/wooga/anaconda/lib/python2.7/site-packages/scrapy/contrib/spidermiddleware/depth.py", line 50, in <genexpr>
        return (r for r in result or () if _filter(r))
      File "/home/wooga/OKCubot/okcubot/okcubot/spiders/okcubot_spider.py", line 184, in parse_profile
        val = selector.css(ident).extract()[0]
    exceptions.IndexError: list index out of range

2015-11-10 13:53:44+0100 [okcubot] INFO: Closing spider (finished)
2015-11-10 13:53:44+0100 [okcubot] INFO: Dumping Scrapy stats:
    {'downloader/request_bytes': 2482,
     'downloader/request_count': 6,
     'downloader/request_method_count/GET': 5,
     'downloader/request_method_count/POST': 1,
     'downloader/response_bytes': 75870,
     'downloader/response_count': 6,
     'downloader/response_status_count/200': 3,
     'downloader/response_status_count/301': 2,
     'downloader/response_status_count/302': 1,
     'finish_reason': 'finished',
     'finish_time': datetime.datetime(2015, 11, 10, 12, 53, 44, 908193),
     'log_count/ERROR': 2,
     'log_count/INFO': 19,
     'request_depth_max': 1,
     'response_received_count': 3,
     'scheduler/dequeued': 6,
     'scheduler/dequeued/memory': 6,
     'scheduler/enqueued': 6,
     'scheduler/enqueued/memory': 6,
     'spider_exceptions/IndexError': 2,
     'start_time': datetime.datetime(2015, 11, 10, 12, 53, 33, 933000)}
2015-11-10 13:53:44+0100 [okcubot] INFO: Spider closed (finished)