scrapy-plugins / scrapy-jsonrpc

Scrapy extension to control spiders using JSON-RPC
295 stars 72 forks source link

Python 3 compatibility #12

Open redapple opened 7 years ago

redapple commented 7 years ago

scrapy-jsonrpc is not compatible with Python 3.

Apart from the example client code that uses urllib.urlopen() :

mirceachira commented 7 years ago

Any updates on this issue?

mirceachira commented 7 years ago

I see there's a pull request for this for a while now, I'm running it locally and it works but it would be nice to merge the branch so that the master could be used directly. Please fix this asap so that people don't have to deal with this in the future :)

rustanacexd commented 6 years ago

any update for this issue?

kouroshshafi commented 5 years ago

Any update. I cannot make it work with Python 3.


2019-02-26 22:57:06 [py.warnings] WARNING: /Users/XXX/anaconda3/lib/python3.6/site-packages/scrapy_jsonrpc/webservice.py:4: ScrapyDeprecationWarning: Module `scrapy.log` has been deprecated, Scrapy now relies on the builtin Python library for logging. Read the updated logging entry in the documentation to learn more.
  from scrapy import log, signals

Traceback (most recent call last):
  File "/Users/XXX/anaconda3/bin/scrapy", line 11, in <module>
    sys.exit(execute())
  File "/Users/XXX/anaconda3/lib/python3.6/site-packages/scrapy/cmdline.py", line 150, in execute
    _run_print_help(parser, _run_command, cmd, args, opts)
  File "/Users/kourosh/anaconda3/lib/python3.6/site-packages/scrapy/cmdline.py", line 90, in _run_print_help
    func(*a, **kw)
  File "/Users/kourosh/anaconda3/lib/python3.6/site-packages/scrapy/cmdline.py", line 157, in _run_command
    cmd.run(args, opts)
  File "/Users/kourosh/anaconda3/lib/python3.6/site-packages/scrapy/commands/crawl.py", line 57, in run
    self.crawler_process.crawl(spname, **opts.spargs)
  File "/Users/kourosh/anaconda3/lib/python3.6/site-packages/scrapy/crawler.py", line 171, in crawl
    crawler = self.create_crawler(crawler_or_spidercls)
  File "/Users/kourosh/anaconda3/lib/python3.6/site-packages/scrapy/crawler.py", line 200, in create_crawler
    return self._create_crawler(crawler_or_spidercls)
  File "/Users/kourosh/anaconda3/lib/python3.6/site-packages/scrapy/crawler.py", line 205, in _create_crawler
    return Crawler(spidercls, self.settings)
  File "/Users/kourosh/anaconda3/lib/python3.6/site-packages/scrapy/crawler.py", line 55, in __init__
    self.extensions = ExtensionManager.from_crawler(self)
  File "/Users/kourosh/anaconda3/lib/python3.6/site-packages/scrapy/middleware.py", line 53, in from_crawler
    return cls.from_settings(crawler.settings, crawler)
  File "/Users/kourosh/anaconda3/lib/python3.6/site-packages/scrapy/middleware.py", line 34, in from_settings
    mwcls = load_object(clspath)
  File "/Users/kourosh/anaconda3/lib/python3.6/site-packages/scrapy/utils/misc.py", line 44, in load_object
    mod = import_module(module)
  File "/Users/kourosh/anaconda3/lib/python3.6/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 994, in _gcd_import
  File "<frozen importlib._bootstrap>", line 971, in _find_and_load
  File "<frozen importlib._bootstrap>", line 955, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 665, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 678, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/Users/kourosh/anaconda3/lib/python3.6/site-packages/scrapy_jsonrpc/webservice.py", line 7, in <module>
    from scrapy_jsonrpc.jsonrpc import jsonrpc_server_call
  File "/Users/kourosh/anaconda3/lib/python3.6/site-packages/scrapy_jsonrpc/jsonrpc.py", line 11, in <module>
    from scrapy_jsonrpc.serialize import ScrapyJSONDecoder
  File "/Users/kourosh/anaconda3/lib/python3.6/site-packages/scrapy_jsonrpc/serialize.py", line 8, in <module>
    from scrapy.spider import Spider
ModuleNotFoundError: No module named 'scrapy.spider'
Digenis commented 5 years ago

@kouroshshafi, try downgrading scrapy to 1.5, this is probably irrelevant to python3.

ShrinkDW commented 4 years ago

IS any one working on this?