MA3STR0 / kimsufi-crawler

Crawler that will send you an email alert as soon as servers on OVH/Kimsufi become available for purchase
MIT License
194 stars 62 forks source link

ImportError: cannot import name coroutine #10

Closed jerom18 closed 9 years ago

jerom18 commented 9 years ago

I have been trying all morning to get this script working on my NAS (Netgear Readynas RN104, running OS6.20). I am under the impression that given how Python works there should be minimal dispruption running scripts on different devices.

I have configured a gmail account and renamed the file as described in the readme.

I had Python installed from previous script use and installed Tornado as below.

aptitude install python-tornado The following NEW packages will be installed: librtmp0{a} python-pycurl{a} python-tornado The following packages are RECOMMENDED but will NOT be installed: python-mysqldb 0 packages upgraded, 3 newly installed, 0 to remove and 2 not upgraded. Need to get 402 kB of archives. After unpacking 1,438 kB will be used. Do you want to continue? [Y/n/?] y Get: 1 http://mirrors.kernel.org/debian/ wheezy/main librtmp0 armel 2.4+20111222.git4e06e21-1 [58.7 kB] Get: 2 http://mirrors.kernel.org/debian/ wheezy/main python-pycurl armel 7.19.0-5 [89.3 kB] Get: 3 http://mirrors.kernel.org/debian/ wheezy/main python-tornado all 2.3-2 [254 kB] Fetched 402 kB in 11s (34.0 kB/s) debconf: delaying package configuration, since apt-utils is not installed Selecting previously unselected package librtmp0:armel. (Reading database ... 36003 files and directories currently installed.) Unpacking librtmp0:armel (from .../librtmp0_2.4+20111222.git4e06e21-1_armel.deb) ... Selecting previously unselected package python-pycurl. Unpacking python-pycurl (from .../python-pycurl_7.19.0-5_armel.deb) ... Selecting previously unselected package python-tornado. Unpacking python-tornado (from .../python-tornado_2.3-2_all.deb) ... Setting up librtmp0:armel (2.4+20111222.git4e06e21-1) ... Setting up python-pycurl (7.19.0-5) ... Setting up python-tornado (2.3-2) ... Processing triggers for libc-bin ... Processing triggers for python-support ...

On running python crawler.py as root I receive the following error -

python crawler.py Traceback (most recent call last): File "crawler.py", line 13, in <module> from tornado.gen import coroutine ImportError: cannot import name coroutine

I am not sure how to move forward from this point.

MA3STR0 commented 9 years ago

What Python and Tornado versions do you have? You can find it by running

python --version
python -c "import tornado;print tornado.version"

Also, if you have python-setuptools or python-pip it's much better to use it for installing python libs: sudo pip install tornado or sudo easy_install tornado

jerom18 commented 9 years ago

Python Version -

python --version Python 2.7.3

Tornado -

python -c "import tornado;print tornado.version" 2.3

Pip Install Tornado -

pip install tornado Requirement already satisfied (use --upgrade to upgrade): tornado in /usr/lib/python2.7/dist-packages Cleaning up...

Tried the suggestion -

easy_install tornado Searching for tornado Best match: tornado 2.3 Adding tornado 2.3 to easy-install.pth file

Using /usr/lib/python2.7/dist-packages Processing dependencies for tornado Finished processing dependencies for tornado

Issue is still present following restart.

MA3STR0 commented 9 years ago

OK, so the problem is your Tornado version (current is 4.0.2, coroutines are available since 3.0, you have 2.3) Try sudo easy_install tornado==4.0.2 and check the version again.

If that does not help, you need to remove old tornado and install the new one again. Sadly, with aptitude you always get a bit outdated packages, thats why I recommended pip. But thanks for reporting the issue, I will add minimal version requirements to readme.

MA3STR0 commented 9 years ago

So, try this:

sudo apt-get remove python-tornado
sudo easy_install tornado==4.0.2
jerom18 commented 9 years ago

And like magic that problem was solved. However, another has occurred...

Traceback (most recent call last): File "crawler.py", line 19, in config = json.loads(configfile.read()) File "/usr/lib/python2.7/json/init.py", line 326, in loads return _default_decoder.decode(s) File "/usr/lib/python2.7/json/decoder.py", line 365, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File "/usr/lib/python2.7/json/decoder.py", line 381, in raw_decode obj, end = self.scan_once(s, idx) ValueError: Expecting , delimiter: line 3 column 5 (char 30)

Unfortunately I'm not at home at the moment so have only been able to remotely SSH in to try your suggestion. Tomorrow I will have a look at the lines in detail see if there is anything obvious that I have configured improperly.

Great work on the script all the same though, even though I've yet to get it working I'm glad something exists already for this problem!

MA3STR0 commented 9 years ago

Yeah, I hope this one will be easier to fix, most likely you have a typo in your json config file, copy and paste its contents to some validator (like http://jsonlint.com), and see whats wrong. Good luck tomorrow ;)

jerom18 commented 9 years ago

I was missing a comma in the config, my bad!

I have yet to have an email but the script does appear to be working just fine.

MA3STR0 commented 9 years ago

Great, so I'm closing this issue. By the way, to test if script works correctly you can put some popular server types in your config, eg "KS-2" in "rbx".