Closed rafaelcapucho closed 4 years ago
Hi! I looked at the full output and the short output and everything seems OK. I use Python 3.6 and the latest PIP and virtualenv and I deployed spiders using scrapy link filter and scrapy count filter tens of times and I didn't have any problems. I deleted the library now and reinstalled it again, this is my output for installing for the first time:
pip install git+https://github.com/croqaz/scrapy-count-filter
Collecting git+https://github.com/croqaz/scrapy-count-filter
Cloning https://github.com/croqaz/scrapy-count-filter to /private/var/folders/k2/.../T/pip-req-build-l0q8r0fs
Running command git clone -q https://github.com/croqaz/scrapy-count-filter /private/var/folders/k2/.../T/pip-req-build-l0q8r0fs
Building wheels for collected packages: scrapy-count-filter
Building wheel for scrapy-count-filter (setup.py) ... done
Created wheel for scrapy-count-filter: filename=scrapy_count_filter-0.2.0-cp36-none-any.whl size=5161 sha256=12e58e762af227db74300d71922d0aa776d44e5dd7e765047bbd3c943701e982
Stored in directory: /private/var/folders/k2/.../T/pip-ephem-wheel-cache-k2rxqdc2/wheels/c8/63/e0/7fa943c1cea868745cb63fd83480f2ef29958bc3af8247462c
Successfully built scrapy-count-filter
Installing collected packages: scrapy-count-filter
Successfully installed scrapy-count-filter-0.2.0
And for reinstalling:
pip install git+https://github.com/croqaz/scrapy-count-filter
Collecting git+https://github.com/croqaz/scrapy-count-filter
Cloning https://github.com/croqaz/scrapy-count-filter to /private/var/folders/k2/.../T/pip-req-build-u3v__6v9
Running command git clone -q https://github.com/croqaz/scrapy-count-filter /private/var/folders/k2/.../T/pip-req-build-u3v__6v9
Requirement already satisfied (use --upgrade to upgrade): scrapy-count-filter==0.2.0 from git+https://github.com/croqaz/scrapy-count-filter in /Users/croqaz/Dev/ScrapingHub/env/lib/python3.6/site-packages
Building wheels for collected packages: scrapy-count-filter
Building wheel for scrapy-count-filter (setup.py) ... done
Created wheel for scrapy-count-filter: filename=scrapy_count_filter-0.2.0-cp36-none-any.whl size=5161 sha256=3dc30252bae6609eca3cb992fd4c281d25e0a33e6897b378590a49445d3cb9e7
Stored in directory: /private/var/folders/k2/.../T/pip-ephem-wheel-cache-aa4rk8sj/wheels/c8/63/e0/7fa943c1cea868745cb63fd83480f2ef29958bc3af8247462c
Successfully built scrapy-count-filter
And then:
python
Python 3.6.9 (default, Sep 23 2019)
[GCC 4.2.1 Compatible Apple LLVM 10.0.1 (clang-1001.0.46.4)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import scrapy_count_filter
>>> dir(scrapy_count_filter)
['GlobalCountFilterMiddleware', 'HostsCountFilterMiddleware', '__builtins__', '__cached__', '__doc__', '__file__', '__loader__', '__name__', '__package__', '__path__', '__spec__', 'middleware']
>>>
Can you try on a different machine if you have any?
I was talking to Viktor just now, it seems that my scrapy
was installed in the global environment, I rebuild everything and now it is working, thank you!
Hi @croqaz, I created a virtual environment and installed all
requirements.txt
+scrapy 1.6.0
.Just to make sure it is really installed, I ran:
It returns:
the full error output, any idea? Thank you
pip freeze