scrapy / scrapyd-client

Command line client for Scrapyd server
BSD 3-Clause "New" or "Revised" License
770 stars 146 forks source link

cannot deploy scrapy #57

Closed aliank516 closed 5 years ago

aliank516 commented 5 years ago

Hi everyone

Does anyone know why when I run the command: scrapyd-deploy it just hangs at Packing version 1556280070 Deploying to project "scrapy_project" in http://localhost:6800/addversion.json Deploy failed (400):

if I do this scrapyd-deploy -l I can see default http://localhost:6800/ this is my scrapy.conf [settings] default = scrapy_project.settings

[deploy] url = http://localhost:6800/ project = scrapy_project

is that why it may hang?

Thanks

my8100 commented 5 years ago

Is the OS and/or the Python environment newly installed?

aliank516 commented 5 years ago

@my8100

my8100 commented 5 years ago

Check out the link in my previous comment and retry.

Digenis commented 5 years ago

@aliank516, Try downgrading to scrapy-1.5. Make sure that you also do this for the environment running scrapyd not just the environment from which you deploy.

aliank516 commented 5 years ago

@Digenis I try it that downgrading to scrapy-1.5.2,but it get same errors. and I'm pretty sure I've started scrapyd. You mean scrapy project and scrapyd services can't be on the same machine,right?

This is some environmental information: [root@instance-3du7ykjk scrapy_project]# python3.7 Python 3.7.2 (default, Mar 21 2019, 16:35:19)

[root@instance-3du7ykjk scrapy_project]# scrapy Scrapy 1.5.2 - project: scrapy_project

[root@instance-3du7ykjk scrapy_project]# ps -ef|grep scrapyd root 23871 23778 0 11:44 pts/1 00:00:00 /usr/local/bin/python3.7 /usr/local/bin/scrapyd root 23919 23578 0 11:50 pts/0 00:00:00 grep --color=auto scrapyd

[root@instance-3du7ykjk scrapy_project]# scrapyd-deploy Packing version 1556336654 Deploying to project "scrapy_project" in http://localhost:6800/addversion.json Deploy failed (400):

my8100 commented 5 years ago

Try to use the "Auto Packaging" feature of ScrapydWeb and post the output in the web UI.

aliank516 commented 5 years ago

@my8100 I've noticed this. Maybe I'll use ScrapydWeb, but now I just want to know what caused it.

my8100 commented 5 years ago

ScrapydWeb would help to output the reason in the web UI! BTW, why not use a non-root account and a virtual environment?

my8100 commented 5 years ago

What's the output of Scrapyd? What if deploy by ScrapydWeb? BTW, check out https://guides.github.com/features/mastering-markdown/ to learn how to post code in markdown.

aliank516 commented 5 years ago

@my8100 I try to use non-root account but get some errors:

[alian@instance-3du7ykjk scrapy_project]$ scrapyd-deploy
Packing version 1556342023
Traceback (most recent call last):
File "/usr/local/bin/scrapyd-deploy", line 11, in <module>
load_entry_point('scrapyd-client==1.2.0a1', 'console_scripts', 'scrapyd-deploy')()
File "/usr/local/lib/python3.7/site-packages/scrapyd_client/deploy.py", line 103, in main
exitcode, tmpdir = _build_egg_and_deploy_target(target, version, opts)
File "/usr/local/lib/python3.7/site-packages/scrapyd_client/deploy.py", line 124, in _build_egg_and_deploy_target
egg, tmpdir = _build_egg()
File "/usr/local/lib/python3.7/site-packages/scrapyd_client/deploy.py", line 277, in _build_egg
stdout=o, stderr=e)
File "/usr/local/lib/python3.7/site-packages/scrapy/utils/python.py", line 353, in retry_on_eintr
return function(*args, **kw)
File "/usr/local/lib/python3.7/subprocess.py", line 347, in check_call
raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command '['/usr/local/bin/python3.7', 'setup.py', 'clean', '-a', 'bdist_egg', '-d', '/tmp/scrapydeploy-uinokwxd']' returned non-zero exit status 1.
my8100 commented 5 years ago

What's the output of Scrapyd? What if deploy by ScrapydWeb? Why not use virtual environment?

Digenis commented 5 years ago

@aliank516 Does it work without scrapyd? Try running the scrapy crawl command.

Digenis commented 5 years ago

possible duplicate https://github.com/scrapy/scrapyd/issues/309

aliank516 commented 5 years ago

@Digenis

Does it work without scrapyd?

scrapy crawl does not work with scrapy 1.5.2,but work with 1.6

possible duplicate scrapy/scrapyd#309

i view it,but not find some useful information.

Digenis commented 5 years ago

@aliank516, instead of scrapyd-deploy build your project with python setup.py clean -a bdist_egg -d . and then deploy it with curl -v http://localhost:6800/addversion.json -F project=scrapy_project -F version=$(date +%s) -F egg=@./project-1.0-py3.7.egg

aliank516 commented 5 years ago

@Digenis ,i build project with:

 scrapyd-deploy -p scrapy_project -v 20190427 --build-egg=scrapy_project.egg
 python3.7 setup.py clean -a bdist_egg -d .
 curl -v http://localhost:6800/addversion.json -F project=scrapy_project -F version=20190427 -F egg=@scrapy_project.egg

but get some errors:

[root@instance-3du7ykjk scrapy_project]# curl -v http://localhost:6800/addversion.json -F project=scrapy_project -F version=20190427 -F egg=@scrapy_project.egg
* About to connect() to localhost port 6800 (#0)
*   Trying ::1...
* Connection refused
*   Trying 127.0.0.1...
* Connected to localhost (127.0.0.1) port 6800 (#0)
> POST /addversion.json HTTP/1.1
> User-Agent: curl/7.29.0
> Host: localhost:6800
> Accept: */*
> Content-Length: 20878
> Expect: 100-continue
> Content-Type: multipart/form-data; boundary=----------------------------93e404b40f9a
> 
< HTTP/1.1 100 Continue
< HTTP/1.1 400 Bad Request
* no chunk, no close, no size. Assume close to signal end
< 
* Closing connection 0
Digenis commented 5 years ago

Closing in favour of https://github.com/scrapy/scrapyd/issues/309