scrapy / scrapyd-client

Command line client for Scrapyd server
BSD 3-Clause "New" or "Revised" License
770 stars 146 forks source link

scrapy-client auth not working & contribution guidelines #122

Closed apetryla closed 1 year ago

apetryla commented 1 year ago

Hello, on my end scrapy-client projects isn't working as no authentication credentials being sent with the /listprojects.json request (checked on wireshark). My scrapy.cfg:

[settings]
default = ecom_scraper.settings

[scrapyd]
bind_address = 0.0.0.0
username = usr
password = 123

[deploy]
url = http://localhost:6800/
project = ecom_scraper
eggs_dir = scrapyd_eggs
username = usr
password = 123

I could try to debug it on my end, but I couldn't find any contribution doc, where local set up would be explained. What do You think about adding a contribution doc with explanation on a preferred way of setting up the dev env and pull request requirements? :)

jpmckinney commented 1 year ago

Hi @apetryla, I'll add some docs but it's very simple:

pip install .[test]
pytest
jpmckinney commented 1 year ago

Which command did you run to send the request?

apetryla commented 1 year ago

Hey, thanks for the description! I took a look a bit more into the code and noticed that auth=get_auth(url=url, username=username, password=password was delivered with https://github.com/scrapy/scrapyd-client/commit/39162ae5bc1728ba907115b3fca541bd2f6804ef and VERSION wasn't updated. So using latest version in pypi (1.2.2), I was getting:

$ scrapyd-client projects
Received a malformed response:
Unauthorized

However, pulling current repo and pip install . it authenticates successfully. I admit I still get confused with python versioning and setup stuff. :)

jpmckinney commented 1 year ago

Version 1.2.3 is now released, so you shouldn't need to install from the repo :)

apetryla commented 1 year ago

Of great, thanks for quick development and great support. I saw some cool features, so I can't wait to try them out!