Closed juananpe closed 8 years ago
Answering to myself, it seems that the boto library is showing this error when it can't connect to an S3/AWS host. Just set the variables AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY to an empty string and your are good to go:
$ export AWS_ACCESS_KEY_ID="" && export AWS_SECRET_ACCESS_KEY="" && scrapy shell http://web:9312/properties/property_000000.html
(I suppose that there is a more elegant way to bypass this error, but that line works for me :)
Thanks a lot @juananpe for the clarification. This refers to page 113 of the e-book, or page 92 of the printed book.
You are right on exporting AWS_*
credentials. This is exactly what I do as well in settings.py for Chapter 5 and any other chapter. As you say - it's just annoying boto detail and a non-elegant workaround is fine.
Hopefully this won't be a very common problem. If you run your scrapy shell
command from within ch05/properties
directory, it should work fine because scrapy shell
includes automatically settings.py
. My guess is you ran scrapy shell
from a top-level directory.
It's great clarification. Thanks a million!
I just added the following to the setting.py and hey presto # Disable S3 AWS_ACCESS_KEY_ID = "" AWS_SECRET_ACCESS_KEY = ""
Hope this helps someone else
Newbie to Py and Scrapy
Richard
Hi there,
I am having fun trying to set-up the Vagrant/Docker network in OSX 10.11.3 (El Capitan). First, a warning for all other OSX folks out there : don't use Vagrant 1.7.x or you will be stuck badly with non-sense errors in your console. Use the 1.8.1 (or newer version)
Then, let's go with my problem. I can see all the docker boxes. I can even ssh into them (vagrant ssh works like a charm). From there, I can see that the web box is running OK and responding HTTP queries at tcp/9312 also :
But now, following the book (p. 113, section "The URL"), if I try to use the scrapy shell to connect to http://web:9312 , there is a timeout error that I can't grok:
Any help will be much appreciated.
Greetings,
Juanan