Closed mic0331 closed 7 years ago
Yes, this always happens if you start a new scrapy project with scrapy startproject
. It happens inside the VM, outside the VM... everywhere. Yes, it's inconsequential and all the projects of the book have in their settings.py
this simple fix:
# Disable S3
AWS_ACCESS_KEY_ID = ""
AWS_SECRET_ACCESS_KEY = ""
Here is the Scrapy issue and two Stack Overflow questions [1][2]
I noticed an error message when using scrapy inside the VM. I think it has no consequence for the examples. The error is link to urllib2 shooting a time out. Is this a normal behavior and will be fix in later version ?
Feel free to close this if it is not relevant. Thanks !