Closed Alaminpramanik closed 3 years ago
Same for me here is my docker file
# As Scrapy runs on Python, I choose the official Python 3 Docker image.
FROM python:3
EXPOSE 6800
# Copy the file from the local host to the filesystem of the container at the working directory.
COPY requirements.txt ./
# Install Scrapy specified in requirements.txt.
RUN pip3 install --upgrade pip
RUN pip3 install --no-cache-dir -r requirements.txt
# Copy the project source code from the local host to the filesystem of the container at the working directory.
RUN mkdir /app
WORKDIR /app
COPY ./app /app
# RUN Scraper client
COPY run.sh /usr/local/bin
RUN chmod +x /usr/local/bin/run.sh
COPY ./scrapyd.conf /etc/scrapyd/
# Run the crawler when the container launches.
# CMD [ "python3", "./go-spider.py" ]
#ENTRYPOINT ["tail", "-f", "/dev/null"]
CMD ["scrapyd", "--pidfile=abc.pid"]
docker-compose.yml
version: '3'
services:
scraper:
build:
context: .
volumes:
- ./app:/app
- logvolume01:/var/log
- csvdata:/var/lib/scraper
ports:
- '6800:6800'
- '8000:8000'
volumes:
logvolume01: {}
csvdata: {}
I fixed it changing
bind_address = 127.0.0.1
>> bind_address = 0.0.0.0
within /etc/scrapyd/scrapyd.conf
https://scrapyd.readthedocs.io/en/stable/config.html#example-configuration-file
I fixed it changing
bind_address = 127.0.0.1
>>bind_address = 0.0.0.0
within /etc/scrapyd/scrapyd.conf
https://scrapyd.readthedocs.io/en/stable/config.html#example-configuration-file
Closing as resolved.
Show your docker files