scrapy / scrapyd

A service daemon to run Scrapy spiders
https://scrapyd.readthedocs.io/en/stable/
BSD 3-Clause "New" or "Revised" License
2.96k stars 569 forks source link

I'm experiencing the same issue, scrapyd is accessible from within the container, but not from the outside. #405

Closed Alaminpramanik closed 3 years ago

psdon commented 3 years ago

Show your docker files

nhuzaa commented 3 years ago

Same for me here is my docker file

# As Scrapy runs on Python, I choose the official Python 3 Docker image.
FROM python:3

EXPOSE 6800

# Copy the file from the local host to the filesystem of the container at the working directory.
COPY requirements.txt ./

# Install Scrapy specified in requirements.txt.
RUN pip3 install --upgrade pip 
RUN pip3 install --no-cache-dir -r requirements.txt

# Copy the project source code from the local host to the filesystem of the container at the working directory.
RUN mkdir /app
WORKDIR /app
COPY ./app /app

# RUN Scraper client
COPY  run.sh /usr/local/bin
RUN chmod +x /usr/local/bin/run.sh
COPY ./scrapyd.conf /etc/scrapyd/

# Run the crawler when the container launches.
# CMD [ "python3", "./go-spider.py" ]
#ENTRYPOINT ["tail", "-f", "/dev/null"]
CMD ["scrapyd", "--pidfile=abc.pid"]

docker-compose.yml

version: '3'

services:
  scraper:
    build: 
      context: .
    volumes:
      - ./app:/app
      - logvolume01:/var/log
      - csvdata:/var/lib/scraper
    ports:
      - '6800:6800'
      - '8000:8000'
volumes:
  logvolume01: {}
  csvdata: {}
nhuzaa commented 3 years ago

I fixed it changing bind_address = 127.0.0.1 >> bind_address = 0.0.0.0

within /etc/scrapyd/scrapyd.conf

https://scrapyd.readthedocs.io/en/stable/config.html#example-configuration-file

jpmckinney commented 3 years ago

I fixed it changing bind_address = 127.0.0.1 >> bind_address = 0.0.0.0

within /etc/scrapyd/scrapyd.conf

https://scrapyd.readthedocs.io/en/stable/config.html#example-configuration-file

Closing as resolved.