Scrapoxy is a super proxy aggregator, allowing you to manage all proxies in one place π―, rather than spreading it across multiple scrapers πΈοΈ. It also smartly handles traffic routing π to minimize bans and increase success rates π.
Hi, i use scrapy (2.8.0), scrapoxy (with docker image fabienvauchelles/scrapoxy:latest) and splash (3.5) to scrape data but i got a 500 Internal Server Error when splash is running. To illustrate the error I use the website https://quotes.toscrape.com/login
Scrapy is running on macos on host 192.168.0.12.
Scrapoxy is running with docker image on debian 11.9 on host 192.168.0.103.
Splash is running with docker image on debian 11.9 on host 192.168.0.102.
Current Behavior
Hi, i use scrapy (2.8.0), scrapoxy (with docker image fabienvauchelles/scrapoxy:latest) and splash (3.5) to scrape data but i got a 500 Internal Server Error when splash is running. To illustrate the error I use the website https://quotes.toscrape.com/login
Scrapy is running on macos on host 192.168.0.12. Scrapoxy is running with docker image on debian 11.9 on host 192.168.0.103. Splash is running with docker image on debian 11.9 on host 192.168.0.102.
Scrapy settings.py configuration :
Scrapy spider :
Expected Behavior
Everything works with scrapy and scrapoxy. Everything works with scrapy and splash.
But the aim is to be able to use scrapy, scrapoxy and splash in the same scrapy project.
Steps to Reproduce
I use OVH Public Cloud with 6 proxies.
Failure Logs
Scrapoxy Version
docker version
Custom Version
Deployment
Operating System
Storage
Additional Information
No response