Open PedroTroller opened 9 years ago
Yes, this would be interesting approach. In this case browser must share same session with guzzle client used in centipede crawler.
It is in deed simple extension point without reinventing the wheel, I like it.
Thanks @PedroTroller!
You could use Symfony's Client and DomCrawler classes for this (even if not asynchronous).
Hi.
Just a little reflection. Why not provide a system that allow you to describe some scenarios to execute before to crawl urls.
For example :
And then, the crawler will execute scenarios one by one and crawl urls each time.