vezaynk / Sitemap-Generator-Crawler

PHP script to recursively crawl websites and generate a sitemap. Zero dependencies.
https://www.bbss.dev
MIT License
241 stars 92 forks source link

Tracking deferred scans #67

Closed vezaynk closed 6 years ago

vezaynk commented 6 years ago

I started tracing how it works on my whiteboard and realized that duplicate links are getting deferred and thus evade the initial duplication checking. To counter that, I had the option of doing an expensive refactoring of how $scanned links are tracked or to also track the deferred links such that duplicates are avoided.

The only downside of this is increased memory usage. The solution to that was two-fold.

1) Removed all scanned links from the deferred list. Further duplicates are going to get caught by the is_scanned() function. 2) It's not a problem anymore. The increase memory usage will not exceed the peak memory usage of before. So if you run out of memory with the script, it will just fail faster.

Before:

▶ time php sitemap.php site=https://blog.codinghorror.com
[+] Sitemap has been generated in 838.66 secondsand saved to sitemap.xml
[+] Scanned a total of 3137 pages and indexed 1704 pages.
[+] Operation Completed
php sitemap.php site=https://blog.codinghorror.com  5.10s user 1.39s system 0% cpu 13:58.70 total

After:

▶ time php sitemap.php site=https://blog.codinghorror.com
 [+] Sitemap has been generated in 746.65 secondsand saved to sitemap.xml
 [+] Scanned a total of 3137 pages and indexed 1704 pages.
 [+] Operation Completed
php sitemap.php site=https://blog.codinghorror.com  6.04s user 1.42s system 0% cpu 12:26.71 total

Alternatively, the same test resulted in "5m50" and "5m35" respectively on my server.

26

vezaynk commented 6 years ago

This has been implemented in b8943622bb004d90c617fbac91d73d84cbdfdc68