I started tracing how it works on my whiteboard and realized that duplicate links are getting deferred and thus evade the initial duplication checking. To counter that, I had the option of doing an expensive refactoring of how $scanned links are tracked or to also track the deferred links such that duplicates are avoided.
The only downside of this is increased memory usage. The solution to that was two-fold.
1) Removed all scanned links from the deferred list. Further duplicates are going to get caught by the is_scanned() function.
2) It's not a problem anymore. The increase memory usage will not exceed the peak memory usage of before. So if you run out of memory with the script, it will just fail faster.
Before:
▶ time php sitemap.php site=https://blog.codinghorror.com
[+] Sitemap has been generated in 838.66 secondsand saved to sitemap.xml
[+] Scanned a total of 3137 pages and indexed 1704 pages.
[+] Operation Completed
php sitemap.php site=https://blog.codinghorror.com 5.10s user 1.39s system 0% cpu 13:58.70 total
After:
▶ time php sitemap.php site=https://blog.codinghorror.com
[+] Sitemap has been generated in 746.65 secondsand saved to sitemap.xml
[+] Scanned a total of 3137 pages and indexed 1704 pages.
[+] Operation Completed
php sitemap.php site=https://blog.codinghorror.com 6.04s user 1.42s system 0% cpu 12:26.71 total
Alternatively, the same test resulted in "5m50" and "5m35" respectively on my server.
I started tracing how it works on my whiteboard and realized that duplicate links are getting deferred and thus evade the initial duplication checking. To counter that, I had the option of doing an expensive refactoring of how
$scanned
links are tracked or to also track the deferred links such that duplicates are avoided.The only downside of this is increased memory usage. The solution to that was two-fold.
1) Removed all scanned links from the deferred list. Further duplicates are going to get caught by the
is_scanned()
function. 2) It's not a problem anymore. The increase memory usage will not exceed the peak memory usage of before. So if you run out of memory with the script, it will just fail faster.Before:
After:
Alternatively, the same test resulted in "5m50" and "5m35" respectively on my server.
26