Purple A11y is a customisable, automated web accessibility testing tool that allows software development teams to find and fix accessibility problems to improve persons with disabilities (PWDs) access to digital services.
MIT License
108
stars
39
forks
source link
Make it easier to amplify the sitemap.xml crawl #256
The ability to scan sitemap.xml files in Purple A11y is powerful. However, the sitemaps generally are too big to be effectively crawled by this script.
I've created a script that I think can help make existing sitemap.xml files more powerful.
Scanning just a handful of sites is a problem. Scanning all web pages in a site also brings challenges with it, particularly for larger sites. To have confidence in accessibility, a random sampling of pages, should allow us to get statistical certainty of our knowledge about the accessibility of a whole site.
The trouble is that there are often too many URLs, and you have site-maps of sitemaps.
My script which aggregates the XML files into just one and then removes all the stuff that we don’t want to be analyzing (.doc, .pdf, .zip, ext.) This produces I have a random sampling of URLs. This XML file can be used in the future to test if the exact same URLs have improved over time (or not). I’m capping sitemap size at 2000 URLs as that is a pretty decent sampling for the sites we work with.
It could be enhanced in the future to ensure that files like the home page, search page, representative landing pages, and any unusual pages are included in the scan. This could be something that is just appended.
Are there lists of sitemap.xml tools that folks find useful?
The ability to scan sitemap.xml files in Purple A11y is powerful. However, the sitemaps generally are too big to be effectively crawled by this script.
I've created a script that I think can help make existing sitemap.xml files more powerful.
Scanning just a handful of sites is a problem. Scanning all web pages in a site also brings challenges with it, particularly for larger sites. To have confidence in accessibility, a random sampling of pages, should allow us to get statistical certainty of our knowledge about the accessibility of a whole site.
The trouble is that there are often too many URLs, and you have site-maps of sitemaps.
My script which aggregates the XML files into just one and then removes all the stuff that we don’t want to be analyzing (.doc, .pdf, .zip, ext.) This produces I have a random sampling of URLs. This XML file can be used in the future to test if the exact same URLs have improved over time (or not). I’m capping sitemap size at 2000 URLs as that is a pretty decent sampling for the sites we work with.
It could be enhanced in the future to ensure that files like the home page, search page, representative landing pages, and any unusual pages are included in the scan. This could be something that is just appended.
Are there lists of sitemap.xml tools that folks find useful?