Closed mgifford closed 1 month ago
The config is now:
module.exports = {
scanner: {
include: [
"/",
"/about",
"/foia",
"/inspector",
"/privacy",
"/search",
"/sitemap",
"/accessibility",
"/contact",
"/fear",
"/espanol",
"/es",
"/sitemap",
"/sitemap.xml",
"/blog",
"/*"
],
samples: 1,
device: 'desktop',
throttle: true,
maxRoutes: 300,
skipJavascript: false,
sitemap: true,
},
chrome: {
useSystem: true
},
debug: false,
};
I've done some testing with your setup and it appears to be correct, at least on the latest version. I have pushed through several fixes since this issue so maybe something got fixed.
Just make sure you have dynamic sampling disabled.
If you still have issues could you provide a similar config reproduction :pray:
module.exports = {
scanner: {
// ...
dynamicSampling: false, // ensure all routes are scanned
},
};
Describe the bug
I have this running:
% npx unlighthouse-ci --site https://organdonor.gov/ --throttle --yes --reporter csvExpanded --expose-gc --timeout 600000 --protocoll-timeout 300000 --navigation-timeout 60000 --log-level error
My config (unlighthouse.config.ts ) is:
on some sites it works as expected, but on others I only get 10-30 results.
I haven't checked on all of them, but there are a lot of other URLs in the sitemap that aren't being scanned: https://www.organdonor.gov/sitemap.xml
Their robots.txt looks pretty standard too: https://www.organdonor.gov/robots.txt
Reproduction
No response
System / Nuxt Info