Some things being scraped will always fail, e.g. a non-public GitLab repo. These will, currently, keep on being scraped forever, generating errors and delaying other things to be scraped. It would be nice if an admin can disable certain things to be scraped (based on the error logs).
Some things being scraped will always fail, e.g. a non-public GitLab repo. These will, currently, keep on being scraped forever, generating errors and delaying other things to be scraped. It would be nice if an admin can disable certain things to be scraped (based on the error logs).