One thing, tho, you might want to add a specific or a random one from a pool of user agent string(s). Not doing so causes every request by your checkers to send python-requests/2.xx.x to the servers you are querying.
While this might work it's poor opsec and will eventually increase the risk of being banned / endpoints being hardened / cool methods being removed as it is pretty noisy.
Of course users can monkey patch this globally by - e.g. inserting something like requests.utils.default_user_agent = lambda: 'Mozilla Firefox....' - but imho the tool should provide proper opsec or clearly state in the README that it broadcast itself as a crawler.
Thanks for putting this together.
One thing, tho, you might want to add a specific or a random one from a pool of user agent string(s). Not doing so causes every request by your checkers to send
python-requests/2.xx.x
to the servers you are querying.While this might work it's poor opsec and will eventually increase the risk of being banned / endpoints being hardened / cool methods being removed as it is pretty noisy.
Of course users can monkey patch this globally by - e.g. inserting something like
requests.utils.default_user_agent = lambda: 'Mozilla Firefox....'
- but imho the tool should provide proper opsec or clearly state in the README that it broadcast itself as a crawler.