JayBizzle / Crawler-Detect

🕷 CrawlerDetect is a PHP class for detecting bots/crawlers/spiders via the user agent
https://crawlerdetect.io
MIT License
1.99k stars 258 forks source link

The ability to extend crawlers list for monitoring-like systems? #480

Closed s-chizhik closed 1 year ago

s-chizhik commented 2 years ago

Hello guys.

First of all, thank you for your package. It is really helpful for projects I've been doing.

Literally, I'm referencing existing issue #309 but from a little different point of view. My goal is not to simply add a few exotic bots to the list. I aim to exclude the custom-configured monitoring systems that we use like Zabbix, Munin, etc. They perform health checks using HTTP requests with specific UA suffix, which we are configuring by ourselves. It can be like, ${PROJECT_NAME}Monitoring, ${DOMAIN}Robot, ${NODE_NAME}Zabbix, etc.

And at this point, we're stuck. Cause of:

So my question: is any chance that you'll review your position about code extensibility or maybe you have any workaround?

As for now, I see the one and a little dumb way to get the goal (excluding project fork) is to make custom classes extend from Crawlers and CrawlerDetect, so CustomCrawlers will contain extra lines for custom monitoring systems and CustomCrawlerDetect will use it.

s-chizhik commented 2 years ago

@JayBizzle pew pew?

JayBizzle commented 2 years ago

Sorry for the late reply. We have discussed this internally again and we are still not sure it is something we would like to support.

How would you see this being implemented?

dreamwerx commented 2 years ago

Hi - I have a use case for this also - what about a method addCustomUserAgentRegex(string $pattern) - which would then append it to the crawlers array?

JayBizzle commented 1 year ago

If someone wants to put together a PR as an initial discussion point, we can take it from there