I've been playing around with munin and very much enjoying the prospects of having a building database of hashes I run. The issue I've run into is that all of the API keys used seem to be processed for each hash ran. Those various APIs have unique rate limits, many of which disallow large hashsets to be processed in a reasonable time....
Suggestion:
Allow Primary and Secondary APIs. Primary APIs are called against all hashes ran against Munin. Secondary APIs are only called when a known bad hash / suspicious file is detected.
Example:
A customer of Virus Total (who has much larger limits then below) runs a hash list of 100K+ MD5s. That same individual declares VT primary but MalShare and Hybrid Analysis secondary. VT returns 10 hits on known bad, so only 10 hashes are then sent on to those secondary services.
Hello!
I've been playing around with munin and very much enjoying the prospects of having a building database of hashes I run. The issue I've run into is that all of the API keys used seem to be processed for each hash ran. Those various APIs have unique rate limits, many of which disallow large hashsets to be processed in a reasonable time....
Free API Limits: