Closed tzylla closed 6 years ago
Hi Flo, any news on that?
I hope I'm able to create this plugin next week.
Hi Thomas,
I was able to merge this into the existing check_db_performance.py plugin, now there is an additional parameter to monitor schema sizes:
Options:
-h shows this help
-V shows the plugin version
-H <license server> domain of IP of your license server
-d <db instance> the name of your DB instance
-u <user login> EXAoperation login user
-p <password> EXAoperation login password
-l <dbuser login> DB instance login user
-a <dbuser passwd> DB instance login password
-s <threshold> (optional) monitor schemata, treshold = max. usage in percent
-c <timeout in sec> (optional) time until a transaction conflict creates a warning
-o <ODBC driver name> (optional) ODBC driver name
"-s" enables this new feature, just give it a value like "-s 10" or "-s 20" to get warnings when 10% or 20% of the possible maximum space usage has been reached. Furthermore the performance data contains a new element "biggest_schema" which provides the current size in GiB (compressed size) of the biggest schema.
Keep in mind that this check is expensive and takes between 3 and 10 seconds to execute the queries on the database instance. You may have to prevent, that your monitoring system will kill this check before it finished its job.
Can you please check if this fits your needs?
Hi, we would need to have the possibility to monitor individual schematas or tables. E.g: on schema XX table YY using ZZ disk space
Thank you, Thomas