confusionMatrix_ considers predictions as negative if the value is <= 0
tnr_ considers predictions as negative if the value is < 0.
I suggest to standardise it using confusionMatrix_, as in the attached patch.
I am not fully aware, but I strongly suggest checking the functions mcc_ and kappa_ as they might have a different result due to the different way it is calculated.
Coverage decreased (-0.1%) to 56.429% when pulling 8230b81c6cb31e129772f6e54e663e93f8c621c9 on labrax:master into d3cbb5f1004180aef85ce27d94b3112c136aee69 on JackStat:master.
Hi Tyler,
Hope you are doing well. Thanks for this great package.
On https://github.com/JackStat/ModelMetrics/blob/master/src/confusionMatrix_.cpp:
confusionMatrix_
considers predictions as negative if the value is <= 0tnr_
considers predictions as negative if the value is < 0.I suggest to standardise it using
confusionMatrix_
, as in the attached patch.I am not fully aware, but I strongly suggest checking the functions
mcc_
andkappa_
as they might have a different result due to the different way it is calculated.Best wishes, Victor