dkpro / dkpro-tc

UIMA-based text classification framework built on top of DKPro Core and DKPro Lab.
https://dkpro.github.io/dkpro-tc/
Other
34 stars 19 forks source link

Settings metrics for evaluation #482

Closed zepp133 closed 6 years ago

zepp133 commented 6 years ago

When running a ExperimentCrossValidation, the final results file only seems to contain the accuracy for each classifier and fold. Is it currently (version 1.0.0-SNAPSHOT) possible to add additional metrics (recall, precision etc.) to a BatchCrossValidationReport?

zepp133 commented 6 years ago

Are additional metrics currently not supported? In version 0.9.0 the evaluation results file contained additional metrics besides accuracy.

Horsmann commented 6 years ago

Hi,

For classification, you should find in the WekaTestTask folder also a file with FScore, Recall and Precision. I am not sure If such a file also exist in the Evaluation folder but in the WekaTestTask it should be there.

Which other metrics are you missing ?

zepp133 commented 6 years ago

Thank you for your hint! Precision & Recall for each fold in a cross validation experiment can be found in the scorePerCategory.txt files in the WekaTestTask* directories in DKPRO_HOME.