Closed fengaccoumt closed 5 months ago
Hi,
I also suggest you to look at other metrics as well, such as "duration" of the behavior, which is less sensitive to false categorization, than "count". LabGym categorizes behaviors at every frame, if a behavior "x" lasts for 10 frames, like "xxxxxxxxxx", the count of behavior "x" is 1, the duration is 10 frames, but if false categorization "y" happens in middle of the 10 frames, like "xxyxxxyxxx", the count of behavior "x" becomes 3 while the duration is 8 frames.
Thanks for your advises, I want to generate the confusion matrix according training report, but I don't know how to do it.
I calculate it by these formulas and the value from training report. This operate is right?
The formula is: https://nonmeyet.tistory.com/entry/Confusion-matrix%EC%99%80-Precision-Recall-F1score%EC%9D%98-%EC%9D%B4%ED%95%B4
This is the value from training report: such as the model aa, Precision=0.9; recall=1; f1-score=0.95; accuracy=0.91
This is calculate process:
Very much looking forward to your reply!
Hi, I'm not sure whether I fully understand your issue / question. Were you trying to use LabGym to generate confusion matrix? The current version of LabGym doesn't output a confusion matrix but output a summary of precision, recall, and f1 score.
Hi good afternoon! Very thanks for your help before.
Due to the trained categorizer accuracy is 0.91 last time, I used it to test real video. The result shows specific behavior count is 7, but in fact the count is 1 in video. So I want to know:
This is categorizer's training report:
This is categorizer's testing report:
Very much looking forward to your reply!