Open basnetpro3 opened 1 year ago
How to describe the contribution features in this type of problem??? Someone help please
Hi @basnetpro3, good question. These graphs aren't exactly the easiest to interpret, but we haven't yet found a better way to render multiclass local explanations in the space constraints that we have with our visualization system.
This is essentially just a set of standard (binary/regression) local explanations stacked on top of each other, ordered by class. One thing that might help in aiding understanding is to remove classes -- you can do this by clicking on the legend at the right with the square boxes to toggle which classes are visible at a time.
Basically, just focus on one color at a time, and you can see where evidence piles up in favor of or against that specific class. So in your example, let's look at class 2 and see what went wrong. The model has a small intercept biased against class 2, and the value of feature "E" indicates strong evidence against it being class 2 (note that the green bar is on the left of the 0 mark), while features RBD, V, and N all contribute positive evidence of being class 2 (with N contributing over 0.4 log odds).
Class 1 on the other hand gets a strong contribution from the intercept and feature E with marginal contributions from the other features. If you sum up the log odds of each class, you'll see that class 1's total log odds is slightly higher than class 2s, but only marginally -- the model could have gone either way.
Hope this helps! Again the best way to understand these graphs is to toggle off many of the irrelevant classes IMO.
Thank you very much, as there are very few papers on multiclass problems, so I needed clarification; I'm working on implementing multiclass EBM in my research problem, and your answer helped me a lot. I can explain the reasoning now regarding the wrong classification. Really appreciate your effort.
Sure, happy to help! Good luck with your research -- definitely let us know if you have any further questions!
In binary classification it is very easy to describe wrong prediction using local explanation, in multiclass problem of four classes how to explain wrong prediction using EBM? see figure below actual class 2 is predicted as class 1, how to interpret this??? Help!