Closed man0007 closed 3 years ago
I would assume you are not meant to compare SHAP with QII, which could be found in great details here.
To my understanding, in the context of game theory, Shapley value refers to a sampling approach over permutation set to approximate the contribution of each agent i in the set of agents N. This can work assume you have a function f(.) that given a set of agents S \in N, one can know their reward/utility.
I would say QII is an adapted version of Shapley value in the Interpretable Machine Learning field. It does that by defining a way to encode the absence of feature (agent) i in an input and the utility function f(.):
Hope this helps.
As we know that QII is quantitative input influence and similarly the shapley values state the same too, that is, the fair distribution of the gains and costs for each feature involved in decision making. So what is the difference between QII and shapley values then, are these both the same?