Closed DominicTheHOST closed 1 year ago
Hi @DominicTheHOST, what is the goal of your analysis?
Is there any way to compute the predicted values and the residuals with no access to the probabilities in the model?
Probabilities are the predicted values, so no.
I am trying to wrap an mlr3-learner with the predict_type = 'response' with the DALEXtra function explain_mlr3().
Please note that all explanations are computed using predicted values (probabilities for classification).
Thank you @hbaniecki for your quick response!
what is the goal of your analysis? I am working on a project involving mlr3 and shiny for creating and explaining different types of supervised ML-models. I want to create an
explainer
depending on the users input, wether it be linear regression, random forest etc. For these models, I want to use DALEX to compute feature importance withmodel_parts
, as well as pdp and ALE-plot withmodel_profile
Sure, so both PD and ALE explain the classifier through probabilities. Permutational feature importance can work without probabilities if you use 1-accuracy loss function.
treating this issue as answered
I am trying to wrap an mlr3-learner with the
predict_type = 'response'
with the DALEXtra functionexplain_mlr3()
. Due to mlr3pipelines syntax restrictions, I am unable to change the predict_type to include the probabilities. If I am trying to create the explainer, an error occurs with the prediction function.gives the output
Unfortunately I have been unable to find an example, where this has been adressed. Is there any way to compute the predicted values and the residuals with no access to the probabilities in the model?