Open yuanqingye opened 10 months ago
xgboost math explanation I think this article write very well about the details The only big part it didn't cover is the shrinkage or learning rate. I think the discussion here may also provide some insight. Discussion regarding learning rate
Hi I try to figure out how is split gain getting calculated here, it is key measure.
I noticed in issue #1230 The supporter wrote: The split gain and leaf output is calculated by sum_grad / sum_hess.
I want to know why? seems the split gain is related to the way we measure impurity(GINI,Entropy,etc) In entropy case, I remember the split gain should be H(Y)-H(Y|X), and how it related to sum_grad/sum_hess?
And should it be different between classification and regression case? I mean it seems for regression and classification, we should have different way to calculate. But if it is the same, then we may use the same logic to calculate the impurity.
Any material regarding this is welcomed.