Closed marthaisabelhilton closed 5 years ago
Hi @marthaisabelhilton,
this check verifies that uniform loss does not try to involve samples (events) from classes we do not include to uniformity requirement.
This is an internal check, in fact. Most probably, you hit this check because there is something wrong with weights
Hi I am wondering if you can help me.
I am getting the following error when trying to use UGradientBoost:
Traceback (most recent call last): File "uBoost_test.py", line 196, in <module> main() File "uBoost_test.py", line 30, in main train_classifier(dataframe, mode, year) File "uBoost_test.py", line 101, in train_classifier ugradientboost.fit(X_train, Y_train, w_train) File "/afs/cern.ch/user/m/mhilton/.local/lib/python3.6/site-packages/hep_ml/gradientboosting.py", line 205, in fit return UGradientBoostingBase.fit(self, X, y, sample_weight=sample_weight) File "/afs/cern.ch/user/m/mhilton/.local/lib/python3.6/site-packages/hep_ml/gradientboosting.py", line 131, in fit residual, weights = self.loss.prepare_tree_params(y_pred) File "/afs/cern.ch/user/m/mhilton/.local/lib/python3.6/site-packages/hep_ml/losses.py", line 118, in prepare_tree_params return self.negative_gradient(y_pred), numpy.ones(len(y_pred)) File "/afs/cern.ch/user/m/mhilton/.local/lib/python3.6/site-packages/hep_ml/losses.py", line 753, in negative_gradient neg_gradient = self._compute_fl_derivatives(y_pred) * self.fl_coefficient File "/afs/cern.ch/user/m/mhilton/.local/lib/python3.6/site-packages/hep_ml/losses.py", line 748, in _compute_fl_derivatives assert numpy.all(neg_gradient[~numpy.in1d(self.y, self.uniform_label)] == 0) AssertionError
I am wondering if you can explain this last
assert
line and why this is happening?Many thanks.