Open kowaalczyk opened 5 years ago
Status:
Best LGB params:
boosting_type=gbdt, colsample_bytree=0.5, learning_rate=0.03, max_depth=5, metric=multi_logloss, min_child_weight=10, min_split_gain=0.01, n_estimators=1000, num_class=14, objective=multiclass, reg_alpha=0.001, reg_lambda=0.1, silent=-1, subsample=0.9, verbose=-1
Implement training with default parameters same as in this kaggle kernel aiming to achieve similar feature importances. Do it for both XGB and CatBoost.