jinlow / forust

A lightweight gradient boosted decision tree package.
https://jinlow.github.io/forust/
Apache License 2.0
56 stars 6 forks source link

[Question] How to use early stopping from rust #96

Closed The-Mr-L closed 6 months ago

The-Mr-L commented 6 months ago

Hi again, well I am trying to use early stopping but without any luck it returns 0 from best_iteration. and it doss not fit to the data at all.

this is the setup

let mut model = GradientBooster::default()
        .set_iterations(2000)
        .set_objective_type(forust_ml::objective::ObjectiveType::LogLoss)
        .set_learning_rate(0.01)
        .set_early_stopping_rounds(Some(50))
        .set_max_depth(5)
        .set_gamma(0.)
        .set_initialize_base_score(false)
        .set_base_score(0.5)
        .set_min_leaf_weight(1.)
        .set_l2(1.);
    let w_ = Vec::new();
    model.fit_unweighted(
        &x_train,
        &y_train,
        Some(Vec::from_iter([(x_eval, y_eval.as_slice(), w_.as_slice())])),
    )?;
    dbg!(model.best_iteration); // prints 0 and the python version doss print a value above 0 

can you please tell me what I am doing wrong? thanks :)

jinlow commented 6 months ago

Does it work if you defien the weight like this? let w_ = vec![1.0; y_eval.len()]; Right now you are passing an empty vector, which might be causing the problem.

The-Mr-L commented 6 months ago

That's it :) Thanks !