JuliaAI / DecisionTree.jl

Julia implementation of Decision Tree (CART) and Random Forest algorithms
Other
356 stars 102 forks source link

Add support for specifying the `loss` used in random forests and AdaBoost model #217

Open ablaom opened 1 year ago

ablaom commented 1 year ago

As far as I can tell, the loss parameter is only exposed for single trees. I think this would be pretty easy to add to the ensemble models.

Issue raised at #211.

fipelle commented 1 year ago

Also, it seems that loss is only available for classification trees - not regression trees.

Is it possible to repurpose the existing code for classification trees to run regression tasks? It would be convenient both for

ablaom commented 1 year ago

multi-target problems (the current implementation for regression trees does not allow for features that are not Float64 - i.e., single targets).

Do you mean features here or, rather, labels (aka target)?

fipelle commented 1 year ago

labels as in this example

ablaom commented 1 year ago

Right. Your interesting question is a little orthogonal to initial post, so addressing it here