JuliaAI / DecisionTree.jl

Julia implementation of Decision Tree (CART) and Random Forest algorithms
Other
352 stars 101 forks source link

Obscenely slow prediction #159

Open tecosaur opened 2 years ago

tecosaur commented 2 years ago

Hello,

I'd love to use DecisionTree.jl for a project I'm currently working on, as it's great in lot of ways. Speedy to train, players nicely with AbstractTrees, etc.

Unfortunately, saying the prediction performance is "not good" is putting things mildly. I did a test run with an simplified version of one of the data sets I'm working with, and recorded the training and prediction times of DecisionTree.jl as well as a number of other common random forest implementations.

Tool Train time Predict time Ratio
DecisionTree.jl 0.6s 175s 292
randomForest 24.4s 4.2s 0.17
ranger 1.9s 0.5s 0.26
sklearn 63s 1.7s 0.03

The competitiveness of the training time gives me hope that the DecisionTrees.jl should be able to be competitive with prediction performance too 🙂.

ablaom commented 2 years ago

Thanks for reporting. That high prediction time is intriguing.

Could you please provide enough detail here for others to reproduce the results, using publicly available data, or synthetic data?

tecosaur commented 2 years ago

In that example above, I've run 1000 trees on a 1-dimensional binary classification data set with ~70,000 entries. It should be pretty easy to generate something like this.

If you have a look at https://github.com/tecosaur/TreeComparison and run julia bulkreport.jl a number of reports will be generated. See forest.jl for the code running each implementation. While the results aren't as extreme, the disparity in the predict/train time ratio is still quite apparent. For example, with the Iris data:

Tool Train time Predict time Ratio
DecisionTree.jl 0.01s 0.32s 32
randomForest 1.28s 0.6s 0.47
ranger 0.08s 0.03s 0.38
sklearn 1.12s 0.1s 0.09
tecosaur commented 2 years ago

Ok, I've started looking into this, and I've identified at least two major sub-optimalities in the design. One is the implementation of tree evaluation/prediction, the other is the design of the Leaf struct.

I'm currently trying to replace Leaf with this structure:

struct Leaf{T, N}
    features :: NTuple{N, T}
    majority :: Int
    values   :: NTuple{N, Int}
    total    :: Int
end

Which should make a prediction with probability on a leaf O(1) instead of O(n). I am unfortunately finding a few parts of the code base hard to work with though, such as src/classification/tree.jl — functions with 18 positional arguments should be outlawed!

tecosaur commented 2 years ago

I've just done a bit more than the bare minimum, and so far the prediction with probability performance improvement is 2-10x with a sample iris dataset and a large-ish unidimensional data set. See: https://tecosaur.com/public/treeperf.html

ablaom commented 2 years ago

This looks like progress to me. Do you think we could get away with marking the proposed change to the Leaf struct as non-breaking? As far as I can tell, this non-public API, and in any case, we are preserving the existing properties and first parameter.

Happy to review a PR.

tecosaur commented 2 years ago

Ok, I was hoping to make further improvements, but it would probably be worth PRing the basic tree improvements I've made.