interpretml / interpret

Fit interpretable models. Explain blackbox machine learning.
https://interpret.ml/docs
MIT License
6.04k stars 714 forks source link

Computational complexity of EBM #495

Open JWKKWJ123 opened 4 months ago

JWKKWJ123 commented 4 months ago

Hi all, I'm interested in comparing the computational complexity of EBM and other machine learning models. I think calculating the number of trainable and non-trainable parameters of a model is one approach. I want to know whether EBM can output the total parameters of the output model (including trainable parameters and non-trainable parameters) based on the input features and the hyperparameters of the model?

paulbkoch commented 4 months ago

Hi @JWKKWJ123 -- I think what you're asking for is the total number of bins within the "termscores" attribute of the EBM model? I'm not clear on what a trainable vs non-trainable parameter would be in the context of EBMs. We don't currently expose the ability to freeze parts of an EBM in the way you might with a neural net, although we do offer the init_score parameter for these scenarios.

JWKKWJ123 commented 4 months ago

term_scores

Hi Paul, Thanks for the reply, I understood that all parameters are ‘trainable’ in EBM. Because I'm not particularly familiar with GBDT, I am still confused with the calculating of total parameters. If I have a dataset of 10 features, and the EBM classifier have the default setting: class interpret.glassbox.ExplainableBoostingClassifier(feature_names=None, feature_types=None, max_bins=256, max_interaction_bins=32, interactions=10, exclude=[], validation_size=0.15, outer_bags=8, inner_bags=0, learning_rate=0.01, greediness=0.0, smoothing_rounds=0, max_rounds=5000, early_stopping_rounds=50, early_stopping_tolerance=0.0001, min_samples_leaf=2, max_leaves=3, objective='log_loss', n_jobs=- 2, random_state=42) Then how to calculate the total parameters? e.g.(I think it is not correct): N(parameters) ≈ N(features) max_bins + N(pairwisefeatures) max_interaction_bins

Harsha-Nori commented 4 months ago

As Paul mentioned, it's not ideal to think of each bin as an independent parameter, but it's probably the closest approximation we have to a "trainable parameter" in EBMs.

Your formula for this is approximately right for binary classification and regression. You'll need to multiply by the number of classes in the case of multiclass regression with >= 3 classes.

That said, the exact number of parameters varies beyond that formula because 1) the number of bins may be smaller than max_bins if there aren't sufficiently many unique values in the data (e.g. a boolean feature will always have only 2 bins, not 256) and 2) categorical features are handled separately, and can have either more or less than the max_bins number of values.

In practice, for any specific dataset, the way to calculate this exactly is to just sum up the lengths of the values in term_scores_ as Paul mentioned. But your formula is a reasonable approximation!

paulbkoch commented 4 months ago

In practice, for any specific dataset, the way to calculate this exactly is to just sum up the lengths of the values in term_scores_ as Paul mentioned.

I agree with everything Harsha said, but wanted to add one detail to this sentence. When you get the length of the arrays in "termscores", you'll want to ravel them before taking the length if you have pairs or multiclass.

JWKKWJ123 commented 4 months ago

Hi all, Thank you very much! The approximation of total parameters is enough for me for now! So I may choose to calculate the parameter based on the number of features and bins.