augboost-anon / augboost

Gradient Boosting Enhanced with Step-Wise Feature Augmentation
Other
16 stars 17 forks source link

AugBoost

Gradient Boosting Enhanced with Step-Wise Feature Augmentation.

About

The code in this repository is based heavily on scikit-learn's 'gradient_boosting.py'. We started this as a fork of sklearn, but split away when we saw it would be more convenient. Thanks! =]

Prerequisites

And a number of small packages which are included in Anaconda. The most important prerequisite is probably the version of sklearn, although we haven't checked if any of them are necessary.

Getting Started

After cloning the repository, the 2 modules in can be imported using these lines of code:

from AugBoost import AugBoostClassifier as ABC
from AugBoost import AugBoostRegressor as ABR    #regression module has an issue and doesn't work yet

Meanwhile, only the code for classification tasks works =[

Create your model using code that looks like this:

model = ABC(n_estimators=10, max_epochs=1000, learning_rate=0.1, \
    n_features_per_subset=round(len(X_train.columns)/3), trees_between_feature_update=10,\
    augmentation_method='nn', save_mid_experiment_accuracy_results=False)

And then train and predict like this:

model.fit(X_train, y_train)
model.predict(X_val)

In the file 'notebook for experiments.ipynb' there is example of code for running experiments with AugBoost.