Closed cpereir1 closed 6 years ago
StackingEstimator is a meta-transformer for adding predictions and/or class probabilities as synthetic feature(s). Your interpretation is correct for the part below:
make_pipeline(
SelectPercentile(score_func=f_classif, percentile=42),
StackingEstimator(estimator=KNeighborsClassifier(n_neighbors=35, p=1, weights="uniform")),
GaussianNB()
)
After the predictions in this part should be added to input X as synthetic features and then pass to DecisionTreeClassifier.
Thanks weixuanfu for your reply!
Hi! I am ran the TPOT classifier as follows:
pipeline_optimizer = tpot.TPOTClassifier(warm_start=True, periodic_checkpoint_folder="C:\Users...", verbosity=3, max_eval_time_mins=20, config_dict='TPOT light')
On a training set of shape 1871, 18.
I obtained the following exported pipeline:
exported_pipeline = make_pipeline( make_union( StackingEstimator(estimator=make_pipeline( SelectPercentile(score_func=f_classif, percentile=42), StackingEstimator(estimator=KNeighborsClassifier(n_neighbors=35, p=1, weights="uniform")), GaussianNB() )), FunctionTransformer(copy) ), DecisionTreeClassifier(criterion="gini", max_depth=5, min_samples_leaf=1, min_samples_split=13) )
I am having doubts regarding what is the flow that is suggested. My interpretation is:
Is this correct?
Thanks for your help!
BR