AdaBoost with sampling instead of weighting could be implemented for classification problems.
Friedmans "Stochasic Gradient Boosting" (1999) could be implemented for regression and classification problems.
One important question is where the gradient for our target functions is implemented. In gradient boosting we need to use the correct gradient which is appropriate for the currently selected fitness function.
Issue migrated from trac ticket # 2186
milestone: HeuristicLab 3.3.x Backlog | component: Algorithms.DataAnalysis | priority: medium | resolution: duplicate
2014-05-21 17:21:01: @gkronber created the issue