Sum02dean / STRINGSCORE

1 stars 1 forks source link

Build meta-learner to combine all models #36

Closed Sum02dean closed 2 years ago

Sum02dean commented 2 years ago

Idea is simple. Instead of preferring one model over another. Take the estimated probabilities from each model (across all k-fold COG-splits) and then use these as the inputs to stacked ensemble estimator. This way we make the differential weighting of the final predictions a prediction task in itself.