Title: Extreme Ensemble Selection
Authors: Geoff Crew, Alex Ksikes, Alex Niculescu, Rich Caruana
(Alex and I will be presenting.)

Abstract:

An ensemble is a set of models whose predictions are combined by voting or averaging. Typically, ensembles outperform the individual models they combine. Ensemble methods such as bagging and boosting create the models that are averaged by training a single model type (e.g., decision trees) on different samples of the training data.

We present a new method for constructing ensembles. Instead of using a single model type, we train models using many different learning procedures such as decision trees, neural nets, support vector machines, and k-nearest neighbour. Instead of averaging/voting all members of the ensemble set, we select the members that make the ensemble perform best using some performance measure. This gives our ensembles two advantages: 1) the ensemble average only uses those models that contribute to good performance; and 2) we can optimize the performance of the ensemble to any performance criterion even if we do not know how to train individual models to optimize on that criterion.

Experiments with three datasets and four performance measures show that our extreme ensembles not only outperform bagging and boosting, but also yield performance that can not be achieved with any of the learning methods individually.