Feating is an ensemble learning algorithm that combines local rather than global models. To our knowledge it is the only generic ensemble learning technique that can boost the accuracy of stable learners such as SVM and NB.
The Weka implementation of Feating can be downloaded from here.
Publications
Feature-subspace aggregating: Ensembles for stable and unstable learners.
Ting, K. M., Wells, J., Tan, S., Teng, S., & Webb, G. I.
Machine Learning, 82(3), 375-397, 2011.
[Bibtex] [Abstract] → Access on publisher site
@Article{TingEtAl11,
author = {Ting, K. M. and Wells, J. and Tan, S. and Teng, S. and Webb, G. I.},
journal = {Machine Learning},
title = {Feature-subspace aggregating: Ensembles for stable and unstable learners},
year = {2011},
issn = {0885-6125},
number = {3},
pages = {375-397},
volume = {82},
abstract = {This paper introduces a new ensemble approach, Feature-Subspace Aggregating (Feating), which builds local models instead of global models. Feating is a generic ensemble approach that can enhance the predictive performance of both stable and unstable learners. In contrast, most existing ensemble approaches can improve the predictive performance of unstable learners only. Our analysis shows that the new approach reduces the execution time to generate a model in an ensemble through an increased level of localisation in Feating. Our empirical evaluation shows that Feating performs significantly better than Boosting, Random Subspace and Bagging in terms of predictive accuracy, when a stable learner SVM is used as the base learner. The speed up achieved by Feating makes feasible SVM ensembles that would otherwise be infeasible for large data sets. When SVM is the preferred base learner, we show that Feating SVM performs better than Boosting decision trees and Random Forests. We further demonstrate that Feating also substantially reduces the error of another stable learner, k-nearest neighbour, and an unstable learner, decision tree.},
address = {Netherlands},
doi = {10.1007/s10994-010-5224-5},
keywords = {Feating and Multiboosting and Boosting},
publisher = {Springer},
related = {feating},
urltext = {Link to paper via SpringerLink},
}
ABSTRACT This paper introduces a new ensemble approach, Feature-Subspace Aggregating (Feating), which builds local models instead of global models. Feating is a generic ensemble approach that can enhance the predictive performance of both stable and unstable learners. In contrast, most existing ensemble approaches can improve the predictive performance of unstable learners only. Our analysis shows that the new approach reduces the execution time to generate a model in an ensemble through an increased level of localisation in Feating. Our empirical evaluation shows that Feating performs significantly better than Boosting, Random Subspace and Bagging in terms of predictive accuracy, when a stable learner SVM is used as the base learner. The speed up achieved by Feating makes feasible SVM ensembles that would otherwise be infeasible for large data sets. When SVM is the preferred base learner, we show that Feating SVM performs better than Boosting decision trees and Random Forests. We further demonstrate that Feating also substantially reduces the error of another stable learner, k-nearest neighbour, and an unstable learner, decision tree.
FaSS: Ensembles for Stable Learners.
Ting, K. M., Wells, J. R., Tan, S. C., Teng, S. W., & Webb, G. I.
Proceedings of the 8th International Workshop on Multiple Classifier Systems, MCS 2009, Berlin, pp. 364-374, 2009.
[Bibtex] → Access on publisher site
@InProceedings{TingEtAl09,
author = {Ting, K. M. and Wells, J. R. and Tan, S. C. and Teng, S. W. and Webb, G. I.},
booktitle = {Proceedings of the 8th International Workshop on Multiple Classifier Systems, MCS 2009},
title = {FaSS: Ensembles for Stable Learners},
year = {2009},
address = {Berlin},
pages = {364-374},
publisher = {Springer},
doi = {10.1007/978-3-642-02326-2_37},
keywords = {Feating and Multiboosting and Boosting},
location = {Reykjavik, Iceland},
related = {feating},
}
ABSTRACT