Abstract
Neural network ensembles are widely use for classification and regression problems as an alternative to the use of isolated networks. In many applications, ensembles has proven a performance above the performance of just one network.
In this paper we present a new approach to neural network ensembles that we call “cascade ensembles”. The approach is based on two ideas: (i) the ensemble is created constructively, and (ii) the output of each network is fed to the inputs of the subsequent networks. In this way we make a cascade of networks.
This method is compared with standard ensembles in several problems of classification with excellent performance.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Avnimelech, R., Intrator, N.: Booested mixture of experts: an ensemble learning scheme. Neural Computation 11(2), 483–497 (1999)
Dietterich, T.G.: Ensemble methods in machine learning. In: Kittler, J., Roli, F. (eds.) Proceedings of the First International Workshop on Multiple Classifier Systems, pp. 1–15. Springer, Heidelberg (2000)
Dzeroski, S., Zenko, B.: Is combining classifiers with stacking better than selecting the best one? Machine Learning 54, 255–273 (2004)
Fahlman, S.E., Lebiere, C.: The cascade-correlation architecture. In: Touretzky, D.S. (ed.) Advances in Neural Information Systems 2, pp. 524–532. Morgan Kauffman, San Francisco (1990)
Fern, A., Givan, R.: Online ensemble learning: An empirical study. Machine Learning 53, 71–109 (2003)
Giacinto, G., Roli, F.: Dynamic classifier selection. In: Kittler, J., Roli, F. (eds.) MCS 2000. LNCS, vol. 1857, pp. 177–189. Springer, Heidelberg (2000)
Hansen, J.: Combining predictors: Comparison of five meta machine learning methods. Information Science 119(1–2), 91–105 (1999)
Liu, Y., Yao, X., Higuchi, T.: Evolutionary ensembles with negative correlation learning. IEEE Transactions on Evolutionary Computation 4(4), 380–387 (2000)
Liu, Y., Yao, X., Zhao, Q., Higuchi, T.: Evolving a cooperative population of neural networks by minimizing mutual information. In: Proc. of the 2001 IEEE Congress on Evolutionary Computation, Seoul, Korea, pp. 384–389 (May 2001)
Merz, C.J.: Using correspondence analysis to combine classifiers. Machine Learning 36(1), 33–58 (1999)
Perrone, M.P., Cooper, L.N.: When networks disagree: Ensemble methods for hybrid neural networks. In: Mammone, R.J. (ed.) Neural Networks for Speech and Image Processing, pp. 126–142. Chapman – Hall, Boca Raton (1993)
Prechelt, L.: Proben1 – A set of neural network benchmark problems and benchmarking rules. Technical Report 21/94, Fakultät für Informatik, Universität Karlsruhe, Karlsruhe, Germany (September 1994)
Sharkey, A.J.C.: On combining artificial neural nets. Connection Science 8, 299–313 (1996)
Webb, G.I.: Multiboosting: A technique for combining boosting and wagging. Machine Learning 40(2), 159–196 (2000)
Yao, X., Liu, Y.: Making use of population information in evolutionary artificial neural networks. IEEE Transactions on Systems, Man, and Cybernetics – Part B: Cybernetics 28(3), 417–425 (1998)
Zhou, Z.-H., Wu, J., Tang, W.: Ensembling neural networks: Many could be better than all. Artificial Intelligence 137(1–2), 239–253 (2002)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
García-Pedrajas, N., Ortiz-Boyer, D., del Castillo-Gomariz, R., Hervás-Martínez, C. (2005). Cascade Ensembles. In: Cabestany, J., Prieto, A., Sandoval, F. (eds) Computational Intelligence and Bioinspired Systems. IWANN 2005. Lecture Notes in Computer Science, vol 3512. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11494669_73
Download citation
DOI: https://doi.org/10.1007/11494669_73
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-26208-4
Online ISBN: 978-3-540-32106-4
eBook Packages: Computer ScienceComputer Science (R0)