Skip to main content

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 3512))

Included in the following conference series:

Abstract

Neural network ensembles are widely use for classification and regression problems as an alternative to the use of isolated networks. In many applications, ensembles has proven a performance above the performance of just one network.

In this paper we present a new approach to neural network ensembles that we call “cascade ensembles”. The approach is based on two ideas: (i) the ensemble is created constructively, and (ii) the output of each network is fed to the inputs of the subsequent networks. In this way we make a cascade of networks.

This method is compared with standard ensembles in several problems of classification with excellent performance.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Avnimelech, R., Intrator, N.: Booested mixture of experts: an ensemble learning scheme. Neural Computation 11(2), 483–497 (1999)

    Article  Google Scholar 

  2. Dietterich, T.G.: Ensemble methods in machine learning. In: Kittler, J., Roli, F. (eds.) Proceedings of the First International Workshop on Multiple Classifier Systems, pp. 1–15. Springer, Heidelberg (2000)

    Chapter  Google Scholar 

  3. Dzeroski, S., Zenko, B.: Is combining classifiers with stacking better than selecting the best one? Machine Learning 54, 255–273 (2004)

    Article  MATH  Google Scholar 

  4. Fahlman, S.E., Lebiere, C.: The cascade-correlation architecture. In: Touretzky, D.S. (ed.) Advances in Neural Information Systems 2, pp. 524–532. Morgan Kauffman, San Francisco (1990)

    Google Scholar 

  5. Fern, A., Givan, R.: Online ensemble learning: An empirical study. Machine Learning 53, 71–109 (2003)

    Article  MATH  Google Scholar 

  6. Giacinto, G., Roli, F.: Dynamic classifier selection. In: Kittler, J., Roli, F. (eds.) MCS 2000. LNCS, vol. 1857, pp. 177–189. Springer, Heidelberg (2000)

    Chapter  Google Scholar 

  7. Hansen, J.: Combining predictors: Comparison of five meta machine learning methods. Information Science 119(1–2), 91–105 (1999)

    Article  Google Scholar 

  8. Liu, Y., Yao, X., Higuchi, T.: Evolutionary ensembles with negative correlation learning. IEEE Transactions on Evolutionary Computation 4(4), 380–387 (2000)

    Article  Google Scholar 

  9. Liu, Y., Yao, X., Zhao, Q., Higuchi, T.: Evolving a cooperative population of neural networks by minimizing mutual information. In: Proc. of the 2001 IEEE Congress on Evolutionary Computation, Seoul, Korea, pp. 384–389 (May 2001)

    Google Scholar 

  10. Merz, C.J.: Using correspondence analysis to combine classifiers. Machine Learning 36(1), 33–58 (1999)

    Article  MATH  Google Scholar 

  11. Perrone, M.P., Cooper, L.N.: When networks disagree: Ensemble methods for hybrid neural networks. In: Mammone, R.J. (ed.) Neural Networks for Speech and Image Processing, pp. 126–142. Chapman – Hall, Boca Raton (1993)

    Google Scholar 

  12. Prechelt, L.: Proben1 – A set of neural network benchmark problems and benchmarking rules. Technical Report 21/94, Fakultät für Informatik, Universität Karlsruhe, Karlsruhe, Germany (September 1994)

    Google Scholar 

  13. Sharkey, A.J.C.: On combining artificial neural nets. Connection Science 8, 299–313 (1996)

    Article  Google Scholar 

  14. Webb, G.I.: Multiboosting: A technique for combining boosting and wagging. Machine Learning 40(2), 159–196 (2000)

    Article  Google Scholar 

  15. Yao, X., Liu, Y.: Making use of population information in evolutionary artificial neural networks. IEEE Transactions on Systems, Man, and Cybernetics – Part B: Cybernetics 28(3), 417–425 (1998)

    Article  MathSciNet  Google Scholar 

  16. Zhou, Z.-H., Wu, J., Tang, W.: Ensembling neural networks: Many could be better than all. Artificial Intelligence 137(1–2), 239–253 (2002)

    Article  MATH  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

García-Pedrajas, N., Ortiz-Boyer, D., del Castillo-Gomariz, R., Hervás-Martínez, C. (2005). Cascade Ensembles. In: Cabestany, J., Prieto, A., Sandoval, F. (eds) Computational Intelligence and Bioinspired Systems. IWANN 2005. Lecture Notes in Computer Science, vol 3512. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11494669_73

Download citation

  • DOI: https://doi.org/10.1007/11494669_73

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-26208-4

  • Online ISBN: 978-3-540-32106-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics