Abstract
This paper presents a method to iteratively grow a compact Support Vector Regressor so that the balance between size of the machine and its performance can be user-controlled. The algorithm is able to combine Gaussian kernels with different spread parameter, skipping the ‘a priori’ parameter estimation by allowing a progressive incorporation of nodes with decreasing values of the spread parameter, until a cross-validation stopping criterion is met. Experimental results show the significant reduction achieved in the size of the machines trained with this new algorithm and their good generalization capabilities.
This work has been partially supported by Spanish Government CICYT grant TIC2002-03713.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Cabrera-Borges, J.P., Parrado-Hernández, E., Figueiras-Vidal, A.R., Navia-Vázquez, A.: Kernel hybrid adaptive Recursive Least Squares. In: Proceedings of the Learning 2004 Conference, Elche, Spain, pp. 30–38. (2004)
Gutiérrez-González, D., Parrado-Hernández, E., Navia-Vázquez, A.: Mega-GSVC: Training SVMs with millions of data. In: Proceedings of the Learning 2004 Conference, Elche, Spain, pp. 73–78 (2004)
Joachims, T.: Making large scale SVM learning practical. In: Schölkopf, B., Burges, C.J.C., Smola, A.J. (eds.) Advances in Kernel Methods— Support Vector Learning, pp. 169–184. M.I.T. Press, Cambridge (1999)
Navia-Vázquez, A., Pérez-Cruz, F., Artés-Rodríguez, A., Figueiras-Vidal, A.R.: Weighted least squares training of support vector classifiers leading to compact and adaptive schemes. IEEE Trans. Neural Networks 12(5), 1047–1059 (2001)
Navia-Vázquez, A., Gutiérrez-González, D., Parrado-Hernández, E.: Distributed Support Vector Machines, Submitted to. IEEE Trans. Neural Networks (2004)
Parrado-Hernández, E., Mora-Jiménez, I., Arenas-García, J., Figueiras-Vidal, A.R., Navia-Vázquez, A.: Growing support vector classifiers with controlled complexity. Pattern Recognition 36, 1479–1488 (2003)
Pérez-Cruz, F., Navia-Vázquez, A., Alarcón-Diana, P., Artés-Rodríguez, A.: An IRWLS procedure for SVR. In: Proc. of the EUSIPCO 2000 (2000)
Schölkopf, B., Mika, S., Burges, C.J.C., Knirsch, P., Müller, K.-R., Rätsch, G., Smola, A.J.: Input space versus feature space in kernel-based methods. IEEE Trans. Neural Networks 10(5), 1000–1017 (1999)
Schölkopf, B., Smola, A.J.: Learning with kernels. MIT Press, Cambridge (2002)
Smola, A.J., Schölkopf, B.: Sparse greedy matrix approximation for machine learning. In: Langley, P. (ed.) Proc. of the 17th International Conference on Machine Learning, pp. 911–918. Morgan Kaufman, San Francisco (2000)
Vapnik, V.: The nature of statistical learning theory. Springer, New York (1995)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Gutiérrez-González, D., Parrado-Hernández, E., Navia-Vázquez, A. (2005). Multi-kernel Growing Support Vector Regressor. In: Cabestany, J., Prieto, A., Sandoval, F. (eds) Computational Intelligence and Bioinspired Systems. IWANN 2005. Lecture Notes in Computer Science, vol 3512. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11494669_44
Download citation
DOI: https://doi.org/10.1007/11494669_44
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-26208-4
Online ISBN: 978-3-540-32106-4
eBook Packages: Computer ScienceComputer Science (R0)