Abstract
The least squares support vector machine (LSSVM) is computationally efficient because it converts the quadratic programming problem in the training of SVM to a linear programming problem. The sparse LSSVM is proposed to promote the predictive speed and generalization capability. In this paper, two sparse LSSVM algorithms: the SMRLSSVM and the RQRLSSVM are proposed based on the Localized Generalization Error of the LSSVM. Experimental results show that the RQRLSSVM yields both better generalization capability and sparseness in comparison to other sparse LSSVM algorithms.
Similar content being viewed by others
References
Suykens JAK, Vandewalle J (1999) Least squares support vector machine classifiers. Neural Process Lett 9(3):293–300
Sun BB et al (2013) Hyper-Parameter Selection for Sparse Ls-Svm Via Minimization of Its Localized Generalization Error. Int J Wavel Multiresolution Inf Process 11(3):1350030
van Gestel T et al (2004) Benchmarking least squares support vector machine classifiers. Mach Learn 54(1):5–32
Jiao LC, Bo LF, Wang L (2007) Fast sparse approximation for least squares support vector machine. IEEE Trans Neural Networks 18(3):685–697
Yang LX et al (2014) Sparse least square support vector machine via coupled compressive pruning. Neurocomputing 131:77–86
Zhao YP, Sun JG (2009) Recursive reduced least squares support vector regression. Pattern Recogn 42(5):837–842
Zhao YP et al (2012) An improved recursive reduced least squares support vector regression. Neurocomputing 87:1–9
Downs T, Gates KE, Masters A (2002) Exact simplification of support vector solutions. J Mach Learn Res 2(2):293–297
Huang CH (2012) A reduced support vector machine approach for interval regression analysis. Inf Sci 217:56–64
Lee YJ, Huang SY (2007) Reduced support vector machines: a statistical theory. IEEE Trans Neural Networks 18(1):1–13
Lin KM, Lin CJ (2003) A study on reduced support vector machines. IEEE Trans Neural Networks 14(6):1449–1459
Carvalho BPR, Braga AP (2009) IP-LSSVM: a two-step sparse classifier. Pattern Recogn Lett 30(16):1507–1515
Zeng XY, Chen XW (2005) SMO-based pruning methods for sparse least squares support vector machines. IEEE Trans Neural Netw 16(6):1541–1546
Zhao YP, Wang KK, Li F (2015) A pruning method of refining recursive reduced least squares support vector regression. Inf Sci 296:160–174
Suykens JA, Lukas L, Vandewalle J (2000) Sparse approximation using least square vector machines. In: IEEE Proc. Int. Symp. Circuits Syst., pp 757–760
Gao JB, Kwan PW, Shi DM (2010) Sparse kernel learning with LASSO and Bayesian inference algorithm. Neural Netw 23(2):257–264
Liu JL et al (2011) A weighted L-q adaptive least squares support vector machine classifier—Robust and sparse approximation. Expert Syst Appl 38(3):2253–2259
Chen S, Hong X, Harris CJ (2008) An orthogonal forward regression technique for sparse kernel density estimation. Neurocomputing 71(4–6):931–943
Yeung DS, Li J-C, Ng WWY, Chan PPK (2016) MLPNN training via a multiobjective optimization of training error and stochastic sensitivity. IEEE Trans Neural Netw Learn Syst 27(5):978–992
Ng WWY et al (2008) Feature selection using localized generalization error for supervised classification problems using RBFNN. Pattern Recogn 41(12):3706–3719
Yeung DS et al (2007) Localized generalization error model and its application to architecture selection for radial basis function neural network. IEEE Trans Neural Netw 18(5):1294–1305
Ng WWY et al (2015) Diversified sensitivity-based undersampling for imbalance classification problems. IEEE Tras Cybern 45(11):2402–2412
Wang XZ (2015) Learning from big data with uncertainty—editorial. J Intell Fuzzy Syst 28(5):2329–2330
Lu SX, Wang XZ, Zhang GQ, Zhou X (2015) Effective algorithms of the Moore–Penrose inverse matrices for extreme learning machine. Intell Data Anal 19(4):743–760
Acknowledgments
This work was supported by the National Natural Science Foundation of China (61272201 and 61572201) and the Fundamental Research Funds for the Central Universities (2015ZZ023).
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Sun, B., Ng, W.W.Y. & Chan, P.P.K. Improved sparse LSSVMS based on the localized generalization error model. Int. J. Mach. Learn. & Cyber. 8, 1853–1861 (2017). https://doi.org/10.1007/s13042-016-0563-6
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s13042-016-0563-6