Skip to main content
Log in

Improved sparse LSSVMS based on the localized generalization error model

  • Original Article
  • Published:
International Journal of Machine Learning and Cybernetics Aims and scope Submit manuscript

Abstract

The least squares support vector machine (LSSVM) is computationally efficient because it converts the quadratic programming problem in the training of SVM to a linear programming problem. The sparse LSSVM is proposed to promote the predictive speed and generalization capability. In this paper, two sparse LSSVM algorithms: the SMRLSSVM and the RQRLSSVM are proposed based on the Localized Generalization Error of the LSSVM. Experimental results show that the RQRLSSVM yields both better generalization capability and sparseness in comparison to other sparse LSSVM algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  1. Suykens JAK, Vandewalle J (1999) Least squares support vector machine classifiers. Neural Process Lett 9(3):293–300

    Article  MATH  Google Scholar 

  2. Sun BB et al (2013) Hyper-Parameter Selection for Sparse Ls-Svm Via Minimization of Its Localized Generalization Error. Int J Wavel Multiresolution Inf Process 11(3):1350030

    Article  MATH  MathSciNet  Google Scholar 

  3. van Gestel T et al (2004) Benchmarking least squares support vector machine classifiers. Mach Learn 54(1):5–32

    Article  MATH  Google Scholar 

  4. Jiao LC, Bo LF, Wang L (2007) Fast sparse approximation for least squares support vector machine. IEEE Trans Neural Networks 18(3):685–697

    Article  Google Scholar 

  5. Yang LX et al (2014) Sparse least square support vector machine via coupled compressive pruning. Neurocomputing 131:77–86

    Article  Google Scholar 

  6. Zhao YP, Sun JG (2009) Recursive reduced least squares support vector regression. Pattern Recogn 42(5):837–842

    Article  MATH  Google Scholar 

  7. Zhao YP et al (2012) An improved recursive reduced least squares support vector regression. Neurocomputing 87:1–9

    Article  Google Scholar 

  8. Downs T, Gates KE, Masters A (2002) Exact simplification of support vector solutions. J Mach Learn Res 2(2):293–297

    MATH  MathSciNet  Google Scholar 

  9. Huang CH (2012) A reduced support vector machine approach for interval regression analysis. Inf Sci 217:56–64

    Article  MATH  MathSciNet  Google Scholar 

  10. Lee YJ, Huang SY (2007) Reduced support vector machines: a statistical theory. IEEE Trans Neural Networks 18(1):1–13

    Article  Google Scholar 

  11. Lin KM, Lin CJ (2003) A study on reduced support vector machines. IEEE Trans Neural Networks 14(6):1449–1459

    Article  Google Scholar 

  12. Carvalho BPR, Braga AP (2009) IP-LSSVM: a two-step sparse classifier. Pattern Recogn Lett 30(16):1507–1515

    Article  Google Scholar 

  13. Zeng XY, Chen XW (2005) SMO-based pruning methods for sparse least squares support vector machines. IEEE Trans Neural Netw 16(6):1541–1546

    Article  Google Scholar 

  14. Zhao YP, Wang KK, Li F (2015) A pruning method of refining recursive reduced least squares support vector regression. Inf Sci 296:160–174

    Article  MATH  MathSciNet  Google Scholar 

  15. Suykens JA, Lukas L, Vandewalle J (2000) Sparse approximation using least square vector machines. In: IEEE Proc. Int. Symp. Circuits Syst., pp 757–760

  16. Gao JB, Kwan PW, Shi DM (2010) Sparse kernel learning with LASSO and Bayesian inference algorithm. Neural Netw 23(2):257–264

    Article  Google Scholar 

  17. Liu JL et al (2011) A weighted L-q adaptive least squares support vector machine classifier—Robust and sparse approximation. Expert Syst Appl 38(3):2253–2259

    Article  Google Scholar 

  18. Chen S, Hong X, Harris CJ (2008) An orthogonal forward regression technique for sparse kernel density estimation. Neurocomputing 71(4–6):931–943

    Article  Google Scholar 

  19. Yeung DS, Li J-C, Ng WWY, Chan PPK (2016) MLPNN training via a multiobjective optimization of training error and stochastic sensitivity. IEEE Trans Neural Netw Learn Syst 27(5):978–992

    Article  MathSciNet  Google Scholar 

  20. Ng WWY et al (2008) Feature selection using localized generalization error for supervised classification problems using RBFNN. Pattern Recogn 41(12):3706–3719

    Article  MATH  Google Scholar 

  21. Yeung DS et al (2007) Localized generalization error model and its application to architecture selection for radial basis function neural network. IEEE Trans Neural Netw 18(5):1294–1305

    Article  Google Scholar 

  22. Ng WWY et al (2015) Diversified sensitivity-based undersampling for imbalance classification problems. IEEE Tras Cybern 45(11):2402–2412

    Article  Google Scholar 

  23. Wang XZ (2015) Learning from big data with uncertainty—editorial. J Intell Fuzzy Syst 28(5):2329–2330

    Article  MathSciNet  Google Scholar 

  24. Lu SX, Wang XZ, Zhang GQ, Zhou X (2015) Effective algorithms of the Moore–Penrose inverse matrices for extreme learning machine. Intell Data Anal 19(4):743–760

    Article  Google Scholar 

Download references

Acknowledgments

This work was supported by the National Natural Science Foundation of China (61272201 and 61572201) and the Fundamental Research Funds for the Central Universities (2015ZZ023).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Wing W. Y. Ng.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Sun, B., Ng, W.W.Y. & Chan, P.P.K. Improved sparse LSSVMS based on the localized generalization error model. Int. J. Mach. Learn. & Cyber. 8, 1853–1861 (2017). https://doi.org/10.1007/s13042-016-0563-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13042-016-0563-6

Keywords

Navigation