Abstract:Least squares support vector machines (LS-SVM) is a hyperplane-based classifier. Due to the lack of feature selection ability, LS-SVM does not perform well on high-dimensional small sample data sets.Thus it is necessary to improve the feature selection ability of LS-SVM. The e0-norm regular term is introduced into the objective function of LS-SVM to enhance the feature selection ability of the model. However, due to the presence of the e0-norm, the new model is not only non-convex and non-smooth, but also NP-hard. In order to overcome these difficulties, a non-convex non-smooth continuous function was first used to approximate the e0-norm and then the approximate function is decomposed into a DC (difference of convex functions) programming problem, and the DCA (difference of convex functions algorithm) is used to solve the problem. The main advantage of the new method is that the subproblems of DCA have closed form solutions, which greatly improves the training speed. Numerical experiments show that the proposed new method not only has better generalization performance and feature selection ability, but also has fast computation speed.