Abstract | ||
---|---|---|
Method In this paper, we introduce a bi-level optimization formulation for the model and feature selection problems of support vector
machines (SVMs). A bi-level optimization model is proposed to select the best model, where the standard convex quadratic optimization
problem of the SVM training is cast as a subproblem.
Feasibility The optimal objective value of the quadratic problem of SVMs is minimized over a feasible range of the kernel parameters at
the master level of the bi-level model. Since the optimal objective value of the subproblem is a continuous function of the
kernel parameters, through implicity defined over a certain region, the solution of this bi-level problem always exists. The
problem of feature selection can be handled in a similar manner.
Experiments and results Two approaches for solving the bi-level problem of model and feature selection are considered as well. Experimental results
show that the bi-level formulation provides a plausible tool for model selection. |
Year | DOI | Venue |
---|---|---|
2009 | 10.1007/s10287-008-0071-6 | Comput. Manag. Science |
Keywords | Field | DocType |
Support vector machines (SVMs),Machine learning,Model selection,Feature selection,Bi-level programming | Structured support vector machine,Mathematical optimization,Least squares support vector machine,Vector optimization,Support vector machine,Model selection,Quadratic programming,Relevance vector machine,Sequential minimal optimization,Mathematics | Journal |
Volume | Issue | ISSN |
6 | 1 | 1619-697X |
Citations | PageRank | References |
4 | 0.47 | 10 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Peng Du | 1 | 4 | 0.47 |
Ji-Ming Peng | 2 | 500 | 45.74 |
Tamás Terlaky | 3 | 677 | 65.75 |