|Table of Contents|

SVM Model Selection Based on Genetic Algorithms and Empirical Error Minimization(PDF)

南京师范大学学报(工程技术版)[ISSN:1006-6977/CN:61-1281/TN]

Issue:
2009年02期
Page:
65-71
Research Field:
Publishing date:

Info

Title:
SVM Model Selection Based on Genetic Algorithms and Empirical Error Minimization
Author(s):
Zhou XinXu Jianhua
School of Computer Sciences,Nanjing Normal University,Nanjing 210097,China
Keywords:
support vector m ach ine ke rnel func tion kerne l param e ter em pir ica l erro r genetic a lgor ithm
PACS:
TP18
DOI:
-
Abstract:
The spread ing capacity o f support vec to rm ach ine ( SVM ) depends large ly on the se lec tion o f kerne l function and its param eters, and penalty facto r, tha t ism ode l selection. H av ing ana lyzed the param ete rs’ in fluence on the classif-i e rs’ recogn ition accuracy, w e propose a new m ethod for SVM model se lection using genetic algor ithm and em pir ica l e rror m inim ization. The exper im ents on 13 diffe rent UC I benchm a rks show its correc tness, effec tiveness and good spreading perform ance.

References:

[ 1] Ra tsch G, Onoda T, M uller K R. Softm arg ins forAdaBoost [ J]. M ach ine Lea rning, 2001, 42: 287-320.
[ 2] Chapelle O, Vapn ik V, Bousquet O, e t a .l Choo sing mu ltiple param ete rs for support vectorm ach ines [ J]. M achine Learn ing,
2002, 46: 131-159.
[ 3] Keerth i S S. E fficien t tun ing o f SVM hyperparam eters using radius m arg in bound and ite ra tive a lgor ithms [ J]. IEEE T ransactions
on Neural Netwo rks, 2002, 13: 1 225-1 229.
[ 4] Duan K, Keerth i S S, Poo A N. Eva luation of s imp le pe rfo rm ancem easures for tun ing SVM hyperparam eters [ J]. N eurocompu
ting, 2003, 51: 41-59.
[ 5] Aya tN E, CherietM, Suen C Y. Optim iza tion o f the SVM kerne ls using an emp irical erro rm inim ization schem e [ C ] / /Lee S
W, Verri A. Pattern Recognition w ith Support V ec torM ach ines. Berlin H e ide lberg: Spr inger, 2002, 2388: 354-369.
[ 6] AdankonM M, Che rietM, Aya tN E. Optim izing resources in m ode l selection for support vectorm ach ines [ C ] / /2005 International
Jo in t Conference on Neura lN etworks. Canada, M ontrea,l 2005: 925-930.
[ 7] Aya tN E, Cher ie tM, Suen C Y. Autom aticmodel se lection fo r the optim ization o f the SVM kerne ls [ J] . Patte rn Recogn ition,
2005, 38: 1 733-1 745.
[ 8] AdankonM M, Cher ie tM. New form ulation of SVM fo rmode l selection [ C] / /2006 Internationa l Jo int Conference on Neural
Ne tw orks. C anada, Vancouver: IEEE Press, 2006: 1 900-1 907.
[ 9] Zheng C H, L iC J. Autom atic param e ters se lection for SVM based on GA [ C] / /5thW or ld Congress on Inte lligen tC ontro l and
Automa tion. H ang zhou, Ch ina: IEEE Press, 2004: 1 869-1 872.
[ 10] Jav ierA, Sa turninoM, Philip S. Tun ing L1-SVM hype r-param eters w ith m od ified radius m arg in bounds and sim ulated annealing
[ C] / /Com putationa l and Amb ient Inte lligence. Berlin H e ide lberg: Spr inge r-Ver lag, 2007, 4507: 284-291.
[ 11] Guo X C, L iang Y C, W u C G, e t a.l PSO-based hyper-param eters se lection fo r LS-SVM c lassifiers [ C] / / Neural Inform ation
Processing. H ongkong, Ch ina: IEEE Press, 2006, 4233: 1 138-1 147.
[ 12] P latt J. Probabilistic outputs for suppo rt vectorm achines and compar isons to regu lar ized likelihood m ethods [ C ] / /Bartlett P
J, Scholkopf B, Schuurm ans D. Advances in largem a rg in c lassifiers. Cambr idg eMA: M IT Press, 1999: 67-74.
[ 13] Ratsch G. Benchm ark da ta sets[ EB /OL] . http: / / ida. first. fhg. de /pro jects /bench /benchm arks. htm. 1999 /2003- 7.

Memo

Memo:
-
Last Update: 2013-04-23