|Table of Contents|

Multi-Classifier Ensemble Based on Random Feature Subspace(PDF)

南京师范大学学报(工程技术版)[ISSN:1006-6977/CN:61-1281/TN]

Issue:
2008年04期
Page:
87-90
Research Field:
Publishing date:

Info

Title:
Multi-Classifier Ensemble Based on Random Feature Subspace
Author(s):
Ye YunlongYang Ming
School of Mathematics and Computer Science,Nanjing Normal University,Nanjing 210097,China
Keywords:
random sub- space classifier ensemb le re-samp ling
PACS:
TP391.41
DOI:
-
Abstract:
In th is paper, we propose an ensemb le algorithm ca lled RFSEn wh ich is based on random feature subspace. First, an appropriate fea ture subset size is se lected, then subsets of features are random ly and pro jected on the training set, and the pr im ary classifiers o f subspace are obta ined, and thus ensem bled c lassifiers are fo rm ed w ith these prim ary classifiers. A t last, w e use the ensem bled c lassifie r to c lassify the tex t.W e com pare the a lgo rithm w ith bagg ing a lgo rithm wh ich is based on re-sam pling techn iques and sing le c lassifier on the standard datasets. The resu lts show that RFSEn a-l gor ithm is not on ly superior to s ing le c lassifier in perform ance, bu t better than bagg ing a lgor ithm in some deg ree.

References:

[ 1] Dietterich T G. M ach ine learn ing research: four current d irections [ J]. A IM agazine, 1997, 18( 4): 97-136.
[ 2] Freund Y, SchapireR E. Exper im ents w ith a new boosting algor ithm [ C] / / Proceed ing s o f the 13th Internationa lConference on M achine Learning. San Francisco: Mo rgan K aufmann, 1996: 148-156.
[ 3] Bre iman L. Bagg ing pred icto rs[ J]. M ach ing Learning, 1996, 24( 2): 123-140.
[ 4] Zhou Z H, W u J, TangW. Ensem bling neural netw orks: m any cou ld be better than a ll[ J]. A rtific ia l Inte lligence, 2002, 137( 1 /2): 239-263.
[ 5] W e iss SM, Apte C, Dame rau F J. M ax im izing tex t-m in ing performance[ J] . IEEE Inte lligent System s, 1999, 14 ( 4): 63-69.
[ 6] Schapire R E, Singer Y. Boostexte r: a boosting-based system for tex t catego riza tion [ J]. M achine Learning, 2000, 39( 223): 135-168.
[ 7] Tum erK, Ghosh J. C lassifie r com bin ing: analytical resu lts and im plica tions[ C ] / / Proceeding o f the AAA I-96Wo rkshop on Integrating
Mu ltiple LearnedM odels for Im prov ing and Sca lingM achine Lear ing A lgorithm s. Portland: AAA I Press, 1996.
[ 8] W ang X iaogang, Tang X iaoou. Using random subspace to comb ine mu ltiple featu res for face recogn ition[ C] / / Pro ceeding o f
the 6th IEEE Internationa l Confe rence on Autom a tic Face and Gesture Recogn ition. Los A lam ito s: IEEE Com puter So ciety Press, 2004: 284-289.
[ 9] Bay S D. Comb ining nearest ne ighbor classifiers through m ultiple fea ture subsets[ C] / / Proceeding of the Proceed ings o f the
17 th Internationa l Con ference onM ach ine Learn ing. M adison, W I: M o rgan Kaufm ann, 1998: 37-45.
[ 10] Robe rt Bry lla. A ttr ibute bagg ing: im prov ing accuracy of c lassifier ensemb les by using random featu re subse ts[ J]. Pattern Recogn ition, 2003, 36( 6): 1 291-1 302.
[ 11] Lew is D D. Na ve ( Bayes) a t forty: the independence assumption in info rma tion retrieva[ C] / / Proceed ing s o f 10 th European Conference onM ach ine Lea rning. Chemn itz, DE: Springer Ver lag, 1998: 4-15.

Memo

Memo:
-
Last Update: 2013-04-24