[1]叶云龙,杨明.基于随机子空间的多分类器集成[J].南京师范大学学报(工程技术版),2008,08(04):087-90.
 Ye Yunlong,Yang Ming.Multi-Classifier Ensemble Based on Random Feature Subspace[J].Journal of Nanjing Normal University(Engineering and Technology),2008,08(04):087-90.
点击复制

基于随机子空间的多分类器集成
分享到:

南京师范大学学报(工程技术版)[ISSN:1006-6977/CN:61-1281/TN]

卷:
08卷
期数:
2008年04期
页码:
087-90
栏目:
出版日期:
2008-12-30

文章信息/Info

Title:
Multi-Classifier Ensemble Based on Random Feature Subspace
作者:
叶云龙;杨明;
南京师范大学数学与计算机科学学院, 江苏南京210097
Author(s):
Ye YunlongYang Ming
School of Mathematics and Computer Science,Nanjing Normal University,Nanjing 210097,China
关键词:
随机子空间 分类器集成 重抽样
Keywords:
random sub- space classifier ensemb le re-samp ling
分类号:
TP391.41
摘要:
提出了一种基于随机子空间的多分类器集成算法RFSEn.首先选择一个合适的子空间大小,然后随机选择特征子集并投影,并得到子空间上的基分类器,从而通过基分类器构成集成分类器,并由集成分类器来进行文本的分类.将该算法与单一分类器和基于重抽样技术的bagging算法进行了比较,在标准数据集上进行了实验.结果表明,该方法不仅优于单一分类器的分类性能,而且一定程度上优于bagging算法.
Abstract:
In th is paper, we propose an ensemb le algorithm ca lled RFSEn wh ich is based on random feature subspace. First, an appropriate fea ture subset size is se lected, then subsets of features are random ly and pro jected on the training set, and the pr im ary classifiers o f subspace are obta ined, and thus ensem bled c lassifiers are fo rm ed w ith these prim ary classifiers. A t last, w e use the ensem bled c lassifie r to c lassify the tex t.W e com pare the a lgo rithm w ith bagg ing a lgo rithm wh ich is based on re-sam pling techn iques and sing le c lassifier on the standard datasets. The resu lts show that RFSEn a-l gor ithm is not on ly superior to s ing le c lassifier in perform ance, bu t better than bagg ing a lgor ithm in some deg ree.

参考文献/References:

[ 1] Dietterich T G. M ach ine learn ing research: four current d irections [ J]. A IM agazine, 1997, 18( 4): 97-136.
[ 2] Freund Y, SchapireR E. Exper im ents w ith a new boosting algor ithm [ C] / / Proceed ing s o f the 13th Internationa lConference on M achine Learning. San Francisco: Mo rgan K aufmann, 1996: 148-156.
[ 3] Bre iman L. Bagg ing pred icto rs[ J]. M ach ing Learning, 1996, 24( 2): 123-140.
[ 4] Zhou Z H, W u J, TangW. Ensem bling neural netw orks: m any cou ld be better than a ll[ J]. A rtific ia l Inte lligence, 2002, 137( 1 /2): 239-263.
[ 5] W e iss SM, Apte C, Dame rau F J. M ax im izing tex t-m in ing performance[ J] . IEEE Inte lligent System s, 1999, 14 ( 4): 63-69.
[ 6] Schapire R E, Singer Y. Boostexte r: a boosting-based system for tex t catego riza tion [ J]. M achine Learning, 2000, 39( 223): 135-168.
[ 7] Tum erK, Ghosh J. C lassifie r com bin ing: analytical resu lts and im plica tions[ C ] / / Proceeding o f the AAA I-96Wo rkshop on Integrating
Mu ltiple LearnedM odels for Im prov ing and Sca lingM achine Lear ing A lgorithm s. Portland: AAA I Press, 1996.
[ 8] W ang X iaogang, Tang X iaoou. Using random subspace to comb ine mu ltiple featu res for face recogn ition[ C] / / Pro ceeding o f
the 6th IEEE Internationa l Confe rence on Autom a tic Face and Gesture Recogn ition. Los A lam ito s: IEEE Com puter So ciety Press, 2004: 284-289.
[ 9] Bay S D. Comb ining nearest ne ighbor classifiers through m ultiple fea ture subsets[ C] / / Proceeding of the Proceed ings o f the
17 th Internationa l Con ference onM ach ine Learn ing. M adison, W I: M o rgan Kaufm ann, 1998: 37-45.
[ 10] Robe rt Bry lla. A ttr ibute bagg ing: im prov ing accuracy of c lassifier ensemb les by using random featu re subse ts[ J]. Pattern Recogn ition, 2003, 36( 6): 1 291-1 302.
[ 11] Lew is D D. Na ve ( Bayes) a t forty: the independence assumption in info rma tion retrieva[ C] / / Proceed ing s o f 10 th European Conference onM ach ine Lea rning. Chemn itz, DE: Springer Ver lag, 1998: 4-15.

相似文献/References:

[1]叶云龙,杨明.一种基于多模态模型的随机子空间分类集成算法[J].南京师范大学学报(工程技术版),2009,09(04):057.
 Ye Yunlong,Yang Ming.A Multi-modality-based Random Subspace Classifier Ensemble Algorithm[J].Journal of Nanjing Normal University(Engineering and Technology),2009,09(04):057.

备注/Memo

备注/Memo:
通讯联系人: 杨  明, 教授, 博士, 研究方向: 数据挖掘、机器学习和粗糙集理论及应用. E-m a il:m yang@ n jnu. edu. cn
更新日期/Last Update: 2013-04-24