[ 1] Kononenko I. Estima ting attr ibutes: ana lys is and ex tensions o f re lief[ C] / / Proceed ings o f the 7 th European Con fe rence onM ach ine Learn ing. B erlin: Spr ing er, 1994: 171-182.
[ 2] Liu H, SetionoR. Featu re se lection and c lassification: a probab ilisticw rapper approach[ C] / / Proceed ings of the 9 th Internationa l Con fe rence on Industr ia l and Eng ineering App lications o fA I and ES. Fukuoka: Springer, 1996: 419-424.
[ 3] DashM, L iu H. Fea ture se lection for c lassifica tion[ J] . Inte lligent Data Ana ly sis, 1997, 1( 3): 131-156.
[ 4] Schapire R E. The strength of w eak learnab ility [ J]. M achine Learn ing, 1990, 5( 2) : 197-227.
[ 5] Fred A L N, Ja in A K. Data c lustering us ing ev idence accumu lation[ C ] / / Proceedings o f the 16th Internationa lConference on Pattern Recogn ition. Quebec: IEEE Press, 2002: 276-280.
[ 6] Newm an D J, H e ttich S, B lake C L, et a.l UC I reposito ry o f m ach ine learn ing da tabases [ EB /OL]. [ 2006-12-21] http: / / www. ics. uc .i edu / ~ m learn /MLRepository. htm ,l 1998.
[ 7] M artin H C Law, MÇ rioA T Figu re iredo, An ilK Jain. S imu ltaneous feature se lection and c luste ring usingm ix turem ode ls[ J]. IEEE Transac tions on Pa ttern Analysis andM ach ine In tellig ence, 2004, 26( 9): 1 154-1 166.
[ 8] M odha D S, Spang lerW S. Fea ture we ighting in k-means c lustering[ J]. M ach ine Learn ing, 2003, 52( 3): 217-237