支持向量机集成学习算法研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
集成学习通过训练多个个体学习器并将其结果进行合成,显著地提高了学习系统的泛化能力,作为机器学习的第四大研究方向已经越来越引起人们的重视,为提高机器学习的泛化性能提供了另一种新的解决途径。支持向量机作为一种“稳定的”学习算法对集成学习技术提出了新的挑战,研究和探索新型的支持向量机集成学习算法成为目前研究的热点问题。支持向量机集成学习的研究开始较晚,研究较少。如何设计出更有效的支持向量机集成学习实现算法,是目前支持向量机集成学习的关键问题。本文分别从集成学习中的个体生成和结论合成两方面入手展开研究,充分挖掘支持向量机集成的优势和潜力。
     针对支持向量机对核函数类型及参数扰动的敏感性,而现有参数扰动方法都是预先选定一种核函数,没有考虑核函数类型对支持向量机性能的影响,引入更加灵活的混合核函数,对混合核函数各参数进行扰动,其实质是实现了对支持向量机模型的扰动,提出一种模型一重扰动的支持向量机集成算法。仿真实验结果表明,该算法通过引入更多参数参与扰动,提高了支持向量机集成的差异度和泛化性能。
     在模型扰动基础上,结合特征扰动机制,研究基于模型与特征二重扰动的支持向量机集成算法。已有特征扰动方法都是在原始特征空间中进行,没有考虑特征相关性对个体支持向量机性能及集成差异度的影响。引入ICA特征变换方法,利用ICA方法对特征空间进行变换,去除特征间的相关性;在变换后的独立成分空间中,给出基于模型与特征二重扰动机制的支持向量机集成算法。仿真实验结果表明,该方法进一步提高了集成的差异度和泛化性能。
     选择性集成方法从集成系统中选择出部分个体参与集成,达到了提高集成泛化性能的目的。经典的选择性集成算法还存在计算复杂度高、学习效率差、性能低等缺陷,提出利用人工鱼群优化算法优化结论合成的权值,引入人工鱼群算法求解的全局性、初值不敏感、鲁棒性强、收敛速度快等优点,实现一种新的选择性支持向量机集成算法,仿真实验结果表明,其在泛化性能、学习效率、集成规模等方面都有所改善。
     基于模糊积分法的支持向量机集成算法可以充分利用支持向量机的度量层输出信息,进一步提高集成的泛化性能。已有的模糊积分集成法利用训练样本的先验静态信息来确定模糊密度值,其对所有的测试样本都是固定不变的,不能充分体现不同个体支持向量机相对于不同待测样本分类的不同置信度。提出一种基于自适应模糊积分法的支持向量机集成算法,根据各个体支持向量机分类器的度量层输出信息确定个体支持向量机分类器对待测样本分类的置信度,并据此实现自适应模糊密度赋值。仿真实验结果表明,该方法进一步了提高支持向量机集成的泛化性能。
Ensemble learning can greatly improve the generalization ability of learning system by training multiple base learners and combining their results, which has been the forth hot investigation orientation of machine learning.It also provides another way to improve the generalization ability of machine learning. Support vector machine, as a "stable" learning algorithm, is a challenge to ensemble learning. It is a hot research point to investigate and quest for new-style support vector machine ensemble learning algorithm. The research of support vector machine ensemble starts lately, and has few research productions. It is the key point of the support vector machine ensemble learning algorithm about how to work out much more effective ensemble learning algorithms. The paper focuses on the view point of individual production and conclusion combination to fully mining the predominance and potentiality of support vector machine ensemble.
     According to the sensitivity of support vector machine to the type of kernel function and model parameters. It aims at that the existed parameter manipulating methods did not consider the influence of the type of kernel function to the performance of support vector machine. The flexible hybrid kernel function is introduced and the involved parameters are manipulated. Actually a support vector machine ensemble algorithm based on model manipulating is proposed. The simulated experiments results show that the diversity and generalization performance is improved by introducing much manipulated parameters.
     The research of support vector machine ensemble algorithm is based on model and feature double disturbance mechanism. The feature disturbance is introduced into model disturbance. The existed feature manipulating methods did not consider the influence of the feature relativity to the performance of support vector machine and the diversity of ensemble.The feature transformation is introduced, the ICA is used to transform the feature space to take out the feature relativity, the support vector machine ensemble algorithm based on model and feature double disturbance mechanism is proposed. The simulated experiment results show that the diversity and generalization performance is improved further.
     Selective ensemble method selects partial individuals from ensemble system to improve generalization ability. The classical selective ensemble methods have the disadvantage of higher computation complexity, lower learning efficiency, and lower performance.A selective support vector machine ensemble algorithm based on artificial fish swarm algorithm, which has the virtue of overall solution, not sensitive to initial value, tough, rapidly converge, is proposed to optimize the combined weight. The simulated experiment results show that it can improve generalization ability and learning efficiency, decrease the scale of ensemble.
     Support vector machine ensemble based on fuzzy integral method can make full use of the measurement level information of support vector machine to improve generalization performance of support vector machine ensemble. The existed fuzzy integral fusion methods used the prior information of training samples to determine the value of fuzzy density, which is the same to any samples and can not reflect the different importance of support vector machine to different samples. A support vector machine ensemble based on adaptive fuzzy integral is presented, the classification confidence of individual support vector machine to test sample is determined according to the measurement level information and the adaptive fuzzy density is determined according to classification confidence. The simulated experiment results show that the proposed method can improve the performance further.
引文
[1]V. Vapnik. Statistical Learning Theory. Wiley, New York,1998
    [2]V.Vapnik.The nature of statistical Learning Theory.Springer-Verlag,New York,1995
    [3]C.Corts, V. Vapnik. Support vector networks. Machine Learning.1995, 20(3):273-297P
    [4]TaiYue Wang, HueiMin Chiang.Fuzzy support vector machine for multi-class text categorization. Information processing& Management.2007, 43(4):914-929P
    [5]Vikramjit Mitra, Chia-Jiu Wang,Satarupa Banerjee.Text Classification:A least square support vector machine approach. Applied Soft Computing. 2007,7(3):908-914P
    [6]Yu Shui,Xie YunMing,Ma Fan Yuan. Accurate performance estimators for information retrieval based on span bound of support vector machines. Journal of Harbin Institute of Technology.2006,13(1):113-117P
    [7]肖云,韩崇昭,郑庆华,赵婷.基于粗糙集.支持向量机理论的过滤误报警方法.电子与信息学报.2007,29(12):3011-3014页
    [8]何灵敏,沈掌泉,孔繁胜,刘震科.SVM在多源遥感图像分类中的应用研究.中国图象图形学报.2007,12(4):648-654页
    [9]李杰,朱维乐,王磊,杨浩淼.基于Wold模型和支持向量机的纹理识别.计算机研究与发展.2007,44(3):460-466页
    [10]J.Li, N.Allinson, D.Tao, X.Li. Multitraining Support Vector Machine for Image Retrieval. IEEE Transactions on Image Processing,2006,15(11): 3597-3601P
    [11]Y.H.Liu, Y.T.Chen. Face Recognition Using Total Margin-Based Adaptive Fuzzy Support Vector Machines.IEEE Transactions on Neural Networks,2007,18(1):178-192P
    [12]黄石磊,匡镜明,谢湘.基于SVM的置信度综合方法在语音识别中的应 用.北京理工大学学报.2007,27(3):255-259页
    [13]Everthon Silva Fonseca, Rodrigo Capobianco Guido, Paulo Rogerio Scalassara, Carlos Dias Maciel, Jose Carlos Pereira. Wavelet time-frequency analysis and least squares support vector machines for the identification of voice disorders. Computers in Biology and Medicine.2007,37 (4):571-578P
    [14]S.Mitra, Y.Hayashi. Bioinformatics with Soft Computing. IEEE Transactions on Systems,Man and Cybernetics:Part C,2006,36(5):616-635P
    [15]吴峰崎,孟光.基于支持向量机的转子震动信号故障分类研究.震动工程学报.2006,19(2):238-241页
    [16]Y.Bazi, F.Melgani. Toward on Optimal SVM Classification System for Hyperspectral Remote Sensing Images.IEEE Transactions on Geoscience and Remote Sensing,2006,44(11):3452-3461P
    [17]C.V.Gustavo,G.C.Luis.Compostite Kernels for Hyperspectral Image Classification. IEEE Geoscience and Remote Sensing Letters,2006,3(1): 93-98P
    [18]白鹏,李彦,张斌,刘君华.基于SVM的混合气体红外光谱分析关键技术研究.光子学报.2008,37(3):566-572页
    [19]李昆仑,黄厚宽等.模糊多类支持向量机及其在入侵检测中的应用.计算机学报.2005,28(2):274-280页
    [20]周辉仁,郑丕谔,赵春秀.基于遗传算法的LS-SVM参数优选及其在经济预测中的应用.计算机应用.2007,27(6):1418-1419,1429页
    [21]E.Osuna, R.Freund.An Improved Training Algorithm for Training Support Vector Machines. Proceedings of IEEE Workshop on Neural Networks for Signal Processing, New York,1997,1:252-256P
    [22]J.C.Platt. Sequential Minimal Optimization:A fast algorithm for training support vector machines.Cambridge:The MIT Press,1998:169-182P
    [23]V.N.Vapnik. Estimation of Dependence Based on Empirical Data.New York:SpringerVerlag,1982
    [24]Carlos Soares, Pavel Brazdil. Selecting Parameters of SVM Using Meta-Learning and Kernel Matrix-based Meta-Features.SAC, 2006:564-568P
    [25]王兴玲,李占斌.基于网格搜索的支持向量机核函数参数的确定.中国海洋大学学报.2005,35(5):859-862页
    [26]O.Chapelle, V.Vapnik. Choosing Parameters for Support Vector Machines. Machine Learning.2002,46(1):131-160P
    [27]R.G.Onada, T.K.Muller. Soft Margins for AdaBoost.Machine Learning. 2001,42(3):287-320P
    [28]T.G. Dietterich. Machine Learning Research:Four Current Directions. AI Magazine.1997,18(4):97-136P
    [29]周志华.机器学习及其面临的挑战.计算机科学面临的挑战高层研讨会,厦门大学,2003
    [30]YanShi Dong, KeSong Han. A Comparison of Several Ensemble Methods for Text Categorization. IEEE International Conference on Services Computing,2004:419-422P
    [31]YanShi Dong, KeSong Han. Boosting SVM Classifiers by Ensemble.14th International Conference on World Wide Web,2005:1072-1073P
    [32]L.Breiman. Bagging Predictors. Machine Learning.1996,24(2):123-140P
    [33]H.Kim, S.Pang, H.Je, et al.. Support vector Machine Ensemble with Bagging.Proceedings of the First International Workshop on Pattern Recognition with support vector Machines,2002,2388:397-407P
    [34]HyunChul Kim, Shaoning Pang,HongMo Je,Daijin Kim,Sung Yang Bang. Pattern Classification Using Support Vector machine Ensemble. Proceedings of the 16th International Conference on Pattern Recognition,2002:160-163P
    [35]C.A.M.Lima, A.L.V. Coelho, F.J.Von Zuben. Ensembles of Support Vector Machines for Regression Problems. Procs.INNS-IEEE International Joint Conference on Neural Networks,2002,3:2381-2386P
    [36]H. Kim, S.Pang, H.Je, et al..Constructing Support Vector Machine Ensemble. Pattern Recognition.2003,36(12):2757-2767P
    [37]B.Y.Sun, D.S.Huang. Least Squares Support Vector Machine Ensemble. International Joint Conference on Neural Network,2004:2013-2016P
    [38]R.Yan, Y.Liu, R.Jin, et al..On Predicting Rare Classed with SVM Ensembles in Scene Classification. IEEE International Conference on Acoustics,Speech and Signal Processing,2003:6-10P
    [39]LingMing He, XiaoBing Yang, HuiJun Lu.A Comparison of Support Vector Machines Ensemble for classification. Proceedings of the Sixth International Conference on Machine Learning and Cybernetics,2007: 3613-3617P
    [40]S.Dimitrios, Frossyniotis, S.Andreas. A multi-SVM Classification System. Proceedings of the Second International Workshop on Multiple Classifier Systems, LNCS2096,2001:198-207P
    [41]Mei Suyu, Liu Yue, Zhang Bofeng, Wu Gengfeng. Rough Reducts Based SVM Ensemble. Proceeding of IEEE International Conference on Granular Computing,2005,7:571-574P
    [42]Hu Zhonghui, Cai Yunze, He Xing, Xu Xiaoming. Support vector machine ensemble using rough sets theory.HIGH TECHNOLOGY LETTERS.2006,12(1):58-62P
    [43]谷雨,徐宗本,孙剑,郑锦辉.基于PCA与ICA特征提取的入侵检测集成分类系统.计算机研究与发展.2006,43(4):633-638页
    [44]G.Valentini. An Experimental Bias-Variance Anylysis of SVM ensembles Based on Resampling Techniques. IEEE transactions on System, Man and Cybernetics-Part B,2005,35(6):1252-1271P
    [45]G.Valentini, T.GDietterich. Bias-Variance Analysis of Support Vector Machines for the Development of SVM-Based Ensemble Methods. Journal of Machine Learning Research.2004,5:725-775P
    [46]G.Valentini,T.G. Dietterich. Bias-Variance analysis and ensembles of SVM. Third International Workshop on MCS,2002:222-232P
    [47]X.C.Li, L.Wang, Eric Sung. AdaBoost with SVM Based Component Classifiers.Engineering Application of Artificial Intelligence.2008(21): 785-795P
    [48]Xuchun Li,Lei Wang,Eric Sung. A study of AdaBoost with SVM based weak learners. Neural Networks(IJCNN).2005:196-201P
    [49]王小丹,莉东延,郑春颖,张宏达,赵学军.一种基于AdaBoost的SVM分类器.空军工程大学学报(自然科学版).2006,7(6):54-57页
    [50]何灵敏.支持向量机集成及其在遥感分类中的应用.浙江大学博士论文,2006
    [51]李国正,李丹.集成学习中特征选择技术.上海大学学报(自然科学版).2007,13(5):598-604页
    [52]D.C.Tao, X.O.Tang, et al.. Asymmetric Bagging and Random Subspace for Support Vector Machines-Based Relevance Feedback in Image Retrieval. IEEE Transactions on Pattern Analysis and Machine Intelligence,2006,28(7):1088-1099P
    [53]何鸣,李国正,袁捷,吴耿锋.基于主成分分析的Bagging集成学习方法.上海大学学报(自然科学版).2006,12(4):415-418,427页
    [54]贾华丁,游志胜,王磊.采用二重扰动机制的支持向量机的集成训练算法.控制与决策.2008,23(7):828-832页
    [55]谷雨,赵佳枢,杨柽.基于负相关学习的支持向量机集成算法.微电子学与计算机.2006,23(3):58-61页
    [56]蔡铁,伍星,李烨.基于RSBRA离散化方法的支持向量机集成.科学技术与工程.2008,8(12):3167-3170页
    [57]蔡铁,伍星,李烨.集成学习中基于离散化方法的基分类器构造研究.计算机应用.2008,28(8):2091-2093页
    [58]LingMin He, XiaoBing Yang, FanSheng Kong. Support vector Machines ensemble with optimizing weights by genetic algorithm. Proceedings of the Fifth International Conference on Machine Learning and Cybernetics, 2006:3503-3507P
    [59]YiZhou Zhang, ChunMe Liu, LangKuan Zhun, QingLei Hu.Constructing Multiple Support Vector Machines Ensemble Based on Fuzzy Integral and Rough Reducts. Second Conference on Industrial Electronics and Applications,2007:1256-1259P
    [60]颜根廷,李传江,马广富.支持向量分类器的模糊积分集成方法.哈尔滨工业大学学报.2008(40),7:1017-1020页
    [61]李烨,蔡云泽,尹汝泼,许晓鸣.基于证据理论的多类分类支持向量机集成.计算机研究与发展.2008,45(4):571-578页
    [62]Ye Li, RuPo Yin,YunZe Cai, XiaoMing Xu. A New Decision Fusion Method in Support vector machine ensemble. Proceedings of tht Fourth International Conference on Machine Learning and Cybernetics,2005:3304-3308P
    [63]S.N.Pang, Daijin Kim and S.Y.Bang. Fraud Detection Using Support Vector Machine Ensemble.IEEE International Conference on Acoustics, speech and Signal Proceesing,2001:1344-1349P
    [64]G.Valentini, M.Muselli, F.Ruffino. Cancer Recognition with Bagged Ensembles of Support Vector Machines.Neurocomputing.2004,56: 461-466P
    [65]G.Valentini, T.GDietterich. Low Bias Bagged Support Vector Machines. International Conference on Machine Learning,2003:752-759P
    [66]A.Bellili, M.Gilloux, P.Callinari.An MLP_SVM Combination Architecture for Offline Handwritten Digit Recognition:Reduction of Recognition Errors by Support Vector Machines Rejection Mechanisms. International Journal on Document Analysis and Recognition.2003:244-252P
    [67]Loris Nanni, Alessandra Lumini. Mpps:An ensemble of support vector machine based on multiple physicochemical properties of amino acids. Neurocomputing.2006,69:1688-1690P
    [68]YanShi Dong, KeSong Han. Text Classification Based on Data Partitioning and Parameter Varing Ensembles. ACM Symposium on Applied Computing,2005:1044-1048P
    [69]蔡俊伟,胡寿松,陶洪峰.基于选择性支持向量机集成的混沌时间预测序列.物理学报.2007,56(12):6820-6827页
    [70]叶芗芸,戚飞虎,朱国霞.一种多级分类器集成的字符识别方法.电子学报.1998,26(11):15-19页
    [71]李烨,蔡云泽,许晓鸣.基于支持向量机集成的故障诊断.控制工程.2005(12):170-173页
    [72]唐静远,师奕兵,张伟.基于支持向量机集成的模拟电路故障诊断.仪器仪表学报.2008,29(6):1216-1220页
    [73]王磊,孙世新,杨浩淼.二维bagging集成支持向量机进行网络故障诊断.电子测量与仪器学报.2008,22(1):10-14页
    [74]韩冰,高新波,姬红兵.一种基于选择性集成SVM的新闻音频自动分类方法.模式识别与人工智能.2006,19(5):634-639页
    [75]马勇,丁晓青.基于层次型SVM的人脸检测.清华大学学报(自然科学版).2003,43(1):35-38页
    [76]H. Konig.Eigenvalue Distribution of Compact Operator.Integral Equations and Operator Theory.1986,9(4):610-612P
    [77]K. Crammer, Y. Singer. On the Algorithmic Implementation of Multi-Class Kernel-Based Vector Machines. Journal of Machine Learning Research.2001,2:265-292P
    [78]U. Kreel. Pairwise Classification and Support Vector Machines. Advances in Kernel Methods:Support Vector Learnings.Cambridge:MIT Press,1999:255-268P
    [79]J. Platt, N. Cristianini, J. Shawe-Taylor. Large Margin DAGs for Multiclass Classification. In Advances in Neural Information Processing Systems,2000,12:547-553P
    [80]Xiaodan Wang, Zhaohui Shi, Chongming Wu, Wei Wang. An Improved Algorithm for Decision-Tree-Based SVM.Proceedings of the 6th Congress on Intelligent Control and Automation,2006:4234-4238P
    [81]C. Lin. On the convergence of the decomposition method for support vector machines.IEEE Transactions on Neural Networks,2001,12(6): 1288-1298P
    [82]焦李成,张莉,周伟达.支撑矢量预选取的中心距离比值法.电子学报.2001,29(3):384-386页
    [83]汪西莉,焦李成.一种基于马氏距离的支持向量快速提取算法.西安电子科技大学学报(自然科学版).2004,31(4):639-643页
    [84]李青,焦李成,周伟达.基于向量投影的支撑矢量预选取.计算机学报.2005,28(2):145-152页
    [85]O.Chapplle, V.Vapnik.Model selection for support vector machines. Advances in Neural Information Processing System,1999,12:230-236P
    [86]V.Vapnik, O.Chapplle. Bounds on error expression for support vector machines.Neural Computation.2000,12(9):2013-2036P
    [87]M.Kearns, L.G.Valiant. Learning Boolean Formulae or Factoring. Technical Report TR-1488,Cambridge.MA, Havard University,Aiken Computation Laboratory,1988
    [88]T.D. Dietterich. Ensemble Methods in Machine Learning. In Proceedings of MCS,LNCS,Springer,2000:1-15P
    [89]Roger W.Johnson. An Introduction to the Bootstrap. Teaching Statistics. 2001,23(2):49-54P
    [90]Y.Freund. Boosting a weak Learning Algorithm by Majority. Information and Computation.1995,121(2):256-285P
    [91]R.E.Schapire. The Strength of Weak Learnability.Machine Learning.1990, 5(2):197-227P
    [92]Y.Freund, R.E.Schapire. A Decision-theoretic Generalization of On-line Learning and an Application to Boosting. In Proceedings of EuroCOLT-94, Springer,1995:23-27P
    [93]K.Tumer, J.Ghosh. Classifier Combining:Analytical Results and Implications. In Proceedings of 13th NCAI,1996,8:126-132P
    [94]D.Opitz. Feature Selection for Ensembles.In proceedings of American Association for Artificial Intelligence,1999:379-384P
    [95]T. Ho.The random subspace method for construction decision forests. IEEE Transaction Pattern Analysis and Machine Intelligence,1998, 20(8):832-844P
    [96]L.Olveira, R.Sabourin, F.Bortolozzi, et al.. Feature selection using multi-objective genetic algorithms for handwritten digit recognition. the 16th International Conference on Pattern Recognition,2002:568-571P
    [97]A.Tsymbal, P.Cunningham. Search strategies for ensemble feature selection in medical diagnostics.Proceedings of the 16th Symposium, 2003:124-129P
    [98]A.Tsymbal, M.Pechenizkiy, P.Cunningham. Sequential genetic search for ensemble feature selection.Proceeding of International Joint Conference on Artifical Intelligence,2005:877-882P
    [99]R.Brylia, R. Gutierrez Osunab, F.Queka. Attribute Bagging:improving accuracy of classifier ensembles by using random feature subsets.Pattern Recognition.2003,36(6):1291-1302P
    [100]R.Anand, GMehrotra, C.K.Mohan. Efficient classification for multiclass problems using modular neural networks. IEEE Transactions on Neural Networks,1995,6(1):117-124P
    [101]T.Hastie, R.Tibshirani. Classification by pairwise coupling. The Annals of Statistics.1998,26(1):451-471P
    [102]T.G Dietterich, G. Bakiri. Solving Multiclass Learning Problems via Error-correcting Output Codes. Journal of Artificial Intelligence Reaserch. 1995,2:263-286P
    [103]Z.H. Zhou,Y.Yu. Adapt Bagging to Nearest Neighbor Classifiers. Journal of Computer Science & Technology.2005,20(1):48-54P
    [104]X. Zhang, J.Mesirov, D.Waltz. Hybrid System for Protein Secondary Structure Prediction.Molecular Biology.1992,225(4):1049-1063P
    [105]N.Ueda. Optimal Linear Combination of Neural Networks for Improving Classification Performance.IEEE Transactions on Pattern Analysis and Machine Intelligence,2000,22(2):207-215P
    [106]寇忠宝,张长水.基于Multi-Agent的分类器融合.计算机学报.2003,26(2):174-179页
    [107]Dymitr Ruta, Bogdan Gabrys. Classifier selection for majority voting. Information Fusion.2005,6:63-81P
    [108]J.Kittler, M.Hatef, R.P.W Duin, J.Matas. On Combining Classifiers. IEEE Transactions on Pattern analysis and Machine Intelligence,1998, 20(3):226-239P
    [109]L.Xu, A.Krzyzak, C.Y.Suen. Method of combining multiple classifiers and their application to handwritten numeral recognition.IEEE Transactions on Systems,Man and Cybernetics,1992,22(3):418-435P
    [110]T.K. Ho, J.J.Hull, S.N.Srihari. Decision Combination in Multiple Classifier System. IEEE Transaction on Pattern Analysis and Machine Intelligence,1994,16(1):66-75P
    [111]P.Verlinde. A Contribution to Multi-Modal Identity Verification Using Decision Fusion.Doctor Dissertation[EB/OL].Available in http:// www.sic. rma.ac.be/Publications/index/html,2000-5-13
    [112]T.Lee, J.A.Richards, P.H.Swain. Probabilistic and Evidential Approaches for Multisource Data Analysis.IEEE Transaction On Geoscience and Remote Sensing,1987,GE-25(3):283-293P
    [113]J.M.Keller, P.Gader, H.Tahani,et al..Advances in Fuzzy Integration for Pattern Recognition.Fuzzy Sets and Systems.1994,65(2):273-283P
    [114]J.Kittler, J.Matas, J.Jonsson, et al.. Combining Evidence in Personal Identity Verification Systems.Pattern Recognition Letters.1997,18(9): 845-852P
    [115]N.S.V.Rao. Fusion Method that Performs Better Than Best Sensor [EB/OL].Available in http://avalon.epm.ornl.gov/-nrao/ publications.html,2000-05-25.
    [116]Z.H.Zhou, J.X.Wu.Tang. Ensembling neural networks:Many could be better than all.Artificial Intelligence.2002,137(1-2):239-263P
    [117]Wu Jianxin, Zhou Zhihua,Shen Xuehua,et al.. A selective constructing approach to neural network ensemble.Journal of Computer Research and Development.2002,37(9):1039-1044P
    [118]R.Caruana,A.Niculescu-Mizil,G.Crew,A.Ksikes.Ensemble Selection from Libraries of Models.International Conference on Machine Leaning,2004
    [119]李青,焦李成.利用集成支撑矢量机提高分类性能.西安电子科技大学学报(自然科学版).2007,34(1):68-70,105页
    [120]赵莹.支持向量机中高斯核函数的研究.华东师范大学硕士学位论文,2007
    [121]王磊.支持向量机学习算法的若干问题研究.电子科技大学博士学位论文,2007
    [122]P.Domingos. A Unified Bias-Variance Decomposition for Zero-One and Squared Loss. In Proceedings of the 7th International Conference on Machine Learning,2000:231-238P
    [123]G. Valentini. Random Aggregated and Bagged Ensembles of SVMs:An Empirical Bias-Varianec Analysis.the 5th International Workshop on Multiple Classifier Systems,Lecture Notes in Computer Science, Springer,2004,3077:263-272P
    [124]张翔,肖小玲,徐光祐.一种确定高斯核模型参数的新方法.计算机工程.2007,33(12):52-53,56页
    [125]董春曦,饶鲜,杨绍全,徐松涛.径向基支持向量机推广能力快速估计算法.西安电子科技大学学报(自然科学版).2004,31(4):557-561页
    [126]Anh Tran Quang, QianLi Zhang,Xing Li.EVOLVING SUPPORT VECTOR MACHINE PARAMETERS. Proceedings of the First International Conference on Machine Learning and Cybernetics,2002: 548-551P
    [127]V.Vapnik. An Overview of Statistical Learning Theory.IEEE Transactions on Neural Networks,1999,10(5):988-999P
    [128]胡正平,张晔,刘明.分解子空间自适应核函数综合支持向量机算法.哈尔滨工业大学学报.2007,39(7):1099-1101页
    [129]Y.Freund, R.E.Schapire. Experiments with a New Boosting algorithm. In Proceedings of the 13th International Conference on Machine Learning, 1996:148-156P
    [130]吉小军,李世中,李霆.相关分析在特征选择中的应用.华北工学院测试技术学报.2001,15(1):15-18页
    [131]C. Jutten, J.Hersult. Independent component analysis versus principal analysis. In Proceeding of European Signal Processing Conference,1998: 643-646P
    [132]D.Opitz, R.Maclin. Popular Ensemble Methods:An Empirical Study. Artificial Intelligence Research.1999,11:169-198P
    [133]李凯,黄厚宽.一种基于聚类技术的选择性神经网络集成方法.计算机研究与发展.2005,42(4):594-598页
    [134]李晓磊,钱积新.人工鱼群算法:自下而上的寻优模式.过程系统工程年会论文集.2001:76-82页
    [135]李晓磊,邵之江,钱积新.一种基于动物自治体的寻优模式:鱼群算法.系统工程理论与实践.2002,11:32-38页
    [136]李晓磊,薛云灿,路飞,田国会.基于人工鱼群算法的参数估计方法.山东大学学报(工学版).2004,34(3):84-87页
    [137]C.R.Houck, J.A.Joines, M.GKay. A Genetic Algorithm for Function Optimization:A Matlab Implementation. ACM Transactions on Mathematical Software,1996
    [138]吴建鑫,陈兆乾,周志华.基于最优权值的选择性神经网络集成方法.模 式识别与人工智能.2001,14(4):476-480页
    [139]Dymitr Ruta, Bogdan Gabrys.An overview of classifer fusion methods.Computing and Information Systems.2000,7(1):1-10P
    [140]Xie Hua,Xia Shunren,Zhang Zhanchao.Study development of multi-classifier fusion methods in medical image identification.Journal of International Biomedicine Engineering.2006,29(3):152-157P
    [141]孔志周,蔡自兴.分类器融合中模糊积分理论研究进展.小型微型计算机系统.2008,29(6):1093-1098页
    [142]刘汝杰,袁保宗,唐晓芳.用遗传算法实现模糊测度赋值的一种多分类器融合算法.电子学报.2002,30(1):145-147页
    [143]Wang Xizhao,Feng Huimin.Nonnegtive set functions in multiple classifier fusion.Proceedings of the Third International Conference on Machine Learning and Cybernetics,2004,4:2020-2023P
    [144]G.Wahba. Support Vector Machines Reproducing Kernel Hilbert Spaces and the Randomized GACV. In Advances in Kernel Methods Support Vector Learning, MIT Press,1999:59-88P
    [145]J.C.Platt. Probabilities for SV Machines.In Advances in Large Margin Classifiers, MIT Press,2000:61-74P
    [146]唐春生,金以慧.基于全信息矩阵的多分类器集成方法.软件学报.2003,14(6):1103-1109页
    [147]G.Giacinto, F.Roli. Dynamic classifier selection based on multiple classifier behavior.Pattern Recognition.2001,34(9):1879-1881P
    [148]荆晓远,杨静宇.基于相关性和有效互补性分析的多分类器组合方法.自动化学报.2000,26(6):142-147页
    [149]肖旭红,戴汝为.一种识别手写汉字的多分类器组合方法.自动化学报.1997,23(5):621-627页