用户名: 密码: 验证码:
石油储层纵向预测软硬计算融合的理论与方法研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
本论文的选题及研究内容获得诸克军教授主持的国家自然科学基金项目“石油勘探管理中软计算集成的理论和方法研究(NO.70573101)”和笔者主持的高等学校博士学科点专项科研基金“石油储层纵向预测软硬计算融合的理论与方法研究(NO.20070491011)”的联合资助。
     1996年我国产石油15729×10~4t,石油消费量为17307×10~4t,石油对外依存度为9.12%;2004年产石油17499×10~4t,石油消费量为31873×10~4t,石油对外依存度为45.10%,该年的石油消费量和石油进口量分别居世界第二位和第三位;2007年产原油18665.7×10~4t,石油消费量34593.7×10~4t,石油对外依存度为46.04%。以上数据显示:我国原油产量虽有大幅度增长,但是我们面对的主要情况是我国国民经济发展速度快,而石油产量的年增长率相对滞后,供需矛盾进一步凸显,并且随着石油勘探领域的不断扩大,储层预测面临的研究对象越来越复杂,已有的预测技术与不断提高的解释要求之间的矛盾也越来越突出。如何有效利用新的技术手段和思路,科学地预测储层的岩性和含油性,对于指导石油勘探和开发具有十分重要的实际意义,并普遍受到人们的关注。
     本研究以江汉油田某区块为石油储层纵向预测的案例,基于将该区块储层的测井数据转化成信息,再将信息转化成认识的动态过程,把软计算和硬计算进行融合,提出软计算和硬计算融合的模式,并且在这些融合模式下进行算法设计,对石油储层进行纵向预测:
     (1)在分析了软计算和硬计算的基本理论和原理的基础上,提出了软计算和硬计算的融合模式(分离模式、并行连接模式、串联模式和嵌套模式),并分析了各个模式的特点。
     (2)创建三维表(三维包括搜索策略、评价标准和识别任务类别。其中,搜索策略包括:穷举法、启发方法和随机方法;评价标准模式包括:过滤模式、包裹模式和杂交模式;识别任务类别包括分类和聚类。)对属性优化方法(特征选择)进行了归纳,并且在归纳的基础上提出了算法选择平台,依据算法选择平台可以从归纳的特征选择方法中选择匹配的算法进行属性优化。
     (3)在储层岩性识别中通过硬计算方法对数据进行整理,然后通过算法选择平台得出匹配的算法和各种软计算和硬计算方法中的分类器进行组合,最后得出识别该区块岩性最优的属性子集为孔隙度(POR),然后运用人工神经网络(ANN)对数据进行学习,从中提取对应各个类别的识别函数(硬计算函数),把该识别函数作为目标函数,再用遗传算法(GA)从中进行规则提取:①如果POR是中,则该层为砂岩;②如果POR是低,则该层为砂岩;③如果POR是高,则该层为泥岩。
     (4)在储层含油性识别中,通过算法选择平台得出匹配的算法和各种软计算和硬计算方法中的分类器进行组合,最后得出识别该区块含油性最优的属性子集为声波时差(AC)和含油饱和度(So),接着提出了数据驱动的灰色关联预测方法来识别储层的含油性,然后进行样本约简,最后运用人工神经网络(ANN)对数据进行学习,从中提取对应各个类别的识别函数(硬计算函数),把该识别函数作为目标函数,再用遗传算法(GA)从中进行规则提取①如果AC是低和So是低,则该层为干层;②如果AC是中和So是低,则该层为干层;③如果AC是高和So是低,则该层为水层;④如果AC是中和So是中,则该层为差油层;⑤如果AC是低和So是中,则该层为油层;⑥如果AC是高和So是中,则该层为油层;⑦如果So是高,则该层为油层。
     (5)在岩性识别的最优属性子集POR和含油性最优属性子集AC和So的基础上,如果在测井数据集中没有或者缺少这些关键属性时,需要解决的问题就是怎样在不需要额外的开支下(钻井、化验等)构建预测POR,AC和So的模型。首先通过硬计算方法中的回归模型分别建立预测POR,AC和So的回归方程;然后再用软计算方法中的遗传算法和BP神经网络的嵌套(GA-BP,其中遗传算法用来优化确定BP神经网络的输入属性组合和隐含层神经元的个数)得到满意的BP神经网络预测模型来预测和识别POR,AC和So;最后对硬计算和软计算得到的模型进行比较和分析,得出GA-BP模型优于多元回归预测模型。
     (6)该研究区块的储层分为四个类别:①如果该层岩性为砂岩,含油性为油层,那么该储层为类别Ⅰ;②如果该层岩性为泥岩,含油性为油层,那么该储层为类别Ⅱ;③如果该层岩性为砂岩,含油性为差油层,那么该储层为类别Ⅲ;④如果该层岩性为泥岩,含油性为差油层,那么该储层为类别Ⅳ。
     石油储层纵向预测可以为减少石油勘探风险、准确评估石油储量、明确合理开发方案、提高石油采收率等提供极为重要的决策依据。
This paper is supported by both National Natural Science Foundations of China (NSFC)funded project“A Study on Integration of Soft Computing Theories and Methods in Oilexploration Management (NO.70573101)”and the Specialized Research Fund for the DoctoralProgram of Higher Education of China“A Study on Fusion of Soft Computing and HardComputing Theories and Methods of Oil Reservoir Forecast (NO.20070491011)”.
     In 1996,in China oil production and consumption was 15729×10~4t and 17307×10~4trespectively,and external dependence rate was 9.12%.In 2004,oil production was 17499×10~4t,oil consumption was 31873×10~4t,and external dependence rate was 45.10%.In 2004 oilconsumption ranked the second place and oil import volume ranked the third place in the world;In 2007,oil production was 18665.7×10~4t,oil consumption was 34593.7×10~4t,and externaldependence rate was 46.04%.These data shows that although Chinese oil production increasingsharply,the annual rate of growth of oil production is relatively backward because of the rapiddevelopment of national economy,and the problems existing between supply and demand of oilare more serious.As the area of oil exploration continuously enlarge,the research objects of theforecast of reservoir are more and more complicated,the contradictions between the forecasttechnique existing and the explanation demand constantly increasing are prominent.How toeffectively utilize new technology and idea to forecast lithology and oiliness of all kinds of oilreservoirs has important practical significance for directing oil exploration,and causes extensiveconcern.
     This research takes a certain block in Jianghan oil field as the case of oil reservoir forecast,based on the dynamic process of transforming well logging data into information,thentransforming information into acknowledge,and fusing soft computing with hard computing,putforward the patterns of fusion of soft computing and hard computing,and design algorithm underthe fusion patterns to forecast oil reservoir:
     (1)On basis of analyzing the fundamental theory and principle of soft computing and hardcomputing,put forward the patterns of fusion of soft computing and hard computing (isolatedpattern、parallel pattern、cascaded pattern and nested pattern),and analyze the feature of allpatterns.
     (2)Developing a three dimensional categorizing framework.One dimension is searchstrategy including complete search,sequential search and random search.Another dimension isevaluation criteria including filter model,the wrapper model and the hybrid model.The thirddimension is task including clustering with unlabeled data (unsupervised feature selection)andclassification with labeled data (supervised feature selection)and put forward algorithm selectionplatform,and we can select matched algorithm to optimize attributes from the induced featureselection methods according to algorithm selection platform.
     (3)Analyzing data with hard computing methods in the recognition of reservoir lithology,then combine the matching algorithm which was selected from categorizing framework based onalgorithm selection platform with classifiers of various soft computing methods and hardcomputing methods,and get the optimal attribute set of recognition lithology in this block is POR.Training the data with ANN and extract the corresponding recognition functions (hard computingfunctions).Then use recognition function as objective function for rule extraction with GA.①If POR is at middle level,this layer is sandstone;②If POR is at low level,this layer issandstone;③If POR is at high level,this layer is mudstone.
     (4)Combining the matching algorithm which was selected from categorizing frameworkbased on algorithm selection platform with classifiers of various soft computing methods andhard computing methods,and get the optimal attribute set of oiliness in this block is AC and So,then propose a method of data driven gray relational analysis for recognizing oil-bearingcharacteristics in reservoir and reduce samples,at last training the data with ANN and extract thecorresponding recognition functions (hard computing functions).Use recognition function asobjective function for rule extraction with GA.①If both of AC and So are at low level,this isdry layer;②If AC is at middle level and So is at low level,this is dry layer;③If AC is athigh level and So is at low level,this is water layer;④If both of AC and So are at middle level,this layer is inferior oil layer;⑤If AC is at low level and So is at middle level,this layer is oillayer;⑥If AC is high level and So is at middle level,this layer is oil layer;⑦if So is at highlevel,this layer is oil layer.
     (5)Based on the optimal attribute set of recognition lithology is POR and the optimalattribute set of oiliness is AC and So in this block,the problem to be solved is how to establishand predict the models of POR,AC and So without additional expenditure (well drilling,chemical examination etc.)if there is no or not enough key attributes in the data set of welllogging.Firstly,establish the regressive equations for the prediction of POR,AC and So withregression model of hard computing methods.Then,obtain satisfactory neural network model ofBP to predict and recognize POR,AC and So by nested genetic algorithm of soft computingmethods and BP neural network (The genetic algorithm is used to optimize the input attributescombination of BP neural network and determine the number of neurons in the hidden layer).Lastly,compare the models which are obtained from hard computing methods and softcomputing methods and conclude that GA-BP model is better than multiple regression model.
     (6)This research block are divided into four classes:①If the lithology of this ayer issandstone and the oiliness is oil layer,the reservoir belongs to classⅠ;②If the lithology of this layer is mudstone and the oiliness is oil layer,the reservoir belongs to classⅡ;③If thelithology of this layer is sandstone and the oiliness is inferior oil layer,the reservoir belongs toclassⅢ;④If the lithology of this layer is mudstone and the oiliness is inferior oil layer,thereservoir belongs to classⅣ.
     The prediction of oil reservoir provides important decision basis to reduce oil explorationrisks,evaluate oil reserves correctly,clear scientific development plan and increase the oilrecovery rate.
引文
[1]赵澄林.储层沉积学[M].北京:石油工业出版社,1998
    [2]吴元燕,徐龙,张昌明.油气储层地质[M].北京:石油工业出版社,1996
    [3]霍俊.实用预测学[M].北京:科学普及出版社,1987
    [4]L.A.Zadeh.Adefinition of Soft Computing in Greater Detail.WWW page,1991/1994.Available from
    [5]郭嗣琮,陈刚.信息科学中的软计算方法[M].沈阳:东北大学出版社,2001
    [6]张颖,刘艳秋.软计算方法[M].北京:科学出版社,2002
    [7]刘晋寅,李洪兴.软计算及其哲学内涵[J].自然辩证法研究,16(5),2000,26-29
    [8]杨善林,李永森,胡笑旋,潘若愚.k-means算法中的k值优化问题研究[J].系统工程理论与实践,2006,2,97-101
    [9]邓祖新.sas 系统和数据分析[M].北京:电子工业出版社,2002
    [10]余建英,何旭宏.数据统计分析与 SPSS 应用[M].北京:人民邮电出版社,2003
    [11]L.A.Zadeh.Fuzzy Sets[J].Information and Contro18,1965:338-353
    [12]L.A.Zadeh.Outline of a New Approach to the Analysis of Complex Systems and Decision Processes[J].IEEE Transactions on Systems, Man,and Cybernetics,3,1973:28-44
    [13]S.J.Ovaska and H.F.VanLandingham,and A.Kamiya.Fusion of Soft Computing and Hard Computing in Industrial Application:An Overview[J].IEEE Transactions on Systems,Man,and Cybernetics-Part C:Applications and Reviews,32,2002,72-79
    [14]S.J.Ovaska and H.F.VanLandingham.Guest Editorial:Special Issue on Fusion of Soft Computing and Hard Computing in Industrial Application[J].IEEE Transactions on Systems,Man,and Cybernetics-Part C:Applications and Reviews,32,2002,69-71
    [15]R.Sterritt and D.W.Bustard.Fusing Hard and Soft Computing for Fault Management in Telecommunications Systems[J]. IEEE Transactions on Systems,Man,and Cybernetics-Part C:Applications and Reviews,32,2002,92-98
    [16]Terrence P.Fries and James H.Graham. Fusion of Soft and Hard Computing for Fault Diagnosis in Manufacturing Systems[J].IEEE Transactions on Systems,Man,and Cybernetics-Part C:Applications and Reviews,32,2003,80-91
    [17]T.Furuhashi.Fusion of Fuzzy/Neuron/Evolutionary Computing for Knowledge Acquisition[C].Proceedings of the IEEE,89,2001,1266-1274
    [18]B.Sick.Fusion of Hard and Soft Computing Techniques in Indirect,Online Tool Wear Monitoring[J].IEEE Transactions on Systems,Man,and Cybernetics-Part C:Applications and Reviews,32,2002,80-91
    [19]H.Liu and H.Motoda,Feature Selection for Knowledge Discovery and Data Mining[M].Boston:Kluwer Academic,1998
    [20]D.Pyle,Data Preparation for Data Mining[M].Morgan Kaufmann Publishers,1999
    [21]M.Ben-Bassat.Pattern Recognition and Reduction of Dimensionality[M].Handbook of Statistics-Ⅱ,P.R.Krishnaiah and L.N.Kanal,eds.,North Holland,1982,773-791
    [22]A.Jain and D.Zongker.Feature Selection:Evaluation,Application,and Small Sample Performance[J].IEEE Trans.Pattern Analysis and Machine Intelligence,19(2), 1997,153-158
    [23]G.H.John,R.Kohavi,and K.Pfleger.Irrelevant Feature and the Subset Selection Problem[C].Proc.11th Int'l Conf.Machine Learning,1994,121-129
    [24]R.Kohavi and G.H.John.Wrappers for Feature Subset Selection[J].Artificial Intelligence,97(1-2),1997,273-324
    [25]M.Dash,K.Choi,P.Scheuermann,and H.Liu.Feature Selection for Clustering-a Filter Solution[C].Proc.Second Int'l Conf.Data Mining,2002,115-122
    [26]M.Dash and H.Liu.Feature Selection for Classification[J].Intelligent Data Analysis:An Int'l J,1(3),1997,131-156
    [27]E.Leopold and J.Kindermann.Text Categorization with Support Vector Machines.How to Represent Texts in Input Space?[J].Machine Learning,46,2002,423-444
    [28]Y.Rui,T.S.Huang,and S.Chang.Image Retrieval:Current Techniques,Promising Directions and Open Issues[J].Visual Comm.and Image Representation,10(4),1999,39-62
    [29]K.S.Ng and H.Liu.Customer Retention via Data Mining[J].AI Rev.,14(6),2000,569-590
    [30]W.Lee,S.J.Stolfo,and K.W.Mok.Adaptive Intrusion Detection:A Data Mining Approach[J].AI Rev.,14(6),2000,533-567
    [31]E.Xing,M.Jordan,and R.Karp.Feature Selection for High-Dimensional Genomic Microarray Data.Proc.15th Int'l Conf.Machine Learning,2001,601-608
    [32]Wolff M.,Pelissier-Comberscure J.FACIOLOG - automatic electrophacies determination[C].23 rd SPWLA Ann.Log.Symp,1982
    [33]张治国,杨毅恒,夏立显.RPROP算法在测井岩性识别中的应用[J].吉林大学学报(地球科学版),35(3),2005,389-393
    [34]贺新蔚,杜飙,王建伟,马火林.测井识别岩性的新方法:自组织神经网络法[J].江汉石油职工大学学报,19(6),2006,54-55
    [35]于代国,孙建孟,王焕增,陈伟中等[J].测井识别岩性新方法—支持向量机方法.大庆石油地质与开发,24(5),2005,93-96
    [36]王淑盛,徐正光,刘黄伟,王志良等.改进的K近邻方法在岩性识别中的应用[J].地球物理学进展,19(2),2004,478-480
    [37]程国建,周冠武,王潇潇.概率神经网络方法在岩性识别中的应用.微计算机信息,23(6-1),2007,288-289
    [38]孟俊,赵彦超.灰色系统聚类法识别岩性在准噶尔盆地腹部的应用[J].重庆科技学院学报(自然科学版),8(2),2006,18-21
    [39]金明霞,张超谟,刘小梅.基于MATLAB神经网络的岩性识别[J].江汉石油学院学报,25(4),2003,81-84
    [40]覃豪,张超谟.基于粗糙集与模糊学的岩性识别[J].江汉石油学院学报,26(4),2004,60-63
    [41]郭爱煌.测井资料计算机自动分层与岩性识别[J].煤炭地质与勘测,26(5),1998,59-63
    [42]李汉林,赵永军.岩性识别的多元统计法[J].地质评论,44(1),1989,106-112
    [43]席道瑛,张涛.BP人工神经网络模型在测井资料岩性自动识别中的应用[J].物探化探计算技术,17(1),1995,42-48
    [44]范训礼,戴航,张新家.神经网络在岩性识别中的应用[J].测井技术,23(1),1999,50-52
    [45]卢新卫,金章东.前馈神经网络的岩性识别方法[J].石油与天然气地质,20(1),1999,82-84
    [46]邱颖,孟庆武,李悌等.神经网络用于岩性及岩相预测的可行性分析[J].地球物理学进展,2001,16(3),76-84
    [47]陆万雨,黄盛仁,覃世银.应用CP网络进行岩性识别[J].吉林大学学报(地球科学版),2002,32(1),100-102
    [48]徐海波,李瑞,邹炜,潘杨辉.模糊聚类实现岩性自动划分[J].物探化探计算技术,28(4),2006,319-322
    [49]刘秀娟,陈超,曾冲,兰磊.利用测井数据进行岩性识别的多元统计方法[J].地质科技情报,26(3),2007,109-112
    [50]廖太平,张福荣,张志坚.基于模糊聚类算法的复杂岩性识别[J].大庆石油学院学报.28(6),2004,58-62
    [51]吴磊,徐怀民,季汉成.基于交会图和多元统计法的神经网络技术在火山岩识别中的应用[J].石油地球物理勘探,2006,41(1):81-86
    [52]李新虎.基于不同测井曲线参数集的支持向量机岩性识别对比[J].煤田地质与勘探.35(3),2007,72-77
    [53]周家纪,孙淑霞,王绪本,赵晖.一种遗传神经网络及其在岩性识别中的应用[J].成都理工学院学报,1998,80-83
    [54]张章华,孟凡卉,王文娟.基于微粒群算法的神经网络在岩性识别上的应用[J].物探化探计算技术,29(2),2007,130-134
    [55]裘亦楠,薛叔浩.油气储层评价技术[M].北京:石油工业出版社,2002
    [56]傅强,王家林,周祖翼.自组织特征映射网络在储层识别中的应用[J].同济大学学报,27(3),1999,371-374
    [57]冯国庆,李允,谈德辉.模糊贴近度在储层识别中的应用[J].西南石油学院学报,21(4),1999,46-49
    [58]薛薇.SPSS统计分析方法及应用[M].北京:电子工业出版社,2007
    [59]张尧庭,陈汉峰.贝叶斯统计推断[M].北京:科学出版社,1991
    [60]韩立岩,汪培庄.应用模糊数学[M].北京:首都经济贸易大学出版社,1999
    [61]宋子齐,谭成仟.灰色系统理论精细评价油气储层的分析准则、处理方法及其应用[J].系统工程理论与实践,1997,3,74-82
    [62]杨斌,匡立春,施泽进,孙中春.一种基于核学习的储集层渗透率预测新方法[J].物探化探计算技术,2005,27(2),172-175
    [63]史忠植.知识发现[M].北京:清华大学出版社,2002
    [64]焦李成.神经网络系统理论[M].西安:西安电子科技大学出版社,1993
    [65]杨庆军,邓春呈,杨永利,鲁国普.人工神经网络在低阻油层识别上的应用.特种油气 藏.8(2),2001,8-11
    [66]Fred Aminzadeh. Applications of AI and soft computing for challenging problems in the oil industry[J]. Journal of Petroleum Science and Engineering,47,2005,5-14
    [67]Aminzadeh,F.Pattern Recognition and Image Processing[M].Pergamon Press,London,1987
    [68]Aminzadeh,F.,Jamshidi,M..Soft Computing,Fuzzy Logic,Neural Network & Distributed Artificial Intelligence[M].Prentice Hall,1994
    [69]Simaan,A,Aminzadeh,F.(Eds.).Artificial Intelligence and Expert Systems in Petroleum Exploration[M].Advances in Geophysical Data Processing,vol.3.JAI Press,Greenwich,CT,1989
    [70]丁纯军,刘曙光,赵青林.志丹地区低孔、低渗储层测井解释[J].特种油气藏,9(5),2002,26-30
    [71]肖慈王旬,娄建立,谭世君.神经网络技术用于测井解释的评述[J].测井技术,1999,23(5):389-392
    [72]杨斌,肖慈王旬,王斌,彭真明.基于神经模糊系统的储层参数反演[J].石油天然气地质,21(2),2000,173-176
    [73]黄述旺,窦齐丰,彭仕宓,王韶华,刘伟.BP神经网络在储层物性参数预测中的应用——以梁家楼油田沙三中为例[J].西北大学学报(自然科学版),32(3),2002,271-274
    [74]周冠武,程国建.RBF神经网络在储层表征中的应用研究[J].计算机工程与应用,2007,43(8),174-177
    [75]甄兆聪,肖慈王旬.补偿模糊神经网络在储层参数预测中的应用[J].物探化探计算技术,24(2),2002,124-128
    [76]卫昆,吴丽花.测井解释中的人工神经网络方法应用研究[J].系统工程理论与实践,2004,7,66-73
    [77]连承波,李汉林,渠芳,蔡福龙,张军涛.基于测井资料的BP神经网络模型在孔隙度定量预测中的应用[J].天然气地球科学.17(3),2006,382-384
    [78]曹旭光,翟慧杰,刘传平.遗传算法计算储层参数研究[J].长春理工大学学报,29(4),2006,59-61
    [79]杨斌,匡立春,孙中春,施泽进.基于遗传规划的储层含水饱和度预测方法[J].成都理工大学学报(自然科学版),33(2),2006,209-213
    [80]胡齐凯,关丽.利用灰色模型预测储层孔隙度[J].江汉石油学院学报.25(4),2003,39-41
    [81]黄彦庆,张昌民,侯读杰,张民伟等.白碱滩油田六中区克下组储层综合评价[J].中国西部油气地质,3(1),2007,49-52
    [82]陆万雨.测井储层评价新技术应用研究[D].博士论文,中国地质大学(北京),2002
    [83]姚萌,徐樟有,熊琦华等.数理统计分析方法在储层分类中的应用.石油学报,1994,15(增刊),105-1091
    [84]彭仕宓,熊琦华,王才经等.储层综合评价的主成分分析方法[J].石油学报,1994,15(增刊),187-1941
    [85]姜香云,吴胜和,王志章.优化后的模糊判识在小集油田储层评价中的应用[J].石油天然气学报(江汉石油学院学报),27(6),2005,847-850
    [86]孙玉平,熊伟,高树生,修乃岭,王学武.长庆超低渗透储层开发潜力的模糊综合评价 [J].石油天然气学报(江汉石油学院学报),29(3),2007,270-272
    [87]许东,吴铮.基于MATLAB 6.X的系统分析与设计—神经网络[M].西安:西安电子科技大学出版社,2002
    [88]王家文,曹宇.MATLAB6.5图形图象处理[M].北京:国防工业出版社,2004
    [89]苏运霖.软计算和知识获取[J].广西科学院学报,19(4),2003,165-170
    [90]江毅.软计算中的协作与融合技术综述[J].模糊系统与数学,1998,12(4),1-10
    [91]刘普寅,李洪兴.软计算及其哲学内涵[J].自然辩证法研究,16(5),2000,26-30
    [92]孟伟.神经-模糊和软计算[J].电脑世界,2001,11,24-25
    [93]McCulloch W.S.and Pitts W.Logical calculus of the ideas immanent in nervous activity[J].Bull Math Biopys,Vo1.5 1943,115-133
    [94]Hebb D.O.The organization of behavior[M].Willey,New York,1949
    [95]Rosenblatt F.Principles of neurodynamics[M].Spartan,New York,1962
    [96]Kohonen T.Association memory:a system theory approach[M].Springer,New York,1977.
    [97]Grossbert S.Adaptive pattern classification and universal recording[J].Biological Cybernetics,1976,23,187-202
    [98]Fukushima K.Cognition:a self-organizing neural network model for a mechanism of pattern recognition unaffected by shift in position[J].Biological Cybernetics.1980,36(4),193-202
    [99]Feldman J.A.and Ballard D.H.Connectionist models and their properties[J].Cognitive Science.1986,6,205-254
    [100]Rumelhart D.E.and McClelland J.L.Parallel distributed processing[M].MIT Press,1986
    [101]Hopfield J.J.Neural networks and physical systems with emergent collective computational abilities[C].Proceedings of National Academic Science of U.S.A.1982,79,2554-2558
    [102]Hopfield J.J.Neurons with graded response have collective computational properties like those of two-state neurons[C].Proceedings of National Academic Science of U.S.A.1984,81,3088-3092
    [103]D.E.Rumelhart and J.L.McClelland.Parallel distributed processing[M].MIT Press,1986
    [104]焦李成.神经网络系统理论[M].西安:西安电子科技大学出版社,1996
    [105]胡守仁,余少波,戴葵.神经网络导论[M].长沙:国防科学技术大学出版社,1993
    [106]James W.Psychology.a briefer course[M].New York:Holt,1890
    [107]Schultz DP,Schultz SE.A History of modern psychology[M].New York:Harcourt Brace,1992
    [108]McCulloch W S,Pitts W.A Logical Calculus of the Ideas Immanent in Nervous Activity[J].Bulletin of Mathematical Biophysics,1943,10(5),115-133
    [109]Rosenblatt F. The Percetion:A Probabilistic Model for Information storage and Organizion in the Brain[J].Psychological Review,1958,65,386-408
    [110]KohonenT.Self-organization and associative memory[M].Third Edition.New York Spring-Verlag,1989
    [111]Hopfield J.Computing with neural circuits:A Model[J].Science,1986,233,625-633
    [112]戴先中,刘军,冯纯伯.连续非线性系统的神经网络α阶逆系统控制方法[J].自动化报, 1998,24(4),463-468
    [113]刘永红.神经网络理论的发展与前沿问题[J].信息与控制.1999,28(I),31-46
    [114]马玉明,贺爱玲,李爱民.遗传算法的理论研究综述[J].山东轻工业学院学报,18(3),2004,77-80
    [115]郭嗣琮,陈刚.信息科学中的软计算方法[M].沈阳:东北大学出版社,2001
    [116]张颖,刘艳秋.软计算方法[M].北京:科学出版社,2002
    [117]王小平,曹立明.遗传算法—理论、应用与软件实现[M].西安:西安交通大学出版社,2002
    [118]李君成.格雷码(Gray)和二进制码的快速转换方法[J].安徽广播电视大学学报,2001,3,85
    [119]赵正佳,黄洪钟,王金诺.混合离散变量优化的遗传算法研究[J].中国机械工程,1999,10(12),1375
    [120]韩立岩,汪培庄.应用模糊数学[M].北京:首都经济贸易大学出版社,1999
    [121]张运杰.模糊数学基础[M].大连:大连海事大学出版社,1996
    [122]张智星,孙春在.神经—模糊和软计算[M].西安:西安交通大学出版社,2000
    [123]刘增良.模糊逻辑与神经网络[M].北京:北京航空航天大学出版社,1996
    [124]L.A.扎德.模糊集合、语言变量及模糊逻辑[M].北京:科学出版社,1982
    [125]卢纹岱.SPSS for Windows统计分析(第三版)[M].北京:电子工业出版社,2007
    [126]杨善林,李永森,胡笑旋,潘若愚.k-means算法中的k值优化问题研究[J].系统工程理论与实践,2006,2,97-101
    [127]Huan Liu,Lei Yu.Toward Integrating Feature Selection Algorithms for Classification and Clustering[J].IEEE Transactions on knowledge and data engineering,vol.17,no.4,2005:491-502
    [128]P.M.Narendra and K.Fukunaga.A Branch and Bound Algorithm for Feature Subset Selection[J].IEEE Trans.Computer,vol.26,no.9,Sept.1977:917-922
    [129]J.Doak.An Evaluation of Feature Selection Methods and Their Application to Computer Security[J].technical report,Univ.of California at Davis,Dept.Computer Science,1992
    [130]H.Liu and H.Motoda,Feature Selection for Knowledge Discovery and Data Mining[M].Boston:Kluwer Academic,1998
    [131]K.Kira and L.A.Rendell.The Feature Selection Problem:Traditional Methods and a New Algorithm[C].Proc.10th Nat'l Conf.Artificial Intelligence,1992,129-134
    [132]G.Brassard and P.Bratley,Fundamentals of Algorithms[M].New Jersey:Prentice Hall,1996
    [133]韩览山,邵贝恩.KDD中的特征选择[J].计算机工程与应用,2002,22,217-220
    [134]王娟,慈林林,姚康泽.特征选择方法综述[J].计算机工程与科学,27(12),2005,67-70
    [135]P.M.Narendra and K.Fukunaga.A Branch and Bound Algorithm for Feature Subset Selection[J].IEEE Trans.Computer,vol.26,no.9,Sept.1997,917-922
    [136]Songyot Nakariyakul , David P. Casasent. Adaptive branch and bound algorithm for selecting optimal features[J]. Pattern Recognition Letters 28 (2007) ,1415-1427
    [137]I. Kononenko. Estimating Attributes: Analysis and Extension of RELIEF[C]. Proc. Sixth European Conf. Machine Learning, 1994, 171-182
    [138] P. Pudil and J. Novovicova. Novel Methods for Subset Selection with Respect to Problem Knowledge[M]. Feature Extraction, Construction and Selection: A Data Mining Perspective, pp. 101-116, 1998, second printing, 2001
    [139]J. Sheinvald, B. Dom, and W. Niblack. A Modelling Approach to Feature Selection[C]. Proc. 10th Int'l Conf. Pattern Recognition, 1990, 535-539
    [140] C. Cardie. Using Decision Trees to Improve Case-Based Learning[C]. Proc. 10th Int'l Conf. Machine Learning, P. Utgoff, ed., 1993, 25-32
    [141]D. Koller and M. Sahami. Toward Optimal Feature Selection[C]. Proc. 13th Int'l Conf. Machine Learning, 1996, 284-292
    [142]M. Dash, K. Choi, P. Scheuermann, and H. Liu. Feature Selection for Clustering-a Filter Solution[C]. Proc. Second Int'l Conf. Data Mining, 2002, 115-122
    [143]M. Dash, H. Liu, and J. Yao. Dimensionality Reduction of Unsupervised Data[C]. Proc. Ninth IEEE Int'l Conf. Tools with AI (ICTAI '97), 1997, 532-539
    [144]L. Bobrowski. Feature Selection Based on Some Homogeneity Coefficient[C]. Proc. Ninth Int'l Conf. Pattern Recognition, 1988, 544-546
    [145]M.A. Hall. Correlation-Based Feature Selection for Discrete and Numeric Class Machine Learning[C]. Proc. 17th Int'l Conf. Machine Learning, 2000, 359-366
    [146]A.N. Mucciardi and E.E. Gose. A Comparison of Seven Techniques for Choosing Subsets of Pattern RecognitionfJ]. IEEE Trans. Computers, vol. 20, 1971, 1023-1031
    [147]P. Mitra, C.A. Murthy, and S.K. Pal. Unsupervised Feature Selection Using Feature Similarity[J]. IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 24, no. 3, Mar.2002,301-312
    [148]H. Almuallim and T.G. Dietterich. Learning with Many Irrelevant Features[C]. Proc. Ninth Nat'l Conf. Artificial Intelligence, 1991,547-552
    [149]J.C. Schlimmer. Efficiently Inducing Determinations: A Complete and Systematic Search Algorithm that Uses Optimal Pruning[C]. Proc. 10th Int'l Conf. Machine Learning 1993, 284-290,
    [150]M. Dash. Feature Selection via Set Cover[C]. Proc. IEEE Knowledge and Data Eng. Exchange Workshop, 1997, 165-171
    [151]H. Liu and R. Setiono. A Probabilistic Approach to Feature Selection-A Filter Solution[C].Proc. 13th Int'l Conf. Machine Learning, 1996, 319-327
    [152] I. Foroutan and J. Sklansky. Feature Selection for Automatic Classification of Non-Gaussian Data[J]. Trans. Systems, Man, and Cybernatics, vol. 17, no. 2, pp. 187-198,1987
    [153]A.W. Moore and M.S. Lee. Efficient Algorithms for Minimizing Cross Validation Error[C].Proc. 11th Int'l Conf. Machine Learning, 1994, 190-198
    [154]M.Devaney and A.Ram.Efficient Feature Selection in Conceptual Clustering[C].Proc.14th Int'l Conf.Machine Learning,1997,92-97
    [155]J.G.Dy and C.E.Brodley.Feature Subset Selection and Order Identification for Unsupervised Learning[C].Proc.17th Int'l Conf.Machine Learning,2000,247-254
    [156]H.Vafaie and I.F.Imam.Feature Selection Methods:Genetic Algorithms vs.Greedy-Like Search[C].Proc.Int'l Conf.Fuzzy and Intelligent Control Systems,1994
    [157]D.J.Stracuzzi and P.E.Utgoff.Randomized Variable Elimination[C].Proc.19th Int'l Conf.Machine Learning,2002,594-601
    [158]S.Das.Filters,Wrappers and a Boosting-Based Hybrid for Feature Selection[C].Proc.18th Int'l Conf.Machine Learning,2001,74-81
    [159]E.Xing,M.Jordan,and R.Karp.Feature Selection for High-Dimensional Genomic Microarray Data[C].Proc.15th Int'l Conf.Machine Learning,2001,601-608
    [160]M.Dash and H.Liu.Feature Selection for Clustering[C].Proc.Fourth Pacific Asia Conf.Knowledge Discovery and Data Mining,(PAKDD-2000),2000,110-121
    [161]诸克军,苏顺华,黎金玲.模糊C-均值中的最优聚类与最佳聚类数[J].系统工程理论与实践,2005,3,52-61
    [162]苏金明,傅荣华,周建斌等.统计软件SPSS系列应用实战篇[M].北京:电子工业出版社,2002
    [163]王得燕.穷举法与粒子群算法的比较[J].无锡职业技术学院学报,7(1),2008,44-45
    [164]杨斌,匡立春,孙中春,施泽进.神经网络及其在石油测井中的应用[M].北京:石油工业出版社,2005
    [165]Tsukimoto H.Extracting rules from trained neural networks[J].IEEE Trans.Neural Networks,2000,11(2),377-389
    [166]Lim M H,Rahardja S,Gwee B H.A GA Paradigm for Learning fuzzy Rules[J].Fuzzy Sets and Systems,1996,(82),177-186
    [167]王刚,王本年.基于FNN与GA相融合的数据挖掘方法研究[J].计算机技术与发展,2008,18(2),119-122
    [168]李旸,高正光,李启炎.基于神经网络与遗传算法的数据挖掘体系结构[J].计算机工程,2004,30(6):155-156
    [169]Ishibuchi H,Murata T,Turksen I B.Single-objective and two-objective genetic algorithms for selecting linguistic rules for pattern classification problems [J].Fuzzy Sets Systems,1997,89(2),135-150.
    [170]Ishibuchi H,Nakashima T,Murata T.Three-objective genetics-based machine learning for linguistic rule extraction [J].Inform Sci,2001,136(1-4),109-133.
    [171]Ishibuchi H,Yamamoto T.Fuzzy rule selection by multi-objective genetic local search algorithms and rule evaluation measures in data mining [J].Fuzzy Sets Systems,2004,141(1),59-88.
    [172]常犁云,王国胤,吴渝.一种基于Rough Set理论的属性约简及规则提取方法[J].软件学报,1999,10(11),1206-1211
    [173]郭海湘,诸克军,贺勇等.基于模糊聚类和粗糙集提取我国经济增长的模糊规则[J].管理学报,2005,4(2),437-440
    [174]邓聚龙.灰色系统基本方法[M].武汉:华中科技大学出版社,1987
    [175]张丽新,王家,赵雁南,杨泽红.基于Relief的组合式特征选择[J].复旦学报(自然科学版),2004,43(5),893-898
    [176]Kira K,Rendell L A.A practical approach to feature selection[A].In:Sleeman D,Edwards P,eds.Proceedings of the 9th International Workshop on Machine Learning [C].San Francisco,CA:Morgan Kaufmann,1992,249-256.
    [177]Igor Kononenko,Se June Hong.Attribute selection for modeling[J].Future Generation Computer Systems 13(1997),181-195
    [178]谭成仟,宋子齐,吴向红.储层油气产能的灰色理论预测方法[J].系统工程理论与实践,2001,10,101-105
    [179]宋子齐,谭成仟,程传之.测井多参数及其灰色理论处理方法在埕岛油田的应用[J].石油物探.1994,33(4),93-106
    [180]刘思峰,郭天榜,党耀国.灰色系统理论及其应用[M].北京:科学出版社,1999
    [181]张岐山.基于遗传算法的分辨系数的优化方法[J].中国管理科学,2003,11(5),76-80
    [182]解建喜,宋笔锋,刘东霞.飞机顶层设计方案优选决策的灰色关联分析法[J].系统工程学报,2004,19(4),350-354

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700