freepeople性欧美熟妇, 色戒完整版无删减158分钟hd, 无码精品国产vα在线观看DVD, 丰满少妇伦精品无码专区在线观看,艾栗栗与纹身男宾馆3p50分钟,国产AV片在线观看,黑人与美女高潮,18岁女RAPPERDISSSUBS,国产手机在机看影片

正文內(nèi)容

基于文本的數(shù)據(jù)挖掘論文-資料下載頁(yè)

2025-06-27 20:46本頁(yè)面
  

【正文】 O)算法。相信今后進(jìn)一步的研究可以充分地發(fā)揮支持向量機(jī)適用范圍廣,分類(lèi)能力好等特點(diǎn),并使其與其他分類(lèi)方法結(jié)合使用,使分類(lèi)效果和效率得到更大的提高。參考文獻(xiàn)[1] Thorsten Joachims, Text Categorization with Support Vector Machines: Learning with Many Relevant Features. Universit at Dortmund, 1998[2] Steve R. Gunn. Support Vectro Machines for Classification and Regression. Technical Report. Image Speech and Intelligent Systems Group. University of Southampton10 May, 1998[3] Sam Chapman. Support Vector Machines. February 13, 2004.[4] Christopher . Burges, A Tutorial on Support Vector Machines for Pattern Recognition. Bell Laboratories, Lucent Technologies, Kluwer Academic Publishers, Boston.[5] ChihWei Hsu, ChihChung Chang, and ChihJen Lin, A Pratical Guid to Support Vector Classification. Department of Computer Science and Information Engineering, National Taiwan University[6] Thorsten Joachims, A Statistical Learning Model of Text Classification for Support Vector Machines, GMD Forschungszentrum IT, Schloss Birlinghoven, 53754 Sankt Augustin, Germany[7] A. Basu, C. Watters, and M. Shepherd, Support Vector Machines for Text Categorization, Faculty of Computer Science, Dalhousie University, Halifax, Nova Scotia, Canada B3H 1W5, Proceedings of the 36th Hawaii International Conference on System Sciences(HICSS’03), IEEE 2002[8] Bernhard Scholkopf, Christopher . Burges, Alexander , Introduction to Support Vector Learning, Berlin, Holmdel, July 1998[9] Marti A. Hearst, Support Vector Machines. University of California, Berkeley. IEEE INTELLIGENT SYSTEMS. [10] Paul Pavlidis, Llan Wapinski, and William Stafford Noble, Support Vector Machine Classification on the Web. August 13, 2003, Advance Access publication January 22, 2004. BIOINFORMATICS APPLICATIONS NOTE Vol. 20 no. 4 2004, pages 586–587. DOI: [11] Eugene A. Borovikov, An Evaluation of Support Vector Machines as a Pattern Recognition Tool, University of Maryland at College Park. 1999/ 3/ 13[12] Hyunsoo Kim, Peg Howland, and Haesun Park, Text Classification using Support Vector Machines with Dimension Reduction. Technical Report 03014, Department of Computer Science and Engineering, University of Minnesota, USA. February 21, 2003[13] Gunnar R168。atsch, A Brief Introduction into Machine Learning. Friedrich Miescher Laboratory of the Max Planck Society, Spemannstra223。e 37, 72076 T168。ubingen, Germany[14] Mamoun Awad, Latifur Khan, Applications and Limitations of Support Vector Machines, Department of Computer Science, University of Texas at Dallas, USA.[15] Yiming Yang and Xin Liu, A reexamination of text categorization methods. School of Computer Science, Carnegie Mellon University. 1999, ACM 1581130961/99/0007[16] Nello Cristianini, Support Vector and Kernel Machines. BIOwulf Technologies, ICML 2001.[17] T. Mitchell. Machine Learning, McGrawHill Science/Engineering/Math。 (March 1, 1997), ISBN: 0070428077. 1~19, 154~199. [18] Fabrizio Sebastiani, Machine Learning in Automated Text Categorization, Consiglio Nazionale delle Ricerche, Italy. 2002 ACM 03600300/02/03000001, ACM Computing Surveys, Vol. 34, No. 1, March 2002, pp. 1~47.[19] Kjersti Aas and Line Eikvil, Text Categorization: A survey. Norwegian Computing Center, , Blindern, N0314, Oslo, Norway. June 1999[20] H. Michael Chung, Paul Gray, Michael Mannino, Introduction to Data Mining and Knowledge Discovery. California State University, Long Beach, Claremont Graudate University, University of Colorado, Denver. 10603425/98 (c) 1998 IEEE[21] Two Crows Corporation, Introduction to Data Mining and Knowledge Discovery, Third Edition. 10500 Falls Road Potomac, MD: Two Crows Corporation, 1999, 1~22[22] Sergey Brin and Lawrence Page, The Anatomy of a LargeScale Hypertextual Web Search, Computer Science Department, Stanford University, Stanford, CA 94305, USA[23] Jason D. M. Rennie, Ryan Rifkin Improving Multiclass Text Classification with the Support Vector Machine, revised April 2002. Published October 2001 by the Massachusetts Institute of Technology as AI Memo 2001026 and CBCL Memo 210.[24] 鄧乃揚(yáng),田英杰. 數(shù)據(jù)挖掘中的新方法—支持向量機(jī). 北京:科學(xué)出版社,2004. 1~378 [25] Youngjoong Ko,Jungyun Seo, Automatic Text Categorization by Unsupervised Learning, Department of Colnputer Science, Sogang University代 碼附該支持向量機(jī)分類(lèi)器的訓(xùn)練和測(cè)試功能代碼:void svmtrain(DOC *docs, long *label, long sumline, long maxid, TRAIN_PARM *train_parm, MODEL *model){ //設(shè)置和初始化參數(shù) long *inconsistent,i。 //處理樣本時(shí)非固定部分 long misclassified(0)。 //計(jì)算錯(cuò)誤分類(lèi)樣本數(shù) long upsvnum。 //上界支持向量數(shù) double maxchange,*lin,*a。 //是否更新非固定部分的樣本 long iterations。 //進(jìn)行縮減優(yōu)化的次數(shù) long *unlabeled,transduction。 //未標(biāo)明分類(lèi)值的訓(xùn)練樣本 long trainpos=0,trainneg=0。 //正負(fù)分類(lèi)值計(jì)數(shù)器 SHRINK shrink。 //縮減結(jié)構(gòu)體,表明每個(gè)樣本的更新情況 train_parmmaxid=maxid。 //最大特征分量序號(hào) train_parmnewvars=train_parmqsize。 //非固定部分樣本數(shù) //為縮減優(yōu)化初始化內(nèi)存空間,默認(rèn)所有樣本都是非固定的 initshrink(amp。shrink,sumline,(long)10000)。 // 為了記錄未標(biāo)明分類(lèi)值的樣本,分配內(nèi)存。 inconsistent = (long *)setmemory(sizeof(long)*sumline)。 unlabeled = (long *)setmemory(sizeof(long)*sumline)。 a = (double *)setmemory(sizeof(double)*sumline)。 lin = (double *)setmemory(sizeof(double)*sumline)。 //每行分配一個(gè)內(nèi)存塊 train_parmvarbound = (double *)setmemory(sizeof(double)*sumline)。 // 為MODEL 結(jié)構(gòu)體分配內(nèi)存,保存訓(xùn)練得到的模式 modelsupvec = (DOC **)setmemory(sizeof(DOC *)*(sumline+2))。 modelalpha = (double *)setmemory(sizeof(double)*(sumline+2))。 modelindex = (long *)setmemory(sizeof(long)*(sumline+2))。 // 模式結(jié)構(gòu)體中的參數(shù)設(shè)置 modelupbound=0。 //累加器,求上界支持向量 modelb=0。 //偏超平面的閾值 modelsupvec[0]=0。 // 支持向量數(shù) modelalpha[0]=0。 //alpha數(shù)組,每個(gè)樣本對(duì)應(yīng)一個(gè)alpha modelmaxid=maxid。 //最大特征分量序號(hào) modelsumline=sumline。 //樣本數(shù) modelsvnum=1。 //支持向量個(gè)數(shù) transduction=0。 //無(wú)分類(lèi)值的樣本 // 界面上顯示懲罰參數(shù) sprintf(temstr, 設(shè)定懲罰參數(shù) C : %.2f, train_parmsvm_c)。 printview(temstr)。 for(i=0。isumline。i++) //每一條樣本所含有的參數(shù)的初始化 {
點(diǎn)擊復(fù)制文檔內(nèi)容
醫(yī)療健康相關(guān)推薦
文庫(kù)吧 www.dybbs8.com
備案圖鄂ICP備17016276號(hào)-1