Biological Sequence Mining Using Plausible Neural Network and its Application to Exon/intron Boundaries Prediction
|
|
- Lawrence Cunningham
- 5 years ago
- Views:
Transcription
1 Bologcal Sequence Mnng Usng Plausble Neural Networ and ts Applcaton to Exon/ntron Boundares Predcton Kuochen L, Dar-en Chang, and Erc Roucha CECS, Unversty of Lousvlle, Lousvlle, KY 40292, USA Yuan Yan Chen PNN Technologes Inc, PO Box 7051, Woodbrdge, VA 22195, USA Abstract Bologcal sequence usually contans yet to fnd nowledge, and mnng bologcal sequences usually nvolves a huge dataset and long computaton tme. Common tass for bologcal sequence mnng are pattern dscovery, classfcaton and clusterng. The newly developed model, Plausble Neural Networ (PNN), provdes an ntutve and unfed archtecture for such a large dataset analyss. Ths paper ntroduces the basc concepts of the PNN, and explans how t s appled to bologcal sequence mnng. The specfc tas of bologcal sequence mnng, exon/ntron predcton, s mplemented by usng PNN. The expermental results show the capablty of solvng bologcal sequence mnng tass usng PNN. I. INTRODUCTION Research n computatonal bology has grown rapdly n the past ten years due to advances n molecular bology technques yeldng hgh-throughput data. One of the most mportant research nterests n computatonal bology s sequence mnng. It s much harder to extract nowledge hdden n the sequences than to generate bologcal sequences. Wth bologcal mutaton and evoluton, sequence datasets usually are enormous and complex. Analyss models become a crtcal factor for bologcal sequence mnng. Several tass related to sequence mnng such as pattern dscovery, classfcaton, predcton, and clusterng, can be mplemented by statstcal, neural networ, or data mnng models [1][2]. Those models can be used to capture the nowledge or patterns n order to predct, classfy, or analyze sequence data. A newly developed neural networ model, plausble neural networ (PNN), whch combnes probablstc and possblstc nferences for bnary random varables was ntroduced n [3] and subsequently patented by Chen n 2003 [4]. PNN s smlar to Hopfeld neural networs n that both are bdrectonal and symmetrcal. But n nference nterpretaton, PNN s smlar to Bayesan neural networs. Specfcally the synaptc weghts n PNN and Bayesan neural networs have probablstc meanngs and thus are transparent. However, PNN wth the weghts based on mutual nformaton content as descrbed n secton II provdes two mportant features. Frst, mutual nformaton content weghts put the PNN archtecture of wnner-tae-all actvaton of competng neurons on a sold statstcal nference foundaton, n whch no pror dstrbuton s requred as n Bayesan neural networs. Second, mutual nformaton content weghts can be used to compute Shannon mutual nformaton between attrbutes wth lttle added cost. Several advantages of PNN lead to applyng t to bologcal sequence mnng. The PNN model s very ntutve n that the user can easly tran t wth dfferent ntal states to dscover patterns hdden n large-scale, complex data. In addton, the model s very general that t can be appled to clusterng, predcton, classfcaton, and pattern dscovery wth the same archtecture. Moreover, PNN represents mssng data as a null vector so that they do not partcpate n the PNN nference process. Most of other clusterng and classfcaton methods handle mssng data by addng artfcal data or removng the ncomplete data. Snce the learnng process and actvaton method n PNN are smple, the computaton for tranng or actvaton can be fnshed n a short perod of tme. Ths paper appled PNN to one of the bologcal sequence mnng tass, the predcton of exon/ntron boundares. The accuracy rate comparng to other computatonal ntellgent models shows the qualty result of PNN. II. A PNN MODEL FOR BIOLOGICAL SEQUENCE MINING A. PNN Archtecture A basc PNN archtecture for bologcal sequence mnng conssts of two layers (nput layer and hdden layer) of cooperatve and compettve neurons. The connecton between nput neurons and hdden neurons s complete, bdrectonal, and symmetrc. The compettve neurons are grouped nto a wnner-tae-all (WTA) ensemble and, n turn, WTA ensembles cooperate n decson mang. Fg.1 llustrates the archtecture for DNA sequence data. The hdden neuron WTA ensemble represents un-labeled (thus unnown or hdden) patterns of smlar sequences. Each nput WTA ensemble encodes the values of an nput attrbute, whch s ether a sequence base or the sequence
2 class. For example, n Fg. 1 the nput WTA ensemble (IE, EI, N) encodes the values of attrbute Class (.e. DNA sequence class label) and each WTA ensemble (A, C, G, T) encodes a DNA base n the sequence. C. Connecton Weghts Assume each neuron n the PNN s a bnary random varable. The connecton weght between an nput neuron and a hdden neuron Y s gven as follows P( = 1, Y = 1) ω = ln (2) P( = 1) P( Y = 1) In (2), P denotes the probablty of the enclosed event. The weght (2) contans the frng hstory or mutual nformaton content of two connected neurons. It s clear that and Y are postvely assocated (exctatory), statstcally ndependent, or negatvely assocated (nhbtory) f ω s postve, 0, or negatve, respectvely. Note that bnary random varable s assumed to defne (2), but the estmaton of (2) as descrbed n secton E and PNN applcatons also apply to neurons satsfyng (1). Fg.1. PNN archtecture for predctng exon/ntron boundares B. Attrbute Value Codng PNN encodes categorcal or contnuous attrbutes nto WTA ensembles. For sequence mnng, only the categorcal attrbute encodng s requred. For a categorcal attrbute wth categores, ts value s encoded as a WTA ensemble of bnary-valued (.e. 0 or 1) neurons ( 1, 2,, ). For example, splce-uncton gene sequences datasets may have a class attrbute wth three possble values IE, EI, and N representng ntron to exon boundary, exon to ntron boundary, and None, respectvely. For such examples, the class attrbute can be encoded n a three-neuron WTA ensemble (IE, EI, N). The nput vector (1, 0, 0) for the class attrbute WTA ensemble ndcates the sequence s of the IE class, (0, 1, 0) ndcates EI, and (0, 0, 1) ndcates N. Each base n a DNA sequence s encoded by a four-neuron WTA ensemble (A, C, G, T). Ths DNA encodng can allow for uncertan base value gven n the IUPAC code as well. For example, f the base value s R (means G or C), the nput vector for the base s (0, 0.5, 0.5, 0). Mssng attrbute values are represented by a null vector,.e. all s n the attrbute ensemble are zero. Gven the codng scheme, an nput WTA ensemble ( 1, 2,, ) representng an attrbute value satsfes the followng condtons: = 1 = 1 and 0 1 for all (1) Furthermore, mssng attrbute values are represented by a null vector,.e. all s n the attrbute ensemble are zero. Wth ths codng, snce the nput from every neuron n the attrbute ensemble s zero, actvaton potental (.e. nput tmes the weght) contrbuted by the attrbute s zero. As a consequence, the attrbute wth mssng value wll not partcpate n the WTA actvaton of the competng hdden neurons. D. Forward and Reverse Frng The PNN s bdrectonal snce the frng n PNN can be carred out n both drectons between nput neurons and hdden neurons. For convenence, forward frng s referred to the actvaton of hdden neurons gven the states of the nput neurons and reverse frng s referred to the actvaton of nput neurons gven the states of hdden neurons. Suppose an ensemble of compettve WTA hdden neurons Y 1, Y 2,, Y n receve nput sgnals from the nput neurons 1, 2,, m. The frng states of Y 1, Y 2,, Y n are computed n the followng steps: 1. Compute the actvaton potental, = 1, 2,, n, as follows = exp( ω ) (3) 2. Compute Max, the maxmum value of, = 1,2,..,n. 3. Perform α-cut (0 < α 1) on, = 1,2,..,n,.e. set to zero f Max α 4. Fnally calculate the actvaton level of Y, 1 n, by Y = (4) Step 3 s a soft WTA competton snce there could be multple wnners wth dfferent actvaton level. Alternatvely, one wnner could be chosen wth the largest actvaton level. The latter s called hard WTA. Step 4 s a normalzaton step to mae sum of Y equal to one. In the reverse frng, the states of the hdden neurons Y 1, Y 2,, Y n are gven and the computaton of the actvaton of
3 nput neurons s carred out for each nput WTA ensemble separately usng steps 1-4 above wth some modfcatons. Specfcally, for each nput WTA ensemble ( 1, 2,, ), the actvaton of neurons s s computed usng steps 1-4 wth these modfcatons: a. = exp( ϖ Y ), = 1, 2,..., where ϖ = ω ( ϖ = ω s why the PNN s sad to be symmetrc.) b. In the normalzaton step 4, =, 1. c. Usually, the α value chosen for ths actvaton s much smaller than that used for the forward frng actvaton. E. The Prncple of Inverse Inference The PNN process of nput data rows s va the WTA competton among the hdden neurons. Assumng the hard WTA competton s used, one hdden neuron wll become the wnner of an nput data row. The outcome of the WTA competton s udged by the actvaton level of each hdden neuron from the forward frng of a data row as descrbed n subsecton D. The hdden neuron wth the hghest actvaton level wll become the wnner (.e. the data row n queston belongs to the cluster represented by the hdden neuron.) Ths competton process can be ustfed by the prncple of nverse nference, whch s stated as Gven the evdence, the more probable a hypothess can produce such evdence the more lely t to be true [3]. To paraphrase the prncple n PNN terms, gven an nput data row (.e. gven nput neurons values), the more probable hdden neuron can response to such an nput the more lely the data row belongs to the cluster represented by the hdden neuron. Suppose 1, 2,, and m denote the states (0 or 1) of the nput neurons n a PNN, the evdence (or the possblty) of a hdden neuron, Y, respondng to the nput can be measured by the followng condtonal probablty: P,,, m Y ) (5) ( 1 2 Therefore, to ustfy the prncple of nverse nference as appled to PNN, t needs to show that the wnnng hdden neuron Y udged by the forward frng usng the weght (2) has ndeed the hghest quantty (5). Usng (2) to compute the actvaton potental n (3), yelds ω = ln( P(,, Y ) / P( )) (7) 1 Pluggng (7) nto (3), the hdden neuron Y s pre-wta actvaton value,, becomes: m = P( 1, Y ) / P( ) (8) m Comparng (8) and (5), s equal to the condtonal probablty (5) dvded by a factor common to all hdden neurons. As a result, usng (8) to determne the wnnng hdden neuron yelds the same result as usng (5). Ths concluded the ustfcaton of the prncple of nverse nference used n PNN. Furthermore snce the sum of (5) over all Y s one, the normalzaton actually computes (5) as the actvaton level for the hdden neuron Y. The statstcal ndependence assumpton on nput attrbutes s a lmtaton of PNN as that n the naïve Bayesan neural networs. However, n data exploraton wthout any pror nowledge of the data, the lmtaton would not be a bg problem n PNN applcatons. One mportant note about PNN s no pror dstrbuton s needed due to the unform pror present n the denomnator of (8). Ths s a huge advantage over Bayesan neural networs, whch requre a dffcult tas of estmatng pror dstrbuton. F. PNN Tranng To apply PNN, a tranng (learnng) procedure s needed to estmate the weght (2). The PNN tranng algorthm estmates the weghts from a gven dataset. Gven the past co-frng hstory of two neurons and Y, (x, y ), = 1,2,,n, the weght (2) between and Y n a PNN can be estmated by n x y = 1 ω = ln( n n x y = 1 n = 1 Suppose that a tranng set of DNA sequences s gven. Frst, fre the hdden neurons randomly for each sequence to produce an ntal fre matrx. Based on the ntal fre matrx and tranng sequences, calculate the new weghts between nput neurons and hdden neurons usng (3). In turn, use the new weghts n the forward frng of tranng sequences to get a new fre matrx. Repeat ths process untl the PNN s stable,.e. untl the fre matrx s not changed. Fg.2 shows the PNN tranng algorthm. ) (3) = ln( P( Y ) ω ln( P( )) (6) If the nput attrbutes are statstcally ndependent, (6) can be reduced to:
4 5. 1-nn: a 1-nearest neghbor neural networ 6. FSlNN: a 1-nearest neghbor neural networ wth a prevous feature selecton 7. -nn: a -nearest neghbor neural networ 8. FSKNN: a -nearest neghbor neural networ wth a prevous feature selecton In order to compare PNN to the above algorthms, same dataset s used to test the PNN and the accuracy rate s evaluated. The tested PNN contans 8 hdden neurons and the soft WTA competton s used for the networ. Table 1 shows the accuracy rates of PNN and the other algorthms obtaned from [8]. The result shows the PNN acheved hgher accuracy rate than any algorthms lsted above. Table 1. Accuracy rate of E/I I/E predcton usng dfferent algorthms PNN ID3 BP LVQ FS+LVQ nn FS1NN -nn FSKNN Fg.2. The PNN tranng algorthm III. EPERIENTS A. Datasets Two datasets are used to test the performance of PNN. The frst dataset s downloaded from [7]. Ths dataset contans 3190 sequences whch are taen from the GenBan Each data sample contans a class attrbute (I/E, E/I, or N), nstance name, and 60-base sequence wth 30 bases on ether sde of the boundary. 767 sequences are classfed as E/I. 768 sequences are classfed as I/E sequences are classfed as N. In order to determne PNNs could be used to determne splce ste predcton wth large dataset, a set of sequences used to benchmar gene predcton programs was downloaded from [5]. The second dataset contans 9204 sequences extracted from 570 dfferent genes. From ths set of sequences, 2629 Exon/Intron (E/I) boundares and 2634 Intron/Exon (I/E) were extracted, wth 30 bases of sequence on ether sde of the boundary reported. In addton, base sequences not occurrng n an E/I or I/E (reported as an N) were extracted. Ths dataset was then separated randomly nto two parts. One was used to tran a PNN wth 8 hdden neurons to dstngush between these three classes. The other one was used to test the performance of the PNN for I/E and E/I recognton. B. Comparng methods Eght dfferent classfcaton algorthms are tested usng the frst sequence dataset n [8]. The algorthms are lsted as follows: 1. ID3: a decsson tree algorthm 2. BP: the bacpropagaton algorthm 3. LVQ: a combnaton of procedures of LVQ 4. FS+LVQ: a LVQ learnng wth feature selecton C. Evaluatng the performance of PNN The second dataset s employed to test the PNN performance of handlng the large dataset. From the total of 9204 sequences, 6204 randomly selected sequences were used to tran the PNN, and the other 3000 sequences were used to test the PNN. The PNN contans 8 hdden neurons and 243 nput neurons (3 for the class attrbute and 4 for each base). Alpha cut was set to 0.8 n order to have more precse predcton. The PNN was traned for 200 teratons and the tranng was completed n mnutes but yet to meet the stable condton. The 3000 testng sequences then nput to the traned PNN to test the performance. The result showed that 2830 out 3000 sequences were correctly classfed. In other words, 94.3% accuracy rate was acheved. The performance parameters: senstvty (Sn), specfcty (Sp), and correlaton co-effcency (CC) were calculated to assess the performance of the PNN. The defntons of Sn, Sp and CC are as follows: Sn = TP TP + FN Sp = TN TN + FP CC = TP TN FP FN ( TP + FP ) ( TP + FN ) ( TN + FP ) ( TN + FN where TP stands for True Postve, FN: for False Postve, TN for True Negatve, and FN for False Negatve Table 2 shows the performance parameters of ths experment. Notce Sn, Sp, and CC have rather hgh percentages for both the I/E and E/I predctons usng PNN even the PNN had not yet reached an optmal state. However, the tranng was extended for couple more hundreds of teratons, but the performance was not mproved sgnfcantly. To mprove tranng performance, the next experment used a larger tranng dataset. )
5 Table 2. Performance parameters of E/I and I/E predctons for 3000 testng sequences E/I I/E In ths experment, the tranng dataset was ncreased to 7204 randomly selected sequences. The rest of the 2000 sequences were used to test the performance of the PNN. The same PNN from the prevous experment was confgured for ths experment. The tranng of the PNN was fnshed surprsngly n 58 teratons and 1883 out of 2000 testng sequences were correctly classfed. Therefore, the PNN acheved about 94% accuracy rate n a very short perod of tranng tme. Table 3 also shows the performance parameters of ths experment. Table 3. Performance parameters of E/I and I/E predctons for 2000 testng sequences E/I I/E The measurement of accuracy for ths experment shows the smlar result from the prevous experment. Addtonally, for msclassfcaton, the PNN provded some useful nformaton (e.g. 0.45: N 0.55:E/I) whch could be used for further analyss. Snce the tranng of PNN s based on mutual nformaton nstead of error correcton, to mprove the performance, a smple tunng algorthm can be appled. It smply repeats the tranng process and records the confguraton of the PNN, whch has the best accuracy rate for the classfcaton. Table 4. Measurement of E/I and I/E predcton for 2000 testng sequences E/I I/E Table 4 shows the measurement of the PNN wth the tunng algorthm. Comparng to 117 msclassfed testng sequences from experment 2, only 89 msclassfed testng sequences were found out of 2000 sequences. Furthermore, the I/E predcton had sgnfcantly mproved n Sn and CC as shown n Table 4. V. FUTURE WORK PNN has shown the abltes to solve problems such as classfcaton, clusterng, pattern recognton, functon approxmaton, etc [9] [10]. In ths paper, PNN has shown promse for gene splce-uncton recognton. The fast convergence of PNN maes t feasble for large bologcal sequence analyss. For future wor, the study of new uncton pattern dscovery and mprovement of PNN for sequence mnng n general are nterestng to be mplemented. In addton, the possblty of applyng PNN to promoter predcton and proten profle pattern recognton wll also be consdered. REFERENCES [1] B. S. Evertt and G. Dunn, Appled Multvarate Data Analyss. London: Arnold, 2001, ch. 6. [2] J. Han and M. Kamber, Data Mnng: Concepts and Technques. Morgan Kaufmann Publshers, 2001, ch. 7 and ch. 8. [3] Y. Y. Chen, Plausble neural networs, Advance n Neural Networs World, A. Grmela and N. E. Mastoras, Ed. WSEAS Press 2002, pp [4] Y. Y. Chen, Plausble neural networ wth supervsed and unsupervsed cluster analyss, U.S. Patent , July 24, [5] M. Burset and R. Gugo, Evaluaton of gene structure predcton programs, Genomcs 34: , [6] E.Y. Chen,, M. ollo, R.A. Mazzarella, A. Cccodcola, C.N. Chen, L. uo, C. Hener, F.W. Burough, M. Rpetto, D. Schlessnger and M. D'Urso, Long-range sequence analyss n q28: thrteen nown and sx canddate genes n b of hgh GC DNA between the RCP/GCP and G6PD loc, Hum. Mol. Genet. 5 (5), 1966, pp vol. 7, no. 3, pp , June [7] D.J. Newman, S. Hettch, C.L. Blae, and C.J. Merz, UCI Repostory of machne learnng databases, Unversty of Calforna, Department of Informaton and Computer Scence, Irvne, CA, 1998 < [8] M. Mar Abad Grau, and L.D.H Molnero, Feature selecton n codeboo based methods provdes hgh accuracy, IJCNN 99, vol.3, [9] K. L, D. Chang, and Y. Y. Chen, "Hgh-speed B-drectonal Functon Approxmaton Usng Plausble Neural Networs," Proceedngs of IJCNN'06, [10] K. L and D. Chang,"Fuzzy Membershp Functon Elctaton usng Plausble Neural Networ," Proceedngs of ICAI'06 Volume I, 2006, pp IV. CONCLUSION PNN not only shows good performance for the predcton of exon/ntron boundares but also can dscover the patterns of exon/ntron boundares. For a traned PNN, each hdden neuron represents a found pattern. The pattern can be easly extracted by reverse-frng the partcular hdden neuron. The extracted nowledge then provdes better understandng of the dataset. Thus, the number of hdden neurons s crucal to the PNN performance. In ths paper, dfferent numbers of hdden neurons are tested n the experments. The PNN wth 8 hdden neurons shows the best performance regardng to the accuracy rate and the tranng tme. However, to decde the mnmum number of hdden neurons to acheve desred performance crtera s stll an open ssue.
Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like:
Self-Organzng Maps (SOM) Turgay İBRİKÇİ, PhD. Outlne Introducton Structures of SOM SOM Archtecture Neghborhoods SOM Algorthm Examples Summary 1 2 Unsupervsed Hebban Learnng US Hebban Learnng, Cntd 3 A
More informationClassifying Acoustic Transient Signals Using Artificial Intelligence
Classfyng Acoustc Transent Sgnals Usng Artfcal Intellgence Steve Sutton, Unversty of North Carolna At Wlmngton (suttons@charter.net) Greg Huff, Unversty of North Carolna At Wlmngton (jgh7476@uncwl.edu)
More informationLearning the Kernel Parameters in Kernel Minimum Distance Classifier
Learnng the Kernel Parameters n Kernel Mnmum Dstance Classfer Daoqang Zhang 1,, Songcan Chen and Zh-Hua Zhou 1* 1 Natonal Laboratory for Novel Software Technology Nanjng Unversty, Nanjng 193, Chna Department
More informationThe Research of Support Vector Machine in Agricultural Data Classification
The Research of Support Vector Machne n Agrcultural Data Classfcaton Le Sh, Qguo Duan, Xnmng Ma, Me Weng College of Informaton and Management Scence, HeNan Agrcultural Unversty, Zhengzhou 45000 Chna Zhengzhou
More informationInvestigating the Performance of Naïve- Bayes Classifiers and K- Nearest Neighbor Classifiers
Journal of Convergence Informaton Technology Volume 5, Number 2, Aprl 2010 Investgatng the Performance of Naïve- Bayes Classfers and K- Nearest Neghbor Classfers Mohammed J. Islam *, Q. M. Jonathan Wu,
More informationA User Selection Method in Advertising System
Int. J. Communcatons, etwork and System Scences, 2010, 3, 54-58 do:10.4236/jcns.2010.31007 Publshed Onlne January 2010 (http://www.scrp.org/journal/jcns/). A User Selecton Method n Advertsng System Shy
More informationSubspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;
Subspace clusterng Clusterng Fundamental to all clusterng technques s the choce of dstance measure between data ponts; D q ( ) ( ) 2 x x = x x, j k = 1 k jk Squared Eucldean dstance Assumpton: All features
More informationData Mining: Model Evaluation
Data Mnng: Model Evaluaton Aprl 16, 2013 1 Issues: Evaluatng Classfcaton Methods Accurac classfer accurac: predctng class label predctor accurac: guessng value of predcted attrbutes Speed tme to construct
More informationClassifier Selection Based on Data Complexity Measures *
Classfer Selecton Based on Data Complexty Measures * Edth Hernández-Reyes, J.A. Carrasco-Ochoa, and J.Fco. Martínez-Trndad Natonal Insttute for Astrophyscs, Optcs and Electroncs, Lus Enrque Erro No.1 Sta.
More informationTerm Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task
Proceedngs of NTCIR-6 Workshop Meetng, May 15-18, 2007, Tokyo, Japan Term Weghtng Classfcaton System Usng the Ch-square Statstc for the Classfcaton Subtask at NTCIR-6 Patent Retreval Task Kotaro Hashmoto
More informationBOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET
1 BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET TZU-CHENG CHUANG School of Electrcal and Computer Engneerng, Purdue Unversty, West Lafayette, Indana 47907 SAUL B. GELFAND School
More informationSupport Vector Machines
/9/207 MIST.6060 Busness Intellgence and Data Mnng What are Support Vector Machnes? Support Vector Machnes Support Vector Machnes (SVMs) are supervsed learnng technques that analyze data and recognze patterns.
More informationAn Entropy-Based Approach to Integrated Information Needs Assessment
Dstrbuton Statement A: Approved for publc release; dstrbuton s unlmted. An Entropy-Based Approach to ntegrated nformaton Needs Assessment June 8, 2004 Wllam J. Farrell Lockheed Martn Advanced Technology
More informationMachine Learning 9. week
Machne Learnng 9. week Mappng Concept Radal Bass Functons (RBF) RBF Networks 1 Mappng It s probably the best scenaro for the classfcaton of two dataset s to separate them lnearly. As you see n the below
More informationCompiler Design. Spring Register Allocation. Sample Exercises and Solutions. Prof. Pedro C. Diniz
Compler Desgn Sprng 2014 Regster Allocaton Sample Exercses and Solutons Prof. Pedro C. Dnz USC / Informaton Scences Insttute 4676 Admralty Way, Sute 1001 Marna del Rey, Calforna 90292 pedro@s.edu Regster
More informationModule Management Tool in Software Development Organizations
Journal of Computer Scence (5): 8-, 7 ISSN 59-66 7 Scence Publcatons Management Tool n Software Development Organzatons Ahmad A. Al-Rababah and Mohammad A. Al-Rababah Faculty of IT, Al-Ahlyyah Amman Unversty,
More informationImplementation Naïve Bayes Algorithm for Student Classification Based on Graduation Status
Internatonal Journal of Appled Busness and Informaton Systems ISSN: 2597-8993 Vol 1, No 2, September 2017, pp. 6-12 6 Implementaton Naïve Bayes Algorthm for Student Classfcaton Based on Graduaton Status
More informationAn Efficient Genetic Algorithm with Fuzzy c-means Clustering for Traveling Salesman Problem
An Effcent Genetc Algorthm wth Fuzzy c-means Clusterng for Travelng Salesman Problem Jong-Won Yoon and Sung-Bae Cho Dept. of Computer Scence Yonse Unversty Seoul, Korea jwyoon@sclab.yonse.ac.r, sbcho@cs.yonse.ac.r
More informationContent Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers
IOSR Journal of Electroncs and Communcaton Engneerng (IOSR-JECE) e-issn: 78-834,p- ISSN: 78-8735.Volume 9, Issue, Ver. IV (Mar - Apr. 04), PP 0-07 Content Based Image Retreval Usng -D Dscrete Wavelet wth
More information6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour
6.854 Advanced Algorthms Petar Maymounkov Problem Set 11 (November 23, 2005) Wth: Benjamn Rossman, Oren Wemann, and Pouya Kheradpour Problem 1. We reduce vertex cover to MAX-SAT wth weghts, such that the
More informationMeta-heuristics for Multidimensional Knapsack Problems
2012 4th Internatonal Conference on Computer Research and Development IPCSIT vol.39 (2012) (2012) IACSIT Press, Sngapore Meta-heurstcs for Multdmensonal Knapsack Problems Zhbao Man + Computer Scence Department,
More informationDetermining the Optimal Bandwidth Based on Multi-criterion Fusion
Proceedngs of 01 4th Internatonal Conference on Machne Learnng and Computng IPCSIT vol. 5 (01) (01) IACSIT Press, Sngapore Determnng the Optmal Bandwdth Based on Mult-crteron Fuson Ha-L Lang 1+, Xan-Mn
More informationCourse Introduction. Algorithm 8/31/2017. COSC 320 Advanced Data Structures and Algorithms. COSC 320 Advanced Data Structures and Algorithms
Course Introducton Course Topcs Exams, abs, Proects A quc loo at a few algorthms 1 Advanced Data Structures and Algorthms Descrpton: We are gong to dscuss algorthm complexty analyss, algorthm desgn technques
More informationImprovement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration
Improvement of Spatal Resoluton Usng BlockMatchng Based Moton Estmaton and Frame Integraton Danya Suga and Takayuk Hamamoto Graduate School of Engneerng, Tokyo Unversty of Scence, 6-3-1, Nuku, Katsuska-ku,
More informationOutline. Type of Machine Learning. Examples of Application. Unsupervised Learning
Outlne Artfcal Intellgence and ts applcatons Lecture 8 Unsupervsed Learnng Professor Danel Yeung danyeung@eee.org Dr. Patrck Chan patrckchan@eee.org South Chna Unversty of Technology, Chna Introducton
More informationLecture 5: Multilayer Perceptrons
Lecture 5: Multlayer Perceptrons Roger Grosse 1 Introducton So far, we ve only talked about lnear models: lnear regresson and lnear bnary classfers. We noted that there are functons that can t be represented
More informationKeywords - Wep page classification; bag of words model; topic model; hierarchical classification; Support Vector Machines
(IJCSIS) Internatonal Journal of Computer Scence and Informaton Securty, Herarchcal Web Page Classfcaton Based on a Topc Model and Neghborng Pages Integraton Wongkot Srura Phayung Meesad Choochart Haruechayasak
More informationAssociative Based Classification Algorithm For Diabetes Disease Prediction
Internatonal Journal of Engneerng Trends and Technology (IJETT) Volume-41 Number-3 - November 016 Assocatve Based Classfcaton Algorthm For Dabetes Dsease Predcton 1 N. Gnana Deepka, Y.surekha, 3 G.Laltha
More informationA Statistical Model Selection Strategy Applied to Neural Networks
A Statstcal Model Selecton Strategy Appled to Neural Networks Joaquín Pzarro Elsa Guerrero Pedro L. Galndo joaqun.pzarro@uca.es elsa.guerrero@uca.es pedro.galndo@uca.es Dpto Lenguajes y Sstemas Informátcos
More informationIntelligent Information Acquisition for Improved Clustering
Intellgent Informaton Acquston for Improved Clusterng Duy Vu Unversty of Texas at Austn duyvu@cs.utexas.edu Mkhal Blenko Mcrosoft Research mblenko@mcrosoft.com Prem Melvlle IBM T.J. Watson Research Center
More informationCS 534: Computer Vision Model Fitting
CS 534: Computer Vson Model Fttng Sprng 004 Ahmed Elgammal Dept of Computer Scence CS 534 Model Fttng - 1 Outlnes Model fttng s mportant Least-squares fttng Maxmum lkelhood estmaton MAP estmaton Robust
More informationBackpropagation: In Search of Performance Parameters
Bacpropagaton: In Search of Performance Parameters ANIL KUMAR ENUMULAPALLY, LINGGUO BU, and KHOSROW KAIKHAH, Ph.D. Computer Scence Department Texas State Unversty-San Marcos San Marcos, TX-78666 USA ae049@txstate.edu,
More informationUnsupervised Learning
Pattern Recognton Lecture 8 Outlne Introducton Unsupervsed Learnng Parametrc VS Non-Parametrc Approach Mxture of Denstes Maxmum-Lkelhood Estmates Clusterng Prof. Danel Yeung School of Computer Scence and
More informationSupport Vector Machines
Support Vector Machnes Decson surface s a hyperplane (lne n 2D) n feature space (smlar to the Perceptron) Arguably, the most mportant recent dscovery n machne learnng In a nutshell: map the data to a predetermned
More information(1) The control processes are too complex to analyze by conventional quantitative techniques.
Chapter 0 Fuzzy Control and Fuzzy Expert Systems The fuzzy logc controller (FLC) s ntroduced n ths chapter. After ntroducng the archtecture of the FLC, we study ts components step by step and suggest a
More informationA Unified Framework for Semantics and Feature Based Relevance Feedback in Image Retrieval Systems
A Unfed Framework for Semantcs and Feature Based Relevance Feedback n Image Retreval Systems Ye Lu *, Chunhu Hu 2, Xngquan Zhu 3*, HongJang Zhang 2, Qang Yang * School of Computng Scence Smon Fraser Unversty
More informationCluster Analysis of Electrical Behavior
Journal of Computer and Communcatons, 205, 3, 88-93 Publshed Onlne May 205 n ScRes. http://www.scrp.org/ournal/cc http://dx.do.org/0.4236/cc.205.350 Cluster Analyss of Electrcal Behavor Ln Lu Ln Lu, School
More informationAn algorithm for correcting mislabeled data
Intellgent Data Analyss 5 (2001) 491 2 491 IOS Press An algorthm for correctng mslabeled data Xnchuan Zeng and Tony R. Martnez Computer Scence Department, Brgham Young Unversty, Provo, UT 842, USA E-mal:
More informationEdge Detection in Noisy Images Using the Support Vector Machines
Edge Detecton n Nosy Images Usng the Support Vector Machnes Hlaro Gómez-Moreno, Saturnno Maldonado-Bascón, Francsco López-Ferreras Sgnal Theory and Communcatons Department. Unversty of Alcalá Crta. Madrd-Barcelona
More informationA MODIFIED K-NEAREST NEIGHBOR CLASSIFIER TO DEAL WITH UNBALANCED CLASSES
A MODIFIED K-NEAREST NEIGHBOR CLASSIFIER TO DEAL WITH UNBALANCED CLASSES Aram AlSuer, Ahmed Al-An and Amr Atya 2 Faculty of Engneerng and Informaton Technology, Unversty of Technology, Sydney, Australa
More informationIncremental Learning with Support Vector Machines and Fuzzy Set Theory
The 25th Workshop on Combnatoral Mathematcs and Computaton Theory Incremental Learnng wth Support Vector Machnes and Fuzzy Set Theory Yu-Mng Chuang 1 and Cha-Hwa Ln 2* 1 Department of Computer Scence and
More informationFuzzy Modeling of the Complexity vs. Accuracy Trade-off in a Sequential Two-Stage Multi-Classifier System
Fuzzy Modelng of the Complexty vs. Accuracy Trade-off n a Sequental Two-Stage Mult-Classfer System MARK LAST 1 Department of Informaton Systems Engneerng Ben-Guron Unversty of the Negev Beer-Sheva 84105
More informationHermite Splines in Lie Groups as Products of Geodesics
Hermte Splnes n Le Groups as Products of Geodescs Ethan Eade Updated May 28, 2017 1 Introducton 1.1 Goal Ths document defnes a curve n the Le group G parametrzed by tme and by structural parameters n the
More informationDetermining Fuzzy Sets for Quantitative Attributes in Data Mining Problems
Determnng Fuzzy Sets for Quanttatve Attrbutes n Data Mnng Problems ATTILA GYENESEI Turku Centre for Computer Scence (TUCS) Unversty of Turku, Department of Computer Scence Lemmnkäsenkatu 4A, FIN-5 Turku
More informationOn Supporting Identification in a Hand-Based Biometric Framework
On Supportng Identfcaton n a Hand-Based Bometrc Framework Pe-Fang Guo 1, Prabr Bhattacharya 2, and Nawwaf Kharma 1 1 Electrcal & Computer Engneerng, Concorda Unversty, 1455 de Masonneuve Blvd., Montreal,
More informationComplex Numbers. Now we also saw that if a and b were both positive then ab = a b. For a second let s forget that restriction and do the following.
Complex Numbers The last topc n ths secton s not really related to most of what we ve done n ths chapter, although t s somewhat related to the radcals secton as we wll see. We also won t need the materal
More informationError Detection and Impact-Sensitive Instance Ranking in Noisy Datasets
Error Detecton and Impact-Senstve Instance Ranng n osy Datasets Xngquan Zhu, Xndong Wu, and Yng Yang Department of Computer Scence, Unversty of Vermont, Burlngton VT 05405, USA {xqzhu, xwu, yyang}@cs.uvm.edu
More informationImproving anti-spam filtering, based on Naive Bayesian and neural networks in multi-agent filters
J. Appl. Envron. Bol. Sc., 5(7S)381-386, 2015 2015, TextRoad Publcaton ISSN: 2090-4274 Journal of Appled Envronmental and Bologcal Scences www.textroad.com Improvng ant-spam flterng, based on Nave Bayesan
More informationSLAM Summer School 2006 Practical 2: SLAM using Monocular Vision
SLAM Summer School 2006 Practcal 2: SLAM usng Monocular Vson Javer Cvera, Unversty of Zaragoza Andrew J. Davson, Imperal College London J.M.M Montel, Unversty of Zaragoza. josemar@unzar.es, jcvera@unzar.es,
More informationMachine Learning. Topic 6: Clustering
Machne Learnng Topc 6: lusterng lusterng Groupng data nto (hopefully useful) sets. Thngs on the left Thngs on the rght Applcatons of lusterng Hypothess Generaton lusters mght suggest natural groups. Hypothess
More informationProblem Definitions and Evaluation Criteria for Computational Expensive Optimization
Problem efntons and Evaluaton Crtera for Computatonal Expensve Optmzaton B. Lu 1, Q. Chen and Q. Zhang 3, J. J. Lang 4, P. N. Suganthan, B. Y. Qu 6 1 epartment of Computng, Glyndwr Unversty, UK Faclty
More informationExtraction of Fuzzy Rules from Trained Neural Network Using Evolutionary Algorithm *
Extracton of Fuzzy Rules from Traned Neural Network Usng Evolutonary Algorthm * Urszula Markowska-Kaczmar, Wojcech Trelak Wrocław Unversty of Technology, Poland kaczmar@c.pwr.wroc.pl, trelak@c.pwr.wroc.pl
More informationTsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance
Tsnghua Unversty at TAC 2009: Summarzng Mult-documents by Informaton Dstance Chong Long, Mnle Huang, Xaoyan Zhu State Key Laboratory of Intellgent Technology and Systems, Tsnghua Natonal Laboratory for
More informationAnnouncements. Supervised Learning
Announcements See Chapter 5 of Duda, Hart, and Stork. Tutoral by Burge lnked to on web page. Supervsed Learnng Classfcaton wth labeled eamples. Images vectors n hgh-d space. Supervsed Learnng Labeled eamples
More informationYan et al. / J Zhejiang Univ-Sci C (Comput & Electron) in press 1. Improving Naive Bayes classifier by dividing its decision regions *
Yan et al. / J Zhejang Unv-Sc C (Comput & Electron) n press 1 Journal of Zhejang Unversty-SCIENCE C (Computers & Electroncs) ISSN 1869-1951 (Prnt); ISSN 1869-196X (Onlne) www.zju.edu.cn/jzus; www.sprngerlnk.com
More informationAnalysis of Continuous Beams in General
Analyss of Contnuous Beams n General Contnuous beams consdered here are prsmatc, rgdly connected to each beam segment and supported at varous ponts along the beam. onts are selected at ponts of support,
More informationArtificial Intelligence (AI) methods are concerned with. Artificial Intelligence Techniques for Steam Generator Modelling
Artfcal Intellgence Technques for Steam Generator Modellng Sarah Wrght and Tshldz Marwala Abstract Ths paper nvestgates the use of dfferent Artfcal Intellgence methods to predct the values of several contnuous
More informationMachine Learning: Algorithms and Applications
14/05/1 Machne Learnng: Algorthms and Applcatons Florano Zn Free Unversty of Bozen-Bolzano Faculty of Computer Scence Academc Year 011-01 Lecture 10: 14 May 01 Unsupervsed Learnng cont Sldes courtesy of
More informationHelsinki University Of Technology, Systems Analysis Laboratory Mat Independent research projects in applied mathematics (3 cr)
Helsnk Unversty Of Technology, Systems Analyss Laboratory Mat-2.08 Independent research projects n appled mathematcs (3 cr) "! #$&% Antt Laukkanen 506 R ajlaukka@cc.hut.f 2 Introducton...3 2 Multattrbute
More informationMathematics 256 a course in differential equations for engineering students
Mathematcs 56 a course n dfferental equatons for engneerng students Chapter 5. More effcent methods of numercal soluton Euler s method s qute neffcent. Because the error s essentally proportonal to the
More informationA Similarity-Based Prognostics Approach for Remaining Useful Life Estimation of Engineered Systems
2008 INTERNATIONAL CONFERENCE ON PROGNOSTICS AND HEALTH MANAGEMENT A Smlarty-Based Prognostcs Approach for Remanng Useful Lfe Estmaton of Engneered Systems Tany Wang, Janbo Yu, Davd Segel, and Jay Lee
More informationAn Anti-Noise Text Categorization Method based on Support Vector Machines *
An Ant-Nose Text ategorzaton Method based on Support Vector Machnes * hen Ln, Huang Je and Gong Zheng-Hu School of omputer Scence, Natonal Unversty of Defense Technology, hangsha, 410073, hna chenln@nudt.edu.cn,
More informationSimulation: Solving Dynamic Models ABE 5646 Week 11 Chapter 2, Spring 2010
Smulaton: Solvng Dynamc Models ABE 5646 Week Chapter 2, Sprng 200 Week Descrpton Readng Materal Mar 5- Mar 9 Evaluatng [Crop] Models Comparng a model wth data - Graphcal, errors - Measures of agreement
More informationUnder-Sampling Approaches for Improving Prediction of the Minority Class in an Imbalanced Dataset
Under-Samplng Approaches for Improvng Predcton of the Mnorty Class n an Imbalanced Dataset Show-Jane Yen and Yue-Sh Lee Department of Computer Scence and Informaton Engneerng, Mng Chuan Unversty 5 The-Mng
More informationPredicting the Cellular Localization Sites of Proteins Using Bayesian Networks and Bayesian Model Averaging
Predctng the Cellular Localzaton Stes of Protens Usng Bayesan Networs and Bayesan Model Averagng Yetan Chen Department of Computer Scence Iowa State Unversty yetanc@cs.astate.edu Abstract In ths wor, we
More informationSome Advanced SPC Tools 1. Cumulative Sum Control (Cusum) Chart For the data shown in Table 9-1, the x chart can be generated.
Some Advanced SP Tools 1. umulatve Sum ontrol (usum) hart For the data shown n Table 9-1, the x chart can be generated. However, the shft taken place at sample #21 s not apparent. 92 For ths set samples,
More informationUser Authentication Based On Behavioral Mouse Dynamics Biometrics
User Authentcaton Based On Behavoral Mouse Dynamcs Bometrcs Chee-Hyung Yoon Danel Donghyun Km Department of Computer Scence Department of Computer Scence Stanford Unversty Stanford Unversty Stanford, CA
More informationUnsupervised Learning and Clustering
Unsupervsed Learnng and Clusterng Why consder unlabeled samples?. Collectng and labelng large set of samples s costly Gettng recorded speech s free, labelng s tme consumng 2. Classfer could be desgned
More informationA Post Randomization Framework for Privacy-Preserving Bayesian. Network Parameter Learning
A Post Randomzaton Framework for Prvacy-Preservng Bayesan Network Parameter Learnng JIANJIE MA K.SIVAKUMAR School Electrcal Engneerng and Computer Scence, Washngton State Unversty Pullman, WA. 9964-75
More informationy and the total sum of
Lnear regresson Testng for non-lnearty In analytcal chemstry, lnear regresson s commonly used n the constructon of calbraton functons requred for analytcal technques such as gas chromatography, atomc absorpton
More informationSum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints
Australan Journal of Basc and Appled Scences, 2(4): 1204-1208, 2008 ISSN 1991-8178 Sum of Lnear and Fractonal Multobjectve Programmng Problem under Fuzzy Rules Constrants 1 2 Sanjay Jan and Kalash Lachhwan
More informationProblem Set 3 Solutions
Introducton to Algorthms October 4, 2002 Massachusetts Insttute of Technology 6046J/18410J Professors Erk Demane and Shaf Goldwasser Handout 14 Problem Set 3 Solutons (Exercses were not to be turned n,
More informationA Saturation Binary Neural Network for Crossbar Switching Problem
A Saturaton Bnary Neural Network for Crossbar Swtchng Problem Cu Zhang 1, L-Qng Zhao 2, and Rong-Long Wang 2 1 Department of Autocontrol, Laonng Insttute of Scence and Technology, Benx, Chna bxlkyzhangcu@163.com
More informationOptimizing Document Scoring for Query Retrieval
Optmzng Document Scorng for Query Retreval Brent Ellwen baellwe@cs.stanford.edu Abstract The goal of ths project was to automate the process of tunng a document query engne. Specfcally, I used machne learnng
More informationContext-Specific Bayesian Clustering for Gene Expression Data
Context-Specfc Bayesan Clusterng for Gene Expresson Data Yoseph Barash School of Computer Scence & Engneerng Hebrew Unversty, Jerusalem, 91904, Israel hoan@cs.huj.ac.l Nr Fredman School of Computer Scence
More informationHierarchical clustering for gene expression data analysis
Herarchcal clusterng for gene expresson data analyss Gorgo Valentn e-mal: valentn@ds.unm.t Clusterng of Mcroarray Data. Clusterng of gene expresson profles (rows) => dscovery of co-regulated and functonally
More informationJournal of Chemical and Pharmaceutical Research, 2014, 6(6): Research Article. A selective ensemble classification method on microarray data
Avalable onlne www.ocpr.com Journal of Chemcal and Pharmaceutcal Research, 2014, 6(6):2860-2866 Research Artcle ISSN : 0975-7384 CODEN(USA) : JCPRC5 A selectve ensemble classfcaton method on mcroarray
More informationAn Iterative Solution Approach to Process Plant Layout using Mixed Integer Optimisation
17 th European Symposum on Computer Aded Process Engneerng ESCAPE17 V. Plesu and P.S. Agach (Edtors) 2007 Elsever B.V. All rghts reserved. 1 An Iteratve Soluton Approach to Process Plant Layout usng Mxed
More informationAdaptive Transfer Learning
Adaptve Transfer Learnng Bn Cao, Snno Jaln Pan, Yu Zhang, Dt-Yan Yeung, Qang Yang Hong Kong Unversty of Scence and Technology Clear Water Bay, Kowloon, Hong Kong {caobn,snnopan,zhangyu,dyyeung,qyang}@cse.ust.hk
More informationAn efficient iterative source routing algorithm
An effcent teratve source routng algorthm Gang Cheng Ye Tan Nrwan Ansar Advanced Networng Lab Department of Electrcal Computer Engneerng New Jersey Insttute of Technology Newar NJ 7 {gc yt Ansar}@ntedu
More informationExperiments in Text Categorization Using Term Selection by Distance to Transition Point
Experments n Text Categorzaton Usng Term Selecton by Dstance to Transton Pont Edgar Moyotl-Hernández, Héctor Jménez-Salazar Facultad de Cencas de la Computacón, B. Unversdad Autónoma de Puebla, 14 Sur
More informationEmpirical Distributions of Parameter Estimates. in Binary Logistic Regression Using Bootstrap
Int. Journal of Math. Analyss, Vol. 8, 4, no. 5, 7-7 HIKARI Ltd, www.m-hkar.com http://dx.do.org/.988/jma.4.494 Emprcal Dstrbutons of Parameter Estmates n Bnary Logstc Regresson Usng Bootstrap Anwar Ftranto*
More informationReducing Frame Rate for Object Tracking
Reducng Frame Rate for Object Trackng Pavel Korshunov 1 and We Tsang Oo 2 1 Natonal Unversty of Sngapore, Sngapore 11977, pavelkor@comp.nus.edu.sg 2 Natonal Unversty of Sngapore, Sngapore 11977, oowt@comp.nus.edu.sg
More informationKOHONEN'S SELF ORGANIZING NETWORKS WITH "CONSCIENCE"
Kohonen's Self Organzng Maps and ther use n Interpretaton, Dr. M. Turhan (Tury) Taner, Rock Sold Images Page: 1 KOHONEN'S SELF ORGANIZING NETWORKS WITH "CONSCIENCE" By: Dr. M. Turhan (Tury) Taner, Rock
More informationA Robust Method for Estimating the Fundamental Matrix
Proc. VIIth Dgtal Image Computng: Technques and Applcatons, Sun C., Talbot H., Ourseln S. and Adraansen T. (Eds.), 0- Dec. 003, Sydney A Robust Method for Estmatng the Fundamental Matrx C.L. Feng and Y.S.
More informationUsing Neural Networks and Support Vector Machines in Data Mining
Usng eural etworks and Support Vector Machnes n Data Mnng RICHARD A. WASIOWSKI Computer Scence Department Calforna State Unversty Domnguez Hlls Carson, CA 90747 USA Abstract: - Multvarate data analyss
More informationAPPLICATION OF MULTIVARIATE LOSS FUNCTION FOR ASSESSMENT OF THE QUALITY OF TECHNOLOGICAL PROCESS MANAGEMENT
3. - 5. 5., Brno, Czech Republc, EU APPLICATION OF MULTIVARIATE LOSS FUNCTION FOR ASSESSMENT OF THE QUALITY OF TECHNOLOGICAL PROCESS MANAGEMENT Abstract Josef TOŠENOVSKÝ ) Lenka MONSPORTOVÁ ) Flp TOŠENOVSKÝ
More informationLearning an Image Manifold for Retrieval
Learnng an Image Manfold for Retreval Xaofe He*, We-Yng Ma, and Hong-Jang Zhang Mcrosoft Research Asa Bejng, Chna, 100080 {wyma,hjzhang}@mcrosoft.com *Department of Computer Scence, The Unversty of Chcago
More informationA Background Subtraction for a Vision-based User Interface *
A Background Subtracton for a Vson-based User Interface * Dongpyo Hong and Woontack Woo KJIST U-VR Lab. {dhon wwoo}@kjst.ac.kr Abstract In ths paper, we propose a robust and effcent background subtracton
More informationThree supervised learning methods on pen digits character recognition dataset
Three supervsed learnng methods on pen dgts character recognton dataset Chrs Flezach Department of Computer Scence and Engneerng Unversty of Calforna, San Dego San Dego, CA 92093 cflezac@cs.ucsd.edu Satoru
More informationSVM-based Learning for Multiple Model Estimation
SVM-based Learnng for Multple Model Estmaton Vladmr Cherkassky and Yunqan Ma Department of Electrcal and Computer Engneerng Unversty of Mnnesota Mnneapols, MN 55455 {cherkass,myq}@ece.umn.edu Abstract:
More informationA Binarization Algorithm specialized on Document Images and Photos
A Bnarzaton Algorthm specalzed on Document mages and Photos Ergna Kavalleratou Dept. of nformaton and Communcaton Systems Engneerng Unversty of the Aegean kavalleratou@aegean.gr Abstract n ths paper, a
More informationHybridization of Expectation-Maximization and K-Means Algorithms for Better Clustering Performance
BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 16, No 2 Sofa 2016 Prnt ISSN: 1311-9702; Onlne ISSN: 1314-4081 DOI: 10.1515/cat-2016-0017 Hybrdzaton of Expectaton-Maxmzaton
More informationA MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS
Proceedngs of the Wnter Smulaton Conference M E Kuhl, N M Steger, F B Armstrong, and J A Jones, eds A MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS Mark W Brantley Chun-Hung
More informationC2 Training: June 8 9, Combining effect sizes across studies. Create a set of independent effect sizes. Introduction to meta-analysis
C2 Tranng: June 8 9, 2010 Introducton to meta-analyss The Campbell Collaboraton www.campbellcollaboraton.org Combnng effect szes across studes Compute effect szes wthn each study Create a set of ndependent
More informationConcurrent Apriori Data Mining Algorithms
Concurrent Apror Data Mnng Algorthms Vassl Halatchev Department of Electrcal Engneerng and Computer Scence York Unversty, Toronto October 8, 2015 Outlne Why t s mportant Introducton to Assocaton Rule Mnng
More informationWishing you all a Total Quality New Year!
Total Qualty Management and Sx Sgma Post Graduate Program 214-15 Sesson 4 Vnay Kumar Kalakband Assstant Professor Operatons & Systems Area 1 Wshng you all a Total Qualty New Year! Hope you acheve Sx sgma
More informationDeep Classification in Large-scale Text Hierarchies
Deep Classfcaton n Large-scale Text Herarches Gu-Rong Xue Dkan Xng Qang Yang 2 Yong Yu Dept. of Computer Scence and Engneerng Shangha Jao-Tong Unversty {grxue, dkxng, yyu}@apex.sjtu.edu.cn 2 Hong Kong
More informationAn Image Fusion Approach Based on Segmentation Region
Rong Wang, L-Qun Gao, Shu Yang, Yu-Hua Cha, and Yan-Chun Lu An Image Fuson Approach Based On Segmentaton Regon An Image Fuson Approach Based on Segmentaton Regon Rong Wang, L-Qun Gao, Shu Yang 3, Yu-Hua
More informationLoop Permutation. Loop Transformations for Parallelism & Locality. Legality of Loop Interchange. Loop Interchange (cont)
Loop Transformatons for Parallelsm & Localty Prevously Data dependences and loops Loop transformatons Parallelzaton Loop nterchange Today Loop nterchange Loop transformatons and transformaton frameworks
More information