Overlapping Clustering with Sparseness Constraints

Size: px
Start display at page:

Download "Overlapping Clustering with Sparseness Constraints"

Transcription

1 2012 IEEE 12th Internatonal Conference on Data Mnng Workshops Overlappng Clusterng wth Sparseness Constrants Habng Lu OMIS, Santa Clara Unversty Yuan Hong MSIS, Rutgers Unversty W. Nck Street MS, The Unversty of Iowa Fe Wang IBM T.J. Watson Research Center Hanghang Tong IBM T.J. Watson Research Center I. ABSTRACT Overlappng clusterng allows a data pont to be a member of multple clusters, whch s more approprate for modelng many real data semantcs. However, much of the exstng work on overlappng clusterng smply assume that a data pont can be assgned to any number of clusters wthout any constrant. Ths assumpton s not supported by many real contexts. In an attempt to reveal true data cluster structure, we propose sparsty constraned overlappng clusterng by ncorporatng sparseness constrants nto an overlappng clusterng process. To solve the derved sparsty constraned overlappng clusterng problems, effcent and effectve algorthms are proposed. Experments demonstrate the advantages of our overlappng clusterng model. II. INTRODUCTION Overlappng clusterng s a type of clusterng technque that allows a data pont to be a member of multple clusters. Compared to parttonal clusterng technques, whch partton data nto non-overlappng regons, overlappng clusterng s more approprate n modelng data relatonshps for many real applcatons. In bology, clusterng technques are common approaches to dentfyng functonal groups n gene expresson data by clusterng genes wth smlar expresson profles nto the same group. It has been known that many genes are mult-functonal and they should belong to more than one functonal group [1]. Therefore parttonal clusterng technques have the lmtaton n ther ablty to dscover the true cluster structure n gene expresson data. Apart from bology, many other domans, ncludng role-based access control and move recommender systems, also motvate overlappng clusterng. Due to ts mportance, overlappng clusterng has receved much attenton recently. However, much of the exstng work go to the opposte extreme of parttonal clusterng. They smply allow a data pont to belong to as many clusters as needed wthout consderng any contextual nformaton, whch may result n too many cluster assgnments. To llustrate, consder the bology applcaton. It s true that a gene can partcpate n multple processes. However, accordng to current bologcal understandng, t s unlkely a gene would partcpate n many processes. So when an overlappng clusterng result assgns many gene to over 20 processes, ts correctness would be hghly doubted. In an attempt to dscover true overlappng cluster structures, we propose overlappng clusterng wth sparseness constrants. The basc dea s to ncorporate avalable background knowledge of the dataset to be studed, such as the maxmum clusters a data pont can belong to, nto an overlappng clusterng process. Therefore clusterng results would not only provde good descrptons on the nput data, but also match the pror knowledge on the data. In ths paper, we specfcally look at the overlappng clusterng technque proposed by Cleuzou [2], whch we call the k-extended technque because ts soluton s derved from the well known k-means algorthm. The k-extended technque can be descrbed as the followng: Gven a set of data ponts, group them nto overlappng clusters, whle mnmzng the sum of the dstances between each pont and the mean of the representatves of clusters to whch the pont belongs. Apart from the k-extended technque, there are many other overlappng clusterng models, ncludng the plad model [3], the fuzzy c-means clusterng technque [4], and the probablstc model [1]. We chose k-extended for two reasons. The frst reason s that the k-extended technque s a hard overlappng clusterng technque, n whch a data pont s ether a member of a cluster or not, whle many overlappng clusterng technques are soft (probablstc) such as fuzzy c-means clusterng [4] n whch a data pont belongs to a cluster wth some probablty. For many real applcatons, hard overlappng clusterng results carry more nterpretablty. For example, overlappng clusterng technques have been employed to dscover roles to mplement a role based access control mechansm [5]. In the settng of role based access control, a user ether assumes a role, or not. The second reason s that the k-extended technque represents a data pont by the mean of the cluster representatves to whch /12 $ IEEE DOI /ICDMW

2 the data pont belongs. Whle some overlappng clusterng technques, e.g. [1], [6], represent a data pont by the sum of the cluster representatves to whch the data pont belongs, we thnk the mean s more approprate n representng the relatonshp between a data pont and ts assocated cluster representatves. Lke many other overlappng clusterng technques, the k-extended technque smply assumes that a data pont can belong to any number of clusters wthout mposng any constrant on cluster assgnments. Ths way of modelng mght be able to obtan an overlappng clusterng result, whch descrbes the dataset very well. However, the clusterng result could be far away from the ground truth. To overcome the lmtaton, we propose sparsty constraned overlappng clusterng, whch s able to ncorporate pror knowledge on cluster membershps nto the overlappng clusterng process. Techncally, our overlappng clusterng technque s to decompose a data matrx nto two matrces, where one matrx conssts of cluster representatves and the other s a bnary coeffcent matrx showng cluster membershps, whle the form of the bnary coeffcent matrx s regulated accordng to avalable pror knowledge. Mathematcally our problem can bol down to a constraned optmzaton problem. As the decomposed coeffcent matrx s bnary, due to the combnatoral nature, ths problem s very dffcult to solve. So we propose an alternatng mnmzaton soluton, whch mnmzes the obectve functon by fxng one of the decomposed matrces and proceeds n an alternatve fashon. The derved subproblem of mnmzng the obectve functon whle fxng the cluster representatve matrx s proven to be NP-hard. To solve t, we propose a branch-and-bound exact algorthm, whch s suted for small sze problems, and a smulated annealng algorthm, whch s applcable to large sze problems. To evaluate our technque of overlappng clusterng wth sparseness constrants, extensve experments on both synthetc and real datasets are conducted. III. PRIOR WORK Overlappng clusterng has recently attracted much attenton from both the data mnng and computatonal bology felds. However, lttle awareness of sparsty constrants n overlappng clusterng has been observed. The modfed nonnegatve sparse codng model proposed by [7] s closely related to our work. Mathematcally, t s formulated as the problem of mnmzng 1 2 A XC 2 F +λ X, where A and λ are gven, C s restrcted to be nonnegatve, and rows of X are forced to unt norm. The man dfference n our work s that X s converted from the bnary membershps of S, such that X = S / S. Therefore, n addton to the constrant of unt norm for rows of X, postve elements n each row must be the same, whch concdes wth most of the exstng overlappng clusterng models ncludng [3], [8], [2], [6]. Note that some of them are stated as probablstc clusterng approaches. However they bol down to matrx decomposton problems eventually. Another work closely related to our work s the model proposed by Zhu [9]. It s based on the plad model, whch s a co-clusterng model and attempts to approxmate a data matrx wth the sum of k submatrces. On the bass of the plad model, Zhu s model mnmzes the approxmaton error along wth the sze of submatrces n hope to fnd some cohesve submatrces. Imposng sparseness constrants n data analyss tasks n attempt to dscover real data patterns or relatonshps s not a new dea. One of the most mportant works s the Lasso model, a shrnkage and selecton method for lnear regresson, proposed by Tbshran [10], whch has been wdely used n many felds. The power of sparseness constrants has also been well-apprecated by the machne learnng communty. One mportant work s the model proposed by Hoyer [11], whch ncorporates spareness constrants n non-negatve matrx factorzaton. Heler et al. [12] even proposed a sequental cone programmng approach to ths sparsty constraned non-negatve matrx factorzaton problem. IV. PROBLEM DEFINITIONS In ths secton, we wll present the formal defnton of our sparsty constraned overlappng clusterng model. Before dong that, we would lke to frst ntroduce the k-extended overlappng clusterng technque, as our model s bult on t. Defnton 1 (k-extended [2]): Gven m observatons {A 1,..., A m } R n, dscover k clusters {S 1,..., S k } wth respectve representatves {C 1,..., C k } R n such that An observaton can belong to multple clusters; The sum of dstances between each observaton and the mean of ts assgned cluster representatves s mnmzed. Lke many other overlappng clusterng models, the k- extended technque can be descrbed as a matrx decomposton problem: Decompose a matrx A m n nto a bnary matrx S m k, where S =1means data pont belongs to cluster, and a real matrx C k n, where row s the representatve of cluster. As the goal s to dscover the decomposton soluton whch can best descrbe the observed data, so the k-extended technque can be formulated as the followng optmzaton problem. mn A m n X m k C k n 2 2 { X = S s.t. S S {0, 1} If we nclude the constrants nto the obectve functon, the above optmzaton problem can be reformatted as the followng unconstraned programmng problem. mn f 0 = A S C S 2, s.t. S {0, 1}. (2) (1) 487

3 The k-extended technque s essentally an extenson of the well known k-means clusterng technque, whch assgns each observaton to only one cluster, whch can be enforced by addng a constrant of S = 1 to Equaton 2. By allowng a data pont to belong to multple clusters, the data descrpton accuracy can be sgnfcantly mproved ndeed. However, ths mght cause the problem of overfttng the data. To address t, one straghtforward soluton s to take advantage of avalable pror knowledge on cluster membershps and regulate the form of the coeffcent matrx S. In realty, some applcatons may have explct pror knowledge about the maxmum clusters a pont can belong to, whle others may not. To reflect real stuatons, we present two overlappng clusterng technques. They are explct sparsty constraned overlappng clusterng and mplct sparsty constraned overlappng clusterng. Problem 1: (Explct Sparsty Constraned Overlappng Clusterng) mn f 1 = A S C S 2 2 { s.t. S (3) δ. S {0, 1} In explct sparsty constraned overlappng clusterng, there s an explct constrant on the maxmum clusters that a pont can belong to, whch s enforced by S δ. It s possble that dfferent data ponts may have dfferent lmts on the maxmum clusters that they can be assgned to. However, gven the optmzaton model as Equaton 3, t s not dffcult to extend to the personalzed case. So n ths paper we only consder the case that all data ponts have the same lmt. Problem 2: (Implct Sparsty Constraned Overlappng Clusterng) mn f 2 = A S C S λ S 1 (4) s.t. S {0, 1}. As ts name mples, the mplct sparsty constraned overlappng clusterng has no explct restrcton on the maxmum number of clusters that a pont can belong to. Instead, there s a penalty on the L 1 norm of the coeffcent matrx λ S, where λ s a tunng parameter controllng the penalty level. In cases where no explct pror knowledge s avalable, one can repeatedly adust the tunng parameter λ and choose the one whch gves a satsfactory clusterng result. Both enforcng the constrant of S δ and addng a penalty of λ S 1 n the obectve functon would lmt the number of 1 s elements n S. Therefore our technque s called sparsty constraned overlappng clusterng. V. ALTERNATING MINIMIZATION ALGORITHMS In ths secton, we wll present alternatng mnmzaton algorthms for our sparsty constraned overlappng clusterng problems. Alternatng mnmzaton s a method wdely used to solve dffcult problems n data mnng and machne learnng. The sparsty constraned overlappng clusterng problems consst of two groups of varables, S and C. It s dffcult to optmze over all varables, whle t s not dffcult to optmze the problem when ether S or C s fxed. So we present an alternatng mnmzaton algorthm, whch starts wth a set of ntal cluster representatves {C 1,..., C k } and then repeats the followng two-step procedure: Assgnment Step: Mnmze the obectve functon f 1 /f 2 by fxng {C 1,..., C k } and obtan cluster membershps S ; Update step: Mnmze the obectve functon f 1 /f 2 by fxng S and obtan updated {C 1,..., C k }. VI. UPDATE STEP In terms of the update step, the explct and mplct sparsty constraned overlappng clusterng problems are the same. The update step s gven cluster membershps to update cluster representatves. The explct sparsty constraned overlappng clusterng problem has a constrant of S δ. When cluster membershps S are fxed, SC the problems reduces to mnmzng A 2. S For the mplct sparsty constraned overlappng clusterng problem, when S s fxed, a part of ts obectve functon (Equaton 4) becomes a constant and the problem reduces SC to mnmzng A 2 as well. S For the update step, we only need to look at the problem of mnmzng A SC 2, whch can be solved S through lnear least squares. To do that, we frst replace S by X, such that X = S. Thus the problem becomes S to mnmze A X C 2 2, whch s equal to A X C 2 2. Mnmzng A X C 2 2 s a typcal lnear regresson problem, where (X, A ) can be vewed as observatons and C are unknown parameters to be determned. A X C 2 2 can be expanded as the followng: A T X T XA 2C T XA + C T C. Snce ths s a quadratc expresson, the global mnmum can be found by dfferentatng t wth respect to C. Thus we have C =(X T X) 1 XA. Therefore, at each update step, we need to update each cluster representatve to be (X T X) 1 XA gven new cluster membershps. 488

4 VII. ASSIGNMENT STEP The assgnment step s to assgn observatons to gven cluster representatves whle mnmzng errors. In ths secton, we wll study the complexty of the cluster assgnment problems and propose both exact and heurstc algorthms. A. Complexty Analyss The assgnment step n both explct and mplct sparsty constrant overlappng clusterng s NP-hard, whch can be proved by a reducton to a known NP-hard problem, the subset sum problem. [13], whch s descrbed as the follows. Defnton 2 (Subset Sum Problem [13]): Gven a set of ntegers {I 1,..., I n }, does the sum of some non-empty subset equal exactly zero? Theorem 1: The cluster assgnment problem n explct sparsty constraned overlappng clusterng s NP-hard. Proof. The cluster assgnment problem s a mnmzaton problem. Its decson nstance can be descrbed as: Gven a vector set {C 1,..., C k R m, a pont x R m, a cluster assgnment threshold δ, and a real number b, s there some C vector subset S, such that x S C S 2 2 b, where S s the number of vectors n S? The cluster assgnment problem belongs to P, because for any nstance, t s easy to check f a soluton s true. Next we wll show that every cluster assgnment nstance s polynomally reducble to a subset sum nstance. For any subset sum nstance of {n 1,..., n t }, we can construct a correspondng cluster assgnment nstance such that: All data ponts are 1-dmensonal; Cluster representatves are {n 1,..., n t }; δ s t; b s 0. Such a constructed cluster assgnment nstance s equvalent to fndng a subset of {n 1,..., n t }, such that the sum of contaned numbers s equal to zero. Clearly the constructed cluster assgnment nstance s true f and only f the subset sum nstance s true. So the theorem s proven. Theorem 2: The assgnment step n mplct sparsty constraned overlappng clusterng s NP-hard. Proof. It s not dffcult to see that the assgnment step belongs to P. A decson cluster assgnment nstance n mplct sparsty constraned overlappng coursng s as: Gven a vector set {C 1,..., C k }, a data pont x, a penalty parameter λ, and a real number b, s there some vector subset C S, such that x S C S λ S b, where S s the number of vectors n S? For each subset sum nstance {I 1,..., I n }, we can fnd an assgnment step nstance such that: All data ponts are 1-dmensonal; Cluster representatves are {n 1,..., n t }; λ=0; b s 0. Clearly the constructed cluster assgnment nstance s true f and only f the subset sum nstance s true. So the theorem s proven. B. Exact Algorthm Although the assgnment step n both explct and mplct sparsty constraned overlappng clusterng are NP-hard, t s stll possble to apply an exact search algorthm n many real applcatons, because unlke the number of data ponts, the number of clusters usually s not too large. But an exhaustve search algorthm s too computatonally expensve n any case. In ths secton, we wll propose an effcent branchand-bound (B&B) exact algorthm. B&B algorthms have been wdely used n fndng optmal solutons of varous optmzaton problems, especally n dscrete and combnatoral optmzaton. It conssts of a systematc enumeraton of all canddate solutons, where large subsets of canddate solutons are dscarded by usng upper and lower estmated bounds of the quantty beng optmzed. 1) Explct Sparsty Constraned Overlappng Clusterng: The cluster assgnment subproblem n explct sparsty constraned overlappng clusterng can be formulated as an optmzaton problem as the follows. mn f 1 = A S C S 2 { s.t. S δ (5) S {0, 1}. For smplcty, we consder A to be a vector. In other words, we want to assgn a data pont A to gven cluster representatves {C 1,..., C k } approprately. A straghtforward approach for such a combnatoral problem s to search through the whole soluton space S {0, 1} m. The computatonal tme would be O(2 m ), whch s nhbtve for large values of m. B&B s a strategy that reduces computatonal tme by avodng the search n some soluton subspaces where the optmal soluton s guaranteed not to exst. The outlne of the B&B algorthms proposed for the cluster assgnment problem n explct sparsty constraned overlappng clusterng s gven n the Algorthm 1. The explct explanaton of Algorthm 1 s gven as follows: Lnes 1-3 ndcate that the recursve algorthm termnates when the all varables have been branched; n other words, the whole soluton space has been searched. In lnes 4-6 f the currently vsted soluton {s 1 = s 1,..., s l 1 = s l 1,s l = 0,..., s k = 0} s better than all prevously vsted solutons, update the best vsted soluton S and the lowest obectve value z accordngly. Lne 8 gves an estmate for the lower bound of mnmum(f 1 ) n the soluton subspace of s {0, 1} 489

5 Algorthm 1 Branch-and-Bound (l) for Explct Sparsty Constraned Overlappng Clusterng Input: () l s the ndex of the next varable to branch at. () The current soluton s s = s for =1,..., l 1. () The current soluton subspace s s {0, 1} for = l,..., k. (v) The best soluton vsted so far s S and the current lowest obectve value s z. 1: f l>nthen 2: Return; 3: else 4: f f 1(s 1 = s 1,..., s l 1 = s l 1,s l =0,..., s k =0)<z then 5: z = f 1(s 1 = s 1,..., s = s l 1,s l =0,..., s k =0); 6: S = {s 1 = s 1,..., s = s l 1,s l =0,..., s k =0}; 7: end f 8: Estmate a lower bound LB for the mnmum of f 1 gven s = s for =1,..., l 1. 9: f LB < z and l 1 =1 s <δ then 10: S l =1, Branch-and-Bound(l +1); 11: S l =0, Branch-and-Bound(l +1); 12: end f 13: end f for = l,..., k wth s = s for = 1,..., l 1, whch wll be searched later. We defer our dscusson of how to obtan the estmated lower bound untl after the explanaton of the algorthm. Lnes 9-12 are the essence of ths branch-and-bound algorthm. Wthout them, the algorthm s ust an ordnary brute force algorthm. There are two condtons used to determne whether or not keepng searchng along the current branch. LB < z means there mght be some soluton n ths branch that outperforms the current best soluton. l 1 =1 s <δ means the current soluton does not volate the cluster assgnment constrant. So f both constrants are satsfed, there mght be a feasble soluton outperformng the current best soluton and the algorthm should proceed. To further explan the branch-and-bound algorthm, consder the llustraton n Fgure 1. The fgure gves a treelke representaton for the whole soluton space. Assume that at the begnnng we have an ntal soluton S and ts correspondng obectve value z. The algorthm frst proceeds to {1,...}, whch s the soluton subspace wth S 1 fxed to be 1 and the other components n S to be bnary. The algorthm wll estmate a lower bound of f 1 n that soluton subspace. If we ascertan that there s no soluton better than S, the current best soluton, t s clearly unnecessary to proceed any further n that branch. The branch s then dscarded and the algorthm moves to other branches. The essence of a branch-and-bound algorthm s to save computatonal tme by avodng searchng some unnecessary branches by ntellgently employng a upper bound ( the best obectve value vsted so far) and a lower bound ( the estmated best obectve value n the current soluton subspace to be searched). In the B&B algorthm, at each Fgure 1: Branch and Bound Illustraton branchng pont, we need to estmate a lower bound of the obectve value n the current soluton subspace. An accurate estmated lower bound would mprove the algorthm performance sgnfcantly. Consder Equaton 5 of the cluster assgnment problem. Suppose that the value of a porton of S, {S 1,..., S l 1 }, has been determned. For notatonal convenence, we assume that among {S 1,..., S l 1 }, {S 1,..., S l } are 1 and the remanng {S l +1,..., S l 1 } are 0. To obtan the exact lower bound of f 1, we need to solve the mnmzaton functon of Sc f 1 : mn f 1 = x 2, s.t. S {0, 1} where {S S 1,..., S l 1 } have been determned. Ths problem could be as hard as the orgnal cluster assgnment problem. Let us take a deeper look at f 1.As a part of S has been determned, so f 1 can be reorganzed as follows: f 1 = SC S x 2 1 = l + C k + =l S = l =1 k =l k =l S l + k =l S C (x S l + k =l S C x 2 l =1 1 l + k =l S C ) 2 The functon f 1 can then be vewed as dscoverng a lnear combnaton of vectors (C l,..., C k ) wth coeffcents of { k S =l l + k } to approxmate the target =l S (x l 1 =1 l + k C ). For convenence, n the followng =l S we denote (x l 1 =1 l + k C ) by x. =l S S beng bnary makes the problem dffcult to solve. To estmate the lower bound of f 1, we relax the bnary S to be the real value y and then solve the followng problem: mn k y C x 2 2 =l where y are real varables. Certanly, the optmal obectve value of such a relaxed problem gves a lower bound to the mnmum of f 1. The above relaxaton problem s also a typcal lnear least squares problem. For notatonal convenence, we 490

6 rewrte the problem as mn CY x 2 2, where the matrx C s {C l,..., C k }, Y s (y l,..., y k ) T, and x s (x l 1 =1 l + k C ). Then the optmal soluton of Y =l S s (C T C) 1 C T x. So the estmated lower bound for f 1 s: C(C T C) 1 C T x x 2 2. (6) At step 8 n the Algorthm 1, we derve the estmated lower bound C(C T C) 1 C T x x 2 2 and use t to determne whether or not to contnue searchng the current branchng space. 2) Implct Sparsty Constraned Overlappng Clusterng: Now we look at the cluster assgnment problem n mplct sparsty constraned overlappng clusterng. Unlke the cluster assgnment n the explct case, the obectve functon n the mplct case mnmzes both approxmaton error and the L 1 -norm of the assgnments. For smplcty, we consder one data pont A. The problem of assgnng A to gven cluster representatves {C 1,..., C k } n the mplct case can be formulated as the followng optmzaton problem: mn f 2 = A S C S λ S 1 (7) where λ s gven, and S, whch denotes cluster membershp, s to be determned. A B&B algorthm for the cluster assgnment subproblem of the mplct sparsty constranedly overlappng clusterng, s provded n Algorthm 2. Algorthm 2 Branch-and-Bound (l) for Implct Sparsty Constraned Overlappng Clusterng Input: () l s the ndex of the next varable to branch at. () The current soluton s s = s for =1,..., l 1. () The current soluton subspace s s {0, 1} for = l,..., k. (v) The best soluton vsted so far s S and the current lowest obectve value s z. 1: f l>nthen 2: Return; 3: else 4: f f 2(s 1 = s 1,..., s l 1 = s l 1,s l =0,..., s k =0)<z then 5: z = f 2(s 1 = s 1,..., s = s l 1,s l =0,..., s k =0); 6: S = {s 1 = s 1,..., s = s l 1,s l =0,..., s k =0}; 7: end f 8: Estmate a lower bound LB for the mnmum of f 2 gven s = s for =1,..., l 1. 9: f LB < z then 10: S l =1, Branch-and-Bound(l +1); 11: S l =0, Branch-and-Bound(l +1); 12: end f 13: end f Snce Algorthm 2 s smlar to Algorthm 1, we wll skp the explanaton of the man body of the algorthm. Instead, we pont out the dfferences n Algorthm 2: In lnes 4 and 5, the obectve functon s f 2, whch s A SC 2 S 2 + λ S 1 ; In lne 8, the way of estmatng the lower bound of f 2 s dfferent, snce the obectve functon s dfferent; In lne 9, the condton s LB z only, snce there s no explct constrant on the maxmum cluster assgnments n ths case. At step 8, we need to estmate the lower bound of f 2 at each branchng pont. Here we wll present an estmaton method. The obectve functon f 2 conssts of two parts, A SC 2 2, whch s f 1, and λ S 1. Suppose that at S the branchng pnt of l, where {S 1,..., S l 1 } have been determned, {S 1,..., S l } are 1 and the remanng {S l +1,..., S l 1 } are 0. Accordng to the estmated lower bound for f 1 n Equaton 6, we clearly have A SC 2 S 2 C(C T C) 1 C T x x 2 2, where C s {C l,..., C k } and x s (x l 1 =1 l + k C ). We also =l S have λ S 1 λ l =1 S. Therefore an estmated lower bound for f 2 s C. Heurstc C(C T C) 1 C T x x λ S. (8) l =1 As the cluster assgnment problem n both explct and mplct cases are NP-hard, exact search algorthms are not approprate when the number of clusters s large. So n ths secton, we wll present effcent smulated annealng (SA) heurstcs, whch usually run fast and produce satsfactory results. Let us look at the cluster assgnment problem n explct sparseness constraned overlappng clusterng frst. It s gven an nput vector X and cluster representatves C 1,..., C k to assgn X to clusters approprately such that the mean of the assgned cluster representatves s the closest to x. The soluton space of the cluster assgnment S s {0, 1} m. Our smulated annealng heurstc desgned for the cluster assgnment problem s descrbed as follows: Frstly, we fnd the cluster representatve C closest to X and ntalze the value of S by lettng ts th component be 1 and the others be 0. A next canddate soluton s found by randomly selectng one component of the current soluton S and flppng ts value from 0 to 1 or from 1 to 0. Repeat t, f there are more than δ elements wth the value of 1 n S. If the new soluton s closer to the target X, update the current soluton to be the new soluton. Even f the new soluton s not better, wth a certan probablty less than 1, the current soluton s stll updated to be 491

7 Algorthm 3 Smulated Annealng for Explct Sparseness Constraned Overlappng Clusterng Input: x, {C 1,..., C k } {0, 1} m 1, and count ; Output: S {0, 1} k 1 ; 1: = arg : mn C x 2; 2: S() =1and S() =0, ; 3: count =1; 4: whle count count do 5: Generate a random number t n {1,..., k}; 6: S = S and S (t) =1 S (t); 7: f f 1(S ) <f 1(S) and S () δ then 8: S = S ; 9: else 10: Generate a random number r n [0, 1]; 11: f r<exp[ log(count +1)(f 1(S ) f 1(S))] then 12: S = S; 13: end f 14: count = count +1; 15: end f 16: end whle the new soluton. Ths property reduces the chance of beng stuck at a local optmum. Repeat the prevous two steps untl some termnatng condton s satsfed, such as the maxmum teraton steps are reached or the obectve value s not mproved for a certan number of teratons. At the end, choose the best soluton that has been vsted to be the fnal soluton. As mentoned n the above steps, when the canddate soluton s worse than the current soluton, there s a certan probablty of movng to that nferor soluton anyway. We adopt the transton probablty formula proposed by Besag [14], whch s exp[ log(n +1) max(0,f 1 (S ) f 1 (S))] (9) where S s the current soluton, S s the canddate soluton, and n s the number of current teratons. The complete pseudocode of the smulated annealng algorthm s provded n Algorthm 3. A smulated annealng algorthm for the cluster assgnment problem n mplct sparseness constraned overlappng clusterng can be easly obtaned by makng a few changes to Algorthm 3: At lne 7, the termnatng condton s changed to f 2 (S ) <f 2 (S); At lne 11, the condton s changed to r<exp[ log(count +1)(f 2 (S ) f 2 (S))]. VIII. EXPERIMENTAL STUDY In ths secton, three experments are desgned to study our proposed explct and mplct sparsty constraned overlappng clusterng models. All experments are mplemented n Matlab and run on a Dell desktop wth Intel Core 2 Duo CPU 3.00GHz and 2.96 GB of RAM. Experment 1. The frst experment s to evaluate the proposed exact B&B algorthm and SA heurstc. For smplcty, we consder explct sparsty constraned overlappng clusterng. So we study Algorthm 1 and Algorthm 3. As both of these algorthms are desgned for the cluster assgnment step nstead of the whole overlappng clusterng problem, we compare the alternatng mnmzaton algorthm coupled wth the exact B&B algorthm, and the same alternatng mnmzaton algorthm coupled wth the SA heurstc. Notce that although the B&B algorthm gves an optmal soluton for the cluster assgnment problem, the alternatng mnmzaton algorthm coupled wth the exact B&B algorthm s not guaranteed to produce an optmal soluton for an explct sparsty constraned overlappng clusterng problem. The experment s conducted on seven synthetc datasets ncludng a large sze dataset. The detaled data generaton procedure s: () Frst, randomly generate k representatve vectors of d attrbutes wth element values rangng from 1 to 50; () Second, randomly generate a bnary cluster membershp matrx S wth each row consstng of no more than δ elements wth the value of 1; () Fnally, construct a data matrx based on the representatve vectors and the cluster membershp matrx S. Those seven datasets are generated n dfferent parameter settngs. Dataset 1: n =20, d =5, k =4, and δ =2; Dataset 2: n =40, d =10, k =6, and δ =3; Dataset 3: n =60, d =15, k =8, and δ = 4; Dataset 4: n = 80, d = 20, k = 10, and δ =5; Dataset 5: n = 100, d =25, k =12, and δ =6; Dataset 6: n = 120, d =30, k =14, and δ =7; Dataset 7: n =10, 000, d = 100, k =20, and δ =10. Dataset 7 s of large sze, whch s used to test the scalablty of our proposed algorthms. We compare the alternatng mnmzaton algorthm coupled wth the B&B algorthm and the same algorthm coupled wth the SA heurstc n terms of approxmaton error and computatonal tme. For both algorthms, we assume δ s known and use t to regulate the form of the bnary coeffcent decomposed matrx. For the SA heurstc, the maxmum number of teratons s k 2. The measure of approxmaton error s defned as the followng: error = A A 2 A 2 where A s the orgnal data matrx and A s the reconstructed data matrx. Results are plotted n Fgures 2 and 4. As the alternatng mnmzaton algorthm coupled wth the B&B algorthm cannot return a result for Dataset 6 n a lmted tme, n Fgure 2 only the comparson on Datasets 1-6 s provded. We observe that the alternatng mnmzaton algorthm coupled wth the B&B algorthm does outperform the alternatng mnmzaton algorthm coupled wth the SA heurstc. However, the performance of the alternatng mnmzaton algorthm coupled wth the SA heurstc s satsfactory. In many cases, the approxmaton error s even less than

8 Fgure 2: Comparson w.r.t. Approxmaton Error Fgure 4: Relaton Between λ and δ Fgure 3: Comparson w.r.t. Computatonal Tme Fgure 3 provdes the comparson result on computatonal tme for all datasets except for Dataset 7. The alternatng mnmzaton algorthm coupled wth the SA heurstc Dataset 7 takes 1,849 seconds to cluster Dataset 7 of 10,000 records and the resultng approxmaton error s only The result valdates the scalblty of our proposed alternatng mnmzaton algorthm coupled wth the SA heurstc. We also observe that when the data sze s small, two algorthms are comparable. However, the requred computatonal tme for the alternatng mnmzaton algorthm coupled wth the B&B algorthm grows exponentally wth the data sze. For a data matrx wth 120 records and 30 attrbutes, t takes about 60 seconds. When the alternatng mnmzaton algorthm coupled wth the SA heurstc, the requred computatonal tme grows slowly wth the data sze. The underlyng reason s that the SA heurstc runs n polynomal tme, whle the B&B algorthm s an exact algorthm. In the worst case, t can take as much as an exhaustve search algorthm. The Fgure 2 and Fgure 3 suggest that the B&B algorthm s suted for small scale problems and the SA heurstc s good for large scale problems. Experment 2. The second experment s to nvestgate the relaton between explct and mplct sparsty constraned overlappng clusterng models. The experment s conducted on a synthetc dataset, generated n the same way as employed n the frst experment. The parameter settng s n = 120, d =30, k =3, and δ =7. We run the alternatng mnmzaton algorthm for the mplct clusterng, coupled wth the SA heurstc for each value of λ rangng from 0 to 1.8. The maxmum number of teratons s set to be k 2. For each λ value, we fnd the maxmum number of cluster assgnments, δ, whch s not the real cluster assgnment lmt δ, n the clusterng result. The results are plotted n the Fgure 4. There are two observatons. Frst, δ decreases when the value of λ ncreases. The reason s that more penalty are mposed on the total number of cluster assgnments when the value of λ ncreases. The second observaton s that for most values of λ rangng from 0.2 to 0.5, δ s 3, whch s the true cluster assgnment lmt. δ goes to 4 when λ s 0.3 snce the alternatng mnmzaton algorthm does not necessarly fnd the global optmum. Experment 3. The thrd experment s to evaluate the soundness of our sparsty constraned overlappng clusterng approach. Specfcally we compare our alternatng mnmzaton algorthm coupled wth the SA heurstc for the explct sparsty constraned overlappng clusterng model wth the k-extended algorthm for the conventonal overlappng clusterng model. We assume the true cluster assgnment lmt s known to our algorthm. In the SA heurstc, the maxmum number of teratons s set to be the square of the cluster assgnment lmt. The experment s conducted on both synthetc datasets and a real dataset, the MoveLens dataset 1. The synthetc dataset generaton procedure s the same as before. The specfc parameter settngs are as follows. (1) small-synthetc: a dataset wth n = 75, d = 30, k = 10, and δ = 3; (2) medum-synthetc: a dataset wth n = 200, d = 50, k = 10, and δ = 5, (3) large-synthetc: a dataset wth n = 1000, d = 150, k =30, and δ =10. The MoveLens dataset conssts of ratngs and tags for moves by users. We generate three ratng matrces. (1) small-real: 100 moves and 38 users; (2) medum-real: 150 moves and 12 users; (3) large-real: 200 moves and 7 users. Because most of users rate a small porton of moves, when many moves are consdered, we can only fnd a very few users, who rate all of the selected moves. The tags lst genres of every move. Accordng to the real data, a move can belong to up to sx genres. We use sx as the cluster assgnment lmt for our explct sparsty constraned overlappng clusterng model. We adopt the comparson measure employed n [6]. To evaluate the clusterng results, precson, recall, and F

9 Data F-measure Precson Recall Sparse OC K-Extended Sparse OC K-Extended Sparse OC K-Extended small-synthetc medum-synthetc large-synthetc small-real medum-real large-real Fgure 5: Comparson of results on all datasets measure were calculated over pars of ponts. For each par of ponts that share at least one cluster n the overlappng clusterng results, these measures try to estmate whether the predcton of ths par as beng n the same cluster was correct wth respect to the underlyng true categores n the data. Precson s calculated as the fracton of pars correctly put n the same cluster, recall s the fracton of actual pars that were dentfed, and F-measure s the harmonc mean of precson and recall. Comparson of results s provded n Fgure 5. For synthetc datasets, the explct sparsty constraned overlappng clusterng model outperforms the k-extended model wth respect to any clusterng comparson measure. For the real datasets, the performance of our model s sgnfcantly better than the k-extended model wth respect to the measures of F-measure and recall and s comparable to the k-extended model wth respect to precson. The expermental results valdates the soundness of our sparsty constraned overlappng clusterng model. IX. CONCLUSION Ths paper studes the problem of overlappng clusterng wth sparseness constrants. Specfcally, ths paper proposes two new methods, explct sparsty constraned overlappng clusterng and mplct sparsty constraned overlappng clusterng, whch respectvely ncorporate explct and mplct sparseness constrants nto overlappng clusterng. In addton, we propose alternatng mnmzaton algorthms to solve these two problems. Furthermore, as the cluster assgnment step n both of these algorthms s NP-hard, we propose an effcent branch-and-bound exact algorthm and a smulated annealng heurstc. Expermental results show that our methods perform better than the exstng overlappng clusterng method. REFERENCES [1] E. Segal, A. Battle, and D. Koller, Decomposng gene expresson nto cellular processes, n In Proc. of 8th Pacfc Symposum on Bocomputng (PSB), pp , [2] G. Cleuzou, An extended verson of the k-means method for overlappng clusterng, n Pattern Recognton, ICPR th Internatonal Conference on, pp. 1 4, [3] L. Lazzeron and A. Owen, Plad models for gene expresson data, Statstca Snca, vol. 12, pp , [4] J. C. Bezdek, Pattern Recognton wth Fuzzy Obectve Functon Algorthms. Norwell, MA, USA: Kluwer Academc Publshers, [5] H. Lu, J. Vadya, and V. Atlur, Optmal boolean matrx decomposton: Applcaton to role engneerng, n ICDE 08: Proceedngs of the 2008 IEEE 24th Internatonal Conference on Data Engneerng, (Washngton, DC, USA), pp , IEEE Computer Socety, [6] A. Baneree, C. Krumpelman, and J. Ghosh, Model-based overlappng clusterng, n In KDD, pp , ACM Press, [7] L. Badea, D. Tlvea, L. Badea, and D. Tlvea, Sparse factorzatons of gene expresson data guded by bndng data, n Pacfc Symposum on Bocomputng, pp , [8] Q. Fu and A. Baneree, Multplcatve mxture models for overlappng clusterng, n ICDM, pp , [9] H. Zhu, G. Mateos, G. Gannaks, N. Sdropoulos, and A. Baneree, Sparsty-cognzant overlappng co-clusterng for behavor nference n socal networks, n Acoustcs Speech and Sgnal Processng (ICASSP), 2010 IEEE Internatonal Conference on, pp , march [10] R. Tbshran, Regresson shrnkage and selecton va the lasso, Journal of the Royal Statstcal Socety, Seres B, vol. 58, pp , [11] P. O. Hoyer, Non-negatve matrx factorzaton wth sparseness constrants, J. Mach. Learn. Res., vol. 5, pp , December [12] M. Heler, C. Schnorr, P. Bennett, and E. Parrado-hernandez, Learnng sparse representatons by non-negatve matrx factorzaton and sequental cone programmng, Journal of Machne Learnng Research, vol. 7, p. 2006, [13] T. H. Cormen, C. E. Leserson, R. L. Rvest, and C. Sten, Introducton to Algorthms. MIT Press and McGraw-Hll, [14] J. B. Peter, P. Green, D. Hgdon, and K. Mengersen, Bayesan computaton and stochastc systems, Statstcal Scence, vol. 10, pp. 3 67,

Support Vector Machines

Support Vector Machines /9/207 MIST.6060 Busness Intellgence and Data Mnng What are Support Vector Machnes? Support Vector Machnes Support Vector Machnes (SVMs) are supervsed learnng technques that analyze data and recognze patterns.

More information

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points; Subspace clusterng Clusterng Fundamental to all clusterng technques s the choce of dstance measure between data ponts; D q ( ) ( ) 2 x x = x x, j k = 1 k jk Squared Eucldean dstance Assumpton: All features

More information

Course Introduction. Algorithm 8/31/2017. COSC 320 Advanced Data Structures and Algorithms. COSC 320 Advanced Data Structures and Algorithms

Course Introduction. Algorithm 8/31/2017. COSC 320 Advanced Data Structures and Algorithms. COSC 320 Advanced Data Structures and Algorithms Course Introducton Course Topcs Exams, abs, Proects A quc loo at a few algorthms 1 Advanced Data Structures and Algorthms Descrpton: We are gong to dscuss algorthm complexty analyss, algorthm desgn technques

More information

Machine Learning. Topic 6: Clustering

Machine Learning. Topic 6: Clustering Machne Learnng Topc 6: lusterng lusterng Groupng data nto (hopefully useful) sets. Thngs on the left Thngs on the rght Applcatons of lusterng Hypothess Generaton lusters mght suggest natural groups. Hypothess

More information

An Optimal Algorithm for Prufer Codes *

An Optimal Algorithm for Prufer Codes * J. Software Engneerng & Applcatons, 2009, 2: 111-115 do:10.4236/jsea.2009.22016 Publshed Onlne July 2009 (www.scrp.org/journal/jsea) An Optmal Algorthm for Prufer Codes * Xaodong Wang 1, 2, Le Wang 3,

More information

GSLM Operations Research II Fall 13/14

GSLM Operations Research II Fall 13/14 GSLM 58 Operatons Research II Fall /4 6. Separable Programmng Consder a general NLP mn f(x) s.t. g j (x) b j j =. m. Defnton 6.. The NLP s a separable program f ts objectve functon and all constrants are

More information

5 The Primal-Dual Method

5 The Primal-Dual Method 5 The Prmal-Dual Method Orgnally desgned as a method for solvng lnear programs, where t reduces weghted optmzaton problems to smpler combnatoral ones, the prmal-dual method (PDM) has receved much attenton

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Decson surface s a hyperplane (lne n 2D) n feature space (smlar to the Perceptron) Arguably, the most mportant recent dscovery n machne learnng In a nutshell: map the data to a predetermned

More information

Cluster Analysis of Electrical Behavior

Cluster Analysis of Electrical Behavior Journal of Computer and Communcatons, 205, 3, 88-93 Publshed Onlne May 205 n ScRes. http://www.scrp.org/ournal/cc http://dx.do.org/0.4236/cc.205.350 Cluster Analyss of Electrcal Behavor Ln Lu Ln Lu, School

More information

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization Problem efntons and Evaluaton Crtera for Computatonal Expensve Optmzaton B. Lu 1, Q. Chen and Q. Zhang 3, J. J. Lang 4, P. N. Suganthan, B. Y. Qu 6 1 epartment of Computng, Glyndwr Unversty, UK Faclty

More information

An Application of the Dulmage-Mendelsohn Decomposition to Sparse Null Space Bases of Full Row Rank Matrices

An Application of the Dulmage-Mendelsohn Decomposition to Sparse Null Space Bases of Full Row Rank Matrices Internatonal Mathematcal Forum, Vol 7, 2012, no 52, 2549-2554 An Applcaton of the Dulmage-Mendelsohn Decomposton to Sparse Null Space Bases of Full Row Rank Matrces Mostafa Khorramzadeh Department of Mathematcal

More information

Compiler Design. Spring Register Allocation. Sample Exercises and Solutions. Prof. Pedro C. Diniz

Compiler Design. Spring Register Allocation. Sample Exercises and Solutions. Prof. Pedro C. Diniz Compler Desgn Sprng 2014 Regster Allocaton Sample Exercses and Solutons Prof. Pedro C. Dnz USC / Informaton Scences Insttute 4676 Admralty Way, Sute 1001 Marna del Rey, Calforna 90292 pedro@s.edu Regster

More information

Parallelism for Nested Loops with Non-uniform and Flow Dependences

Parallelism for Nested Loops with Non-uniform and Flow Dependences Parallelsm for Nested Loops wth Non-unform and Flow Dependences Sam-Jn Jeong Dept. of Informaton & Communcaton Engneerng, Cheonan Unversty, 5, Anseo-dong, Cheonan, Chungnam, 330-80, Korea. seong@cheonan.ac.kr

More information

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS ARPN Journal of Engneerng and Appled Scences 006-017 Asan Research Publshng Network (ARPN). All rghts reserved. NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS Igor Grgoryev, Svetlana

More information

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance Tsnghua Unversty at TAC 2009: Summarzng Mult-documents by Informaton Dstance Chong Long, Mnle Huang, Xaoyan Zhu State Key Laboratory of Intellgent Technology and Systems, Tsnghua Natonal Laboratory for

More information

A Binarization Algorithm specialized on Document Images and Photos

A Binarization Algorithm specialized on Document Images and Photos A Bnarzaton Algorthm specalzed on Document mages and Photos Ergna Kavalleratou Dept. of nformaton and Communcaton Systems Engneerng Unversty of the Aegean kavalleratou@aegean.gr Abstract n ths paper, a

More information

Assignment # 2. Farrukh Jabeen Algorithms 510 Assignment #2 Due Date: June 15, 2009.

Assignment # 2. Farrukh Jabeen Algorithms 510 Assignment #2 Due Date: June 15, 2009. Farrukh Jabeen Algorthms 51 Assgnment #2 Due Date: June 15, 29. Assgnment # 2 Chapter 3 Dscrete Fourer Transforms Implement the FFT for the DFT. Descrbed n sectons 3.1 and 3.2. Delverables: 1. Concse descrpton

More information

Module Management Tool in Software Development Organizations

Module Management Tool in Software Development Organizations Journal of Computer Scence (5): 8-, 7 ISSN 59-66 7 Scence Publcatons Management Tool n Software Development Organzatons Ahmad A. Al-Rababah and Mohammad A. Al-Rababah Faculty of IT, Al-Ahlyyah Amman Unversty,

More information

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints Australan Journal of Basc and Appled Scences, 2(4): 1204-1208, 2008 ISSN 1991-8178 Sum of Lnear and Fractonal Multobjectve Programmng Problem under Fuzzy Rules Constrants 1 2 Sanjay Jan and Kalash Lachhwan

More information

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique //00 :0 AM Outlne and Readng The Greedy Method The Greedy Method Technque (secton.) Fractonal Knapsack Problem (secton..) Task Schedulng (secton..) Mnmum Spannng Trees (secton.) Change Money Problem Greedy

More information

Learning the Kernel Parameters in Kernel Minimum Distance Classifier

Learning the Kernel Parameters in Kernel Minimum Distance Classifier Learnng the Kernel Parameters n Kernel Mnmum Dstance Classfer Daoqang Zhang 1,, Songcan Chen and Zh-Hua Zhou 1* 1 Natonal Laboratory for Novel Software Technology Nanjng Unversty, Nanjng 193, Chna Department

More information

A mathematical programming approach to the analysis, design and scheduling of offshore oilfields

A mathematical programming approach to the analysis, design and scheduling of offshore oilfields 17 th European Symposum on Computer Aded Process Engneerng ESCAPE17 V. Plesu and P.S. Agach (Edtors) 2007 Elsever B.V. All rghts reserved. 1 A mathematcal programmng approach to the analyss, desgn and

More information

TN348: Openlab Module - Colocalization

TN348: Openlab Module - Colocalization TN348: Openlab Module - Colocalzaton Topc The Colocalzaton module provdes the faclty to vsualze and quantfy colocalzaton between pars of mages. The Colocalzaton wndow contans a prevew of the two mages

More information

An Entropy-Based Approach to Integrated Information Needs Assessment

An Entropy-Based Approach to Integrated Information Needs Assessment Dstrbuton Statement A: Approved for publc release; dstrbuton s unlmted. An Entropy-Based Approach to ntegrated nformaton Needs Assessment June 8, 2004 Wllam J. Farrell Lockheed Martn Advanced Technology

More information

Classifier Selection Based on Data Complexity Measures *

Classifier Selection Based on Data Complexity Measures * Classfer Selecton Based on Data Complexty Measures * Edth Hernández-Reyes, J.A. Carrasco-Ochoa, and J.Fco. Martínez-Trndad Natonal Insttute for Astrophyscs, Optcs and Electroncs, Lus Enrque Erro No.1 Sta.

More information

Kent State University CS 4/ Design and Analysis of Algorithms. Dept. of Math & Computer Science LECT-16. Dynamic Programming

Kent State University CS 4/ Design and Analysis of Algorithms. Dept. of Math & Computer Science LECT-16. Dynamic Programming CS 4/560 Desgn and Analyss of Algorthms Kent State Unversty Dept. of Math & Computer Scence LECT-6 Dynamc Programmng 2 Dynamc Programmng Dynamc Programmng, lke the dvde-and-conquer method, solves problems

More information

Smoothing Spline ANOVA for variable screening

Smoothing Spline ANOVA for variable screening Smoothng Splne ANOVA for varable screenng a useful tool for metamodels tranng and mult-objectve optmzaton L. Rcco, E. Rgon, A. Turco Outlne RSM Introducton Possble couplng Test case MOO MOO wth Game Theory

More information

Classification / Regression Support Vector Machines

Classification / Regression Support Vector Machines Classfcaton / Regresson Support Vector Machnes Jeff Howbert Introducton to Machne Learnng Wnter 04 Topcs SVM classfers for lnearly separable classes SVM classfers for non-lnearly separable classes SVM

More information

Complex Numbers. Now we also saw that if a and b were both positive then ab = a b. For a second let s forget that restriction and do the following.

Complex Numbers. Now we also saw that if a and b were both positive then ab = a b. For a second let s forget that restriction and do the following. Complex Numbers The last topc n ths secton s not really related to most of what we ve done n ths chapter, although t s somewhat related to the radcals secton as we wll see. We also won t need the materal

More information

Determining the Optimal Bandwidth Based on Multi-criterion Fusion

Determining the Optimal Bandwidth Based on Multi-criterion Fusion Proceedngs of 01 4th Internatonal Conference on Machne Learnng and Computng IPCSIT vol. 5 (01) (01) IACSIT Press, Sngapore Determnng the Optmal Bandwdth Based on Mult-crteron Fuson Ha-L Lang 1+, Xan-Mn

More information

Solving two-person zero-sum game by Matlab

Solving two-person zero-sum game by Matlab Appled Mechancs and Materals Onlne: 2011-02-02 ISSN: 1662-7482, Vols. 50-51, pp 262-265 do:10.4028/www.scentfc.net/amm.50-51.262 2011 Trans Tech Publcatons, Swtzerland Solvng two-person zero-sum game by

More information

CS 534: Computer Vision Model Fitting

CS 534: Computer Vision Model Fitting CS 534: Computer Vson Model Fttng Sprng 004 Ahmed Elgammal Dept of Computer Scence CS 534 Model Fttng - 1 Outlnes Model fttng s mportant Least-squares fttng Maxmum lkelhood estmaton MAP estmaton Robust

More information

Meta-heuristics for Multidimensional Knapsack Problems

Meta-heuristics for Multidimensional Knapsack Problems 2012 4th Internatonal Conference on Computer Research and Development IPCSIT vol.39 (2012) (2012) IACSIT Press, Sngapore Meta-heurstcs for Multdmensonal Knapsack Problems Zhbao Man + Computer Scence Department,

More information

Hierarchical clustering for gene expression data analysis

Hierarchical clustering for gene expression data analysis Herarchcal clusterng for gene expresson data analyss Gorgo Valentn e-mal: valentn@ds.unm.t Clusterng of Mcroarray Data. Clusterng of gene expresson profles (rows) => dscovery of co-regulated and functonally

More information

Optimization Methods: Integer Programming Integer Linear Programming 1. Module 7 Lecture Notes 1. Integer Linear Programming

Optimization Methods: Integer Programming Integer Linear Programming 1. Module 7 Lecture Notes 1. Integer Linear Programming Optzaton Methods: Integer Prograng Integer Lnear Prograng Module Lecture Notes Integer Lnear Prograng Introducton In all the prevous lectures n lnear prograng dscussed so far, the desgn varables consdered

More information

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data A Fast Content-Based Multmeda Retreval Technque Usng Compressed Data Borko Furht and Pornvt Saksobhavvat NSF Multmeda Laboratory Florda Atlantc Unversty, Boca Raton, Florda 3343 ABSTRACT In ths paper,

More information

Lobachevsky State University of Nizhni Novgorod. Polyhedron. Quick Start Guide

Lobachevsky State University of Nizhni Novgorod. Polyhedron. Quick Start Guide Lobachevsky State Unversty of Nzhn Novgorod Polyhedron Quck Start Gude Nzhn Novgorod 2016 Contents Specfcaton of Polyhedron software... 3 Theoretcal background... 4 1. Interface of Polyhedron... 6 1.1.

More information

Lecture 5: Multilayer Perceptrons

Lecture 5: Multilayer Perceptrons Lecture 5: Multlayer Perceptrons Roger Grosse 1 Introducton So far, we ve only talked about lnear models: lnear regresson and lnear bnary classfers. We noted that there are functons that can t be represented

More information

S1 Note. Basis functions.

S1 Note. Basis functions. S1 Note. Bass functons. Contents Types of bass functons...1 The Fourer bass...2 B-splne bass...3 Power and type I error rates wth dfferent numbers of bass functons...4 Table S1. Smulaton results of type

More information

An Iterative Solution Approach to Process Plant Layout using Mixed Integer Optimisation

An Iterative Solution Approach to Process Plant Layout using Mixed Integer Optimisation 17 th European Symposum on Computer Aded Process Engneerng ESCAPE17 V. Plesu and P.S. Agach (Edtors) 2007 Elsever B.V. All rghts reserved. 1 An Iteratve Soluton Approach to Process Plant Layout usng Mxed

More information

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers IOSR Journal of Electroncs and Communcaton Engneerng (IOSR-JECE) e-issn: 78-834,p- ISSN: 78-8735.Volume 9, Issue, Ver. IV (Mar - Apr. 04), PP 0-07 Content Based Image Retreval Usng -D Dscrete Wavelet wth

More information

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision SLAM Summer School 2006 Practcal 2: SLAM usng Monocular Vson Javer Cvera, Unversty of Zaragoza Andrew J. Davson, Imperal College London J.M.M Montel, Unversty of Zaragoza. josemar@unzar.es, jcvera@unzar.es,

More information

A MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS

A MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS Proceedngs of the Wnter Smulaton Conference M E Kuhl, N M Steger, F B Armstrong, and J A Jones, eds A MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS Mark W Brantley Chun-Hung

More information

A Facet Generation Procedure. for solving 0/1 integer programs

A Facet Generation Procedure. for solving 0/1 integer programs A Facet Generaton Procedure for solvng 0/ nteger programs by Gyana R. Parja IBM Corporaton, Poughkeepse, NY 260 Radu Gaddov Emery Worldwde Arlnes, Vandala, Oho 45377 and Wlbert E. Wlhelm Teas A&M Unversty,

More information

Programming in Fortran 90 : 2017/2018

Programming in Fortran 90 : 2017/2018 Programmng n Fortran 90 : 2017/2018 Programmng n Fortran 90 : 2017/2018 Exercse 1 : Evaluaton of functon dependng on nput Wrte a program who evaluate the functon f (x,y) for any two user specfed values

More information

Analysis of Continuous Beams in General

Analysis of Continuous Beams in General Analyss of Contnuous Beams n General Contnuous beams consdered here are prsmatc, rgdly connected to each beam segment and supported at varous ponts along the beam. onts are selected at ponts of support,

More information

Collaboratively Regularized Nearest Points for Set Based Recognition

Collaboratively Regularized Nearest Points for Set Based Recognition Academc Center for Computng and Meda Studes, Kyoto Unversty Collaboratvely Regularzed Nearest Ponts for Set Based Recognton Yang Wu, Mchhko Mnoh, Masayuk Mukunok Kyoto Unversty 9/1/013 BMVC 013 @ Brstol,

More information

The Research of Support Vector Machine in Agricultural Data Classification

The Research of Support Vector Machine in Agricultural Data Classification The Research of Support Vector Machne n Agrcultural Data Classfcaton Le Sh, Qguo Duan, Xnmng Ma, Me Weng College of Informaton and Management Scence, HeNan Agrcultural Unversty, Zhengzhou 45000 Chna Zhengzhou

More information

Lecture 4: Principal components

Lecture 4: Principal components /3/6 Lecture 4: Prncpal components 3..6 Multvarate lnear regresson MLR s optmal for the estmaton data...but poor for handlng collnear data Covarance matrx s not nvertble (large condton number) Robustness

More information

Mathematics 256 a course in differential equations for engineering students

Mathematics 256 a course in differential equations for engineering students Mathematcs 56 a course n dfferental equatons for engneerng students Chapter 5. More effcent methods of numercal soluton Euler s method s qute neffcent. Because the error s essentally proportonal to the

More information

Learning-Based Top-N Selection Query Evaluation over Relational Databases

Learning-Based Top-N Selection Query Evaluation over Relational Databases Learnng-Based Top-N Selecton Query Evaluaton over Relatonal Databases Lang Zhu *, Wey Meng ** * School of Mathematcs and Computer Scence, Hebe Unversty, Baodng, Hebe 071002, Chna, zhu@mal.hbu.edu.cn **

More information

CSE 326: Data Structures Quicksort Comparison Sorting Bound

CSE 326: Data Structures Quicksort Comparison Sorting Bound CSE 326: Data Structures Qucksort Comparson Sortng Bound Steve Setz Wnter 2009 Qucksort Qucksort uses a dvde and conquer strategy, but does not requre the O(N) extra space that MergeSort does. Here s the

More information

Sorting Review. Sorting. Comparison Sorting. CSE 680 Prof. Roger Crawfis. Assumptions

Sorting Review. Sorting. Comparison Sorting. CSE 680 Prof. Roger Crawfis. Assumptions Sortng Revew Introducton to Algorthms Qucksort CSE 680 Prof. Roger Crawfs Inserton Sort T(n) = Θ(n 2 ) In-place Merge Sort T(n) = Θ(n lg(n)) Not n-place Selecton Sort (from homework) T(n) = Θ(n 2 ) In-place

More information

Incremental Learning with Support Vector Machines and Fuzzy Set Theory

Incremental Learning with Support Vector Machines and Fuzzy Set Theory The 25th Workshop on Combnatoral Mathematcs and Computaton Theory Incremental Learnng wth Support Vector Machnes and Fuzzy Set Theory Yu-Mng Chuang 1 and Cha-Hwa Ln 2* 1 Department of Computer Scence and

More information

Machine Learning: Algorithms and Applications

Machine Learning: Algorithms and Applications 14/05/1 Machne Learnng: Algorthms and Applcatons Florano Zn Free Unversty of Bozen-Bolzano Faculty of Computer Scence Academc Year 011-01 Lecture 10: 14 May 01 Unsupervsed Learnng cont Sldes courtesy of

More information

6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour

6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour 6.854 Advanced Algorthms Petar Maymounkov Problem Set 11 (November 23, 2005) Wth: Benjamn Rossman, Oren Wemann, and Pouya Kheradpour Problem 1. We reduce vertex cover to MAX-SAT wth weghts, such that the

More information

CMPS 10 Introduction to Computer Science Lecture Notes

CMPS 10 Introduction to Computer Science Lecture Notes CPS 0 Introducton to Computer Scence Lecture Notes Chapter : Algorthm Desgn How should we present algorthms? Natural languages lke Englsh, Spansh, or French whch are rch n nterpretaton and meanng are not

More information

A New Approach For the Ranking of Fuzzy Sets With Different Heights

A New Approach For the Ranking of Fuzzy Sets With Different Heights New pproach For the ankng of Fuzzy Sets Wth Dfferent Heghts Pushpnder Sngh School of Mathematcs Computer pplcatons Thapar Unversty, Patala-7 00 Inda pushpndersnl@gmalcom STCT ankng of fuzzy sets plays

More information

Proper Choice of Data Used for the Estimation of Datum Transformation Parameters

Proper Choice of Data Used for the Estimation of Datum Transformation Parameters Proper Choce of Data Used for the Estmaton of Datum Transformaton Parameters Hakan S. KUTOGLU, Turkey Key words: Coordnate systems; transformaton; estmaton, relablty. SUMMARY Advances n technologes and

More information

Optimal Workload-based Weighted Wavelet Synopses

Optimal Workload-based Weighted Wavelet Synopses Optmal Workload-based Weghted Wavelet Synopses Yoss Matas School of Computer Scence Tel Avv Unversty Tel Avv 69978, Israel matas@tau.ac.l Danel Urel School of Computer Scence Tel Avv Unversty Tel Avv 69978,

More information

X- Chart Using ANOM Approach

X- Chart Using ANOM Approach ISSN 1684-8403 Journal of Statstcs Volume 17, 010, pp. 3-3 Abstract X- Chart Usng ANOM Approach Gullapall Chakravarth 1 and Chaluvad Venkateswara Rao Control lmts for ndvdual measurements (X) chart are

More information

LECTURE : MANIFOLD LEARNING

LECTURE : MANIFOLD LEARNING LECTURE : MANIFOLD LEARNING Rta Osadchy Some sldes are due to L.Saul, V. C. Raykar, N. Verma Topcs PCA MDS IsoMap LLE EgenMaps Done! Dmensonalty Reducton Data representaton Inputs are real-valued vectors

More information

R s s f. m y s. SPH3UW Unit 7.3 Spherical Concave Mirrors Page 1 of 12. Notes

R s s f. m y s. SPH3UW Unit 7.3 Spherical Concave Mirrors Page 1 of 12. Notes SPH3UW Unt 7.3 Sphercal Concave Mrrors Page 1 of 1 Notes Physcs Tool box Concave Mrror If the reflectng surface takes place on the nner surface of the sphercal shape so that the centre of the mrror bulges

More information

Cost-efficient deployment of distributed software services

Cost-efficient deployment of distributed software services 1/30 Cost-effcent deployment of dstrbuted software servces csorba@tem.ntnu.no 2/30 Short ntroducton & contents Cost-effcent deployment of dstrbuted software servces Cost functons Bo-nspred decentralzed

More information

On Some Entertaining Applications of the Concept of Set in Computer Science Course

On Some Entertaining Applications of the Concept of Set in Computer Science Course On Some Entertanng Applcatons of the Concept of Set n Computer Scence Course Krasmr Yordzhev *, Hrstna Kostadnova ** * Assocate Professor Krasmr Yordzhev, Ph.D., Faculty of Mathematcs and Natural Scences,

More information

Wishing you all a Total Quality New Year!

Wishing you all a Total Quality New Year! Total Qualty Management and Sx Sgma Post Graduate Program 214-15 Sesson 4 Vnay Kumar Kalakband Assstant Professor Operatons & Systems Area 1 Wshng you all a Total Qualty New Year! Hope you acheve Sx sgma

More information

APPLICATION OF MULTIVARIATE LOSS FUNCTION FOR ASSESSMENT OF THE QUALITY OF TECHNOLOGICAL PROCESS MANAGEMENT

APPLICATION OF MULTIVARIATE LOSS FUNCTION FOR ASSESSMENT OF THE QUALITY OF TECHNOLOGICAL PROCESS MANAGEMENT 3. - 5. 5., Brno, Czech Republc, EU APPLICATION OF MULTIVARIATE LOSS FUNCTION FOR ASSESSMENT OF THE QUALITY OF TECHNOLOGICAL PROCESS MANAGEMENT Abstract Josef TOŠENOVSKÝ ) Lenka MONSPORTOVÁ ) Flp TOŠENOVSKÝ

More information

Brave New World Pseudocode Reference

Brave New World Pseudocode Reference Brave New World Pseudocode Reference Pseudocode s a way to descrbe how to accomplsh tasks usng basc steps lke those a computer mght perform. In ths week s lab, you'll see how a form of pseudocode can be

More information

A fast algorithm for color image segmentation

A fast algorithm for color image segmentation Unersty of Wollongong Research Onlne Faculty of Informatcs - Papers (Arche) Faculty of Engneerng and Informaton Scences 006 A fast algorthm for color mage segmentaton L. Dong Unersty of Wollongong, lju@uow.edu.au

More information

Reducing Frame Rate for Object Tracking

Reducing Frame Rate for Object Tracking Reducng Frame Rate for Object Trackng Pavel Korshunov 1 and We Tsang Oo 2 1 Natonal Unversty of Sngapore, Sngapore 11977, pavelkor@comp.nus.edu.sg 2 Natonal Unversty of Sngapore, Sngapore 11977, oowt@comp.nus.edu.sg

More information

A Unified Framework for Semantics and Feature Based Relevance Feedback in Image Retrieval Systems

A Unified Framework for Semantics and Feature Based Relevance Feedback in Image Retrieval Systems A Unfed Framework for Semantcs and Feature Based Relevance Feedback n Image Retreval Systems Ye Lu *, Chunhu Hu 2, Xngquan Zhu 3*, HongJang Zhang 2, Qang Yang * School of Computng Scence Smon Fraser Unversty

More information

Machine Learning. Support Vector Machines. (contains material adapted from talks by Constantin F. Aliferis & Ioannis Tsamardinos, and Martin Law)

Machine Learning. Support Vector Machines. (contains material adapted from talks by Constantin F. Aliferis & Ioannis Tsamardinos, and Martin Law) Machne Learnng Support Vector Machnes (contans materal adapted from talks by Constantn F. Alfers & Ioanns Tsamardnos, and Martn Law) Bryan Pardo, Machne Learnng: EECS 349 Fall 2014 Support Vector Machnes

More information

Range images. Range image registration. Examples of sampling patterns. Range images and range surfaces

Range images. Range image registration. Examples of sampling patterns. Range images and range surfaces Range mages For many structured lght scanners, the range data forms a hghly regular pattern known as a range mage. he samplng pattern s determned by the specfc scanner. Range mage regstraton 1 Examples

More information

Topology Design using LS-TaSC Version 2 and LS-DYNA

Topology Design using LS-TaSC Version 2 and LS-DYNA Topology Desgn usng LS-TaSC Verson 2 and LS-DYNA Wllem Roux Lvermore Software Technology Corporaton, Lvermore, CA, USA Abstract Ths paper gves an overvew of LS-TaSC verson 2, a topology optmzaton tool

More information

Load Balancing for Hex-Cell Interconnection Network

Load Balancing for Hex-Cell Interconnection Network Int. J. Communcatons, Network and System Scences,,, - Publshed Onlne Aprl n ScRes. http://www.scrp.org/journal/jcns http://dx.do.org/./jcns.. Load Balancng for Hex-Cell Interconnecton Network Saher Manaseer,

More information

Empirical Distributions of Parameter Estimates. in Binary Logistic Regression Using Bootstrap

Empirical Distributions of Parameter Estimates. in Binary Logistic Regression Using Bootstrap Int. Journal of Math. Analyss, Vol. 8, 4, no. 5, 7-7 HIKARI Ltd, www.m-hkar.com http://dx.do.org/.988/jma.4.494 Emprcal Dstrbutons of Parameter Estmates n Bnary Logstc Regresson Usng Bootstrap Anwar Ftranto*

More information

Helsinki University Of Technology, Systems Analysis Laboratory Mat Independent research projects in applied mathematics (3 cr)

Helsinki University Of Technology, Systems Analysis Laboratory Mat Independent research projects in applied mathematics (3 cr) Helsnk Unversty Of Technology, Systems Analyss Laboratory Mat-2.08 Independent research projects n appled mathematcs (3 cr) "! #$&% Antt Laukkanen 506 R ajlaukka@cc.hut.f 2 Introducton...3 2 Multattrbute

More information

For instance, ; the five basic number-sets are increasingly more n A B & B A A = B (1)

For instance, ; the five basic number-sets are increasingly more n A B & B A A = B (1) Secton 1.2 Subsets and the Boolean operatons on sets If every element of the set A s an element of the set B, we say that A s a subset of B, or that A s contaned n B, or that B contans A, and we wrte A

More information

Load-Balanced Anycast Routing

Load-Balanced Anycast Routing Load-Balanced Anycast Routng Chng-Yu Ln, Jung-Hua Lo, and Sy-Yen Kuo Department of Electrcal Engneerng atonal Tawan Unversty, Tape, Tawan sykuo@cc.ee.ntu.edu.tw Abstract For fault-tolerance and load-balance

More information

Parallel matrix-vector multiplication

Parallel matrix-vector multiplication Appendx A Parallel matrx-vector multplcaton The reduced transton matrx of the three-dmensonal cage model for gel electrophoress, descrbed n secton 3.2, becomes excessvely large for polymer lengths more

More information

Hybrid Heuristics for the Maximum Diversity Problem

Hybrid Heuristics for the Maximum Diversity Problem Hybrd Heurstcs for the Maxmum Dversty Problem MICAEL GALLEGO Departamento de Informátca, Estadístca y Telemátca, Unversdad Rey Juan Carlos, Span. Mcael.Gallego@urjc.es ABRAHAM DUARTE Departamento de Informátca,

More information

Graph-based Clustering

Graph-based Clustering Graphbased Clusterng Transform the data nto a graph representaton ertces are the data ponts to be clustered Edges are eghted based on smlarty beteen data ponts Graph parttonng Þ Each connected component

More information

Support Vector Machines. CS534 - Machine Learning

Support Vector Machines. CS534 - Machine Learning Support Vector Machnes CS534 - Machne Learnng Perceptron Revsted: Lnear Separators Bnar classfcaton can be veed as the task of separatng classes n feature space: b > 0 b 0 b < 0 f() sgn( b) Lnear Separators

More information

Intra-Parametric Analysis of a Fuzzy MOLP

Intra-Parametric Analysis of a Fuzzy MOLP Intra-Parametrc Analyss of a Fuzzy MOLP a MIAO-LING WANG a Department of Industral Engneerng and Management a Mnghsn Insttute of Technology and Hsnchu Tawan, ROC b HSIAO-FAN WANG b Insttute of Industral

More information

Unsupervised Learning

Unsupervised Learning Pattern Recognton Lecture 8 Outlne Introducton Unsupervsed Learnng Parametrc VS Non-Parametrc Approach Mxture of Denstes Maxmum-Lkelhood Estmates Clusterng Prof. Danel Yeung School of Computer Scence and

More information

Optimal Design of Nonlinear Fuzzy Model by Means of Independent Fuzzy Scatter Partition

Optimal Design of Nonlinear Fuzzy Model by Means of Independent Fuzzy Scatter Partition Optmal Desgn of onlnear Fuzzy Model by Means of Independent Fuzzy Scatter Partton Keon-Jun Park, Hyung-Kl Kang and Yong-Kab Km *, Department of Informaton and Communcaton Engneerng, Wonkwang Unversty,

More information

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration Improvement of Spatal Resoluton Usng BlockMatchng Based Moton Estmaton and Frame Integraton Danya Suga and Takayuk Hamamoto Graduate School of Engneerng, Tokyo Unversty of Scence, 6-3-1, Nuku, Katsuska-ku,

More information

Data Representation in Digital Design, a Single Conversion Equation and a Formal Languages Approach

Data Representation in Digital Design, a Single Conversion Equation and a Formal Languages Approach Data Representaton n Dgtal Desgn, a Sngle Converson Equaton and a Formal Languages Approach Hassan Farhat Unversty of Nebraska at Omaha Abstract- In the study of data representaton n dgtal desgn and computer

More information

ON SOME ENTERTAINING APPLICATIONS OF THE CONCEPT OF SET IN COMPUTER SCIENCE COURSE

ON SOME ENTERTAINING APPLICATIONS OF THE CONCEPT OF SET IN COMPUTER SCIENCE COURSE Yordzhev K., Kostadnova H. Інформаційні технології в освіті ON SOME ENTERTAINING APPLICATIONS OF THE CONCEPT OF SET IN COMPUTER SCIENCE COURSE Yordzhev K., Kostadnova H. Some aspects of programmng educaton

More information

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning Outlne Artfcal Intellgence and ts applcatons Lecture 8 Unsupervsed Learnng Professor Danel Yeung danyeung@eee.org Dr. Patrck Chan patrckchan@eee.org South Chna Unversty of Technology, Chna Introducton

More information

Biostatistics 615/815

Biostatistics 615/815 The E-M Algorthm Bostatstcs 615/815 Lecture 17 Last Lecture: The Smplex Method General method for optmzaton Makes few assumptons about functon Crawls towards mnmum Some recommendatons Multple startng ponts

More information

CHAPTER 3 SEQUENTIAL MINIMAL OPTIMIZATION TRAINED SUPPORT VECTOR CLASSIFIER FOR CANCER PREDICTION

CHAPTER 3 SEQUENTIAL MINIMAL OPTIMIZATION TRAINED SUPPORT VECTOR CLASSIFIER FOR CANCER PREDICTION 48 CHAPTER 3 SEQUENTIAL MINIMAL OPTIMIZATION TRAINED SUPPORT VECTOR CLASSIFIER FOR CANCER PREDICTION 3.1 INTRODUCTION The raw mcroarray data s bascally an mage wth dfferent colors ndcatng hybrdzaton (Xue

More information

A Simple and Efficient Goal Programming Model for Computing of Fuzzy Linear Regression Parameters with Considering Outliers

A Simple and Efficient Goal Programming Model for Computing of Fuzzy Linear Regression Parameters with Considering Outliers 62626262621 Journal of Uncertan Systems Vol.5, No.1, pp.62-71, 211 Onlne at: www.us.org.u A Smple and Effcent Goal Programmng Model for Computng of Fuzzy Lnear Regresson Parameters wth Consderng Outlers

More information

Positive Semi-definite Programming Localization in Wireless Sensor Networks

Positive Semi-definite Programming Localization in Wireless Sensor Networks Postve Sem-defnte Programmng Localzaton n Wreless Sensor etworks Shengdong Xe 1,, Jn Wang, Aqun Hu 1, Yunl Gu, Jang Xu, 1 School of Informaton Scence and Engneerng, Southeast Unversty, 10096, anjng Computer

More information

K-means and Hierarchical Clustering

K-means and Hierarchical Clustering Note to other teachers and users of these sldes. Andrew would be delghted f you found ths source materal useful n gvng your own lectures. Feel free to use these sldes verbatm, or to modfy them to ft your

More information

An Efficient Genetic Algorithm with Fuzzy c-means Clustering for Traveling Salesman Problem

An Efficient Genetic Algorithm with Fuzzy c-means Clustering for Traveling Salesman Problem An Effcent Genetc Algorthm wth Fuzzy c-means Clusterng for Travelng Salesman Problem Jong-Won Yoon and Sung-Bae Cho Dept. of Computer Scence Yonse Unversty Seoul, Korea jwyoon@sclab.yonse.ac.r, sbcho@cs.yonse.ac.r

More information

NAG Fortran Library Chapter Introduction. G10 Smoothing in Statistics

NAG Fortran Library Chapter Introduction. G10 Smoothing in Statistics Introducton G10 NAG Fortran Lbrary Chapter Introducton G10 Smoothng n Statstcs Contents 1 Scope of the Chapter... 2 2 Background to the Problems... 2 2.1 Smoothng Methods... 2 2.2 Smoothng Splnes and Regresson

More information

Quality Improvement Algorithm for Tetrahedral Mesh Based on Optimal Delaunay Triangulation

Quality Improvement Algorithm for Tetrahedral Mesh Based on Optimal Delaunay Triangulation Intellgent Informaton Management, 013, 5, 191-195 Publshed Onlne November 013 (http://www.scrp.org/journal/m) http://dx.do.org/10.36/m.013.5601 Qualty Improvement Algorthm for Tetrahedral Mesh Based on

More information

BioTechnology. An Indian Journal FULL PAPER. Trade Science Inc.

BioTechnology. An Indian Journal FULL PAPER. Trade Science Inc. [Type text] [Type text] [Type text] ISSN : 0974-74 Volume 0 Issue BoTechnology 04 An Indan Journal FULL PAPER BTAIJ 0() 04 [684-689] Revew on Chna s sports ndustry fnancng market based on market -orented

More information

Skew Angle Estimation and Correction of Hand Written, Textual and Large areas of Non-Textual Document Images: A Novel Approach

Skew Angle Estimation and Correction of Hand Written, Textual and Large areas of Non-Textual Document Images: A Novel Approach Angle Estmaton and Correcton of Hand Wrtten, Textual and Large areas of Non-Textual Document Images: A Novel Approach D.R.Ramesh Babu Pyush M Kumat Mahesh D Dhannawat PES Insttute of Technology Research

More information