Binary classification posed as a quadratically constrained quadratic programming and solved using particle swarm optimization

Size: px
Start display at page:

Download "Binary classification posed as a quadratically constrained quadratic programming and solved using particle swarm optimization"

Transcription

1 Sādhanā Vol. 4, No. 3, March 206, pp c Indan Academy of Scences Bnary classfcaton posed as a quadratcally constraned quadratc programmng and solved usng partcle swarm optmzaton DEEPAK KUMAR and A G RAMAKRISHNAN Medcal Intellgence and Language Engneerng (MILE) Laboratory, Department of Electrcal Engneerng, Indan Insttute of Scence, Bangalore 56002, Inda e-mal: dpkmr@gmal.com; ramkag@ee.sc.ernet.n MS receved 3 October 204; revsed 9 August 205; accepted 5 December 205 Abstract. Partcle swarm optmzaton (PSO) s used n several combnatoral optmzaton problems. In ths work, partcle swarms are used to solve quadratc programmng problems wth quadratc constrants. The central dea s to use PSO to move n the drecton towards optmal soluton rather than searchng the entre feasble regon. Bnary classfcaton s posed as a quadratcally constraned quadratc problem and solved usng the proposed method. Each class n the bnary classfcaton problem s modeled as a multdmensonal ellpsod to form a quadratc constrant n the problem. Partcle swarms help n determnng the optmal hyperplane or classfcaton boundary for a data set. Our results on the Irs, Pma, Wne, Thyrod, Balance, Bupa, Haberman, and TAE datasets show that the proposed method works better than a neural network and the performance s close to that of a support vector machne. Keywords. Quadratc programmng; partcle swarm optmzaton; hyperplane; quadratc constrants; bnary classfcaton.. Introducton A class of algorthms orgnated for mnmzng or maxmzng a functon f(x), whle satsfyng some constrants g(x). In the early hstory of optmzaton, the functon f(x) and the constrants g(x) were lnear and the problem was known as lnear programmng (LP). One of the earlest algorthms for solvng the lnear programmng problem was gven by Dantzg, popularly known as smplex method []. As the number of dmensons and constrants ncreased, solvng the LP problem usng smplex method became hard. The nablty of smplex method was that t could not solve LP problem n polynomal tme. Khachyan proposed ellpsod algorthm as an alternatve to smplex method and proved that t could reach the soluton teratvely n polynomal tme [2]. The practcally nfeasble condton of ellpsod algorthm led to the evoluton of several nteror or barrer pont methods. One of the well-known nteror pont methods s Karmarkar s method proposed by Narendra Karmarkar [3]. Bnary classfcaton s one of the actve research areas n machne learnng [4, 5]. There are several ways to tran a bnary classfer. The features and the class labels of the tranng data set can be stored and retreved durng classfcaton usng the nearest neghbor approach [6]. A hyperplane s learnt for classfcaton by tranng a neural network, For correspondence whch may not always be optmal [7]. Vapnk and others formulated the problem of classfcaton as optmzaton. Ths method s known as support vector machnes (SVMs) [8]. Sequental mnmal optmzaton (SMO) s a technque whch solves the optmzaton problem n SVMs [9]. Decson trees, baggng, and boostng technques are also used n bnary classfcaton [5, 0 2].. Motvaton Nearest neghbor method does not nvolve any modelng to reduce the storage of the tranng data set [6, 3]. On the other hand, neural network and SVM model the data wth an objectve functon to estmate a hyperplane, whch s used for classfcaton. In neural network approach, the objectve functon s a least squares, whch s quadratc n nature and s mnmzed for the gven data set. The hyperplane obtaned by tranng a neural network may or may not be optmal, snce t depends on the number of layers and weghts used to tran the network. SVM uses quadratc programmng formulaton wth lnear constrants for mnmzng the objectve functon [3]. Even though the objectve functon used n a neural network or SVM s a quadratc programmng problem, the constrants are lnear. If there s a way to model lnear constrants as quadratc constrants, then the objectve functon becomes quadratcally constraned quadratc programmng (QCQP). In ths paper, bnary classfcaton s posed as a QCQP problem and 289

2 290 Deepak Kumar and A G Ramakrshnan a novel soluton s proposed usng partcle swarm optmzaton (PSO). One of the advantages of ths approach s that t solves the QCQP problem wthout the need for gradent estmaton. The paper s organzed as follows: QCQP and PSO are descrbed n the background secton and the soluton for quadratcally constraned quadratc programmng usng partcle swarms s descrbed n Secton 3. The proposed method s compared wth Khachyan s and Karmarkar s algorthms for lnear programmng and wth neural networks and SVM for quadratc programmng n Secton 4 on experments and results. Secton 5 concludes the paper wth suggestons for future work. 2. Background The formulaton of QCQP and PSO are descrbed n the followng subsectons. 2. Quadratcally constraned quadratc programmng An optmzaton problem has the form [4]: mnmze f(x) subject to g (x) b, =, 2,...,m, where x ={x,x 2,...,x n } s the varable of the problem. The functon f(x) : R n R s the objectve functon. The functon g (x) : R n R, =, 2,...,m are the constrant functons and the constants b,b 2,...,b m are the lmts for the constrants. When the objectve functon s quadratc and the constrants are quadratc, then the optmzaton problem becomes a quadratcally constraned quadratc program. A general quadratcally constraned quadratc programmng problem s expressed as follows [4 6]: mnmze f(x) = x T P 0 x + 2q T 0 x + r 0 subject to x T P x + 2q T x + r 0, =, 2,...,m, (2) where x ={x,x 2,...,x n } R n. P 0 R n n, q 0 R n, and r 0 R defne the quadratc functon f(x). P R n n, q R n,andr belongs to R, =, 2,...,m are the constants of the quadratc constrants. If P are postve sem-defnte matrces for = 0,, 2,...,m, then the problem becomes convex QCQP. When P 0 s zero, then the problem becomes lnear programmng wth quadratc constrants. 2.2 Partcle swarm optmzaton Partcle swarm optmzaton was proposed for optmzng the weghts of a neural network [7]. PSO has been appled to numerous applcatons for optmzng non-lnear functons [8 20]. PSO evolved by smulatng brd flockng and () fsh schoolng. The advantages of PSO are that t s smple n concepton and easy to mplement. Partcles are deployed n the search space and each partcle s evaluated aganst an optmzaton functon. The best partcle s chosen as a drectng agent for the rest. The velocty of each partcle s controlled by both the partcle s personal best and the global best. Durng the movement of the partcles, a few of them may reach the global best. Several teratons are requred to reach the global best. Let X ={x, x 2,..., x k } be the partcles deployed n the search space of the optmzaton functon, where k s the number of partcles and V ={v, v 2,..., v k } be the veloctes of the respectve partcles. x, v R n for all the k partcles. A smple PSO update s as follows. Velocty update equaton v j = wv j + c r (x b x j ) + c 2 r 2 (x bg x j ), (3) where w s the weght for the prevous velocty; c, c 2 are constants and r, r 2 are random values vared n each teraton. x b s the personal best value for partcle and x bg s the global best value among all the partcles. v j s the updated velocty of the th partcle n the j th teraton and v j s ts velocty n the (j ) th teraton. x j s the poston of the th partcle after the (j ) th teraton. For poston update, the updated velocty s added to the exstng poston of the partcle. The poston update equaton s x j = x j + v j. (4) Table presents a pseudo code of partcle swarm optmzaton. Intally, the partcle wth the mnmum f(x) s consdered as the global best. After every teraton, the global best value s updated f necessary. Also, the personal best value s updated. 3. Proposed method Our nterest les n determnng the shortest path between the two non-ntersectng ellpsods to perform bnary classfcaton. The shortest path between the two ellpsods n R n can be formulated as a convex QCQP problem. 3. Formulaton Arrvng at the two end ponts of the shortest path, one on each ellpsod, can be posed as mnmzaton n a quadratc form and formulated as follows: mnmze f (x) = (x y) T P 0 (x y) subject to x T P x x X y T P 2 y y Y, (5)

3 Bnary classfcaton posed as a QCQP and solved usng PSO 29 Table. Pseudo code of PSO. Inputs:, and mnmze ; ntalze parameters x v and set Outputs: Global best value x 0, x x 0 for =to x x f x x end f end for whle x Velocty update step: Randomly choose values for 2 n the range 0 to. Then update the velocty of each partcle. v v x x 2 2x x Poston update step: Add the updated velocty to the exstng poston. for =to x x v f x x x x end f f x x x x end f end for end whle where x, y R n and X, Y are non-ntersectng regons n R n, X Y = 0. P and P 2 are the matrces that characterze the ellpsods modelng the two classes. In case the Eucldean dstance metrc s used for mnmzaton of the path length, then P 0 s the dentty matrx. The modfed equaton s mnmze f 2 (x) = x y 2 subject to x T P x x X y T P 2 y y Y. Thus, n ths paper, we address only ths lmted problem for bnary classfcaton, rather than the more general problem gven by Eq. (2). 3.2 Soluton usng modfed PSO Suppose one end pont (y) n the shortest path s known and fxed as a. Then, Eq. (6) can be reformulated as mnmze f 3 (x) = a x 2 subject to x T P x x X. Fgure shows the ellpsod wth the covarance matrx P wth ponts x (nsde), x B (on the boundary) and a (outsde). x B s the boundary pont nearest to the pont a outsde (6) (7) the ellpsod. We need to determne the unknown x B by mnmzng f 3 (x). The novelty of the proposed approach s the applcaton of partcle swarms for solvng the QCQP problem. Partcle swarms are deployed wthn the ellpsod to determne x B. The functon f 3 (x) = a x 2 needs to be evaluated for each partcle n the search space. Instead of usng f 3 (x) = a x 2,weusef 4 (x) = a x as the modfed objectve functon. The partcle wth the mnmum f 4 (x) s consdered as the closest to the pont a. Only one partcle of the swarm s shown n fgure for ease of representaton. The value of the functon f 4 (x) for the th partcle at the j th teraton s f 4 (x j ) = a xj. (8) Let the present poston of the th partcle be splt nto the prevous poston and the velocty vector of the partcle usng Eq. (4). f 4 (x j ) = a xj v j. (9) Here, a s fxed and the partcle poston x j s known and dependent on the velocty vector v j. So, the possble opton for mnmzng the functon s to change the velocty

4 292 Deepak Kumar and A G Ramakrshnan x drecton a x B mnmum dstance x x Fgure. An ellpsod wth a partcle at a pont x nsde and a pont x B on ts boundary nearest to the pont a outsde t (boundary pont on the other ellpsod). The dotted lnes connectng the pont a to the ponts x and x B are shown. The desred drecton of movement from x s also shown by an arrow, whch s requred to reach pont x B. ( vector of the partcle towards the drecton of a x j The arrow (for representaton purpose) shown n fgure s n the drecton of mnmzng the functon f 4 (x). The functon f 3 (x) s also mnmzed when mnmzng the functon f 4 (x). PSO s a populaton based stochastc optmzaton technque, whch takes several teratons to reach the optmal value. The velocty vector update of the PSO algorthm gven by Eq. (3) s modfed by ncludng the drecton term from the functon f 4 (x). The addton of evaluaton functon restrcts the partcle from movng away from the actual course towards the global best poston. Ths addton provdes an advantage n computaton. However, t also constrans the partcle to move n a partcular drecton. To counter ths effect, we add a crazness term n the velocty update equaton. The modfed velocty update equaton s v j ( = wv j + c r x b x j ( ) +c 3 r 3 a x j ) + c 2 r 2 ( x bg x j ). + c 4 r 4, (0) where c 3 and c 4 are constants, and r 3 s a random value that s vared durng each teraton. The value of c 3 s chosen such that the term (a x j ) does not take the partcle outsde the ellpsod. r 4 s a random pont on the surface of a sphere of dmenson n wth randomly varyng radus. c 4 r 4 forms the crazness term. On the assumpton that one end pont s known n the shortest path, partcle swarms are placed n the search space ) of the other regon and the other end pont of the shortest path s determned. In order to evaluate the objectve functon f 2 (x) n Eq. (6) for bnary classfcaton, we need to determne one end pont from regon X and the other from regon Y. Two sets of partcle swarms, one for each regon, are placed wthn the search spaces of the respectve regons. The objectve functon f 4 (x) s evaluated based on the partcles present n both the regons. In every teraton, the best poston of a partcle n one regon s used as the known end pont a n the shortest path n the velocty update equaton (0) of the partcles of the other regon. The objectve functon f 4 (x) reaches ts optmal value after several teratons. Thus, the objectve functon f 2 (x) also reaches ts optmal value. In the process of optmzaton, some partcles may go out of the search space. To lmt the partcles wthn the search space, we nspect after every tentatve poston update as to whether the partcle s lyng wthn the search space by carryng out a check on the QCQP constrants. If the ntended new poston of a partcle s gong to volate the constrant, then ts poston s not updated. In other words, the partcles lkely to go out of the search space are redeployed back to ther prevous postons. The proposed soluton may be used n control system problems such as optmzaton of sensor networks or collaboratve data mnng, whch are based on multple agents or gradent projecton [2]. General consensus problem may be solved usng the proposed method, where multple agents need to reach a common goal [22 24]. Consensus or dstrbuted optmzaton s dscussed n [25] as consensus and

5 Bnary classfcaton posed as a QCQP and solved usng PSO 293 sharng usng alternatng drecton method of multplers (ADMM). ADMM uses one agent for each constrant. Such solvers are used for solvng SVM n dstrbuted format [26]. 3.3 The pseudo code of the proposed method Table 2 presents a pseudo code for the algorthm. The parameters w, c, c 2, c 3,and c 4 are set to fxed values and the randomly varyng parameters r,r 2,r 3,and r 4 are updated n each teraton. The poston, velocty, personal best, and global best of each partcle are stored. The maxmum number of teratons for the algorthm s specfed as T n the experments and the sze of swarm used n the algorthm s 0. The proposed algorthm s mplemented n MATLAB. 4. Experments and results In ths secton, the lnear and quadratc programmng problems wth quadratc constrants are solved usng the proposed method. 4. Lnear programmng wth quadratc constrants A LP problem wth quadratc constrants s chosen n order to compare the convergence performance of our method wth those of Khachyan s and Karmarkar s methods. The LP problem s gven below: max f 5 (x,x 2 ) = x + x 2 subject to x 2 + x2 2 x,x 2 X, where X s the regon, satsfyng the constrant. () Ths LP problem s reformulated as a QCQP problem: max f 5 (x) = a T x subject to x T Ax x X, (2) where vector a = [ ] T, x = [x x 2 ] T and A s the dentty matrx of sze 2 (postve defnte). The soluton for ths LP problem s x, x 2 = wth f 5 (x) =.442. Ths LP problem s solved usng Khachyan s ellpsod algorthm, Karmarkar s algorthm and our method. Fgure 2 shows the value of the functon f 5 (x) for each teraton for a typcal ndependent run of the experment wth 50 teratons. The length vector n Karmarkar s method s scaled by a varable δ [3]. Two values are used for δ namely 0.05 and 0.50, for the evaluaton of the functon. As the value of δ ncreases, the algorthm reaches the optmal value of f 5 (x) n less number of teratons. Table 3 shows the error value of functon f 5 (x) for all the algorthms after the 25 th teraton as shown n fgure 2. The error values for the ellpsod and our methods are less than 0x0 3. In our algorthm, the values of the varables w, c, c 2, and c 3 are set to These values are small so that the vector addton to poston keeps the partcles wthn the regon X. Only the value of c 4 s vared snce t scales the neghborng search space of a partcle. The results for two dfferent values of c 4 are shown n fgure 2. We carred out 50 ndependent runs for ths LP problem wth two dfferent values of c 4. The average, mnmum, maxmum and standard devaton of error value are tabulated n table 4. The error values for c 4 = 0.20 are less than those for c 4 = 0.05, ndcatng that as the neghborng search space s scaled to a hgher value, the error value of the functon ft becomes better n fewer number of teratons. Table 2. Proposed algorthm for QCQP usng PSO. Inputs: 0, and 4 x ; set and ntalze parameters x v Outputs: Global best value 0, whle Functon evaluaton step: Calculate the functon 4 x for x Velocty update step: Randomly choose values for n the range 0 to. Then update the velocty of each partcle as n Eq (0). Poston update step: Add the updated velocty to the exstng poston. Check for the constrant x x on all the partcles. for =to f x x then x x end f end for end whle

6 294 Deepak Kumar and A G Ramakrshnan.5 f(x) value Number of teratons Ellpsod Algorthm Karmarkar Algorthm δ=0.05 Karmarkar Algorthm δ=0.50 Our method c 4 =0.05 Our method c 4 =0.20 Fgure 2. A plot of evaluated value of f 5 (x) aganst the number of teratons for dfferent algorthms for the LP problem gven by Eq. (2). At around 25 teratons, all the algorthms except Karmarkar s reach a value close to the soluton. We observe that Karmarkar algorthm takes more teratons to reach the soluton. 4.2 Bnary classfcaton of smulated data Several varants of PSO have been appled for classfcaton problems [27]. Generally PSO s deployed to learn the rules for classfcaton. A sutable classfer s chosen for classfcaton; for example, a neural network [7]. The parameters lke network weghts are tuned usng partcle swarms. There are dscrete versons of swarms, whch can take a fnte set of values [28]. Here, swarms learn the rule for classfyng the test samples. In our method, bnary classfcaton s posed as QCQP and swarms are used to solve ths QCQP problem. The nput feature space s consdered as the partcle swarm space. Fgure 3 shows a smulated data set for two classes syntheszed usng two Gaussan dstrbutons wth means μ = [8; 0]; μ 2 = [0; 8] and the same covarance matrx, 2 = [2; 2]. A generc decson boundary for a bnary classfcaton problem s a hypersurface. The classes n the data are lnearly separable thus reducng a hypersurface to a hyperplane. The equaton of a hyperplane for ths knd of data s gven by z = w x + w 2 x 2 + w 0, (3) where w,w 2 are the weghts for the ndvdual features and w 0 s the bas of the hyperplane. Ths hyperplane equaton s used for classfcaton: If z 0,(x,x 2 ) If z<0,(x,x 2 ) 2, (4) where and 2 are the regons for the classes and 2, respectvely. The MATLAB programmng platform s used to mplement and test our method and also a SVM and neural network for comparson. We traned the SVM wth a lnear kernel and a neural network on ths synthetc data. The hyperplanes learnt by the SVM and the neural network are shownnfgure3. A sngle layer perceptron wth 2 weghts, bas and log sgmod transfer functon s used wth 00 epochs for tranng the neural network. SMO algorthm s used for tranng the SVM wth lnear kernel [5]. We now explan the determnaton of the optmal hyperplane for bnary classfcaton by our method. The values of sample mean and covarance for the two classes are calculated from the smulated samples. Mahalanobs dstance [5] s determned from the mean value of a class to the data Table 3. Comparson of the error value of functon f 5 (x) of our algorthm wth those of two other methods for the LP problem gven by Eq. (2) after the 25 th teraton. Karmarkar Karmarkar Our method Our method Algorthm Ellpsod δ = 0.05 δ = 0.50 c 4 = 0.05 c 4 = 0.20 Error value

7 Bnary classfcaton posed as a QCQP and solved usng PSO 295 Table 4. The mnmum, mean, maxmum and standard devaton of error values for the functon f 5 (x) over 50 ndependent runs of our algorthm for the LP problem defned by Eq. (2). c 4 Mnmum Mean Maxmum Standard devaton ponts of the other class and the closest data pont of the other class label s found. Ths closest pont s used to fx the reference boundary of the search space (ellpsod) of the other class. Ths process s repeated from the other class and the boundary of the frst class s also determned. Wth the boundares, ellpsodal regons are formed wth estmated means of classes as the centers. Ths s posed as a QCQP problem formulated n Eq. (6) wth the estmated covarance matrces normalzed to boundary ponts as P and P 2.Our algorthm s mplemented by placng partcle swarms near the mean value of the classes and evaluatng the optmzaton functon for 300 teratons. The shortest path and the closest pont on each boundary are estmated. mnmze (x y) T (x y) subject to x T E[(x μ )(x μ ) T ] x, x X. y T E[(y μ 2 )(y μ 2 ) T ] y, y Y (5) Eq. (5) s optmzed usng the proposed algorthm. The ntersecton between the two ellpsodal regons s assumed to be zero to elmnate a stuaton where partcles reach to dfferent solutons. Once the closest ponts on the boundares are determned, the perpendcular bsector of the lne jonng the closest ponts s the hyperplane. The hyperplane obtaned for bnary classfcaton s shown n fgure 3. We can observe that the optmal hyperplane calculated by SMO algorthm and our method are closely placed, whle other hyperplanes are not optmal. The estmated weght and bas values for each method are tabulated n table 5. The estmated weght values can be normalzed as w = w / w 2 + w2 2. (6) w2 = w 2 / w 2 + w2 2 The normalzed weghts of SVM and our method are equal andtheyarew = and w 2 = Performance on real datasets Our algorthm s also tested on some of the datasets avalable from UCI ML repostory [29]. The datasets used n our experments are Irs, Wne, Pma, Thyrod, Balance, Bupa, Haberman, and Tae. These eght datasets have been chosen based on the consderaton of mnmal number of datasets wth the maxmum coverage of the dfferent types of attrbutes (namely, bnary, categorcal value as nteger, and real values). Further, the number of classes n each case s 2 or 3. So, the maxmum number of hyperplanes to be obtaned for any of these datasets s lmted to three. The Class Class 2 Support Vector Machne Neural Network Neural Network 2 Our method 0 x x Fgure 3. Hyperplanes obtaned by SVM, neural networks and our method are shown for a synthetc dataset for two classes. The hyperplane estmated by our method s closely algned wth that obtaned by the SVM wth a lnear kernel. Other hyperplanes are arrved at by two neural networks (perceptron) and are not optmal.

8 296 Deepak Kumar and A G Ramakrshnan Table 5. Weghts and bas values calculated by SVM wth a lnear kernel, neural networks (perceptron) and our method for the bnary classfcaton problem wth synthetc data. Algorthm w w 2 w 0 Neural network Neural network SVM Our method c Pma Indans dabetes dataset: The Pma dataset contans eght dfferent parameters measured from 768 adult females of Pma Indan hertage. Once agan, some of them are nteger valued, such as age and number of pregnances. Certan other parameters, such as serum nsuln, are real valued. It s a two-class classfcaton problem of dentfyng normal and dabetc subjects, gven the eght-dmensonal feature vector as nput. man characterstcs of these datasets are tabulated n table 6. The cross-valdaton performance on these datasets s compared wth those of a SVM wth lnear and RBF kernel, a neural network, and GSVM. GSVM [30] estmates a classfcaton hyperplane and s developed on the fundamentals of optmzng the SVM problem. GSVM modfes the bas (w 0 ) obtaned from SVM by movng the hyperplane such that the geometrc mean of recall and precson s mproved. 4.3a Irs dataset: The Irs dataset conssts of four dfferent measurements on 50 samples of rs flower. There are 50 samples of three dfferent speces of rs formng the dataset. The features, whose values are avalable from the dataset, are length and wdth of leaves and petals of dfferent rs plants. Out of the three speces, two are not lnearly separable from each other, whereas the thrd s lnearly separable from the rest of the speces. The classfcaton task s to determne the speces of the rs plant, gven the four-dmensonal feature vector. 4.3b Wne dataset: The Wne dataset contans the dfferent physcal and chemcal propertes of three dfferent types of wnes derved from three dfferent strans of plants. Some of the physcal propertes such as hue and color ntensty have nteger values, whereas chemcal propertes such as ash or phenol content have real values. The feature vector has 3 dmensons and there are a total of 78 samples. The classfcaton task s to determne the type of wne, gven the values of the content of the 3 physcal and chemcal components. Table 6. The characterstcs of datasets chosen for expermentaton from UCI ML repostory [29] wth vared nature of attrbutes. No. of No. of No. of Nature of Dataset samples attrbutes classes attrbutes Irs Real Wne Real and nteger Pma Real and nteger Thyrod Real and bnary Balance Integer Bupa Real and nteger TAE Integer Haberman Integer 4.3d Thyrod dsease dataset: Ths dataset contans ten dstnct databases of dfferent dmensons. The partcular database chosen for our study contans fve dfferent parameters measured from 25 ndvduals. Some of the varables have bnary values, whle others have real values. The classfcaton task s to assgn an ndvdual to one of three classes, gven the fve-dmensonal feature vector as nput. 4.4 Balance scale weght and dstance dataset The dataset conssts of 625 samples from a modeled psychologcal experment. There are four attrbutes and they are the left weght, the left dstance, the rght weght, and the rght dstance. The classfcaton task s to assgn the sample nto one of three classes namely, balance tp to the rght, balance tp to the left, and be balanced. 4.5 BUPA lver dsorders dataset Bupa dataset conssts of 345 samples related to lver dsorders. There are sx attrbutes n the dataset, where the frst fve varables are all blood test values thought to be senstve to lver dsorders that mght arse from excessve alcohol consumpton. The last attrbute s related to the number of drnks consumed. The classfcaton task s to decde whether a sample belongs to a lver dsorder or not. 4.6 Teachng assstant evaluaton dataset Ths dataset contans the evaluatons of 5 teachng assstants (TA) for ther teachng performance over three regular semesters and two summer semesters at the Statstcs Department of the Unversty of Wsconsn-Madson. The attrbutes are categorcal. The classfcaton task s to assgn a test assstant nto one of the followng three performance categores: low, medum, and hgh. 4.7 Haberman s survval dataset Ths dataset conssts of 306 patents survvng from the breast cancer dsease. The attrbutes are numercal and they are age, year of operaton and the number of postve axllary nodes detected. The classfcaton task s to assgn whether the patent survved more than 5 years or not.

9 Bnary classfcaton posed as a QCQP and solved usng PSO 297 Table 7. Cross-valdaton (CV) error (n %) usng our method for dfferent datasets from UCI ML repostory [29] compared wth those reported n [30] and our own mplementaton of SVM and neural network. CV error CV error n CV error n CV error by n SVM wth neural SVM wth RBF CV error Dataset our method lnear kernel network kernel n GSVM Irs Wne Pma Thyrod Balance Bupa TAE Haberman a Data projecton: We perform a preprocessng step on these real datasets. Ths step s necessary snce, Some of the attrbutes of the dataset have larger varance than others. Ths may result n skewed ellpsod formaton at the mean value of the dataset. The number of samples avalable n the datasets s also less. To overcome these two problems, we perform the two steps gven below. Egen value decomposton s performed on dataset covarance matrx. The dataset s projected on to these egen vectors. Scalng s performed n such a way that each component n the new projected dataset has unt varance. Ths step s performed ndependently on each of the subsets used n the cross-valdaton stage. We project the subset of samples nto two dmensons. Frst s the drecton of the vector jonng the sample means of the classes. The equaton for ths vector s p = (μ μ 2 ), (7) where p s the projected vector, μ and μ 2 are the sample means of classes- and 2, respectvely. The second drecton s that of the egen vector correspondng to the largest egen value of the covarance matrx of the subset. The projected samples are used n the estmaton of new sample means and covarance matrces. The estmated hyperplane s used to classfy the test samples. 4.7b Cross-valdaton: These datasets do not have separate tranng and test samples; hence we perform crossvaldaton. In cross-valdaton, a small subset s used for testng whle the remanng are used as tranng samples. We use ten fold cross-valdaton: splt the datasets nto ten subsets and use one of them for testng and the others for tranng at a tme and then rotate the subsets. Equaton (5) s used n the estmaton of the hyperplane, whch n turn s used to classfy the test samples. Ten trals are performed and the average cross-valdaton errors are reported n table 7. The performance of our method s close to that of the varants of SVM and s superor to that of the neural network. We notce that n Irs and Wne datasets, the cross-valdaton error obtaned by our technque s better than those acheved by SVM wth lnear and RBF kernels and also the neural network. In Pma, Balance, Bupa, TAE, and Haberman datasets, the cross-valdaton error s hgher than the best. 5. Concluson and future work We have developed a classfcaton method and optmzaton algorthm for solvng QCQP problems. The novelty n ths method s the applcaton of partcle swarms, a swarm ntellgence technque, for optmzaton. The results ndcate that our approach s a possble method n solvng general QCQP problems wthout gradent estmaton. We have shown the results of our algorthm under quadratc constrants by evaluatng dfferent optmzaton functons. The ssue wth PSO based methods s ther computatonal complexty and the need for parameter tunng. The number of functon evaluatons lnearly ncreases wth the number of partcles employed and the number of teratons carred out. The results show that the new approach proposed gves better solutons for some problems, but t s not always the case. Deeper analyss s needed to understand why the method performs poorly for certan problems and how ths can be overcome. In future, we ntend to learn multple hyperplanes by placng multple kernels n each class and evaluatng the performance aganst multple-kernel learnng algorthms. The hyperplanes estmated for dfferent kernels may reduce the cross-valdaton error for the Pma, Balance, Bupa, TAE and Haberman datasets.

10 298 Deepak Kumar and A G Ramakrshnan References [] Dantzg G B 963 Lnear programmng. Prnceton, NJ: Unversty Press [2] Khachyan L G 979 A polynomal algorthm n lnear programmng. Doklady Akadema Nauk SSSR 244:S [3] Karmarkar N 984 A new polynomal-tme algorthm for lnear programmng. Combnatorca 4: [4] Bshop C M 2006 Pattern recognton and machne learnng. Sprnger, Berln [5] Duda R O, Hart P E and Stork D G 200 Pattern classfcaton. Wley [6] Derrac J, García S and Herrera F 204 Fuzzy nearest neghbor algorthms: Taxonomy, expermental analyss and prospects. Inform. Sc. 260: 98 9 [7] Bshop C M 995 Neural networks for pattern recognton. Oxford unversty press [8] Cortes C and Vapnk V 995 Support-vector networks. Mach. Learn. 20(3): [9] Platt J 998 Fast tranng of support vector machnes usng sequental mnmal optmzaton. In: Schoelkopf B, Burges C, Smola A (eds) Advances n Kernel methods support vector learnng [0] Fernández A, López V, Galar M, Jesus M J and Herrera F 203 Analysng the classfcaton of mbalanced data-sets wth multple classes: Bnarzaton technques and ad-hoc approaches. Knowledge-based syst [] Galar M, Fernández A, Barrenechea E and Herrera F 203 EUSBoost: Enhancng ensembles for hghly mbalanced datasets by evolutonary undersamplng. Pattern Recognt. 46(2): [2] López V, Fernández A, García S, Palade V and Herrera F 203 An nsght nto classfcaton wth mbalanced data. Emprcal results and current trends on usng data ntrnsc characterstcs. Informaton scences [3] Gonzalez-Abrl L, Velasco F, Angulo C and Ortega J A 203 A study on output normalzaton n multclass SVMs. Pattern Recognt. Lett [4] Boyd S and Vandenberghe L 2004 Convex optmzaton: Cambrdge Unversty Press [5] Bomze I M 998 On standard quadratc optmzaton problems. J. Global Optmz. 3: [6] Bomze I M and Schachnger W 200 Mult-standard quadratc optmzaton: nteror pont methods and cone programmng reformulaton. Comput. Optmz. Appl. 45: [7] Kennedy J and Eberhart R C 995 Partcle swarm optmzaton. In: Proceedngs of IEEE Internatonal Conference on Neural Networks. IV, pp [8] Spadon M and Stefann L 202 A dfferental evoluton algorthm to deal wth box, lnear and quadratc-convex constrants for boundary optmzaton. J. Global Optmz. 52(): 7 92 [9] Zhan Z -H, Zhang J, L Y and Chung HS-H2009 Adaptve partcle swarm optmzaton. IEEE Trans. Syst. Man Cybern. - B: Cybern. 39(6): [20] Kennedy J, Eberhart R C and Sh Y 200 Swarm ntellgence. Morgan Kaufmann Publshers, San Francsco [2] Zanella F, Varagnolo D, Cenedese A, Pllonetto G, Schenato L 202 Multdmensonal Newton-Raphson consensus for dstrbuted convex optmzaton. In: The 202 Amercan Control Conference (ACC), pp [22] Mate I, Baras J S 202 A performance comparson between two consensus-based dstrbuted optmzaton algorthms. In: 3rd IFAC Workshop on Dstrbuted Estmaton and Control n Networked Systems, pp [23] Ned A and Ozdaglar A 2009 Dstrbuted subgradent methods for mult-agent optmzaton. IEEE Trans. Autom. Control 54(): 48 6 [24] Ned A, Ozdaglar A and Parrlo P A 200 Constraned consensus and optmzaton n mult-agent networks. IEEE Trans. Autom. Control 55(4): [25] Boyd S, Parkh N, Chu E, Peleato B and Ecksten J 200 Dstrbuted optmzaton and statstcal learnng va the alternatng drecton method of multplers. Foundatons Trends Mach. Learn. 3(): 22 [26] Forero P A, Cano A and Gannaks G B 200 Consensusbased dstrbuted support vector machnes. J. Mach. Learn. Res. : [27] Kennedy J, Eberhart R C 997 A dscrete bnary verson of the partcle swarm algorthm. In: Proceedngs of the World Multconference on Systems, Cybernetcs and Informatcs, pp [28] Cervantes A, Galvan I M, Isas P 2005 A comparson between the Pttsburgh and Mchgan approaches for the bnary PSO algorthm. In: Congress on evolutonary computaton, pp [29] UC Irvne Machne Learnng Repostory cs.uc.edu/ml/ [30] Gonzalez-Abrl L, Nuñez H, Angulo C and Velasco F 204 GSVM : An SVM for handlng mbalanced accuracy between classes n b-classfcaton problems. Appl. Soft Comput. 7: 23 3 [3] Kumar D and Ramakrshnan A G 204 Quadratcally constraned quadratc programmng for classfcaton usng partcle swarms and applcatons. CoRR. arxv: abs/

Support Vector Machines

Support Vector Machines /9/207 MIST.6060 Busness Intellgence and Data Mnng What are Support Vector Machnes? Support Vector Machnes Support Vector Machnes (SVMs) are supervsed learnng technques that analyze data and recognze patterns.

More information

Learning the Kernel Parameters in Kernel Minimum Distance Classifier

Learning the Kernel Parameters in Kernel Minimum Distance Classifier Learnng the Kernel Parameters n Kernel Mnmum Dstance Classfer Daoqang Zhang 1,, Songcan Chen and Zh-Hua Zhou 1* 1 Natonal Laboratory for Novel Software Technology Nanjng Unversty, Nanjng 193, Chna Department

More information

Classification / Regression Support Vector Machines

Classification / Regression Support Vector Machines Classfcaton / Regresson Support Vector Machnes Jeff Howbert Introducton to Machne Learnng Wnter 04 Topcs SVM classfers for lnearly separable classes SVM classfers for non-lnearly separable classes SVM

More information

Classifier Selection Based on Data Complexity Measures *

Classifier Selection Based on Data Complexity Measures * Classfer Selecton Based on Data Complexty Measures * Edth Hernández-Reyes, J.A. Carrasco-Ochoa, and J.Fco. Martínez-Trndad Natonal Insttute for Astrophyscs, Optcs and Electroncs, Lus Enrque Erro No.1 Sta.

More information

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET 1 BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET TZU-CHENG CHUANG School of Electrcal and Computer Engneerng, Purdue Unversty, West Lafayette, Indana 47907 SAUL B. GELFAND School

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Decson surface s a hyperplane (lne n 2D) n feature space (smlar to the Perceptron) Arguably, the most mportant recent dscovery n machne learnng In a nutshell: map the data to a predetermned

More information

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization Problem efntons and Evaluaton Crtera for Computatonal Expensve Optmzaton B. Lu 1, Q. Chen and Q. Zhang 3, J. J. Lang 4, P. N. Suganthan, B. Y. Qu 6 1 epartment of Computng, Glyndwr Unversty, UK Faclty

More information

CHAPTER 3 SEQUENTIAL MINIMAL OPTIMIZATION TRAINED SUPPORT VECTOR CLASSIFIER FOR CANCER PREDICTION

CHAPTER 3 SEQUENTIAL MINIMAL OPTIMIZATION TRAINED SUPPORT VECTOR CLASSIFIER FOR CANCER PREDICTION 48 CHAPTER 3 SEQUENTIAL MINIMAL OPTIMIZATION TRAINED SUPPORT VECTOR CLASSIFIER FOR CANCER PREDICTION 3.1 INTRODUCTION The raw mcroarray data s bascally an mage wth dfferent colors ndcatng hybrdzaton (Xue

More information

The Research of Support Vector Machine in Agricultural Data Classification

The Research of Support Vector Machine in Agricultural Data Classification The Research of Support Vector Machne n Agrcultural Data Classfcaton Le Sh, Qguo Duan, Xnmng Ma, Me Weng College of Informaton and Management Scence, HeNan Agrcultural Unversty, Zhengzhou 45000 Chna Zhengzhou

More information

Parallelism for Nested Loops with Non-uniform and Flow Dependences

Parallelism for Nested Loops with Non-uniform and Flow Dependences Parallelsm for Nested Loops wth Non-unform and Flow Dependences Sam-Jn Jeong Dept. of Informaton & Communcaton Engneerng, Cheonan Unversty, 5, Anseo-dong, Cheonan, Chungnam, 330-80, Korea. seong@cheonan.ac.kr

More information

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning Outlne Artfcal Intellgence and ts applcatons Lecture 8 Unsupervsed Learnng Professor Danel Yeung danyeung@eee.org Dr. Patrck Chan patrckchan@eee.org South Chna Unversty of Technology, Chna Introducton

More information

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1 4/14/011 Outlne Dscrmnatve classfers for mage recognton Wednesday, Aprl 13 Krsten Grauman UT-Austn Last tme: wndow-based generc obect detecton basc ppelne face detecton wth boostng as case study Today:

More information

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints Australan Journal of Basc and Appled Scences, 2(4): 1204-1208, 2008 ISSN 1991-8178 Sum of Lnear and Fractonal Multobjectve Programmng Problem under Fuzzy Rules Constrants 1 2 Sanjay Jan and Kalash Lachhwan

More information

Unsupervised Learning

Unsupervised Learning Pattern Recognton Lecture 8 Outlne Introducton Unsupervsed Learnng Parametrc VS Non-Parametrc Approach Mxture of Denstes Maxmum-Lkelhood Estmates Clusterng Prof. Danel Yeung School of Computer Scence and

More information

CS 534: Computer Vision Model Fitting

CS 534: Computer Vision Model Fitting CS 534: Computer Vson Model Fttng Sprng 004 Ahmed Elgammal Dept of Computer Scence CS 534 Model Fttng - 1 Outlnes Model fttng s mportant Least-squares fttng Maxmum lkelhood estmaton MAP estmaton Robust

More information

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers IOSR Journal of Electroncs and Communcaton Engneerng (IOSR-JECE) e-issn: 78-834,p- ISSN: 78-8735.Volume 9, Issue, Ver. IV (Mar - Apr. 04), PP 0-07 Content Based Image Retreval Usng -D Dscrete Wavelet wth

More information

Determining the Optimal Bandwidth Based on Multi-criterion Fusion

Determining the Optimal Bandwidth Based on Multi-criterion Fusion Proceedngs of 01 4th Internatonal Conference on Machne Learnng and Computng IPCSIT vol. 5 (01) (01) IACSIT Press, Sngapore Determnng the Optmal Bandwdth Based on Mult-crteron Fuson Ha-L Lang 1+, Xan-Mn

More information

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task Proceedngs of NTCIR-6 Workshop Meetng, May 15-18, 2007, Tokyo, Japan Term Weghtng Classfcaton System Usng the Ch-square Statstc for the Classfcaton Subtask at NTCIR-6 Patent Retreval Task Kotaro Hashmoto

More information

Network Intrusion Detection Based on PSO-SVM

Network Intrusion Detection Based on PSO-SVM TELKOMNIKA Indonesan Journal of Electrcal Engneerng Vol.1, No., February 014, pp. 150 ~ 1508 DOI: http://dx.do.org/10.11591/telkomnka.v1.386 150 Network Intruson Detecton Based on PSO-SVM Changsheng Xang*

More information

Machine Learning. Support Vector Machines. (contains material adapted from talks by Constantin F. Aliferis & Ioannis Tsamardinos, and Martin Law)

Machine Learning. Support Vector Machines. (contains material adapted from talks by Constantin F. Aliferis & Ioannis Tsamardinos, and Martin Law) Machne Learnng Support Vector Machnes (contans materal adapted from talks by Constantn F. Alfers & Ioanns Tsamardnos, and Martn Law) Bryan Pardo, Machne Learnng: EECS 349 Fall 2014 Support Vector Machnes

More information

Programming in Fortran 90 : 2017/2018

Programming in Fortran 90 : 2017/2018 Programmng n Fortran 90 : 2017/2018 Programmng n Fortran 90 : 2017/2018 Exercse 1 : Evaluaton of functon dependng on nput Wrte a program who evaluate the functon f (x,y) for any two user specfed values

More information

Support Vector Machines. CS534 - Machine Learning

Support Vector Machines. CS534 - Machine Learning Support Vector Machnes CS534 - Machne Learnng Perceptron Revsted: Lnear Separators Bnar classfcaton can be veed as the task of separatng classes n feature space: b > 0 b 0 b < 0 f() sgn( b) Lnear Separators

More information

Feature Reduction and Selection

Feature Reduction and Selection Feature Reducton and Selecton Dr. Shuang LIANG School of Software Engneerng TongJ Unversty Fall, 2012 Today s Topcs Introducton Problems of Dmensonalty Feature Reducton Statstc methods Prncpal Components

More information

Research of Neural Network Classifier Based on FCM and PSO for Breast Cancer Classification

Research of Neural Network Classifier Based on FCM and PSO for Breast Cancer Classification Research of Neural Network Classfer Based on FCM and PSO for Breast Cancer Classfcaton Le Zhang 1, Ln Wang 1, Xujewen Wang 2, Keke Lu 2, and Ajth Abraham 3 1 Shandong Provncal Key Laboratory of Network

More information

Optimal Design of Nonlinear Fuzzy Model by Means of Independent Fuzzy Scatter Partition

Optimal Design of Nonlinear Fuzzy Model by Means of Independent Fuzzy Scatter Partition Optmal Desgn of onlnear Fuzzy Model by Means of Independent Fuzzy Scatter Partton Keon-Jun Park, Hyung-Kl Kang and Yong-Kab Km *, Department of Informaton and Communcaton Engneerng, Wonkwang Unversty,

More information

Machine Learning 9. week

Machine Learning 9. week Machne Learnng 9. week Mappng Concept Radal Bass Functons (RBF) RBF Networks 1 Mappng It s probably the best scenaro for the classfcaton of two dataset s to separate them lnearly. As you see n the below

More information

Data Mining For Multi-Criteria Energy Predictions

Data Mining For Multi-Criteria Energy Predictions Data Mnng For Mult-Crtera Energy Predctons Kashf Gll and Denns Moon Abstract We present a data mnng technque for mult-crtera predctons of wnd energy. A mult-crtera (MC) evolutonary computng method has

More information

Classifying Acoustic Transient Signals Using Artificial Intelligence

Classifying Acoustic Transient Signals Using Artificial Intelligence Classfyng Acoustc Transent Sgnals Usng Artfcal Intellgence Steve Sutton, Unversty of North Carolna At Wlmngton (suttons@charter.net) Greg Huff, Unversty of North Carolna At Wlmngton (jgh7476@uncwl.edu)

More information

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification Introducton to Artfcal Intellgence V22.0472-001 Fall 2009 Lecture 24: Nearest-Neghbors & Support Vector Machnes Rob Fergus Dept of Computer Scence, Courant Insttute, NYU Sldes from Danel Yeung, John DeNero

More information

Clustering Algorithm Combining CPSO with K-Means Chunqin Gu 1, a, Qian Tao 2, b

Clustering Algorithm Combining CPSO with K-Means Chunqin Gu 1, a, Qian Tao 2, b Internatonal Conference on Advances n Mechancal Engneerng and Industral Informatcs (AMEII 05) Clusterng Algorthm Combnng CPSO wth K-Means Chunqn Gu, a, Qan Tao, b Department of Informaton Scence, Zhongka

More information

Meta-heuristics for Multidimensional Knapsack Problems

Meta-heuristics for Multidimensional Knapsack Problems 2012 4th Internatonal Conference on Computer Research and Development IPCSIT vol.39 (2012) (2012) IACSIT Press, Sngapore Meta-heurstcs for Multdmensonal Knapsack Problems Zhbao Man + Computer Scence Department,

More information

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision SLAM Summer School 2006 Practcal 2: SLAM usng Monocular Vson Javer Cvera, Unversty of Zaragoza Andrew J. Davson, Imperal College London J.M.M Montel, Unversty of Zaragoza. josemar@unzar.es, jcvera@unzar.es,

More information

Smoothing Spline ANOVA for variable screening

Smoothing Spline ANOVA for variable screening Smoothng Splne ANOVA for varable screenng a useful tool for metamodels tranng and mult-objectve optmzaton L. Rcco, E. Rgon, A. Turco Outlne RSM Introducton Possble couplng Test case MOO MOO wth Game Theory

More information

Data Mining: Model Evaluation

Data Mining: Model Evaluation Data Mnng: Model Evaluaton Aprl 16, 2013 1 Issues: Evaluatng Classfcaton Methods Accurac classfer accurac: predctng class label predctor accurac: guessng value of predcted attrbutes Speed tme to construct

More information

An Optimal Algorithm for Prufer Codes *

An Optimal Algorithm for Prufer Codes * J. Software Engneerng & Applcatons, 2009, 2: 111-115 do:10.4236/jsea.2009.22016 Publshed Onlne July 2009 (www.scrp.org/journal/jsea) An Optmal Algorthm for Prufer Codes * Xaodong Wang 1, 2, Le Wang 3,

More information

CS246: Mining Massive Datasets Jure Leskovec, Stanford University

CS246: Mining Massive Datasets Jure Leskovec, Stanford University CS46: Mnng Massve Datasets Jure Leskovec, Stanford Unversty http://cs46.stanford.edu /19/013 Jure Leskovec, Stanford CS46: Mnng Massve Datasets, http://cs46.stanford.edu Perceptron: y = sgn( x Ho to fnd

More information

S1 Note. Basis functions.

S1 Note. Basis functions. S1 Note. Bass functons. Contents Types of bass functons...1 The Fourer bass...2 B-splne bass...3 Power and type I error rates wth dfferent numbers of bass functons...4 Table S1. Smulaton results of type

More information

Wishing you all a Total Quality New Year!

Wishing you all a Total Quality New Year! Total Qualty Management and Sx Sgma Post Graduate Program 214-15 Sesson 4 Vnay Kumar Kalakband Assstant Professor Operatons & Systems Area 1 Wshng you all a Total Qualty New Year! Hope you acheve Sx sgma

More information

Unsupervised Learning and Clustering

Unsupervised Learning and Clustering Unsupervsed Learnng and Clusterng Why consder unlabeled samples?. Collectng and labelng large set of samples s costly Gettng recorded speech s free, labelng s tme consumng 2. Classfer could be desgned

More information

y and the total sum of

y and the total sum of Lnear regresson Testng for non-lnearty In analytcal chemstry, lnear regresson s commonly used n the constructon of calbraton functons requred for analytcal technques such as gas chromatography, atomc absorpton

More information

A MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS

A MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS Proceedngs of the Wnter Smulaton Conference M E Kuhl, N M Steger, F B Armstrong, and J A Jones, eds A MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS Mark W Brantley Chun-Hung

More information

CHAPTER 2 PROPOSED IMPROVED PARTICLE SWARM OPTIMIZATION

CHAPTER 2 PROPOSED IMPROVED PARTICLE SWARM OPTIMIZATION 24 CHAPTER 2 PROPOSED IMPROVED PARTICLE SWARM OPTIMIZATION The present chapter proposes an IPSO approach for multprocessor task schedulng problem wth two classfcatons, namely, statc ndependent tasks and

More information

A Notable Swarm Approach to Evolve Neural Network for Classification in Data Mining

A Notable Swarm Approach to Evolve Neural Network for Classification in Data Mining A Notable Swarm Approach to Evolve Neural Network for Classfcaton n Data Mnng Satchdananda Dehur 1, Bjan Bhar Mshra 2 and Sung-Bae Cho 1 1 Soft Computng Laboratory, Department of Computer Scence, Yonse

More information

Discriminative Dictionary Learning with Pairwise Constraints

Discriminative Dictionary Learning with Pairwise Constraints Dscrmnatve Dctonary Learnng wth Parwse Constrants Humn Guo Zhuoln Jang LARRY S. DAVIS UNIVERSITY OF MARYLAND Nov. 6 th, Outlne Introducton/motvaton Dctonary Learnng Dscrmnatve Dctonary Learnng wth Parwse

More information

Announcements. Supervised Learning

Announcements. Supervised Learning Announcements See Chapter 5 of Duda, Hart, and Stork. Tutoral by Burge lnked to on web page. Supervsed Learnng Classfcaton wth labeled eamples. Images vectors n hgh-d space. Supervsed Learnng Labeled eamples

More information

Incremental Learning with Support Vector Machines and Fuzzy Set Theory

Incremental Learning with Support Vector Machines and Fuzzy Set Theory The 25th Workshop on Combnatoral Mathematcs and Computaton Theory Incremental Learnng wth Support Vector Machnes and Fuzzy Set Theory Yu-Mng Chuang 1 and Cha-Hwa Ln 2* 1 Department of Computer Scence and

More information

A mathematical programming approach to the analysis, design and scheduling of offshore oilfields

A mathematical programming approach to the analysis, design and scheduling of offshore oilfields 17 th European Symposum on Computer Aded Process Engneerng ESCAPE17 V. Plesu and P.S. Agach (Edtors) 2007 Elsever B.V. All rghts reserved. 1 A mathematcal programmng approach to the analyss, desgn and

More information

Three supervised learning methods on pen digits character recognition dataset

Three supervised learning methods on pen digits character recognition dataset Three supervsed learnng methods on pen dgts character recognton dataset Chrs Flezach Department of Computer Scence and Engneerng Unversty of Calforna, San Dego San Dego, CA 92093 cflezac@cs.ucsd.edu Satoru

More information

An Application of the Dulmage-Mendelsohn Decomposition to Sparse Null Space Bases of Full Row Rank Matrices

An Application of the Dulmage-Mendelsohn Decomposition to Sparse Null Space Bases of Full Row Rank Matrices Internatonal Mathematcal Forum, Vol 7, 2012, no 52, 2549-2554 An Applcaton of the Dulmage-Mendelsohn Decomposton to Sparse Null Space Bases of Full Row Rank Matrces Mostafa Khorramzadeh Department of Mathematcal

More information

A New Approach For the Ranking of Fuzzy Sets With Different Heights

A New Approach For the Ranking of Fuzzy Sets With Different Heights New pproach For the ankng of Fuzzy Sets Wth Dfferent Heghts Pushpnder Sngh School of Mathematcs Computer pplcatons Thapar Unversty, Patala-7 00 Inda pushpndersnl@gmalcom STCT ankng of fuzzy sets plays

More information

Lecture 5: Multilayer Perceptrons

Lecture 5: Multilayer Perceptrons Lecture 5: Multlayer Perceptrons Roger Grosse 1 Introducton So far, we ve only talked about lnear models: lnear regresson and lnear bnary classfers. We noted that there are functons that can t be represented

More information

TPL-Aware Displacement-driven Detailed Placement Refinement with Coloring Constraints

TPL-Aware Displacement-driven Detailed Placement Refinement with Coloring Constraints TPL-ware Dsplacement-drven Detaled Placement Refnement wth Colorng Constrants Tao Ln Iowa State Unversty tln@astate.edu Chrs Chu Iowa State Unversty cnchu@astate.edu BSTRCT To mnmze the effect of process

More information

Identification of a Gaussian Fuzzy Classifier

Identification of a Gaussian Fuzzy Classifier 8 Internatonal Journal of Control, Automaton, and Systems Vol., No., March 004 Identfcaton of a Gaussan Fuzzy Classfer Heesoo Hwang Abstract: Ths paper proposes an approach to dervng a fuzzy classfer based

More information

Lecture 4: Principal components

Lecture 4: Principal components /3/6 Lecture 4: Prncpal components 3..6 Multvarate lnear regresson MLR s optmal for the estmaton data...but poor for handlng collnear data Covarance matrx s not nvertble (large condton number) Robustness

More information

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION Paulo Quntlano 1 & Antono Santa-Rosa 1 Federal Polce Department, Brasla, Brazl. E-mals: quntlano.pqs@dpf.gov.br and

More information

Skew Angle Estimation and Correction of Hand Written, Textual and Large areas of Non-Textual Document Images: A Novel Approach

Skew Angle Estimation and Correction of Hand Written, Textual and Large areas of Non-Textual Document Images: A Novel Approach Angle Estmaton and Correcton of Hand Wrtten, Textual and Large areas of Non-Textual Document Images: A Novel Approach D.R.Ramesh Babu Pyush M Kumat Mahesh D Dhannawat PES Insttute of Technology Research

More information

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points; Subspace clusterng Clusterng Fundamental to all clusterng technques s the choce of dstance measure between data ponts; D q ( ) ( ) 2 x x = x x, j k = 1 k jk Squared Eucldean dstance Assumpton: All features

More information

Structural Optimization Using OPTIMIZER Program

Structural Optimization Using OPTIMIZER Program SprngerLnk - Book Chapter http://www.sprngerlnk.com/content/m28478j4372qh274/?prnt=true ق.ظ 1 of 2 2009/03/12 11:30 Book Chapter large verson Structural Optmzaton Usng OPTIMIZER Program Book III European

More information

FEATURE EXTRACTION. Dr. K.Vijayarekha. Associate Dean School of Electrical and Electronics Engineering SASTRA University, Thanjavur

FEATURE EXTRACTION. Dr. K.Vijayarekha. Associate Dean School of Electrical and Electronics Engineering SASTRA University, Thanjavur FEATURE EXTRACTION Dr. K.Vjayarekha Assocate Dean School of Electrcal and Electroncs Engneerng SASTRA Unversty, Thanjavur613 41 Jont Intatve of IITs and IISc Funded by MHRD Page 1 of 8 Table of Contents

More information

Classifier Swarms for Human Detection in Infrared Imagery

Classifier Swarms for Human Detection in Infrared Imagery Classfer Swarms for Human Detecton n Infrared Imagery Yur Owechko, Swarup Medasan, and Narayan Srnvasa HRL Laboratores, LLC 3011 Malbu Canyon Road, Malbu, CA 90265 {owechko, smedasan, nsrnvasa}@hrl.com

More information

LECTURE : MANIFOLD LEARNING

LECTURE : MANIFOLD LEARNING LECTURE : MANIFOLD LEARNING Rta Osadchy Some sldes are due to L.Saul, V. C. Raykar, N. Verma Topcs PCA MDS IsoMap LLE EgenMaps Done! Dmensonalty Reducton Data representaton Inputs are real-valued vectors

More information

R s s f. m y s. SPH3UW Unit 7.3 Spherical Concave Mirrors Page 1 of 12. Notes

R s s f. m y s. SPH3UW Unit 7.3 Spherical Concave Mirrors Page 1 of 12. Notes SPH3UW Unt 7.3 Sphercal Concave Mrrors Page 1 of 1 Notes Physcs Tool box Concave Mrror If the reflectng surface takes place on the nner surface of the sphercal shape so that the centre of the mrror bulges

More information

Wavefront Reconstructor

Wavefront Reconstructor A Dstrbuted Smplex B-Splne Based Wavefront Reconstructor Coen de Vsser and Mchel Verhaegen 14-12-201212 2012 Delft Unversty of Technology Contents Introducton Wavefront reconstructon usng Smplex B-Splnes

More information

Hybridization of Expectation-Maximization and K-Means Algorithms for Better Clustering Performance

Hybridization of Expectation-Maximization and K-Means Algorithms for Better Clustering Performance BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 16, No 2 Sofa 2016 Prnt ISSN: 1311-9702; Onlne ISSN: 1314-4081 DOI: 10.1515/cat-2016-0017 Hybrdzaton of Expectaton-Maxmzaton

More information

Face Recognition Based on SVM and 2DPCA

Face Recognition Based on SVM and 2DPCA Vol. 4, o. 3, September, 2011 Face Recognton Based on SVM and 2DPCA Tha Hoang Le, Len Bu Faculty of Informaton Technology, HCMC Unversty of Scence Faculty of Informaton Scences and Engneerng, Unversty

More information

Machine Learning. Topic 6: Clustering

Machine Learning. Topic 6: Clustering Machne Learnng Topc 6: lusterng lusterng Groupng data nto (hopefully useful) sets. Thngs on the left Thngs on the rght Applcatons of lusterng Hypothess Generaton lusters mght suggest natural groups. Hypothess

More information

Alternating Direction Method of Multipliers Implementation Using Apache Spark

Alternating Direction Method of Multipliers Implementation Using Apache Spark Alternatng Drecton Method of Multplers Implementaton Usng Apache Spark Deterch Lawson June 4, 2014 1 Introducton Many applcaton areas n optmzaton have benefted from recent trends towards massve datasets.

More information

Parallel Numerics. 1 Preconditioning & Iterative Solvers (From 2016)

Parallel Numerics. 1 Preconditioning & Iterative Solvers (From 2016) Technsche Unverstät München WSe 6/7 Insttut für Informatk Prof. Dr. Thomas Huckle Dpl.-Math. Benjamn Uekermann Parallel Numercs Exercse : Prevous Exam Questons Precondtonng & Iteratve Solvers (From 6)

More information

An Accurate Evaluation of Integrals in Convex and Non convex Polygonal Domain by Twelve Node Quadrilateral Finite Element Method

An Accurate Evaluation of Integrals in Convex and Non convex Polygonal Domain by Twelve Node Quadrilateral Finite Element Method Internatonal Journal of Computatonal and Appled Mathematcs. ISSN 89-4966 Volume, Number (07), pp. 33-4 Research Inda Publcatons http://www.rpublcaton.com An Accurate Evaluaton of Integrals n Convex and

More information

Edge Detection in Noisy Images Using the Support Vector Machines

Edge Detection in Noisy Images Using the Support Vector Machines Edge Detecton n Nosy Images Usng the Support Vector Machnes Hlaro Gómez-Moreno, Saturnno Maldonado-Bascón, Francsco López-Ferreras Sgnal Theory and Communcatons Department. Unversty of Alcalá Crta. Madrd-Barcelona

More information

Training of Kernel Fuzzy Classifiers by Dynamic Cluster Generation

Training of Kernel Fuzzy Classifiers by Dynamic Cluster Generation Tranng of Kernel Fuzzy Classfers by Dynamc Cluster Generaton Shgeo Abe Graduate School of Scence and Technology Kobe Unversty Nada, Kobe, Japan abe@eedept.kobe-u.ac.jp Abstract We dscuss kernel fuzzy classfers

More information

Efficient Text Classification by Weighted Proximal SVM *

Efficient Text Classification by Weighted Proximal SVM * Effcent ext Classfcaton by Weghted Proxmal SVM * Dong Zhuang 1, Benyu Zhang, Qang Yang 3, Jun Yan 4, Zheng Chen, Yng Chen 1 1 Computer Scence and Engneerng, Bejng Insttute of echnology, Bejng 100081, Chna

More information

X- Chart Using ANOM Approach

X- Chart Using ANOM Approach ISSN 1684-8403 Journal of Statstcs Volume 17, 010, pp. 3-3 Abstract X- Chart Usng ANOM Approach Gullapall Chakravarth 1 and Chaluvad Venkateswara Rao Control lmts for ndvdual measurements (X) chart are

More information

Quality Improvement Algorithm for Tetrahedral Mesh Based on Optimal Delaunay Triangulation

Quality Improvement Algorithm for Tetrahedral Mesh Based on Optimal Delaunay Triangulation Intellgent Informaton Management, 013, 5, 191-195 Publshed Onlne November 013 (http://www.scrp.org/journal/m) http://dx.do.org/10.36/m.013.5601 Qualty Improvement Algorthm for Tetrahedral Mesh Based on

More information

Detection of an Object by using Principal Component Analysis

Detection of an Object by using Principal Component Analysis Detecton of an Object by usng Prncpal Component Analyss 1. G. Nagaven, 2. Dr. T. Sreenvasulu Reddy 1. M.Tech, Department of EEE, SVUCE, Trupath, Inda. 2. Assoc. Professor, Department of ECE, SVUCE, Trupath,

More information

Optimizing SVR using Local Best PSO for Software Effort Estimation

Optimizing SVR using Local Best PSO for Software Effort Estimation Journal of Informaton Technology and Computer Scence Volume 1, Number 1, 2016, pp. 28 37 Journal Homepage: www.jtecs.ub.ac.d Optmzng SVR usng Local Best PSO for Software Effort Estmaton Dnda Novtasar 1,

More information

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique //00 :0 AM Outlne and Readng The Greedy Method The Greedy Method Technque (secton.) Fractonal Knapsack Problem (secton..) Task Schedulng (secton..) Mnmum Spannng Trees (secton.) Change Money Problem Greedy

More information

High-Boost Mesh Filtering for 3-D Shape Enhancement

High-Boost Mesh Filtering for 3-D Shape Enhancement Hgh-Boost Mesh Flterng for 3-D Shape Enhancement Hrokazu Yagou Λ Alexander Belyaev y Damng We z Λ y z ; ; Shape Modelng Laboratory, Unversty of Azu, Azu-Wakamatsu 965-8580 Japan y Computer Graphcs Group,

More information

A Robust Method for Estimating the Fundamental Matrix

A Robust Method for Estimating the Fundamental Matrix Proc. VIIth Dgtal Image Computng: Technques and Applcatons, Sun C., Talbot H., Ourseln S. and Adraansen T. (Eds.), 0- Dec. 003, Sydney A Robust Method for Estmatng the Fundamental Matrx C.L. Feng and Y.S.

More information

Compiler Design. Spring Register Allocation. Sample Exercises and Solutions. Prof. Pedro C. Diniz

Compiler Design. Spring Register Allocation. Sample Exercises and Solutions. Prof. Pedro C. Diniz Compler Desgn Sprng 2014 Regster Allocaton Sample Exercses and Solutons Prof. Pedro C. Dnz USC / Informaton Scences Insttute 4676 Admralty Way, Sute 1001 Marna del Rey, Calforna 90292 pedro@s.edu Regster

More information

BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION

BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION SHI-LIANG SUN, HONG-LEI SHI Department of Computer Scence and Technology, East Chna Normal Unversty 500 Dongchuan Road, Shangha 200241, P. R. Chna E-MAIL: slsun@cs.ecnu.edu.cn,

More information

Face Recognition University at Buffalo CSE666 Lecture Slides Resources:

Face Recognition University at Buffalo CSE666 Lecture Slides Resources: Face Recognton Unversty at Buffalo CSE666 Lecture Sldes Resources: http://www.face-rec.org/algorthms/ Overvew of face recognton algorthms Correlaton - Pxel based correspondence between two face mages Structural

More information

Cluster Analysis of Electrical Behavior

Cluster Analysis of Electrical Behavior Journal of Computer and Communcatons, 205, 3, 88-93 Publshed Onlne May 205 n ScRes. http://www.scrp.org/ournal/cc http://dx.do.org/0.4236/cc.205.350 Cluster Analyss of Electrcal Behavor Ln Lu Ln Lu, School

More information

Journal of Chemical and Pharmaceutical Research, 2014, 6(6): Research Article. A selective ensemble classification method on microarray data

Journal of Chemical and Pharmaceutical Research, 2014, 6(6): Research Article. A selective ensemble classification method on microarray data Avalable onlne www.ocpr.com Journal of Chemcal and Pharmaceutcal Research, 2014, 6(6):2860-2866 Research Artcle ISSN : 0975-7384 CODEN(USA) : JCPRC5 A selectve ensemble classfcaton method on mcroarray

More information

Classification of Face Images Based on Gender using Dimensionality Reduction Techniques and SVM

Classification of Face Images Based on Gender using Dimensionality Reduction Techniques and SVM Classfcaton of Face Images Based on Gender usng Dmensonalty Reducton Technques and SVM Fahm Mannan 260 266 294 School of Computer Scence McGll Unversty Abstract Ths report presents gender classfcaton based

More information

Fuzzy Modeling of the Complexity vs. Accuracy Trade-off in a Sequential Two-Stage Multi-Classifier System

Fuzzy Modeling of the Complexity vs. Accuracy Trade-off in a Sequential Two-Stage Multi-Classifier System Fuzzy Modelng of the Complexty vs. Accuracy Trade-off n a Sequental Two-Stage Mult-Classfer System MARK LAST 1 Department of Informaton Systems Engneerng Ben-Guron Unversty of the Negev Beer-Sheva 84105

More information

Optimizing Document Scoring for Query Retrieval

Optimizing Document Scoring for Query Retrieval Optmzng Document Scorng for Query Retreval Brent Ellwen baellwe@cs.stanford.edu Abstract The goal of ths project was to automate the process of tunng a document query engne. Specfcally, I used machne learnng

More information

Course Introduction. Algorithm 8/31/2017. COSC 320 Advanced Data Structures and Algorithms. COSC 320 Advanced Data Structures and Algorithms

Course Introduction. Algorithm 8/31/2017. COSC 320 Advanced Data Structures and Algorithms. COSC 320 Advanced Data Structures and Algorithms Course Introducton Course Topcs Exams, abs, Proects A quc loo at a few algorthms 1 Advanced Data Structures and Algorthms Descrpton: We are gong to dscuss algorthm complexty analyss, algorthm desgn technques

More information

Elliptical Rule Extraction from a Trained Radial Basis Function Neural Network

Elliptical Rule Extraction from a Trained Radial Basis Function Neural Network Ellptcal Rule Extracton from a Traned Radal Bass Functon Neural Network Andrey Bondarenko, Arkady Borsov DITF, Rga Techncal Unversty, Meza ¼, Rga, LV-1658, Latva Andrejs.bondarenko@gmal.com, arkadjs.borsovs@cs.rtu.lv

More information

SVM-based Learning for Multiple Model Estimation

SVM-based Learning for Multiple Model Estimation SVM-based Learnng for Multple Model Estmaton Vladmr Cherkassky and Yunqan Ma Department of Electrcal and Computer Engneerng Unversty of Mnnesota Mnneapols, MN 55455 {cherkass,myq}@ece.umn.edu Abstract:

More information

Parameter estimation for incomplete bivariate longitudinal data in clinical trials

Parameter estimation for incomplete bivariate longitudinal data in clinical trials Parameter estmaton for ncomplete bvarate longtudnal data n clncal trals Naum M. Khutoryansky Novo Nordsk Pharmaceutcals, Inc., Prnceton, NJ ABSTRACT Bvarate models are useful when analyzng longtudnal data

More information

Solving two-person zero-sum game by Matlab

Solving two-person zero-sum game by Matlab Appled Mechancs and Materals Onlne: 2011-02-02 ISSN: 1662-7482, Vols. 50-51, pp 262-265 do:10.4028/www.scentfc.net/amm.50-51.262 2011 Trans Tech Publcatons, Swtzerland Solvng two-person zero-sum game by

More information

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data A Fast Content-Based Multmeda Retreval Technque Usng Compressed Data Borko Furht and Pornvt Saksobhavvat NSF Multmeda Laboratory Florda Atlantc Unversty, Boca Raton, Florda 3343 ABSTRACT In ths paper,

More information

Random Kernel Perceptron on ATTiny2313 Microcontroller

Random Kernel Perceptron on ATTiny2313 Microcontroller Random Kernel Perceptron on ATTny233 Mcrocontroller Nemanja Djurc Department of Computer and Informaton Scences, Temple Unversty Phladelpha, PA 922, USA nemanja.djurc@temple.edu Slobodan Vucetc Department

More information

Efficient Distributed Linear Classification Algorithms via the Alternating Direction Method of Multipliers

Efficient Distributed Linear Classification Algorithms via the Alternating Direction Method of Multipliers Effcent Dstrbuted Lnear Classfcaton Algorthms va the Alternatng Drecton Method of Multplers Caoxe Zhang Honglak Lee Kang G. Shn Department of EECS Unversty of Mchgan Ann Arbor, MI 48109, USA caoxezh@umch.edu

More information

Polyhedral Compilation Foundations

Polyhedral Compilation Foundations Polyhedral Complaton Foundatons Lous-Noël Pouchet pouchet@cse.oho-state.edu Dept. of Computer Scence and Engneerng, the Oho State Unversty Feb 8, 200 888., Class # Introducton: Polyhedral Complaton Foundatons

More information

EVALUATION OF THE PERFORMANCES OF ARTIFICIAL BEE COLONY AND INVASIVE WEED OPTIMIZATION ALGORITHMS ON THE MODIFIED BENCHMARK FUNCTIONS

EVALUATION OF THE PERFORMANCES OF ARTIFICIAL BEE COLONY AND INVASIVE WEED OPTIMIZATION ALGORITHMS ON THE MODIFIED BENCHMARK FUNCTIONS Academc Research Internatonal ISS-L: 3-9553, ISS: 3-9944 Vol., o. 3, May 0 EVALUATIO OF THE PERFORMACES OF ARTIFICIAL BEE COLOY AD IVASIVE WEED OPTIMIZATIO ALGORITHMS O THE MODIFIED BECHMARK FUCTIOS Dlay

More information

An Improved Particle Swarm Optimization for Feature Selection

An Improved Particle Swarm Optimization for Feature Selection Journal of Bonc Engneerng 8 (20)?????? An Improved Partcle Swarm Optmzaton for Feature Selecton Yuannng Lu,2, Gang Wang,2, Hulng Chen,2, Hao Dong,2, Xaodong Zhu,2, Sujng Wang,2 Abstract. College of Computer

More information

A MODIFIED K-NEAREST NEIGHBOR CLASSIFIER TO DEAL WITH UNBALANCED CLASSES

A MODIFIED K-NEAREST NEIGHBOR CLASSIFIER TO DEAL WITH UNBALANCED CLASSES A MODIFIED K-NEAREST NEIGHBOR CLASSIFIER TO DEAL WITH UNBALANCED CLASSES Aram AlSuer, Ahmed Al-An and Amr Atya 2 Faculty of Engneerng and Informaton Technology, Unversty of Technology, Sydney, Australa

More information

A Facet Generation Procedure. for solving 0/1 integer programs

A Facet Generation Procedure. for solving 0/1 integer programs A Facet Generaton Procedure for solvng 0/ nteger programs by Gyana R. Parja IBM Corporaton, Poughkeepse, NY 260 Radu Gaddov Emery Worldwde Arlnes, Vandala, Oho 45377 and Wlbert E. Wlhelm Teas A&M Unversty,

More information