Comparison of Robust Nearest Neighbour fuzzy Rough Classifier (RNN-FRC) with KNN and NEC Classifiers
|
|
- Muriel Greer
- 5 years ago
- Views:
Transcription
1 Bchtrananda Behera et al, / (IJCIT) Internatonal Journal of Computer cence and Informaton Technologes, Vol. 5 (), 014, Comparson of obust Nearest Neghbour fuzzy ough Classfer (NN-FC) wth KNN and NEC Classfers Bchtrananda Behera 1, udhakar abat 1, M.Tech.,Computer cence and Engneerng College of Engneerng and Technology, Bhubaneswar, Inda Abstract Fuzzy rough set, generalzed from Pawlak s rough sets, were ntroduced for dealng wth contnuous or fuzzy data. Ths model has been wdely dscussed and appled these years. It s shown that the model of fuzzy rough sets s senstve to nosy samples, especally senstve to mslabelled samples. As data are usually contamnated wth nose n practce, a robust model s desrable. To handle nosy datasets a lot of robust fuzzy rough set models were desgned before. obust nearest neghbour fuzzy rough set model s best among the models. A classfcaton algorthm was mentoned before based on ths model known as robust nearest neghbour fuzzy rough classfer. In lterature there was no detal descrpton of the robust nearest neghbour fuzzy rough classfer was gven and compared t wth KNN, and NN classfers. In ths paper, we gve detal descrpton of ths classfer and compare t wth other classfers lke KNN classfer, NN classfer and Fuzzy rough classfer. In ths paper we have done some numercal experments to show that obust nearest neghbour Fuzzy ough Classfers perform best among them. Keywords Fuzzy rough set, robustness, KNN, NEC classfers. I. INTODCTION Ths Classfcaton s a form of data analyss that extracts models descrbng mportant data classes. uch models called classfers, predct categorcal (dscrete, unordered) class labels, Classfcaton s a twostep process. In the frst step, a classfcaton model s bult based on tranng data. In the second step t determnes the model s accuracy based on testng data. If the model s accuracy s acceptable, then the model s used to classfy new data. Many classfcaton methods have been proposed by researchers n machne learnng, pattern recognton, and statstcs. Classfcaton has numerous applcatons, ncludng fraud detecton, target marketng, performance predcton, manufacturng, and medcal dagnoss. In lterature a lot of classfers were dscussed. Decson tree classfer s one of the well known classfer. Gven a obect X, for whch the assocated class label s unknown, the attrbute values of the obect are tested aganst the decson tree. A path s traced from the root to a leaf node, whch holds the class predcton for the obect. Bayesan classfers are statstcal classfers. They can predct class membershp probabltes such as probablty that a gven obect belongs to a partcular class. A rule based classfer uses a set of IF-THEN rules for classfcaton. A neural network s a set of connected nput/output unts n whch each connecton has a weght assocated wth t. The weghts are adusted durng the learnng phase to help the network predct the correct class label of the nput obects.a support vector machne s another classfer that transforms tranng data nto a hgher dmenson, where t fnds the hyper plane that separates the data by class usng essental tranng obects called support vectors. K-NN classfer, n whch a new pattern s classfed nto the class wth the most members present among the K nearest neghbours[1].k-nn classfers requre computng all the dstances between the tranng sets and test sample, t s tme-consumng f the avalable samples are of very great sze. Besdes, when the number of prototypes n the tranng set s not large enough, the K-NN rule s no longer optmal. Ths problem becomes more relevant when havng few prototypes compared to the ntrnsc dmensonalty of the feature space. Neghbourhood rough set model (NEC) as a unform framework to understand and mplement neghbourhood classfers[].ths algorthm ntegrates attrbute reducton technque wth classfcaton learnng. ough set theory [3], especally fuzzy rough set theory [4], whch encapsulates two knds of uncertanty of fuzzness and roughness nto a sngle model, has attracted much attenton from the domans of granular computng, machne learnng, and uncertanty reasonng over the past decade. Fuzzy rough set theory has been successfully used n gene clusterng, feature selecton, attrbute reducton, case generaton, and rule extracton. The dependence functon, whch s defned as the rato of the szes of the lower approxmaton of classfcaton over the unverse, plays a key role n these applcatons. Ths functon underles a number of learnng algorthms, ncludng feature selecton, attrbute reducton, rule extracton, and decson trees. Due to the advantages of dependence functon of fuzzy rough set fuzzy rough set models was developed. Due to dependence functon s senstve to nose to overcome ths problem a lot of robust fuzzy rough set models were developed n past. In 003,aldo and Murakam presented a β-precson aggregaton fuzzy rough set model based on β-precson aggregaton trangular operators [5]. In 004, the model of varable precson fuzzy rough sets (VPF) was ntroduced n [6], where the fuzzy membershps of a sample to the lower and upper approxmatons are computed wth fuzzy ncluson. In 007,Hu et al. ntroduced another fuzzy rough set model based on fuzzy granulaton and fuzzy ncluson [7]. In addton, n 007,Cornels et al. presented a model called vaguely quantfed rough sets (VQ) [8], whch was used n constructng a robust feature selecton algorthm n 008 [9]. In 009, Zhao et al. constructed a new model, whch s 056
2 Bchtrananda Behera et al, / (IJCIT) Internatonal Journal of Computer cence and Informaton Technologes, Vol. 5 (), 014, called fuzzy varable precson rough sets (FVP), to handle nose of msclassfcaton and perturbaton [10]. In 010,Hu et al. ntroduced a new robust model of fuzzy rough sets, whch are called soft fuzzy rough sets, where soft threshold was used to compute fuzzy lower and upper approxmatons [11].It has a lot of advantages but n realworld applcaton, the parameters used n the models have complex nteracton ways wth nose. Therefore, t s usually dffcult to obtan an optmal value. In 01,Hu et al. ntroduced robust nearest neghbour fuzzy rough set (NN-F) model and compared NN-F model wth other robust models lke -PF, VQ, VPF, FVP and F[1]. In that paper, robust nearest neghbour fuzzy rough classfer (NN-FC) was mentoned and that classfer performed best among them; But t was not gven the detals descrpton of the NN- FC and ts comparson wth KNN and NN. In ths paper, t s dscussed and gven the explanaton of NN-FC algorthm and compared t wth other classfer lke KNN, NN and F and shown that NN-FC s best a mong them. The remander of ths paper s organzed as follows. Frst, we gve prelmnary knowledge of rough sets and fuzzy rough sets n ecton II; then, we dscuss the exstng models of robust nearest neghbour fuzzy rough set models n ecton III. Then, we ntroduce robust nearest neghbour fuzzy rough classfers (NN-FC) n secton IV. Expermental analyss s gven n ecton V. Fnally, conclusons are drawn n ecton VI. II. PELIMINAIE OF OGH ET AND FZZY OGH ET A. ough et The ough set concept can be defned qute generally by s of topologcal operatons, nteror and closure, called approxmatons. I =, A s called an nformaton table, where s a fnte and nonempty set of obects and A s a set of features used to characterze the obects. B A, a B-ndscernblty relaton s defned as IND(B) = {(x, y) a B,a = a(y)} Then the partton of generated by IND(B) s denoted by /IND(B) (or /B). The equvalence class of x nduced by B-ndscernble relaton s denoted by [x] B. Gven an arbtrary X, s an equvalence relaton on nduced by a set of attrbutes. The lower approxmatons of X wth respect to are defned as X = {x [x] X} The lower approxmaton of a set X wth respect to s the set of all obects, whch can be for certan classfed as X wth respect to (are certanly X wth respect to ) The upper approxmatons of X wth respect to are defned as X = {x [x] X φ} The upper approxmaton of a set X wth respect to s the set of all obects whch can be possbly classfed as X wth respect to (are possbly X n vew of ). -boundary regon of X s defned as BN (X) = X X The boundary regon of a set X wth respect to s the set of all obects, whch can be classfed nether as X nor as not-x wth respect to. -negatve regon of X s defned as NEG (X) = X It contans those elements whch completely do not belongs to X. The defnton of rough sets s et X s crsp (exact wth respect to ), f the boundary regon of X s empty. et X s rough (nexact wth respect to ), f the boundary regon of X s nonempty The lower approxmaton s also called -postve regon of X, denoted by PO (X).Gven a decson table D =, A D D s the decson attrbute. For B A, the postve regon of decson D on B, denoted by PO B (D), s defned as PO (D) B = BX X / D Where, /D s the set of the equvalence classes generated by D. The dependency of decson D on B s defned as PO (D) γ (D) = B Dependency s the rato of the samples n the lower approxmaton over the unverse. As the lower approxmaton s the set of obects wth consstent decsons, dependency s used to measure the classfcaton performance of attrbutes. It s expected that all the decsons of obects are consstent wth respect to the gven attrbutes. In practce, nconsstency wdely exsts n data. B. Fuzzy ough et The ough set model s constructed under the assumpton that only dscrete features exst n the nformaton system. In practce, most of classfcaton tasks are descrbed wth numercal features or fuzzy nformaton. In ths case, neghbourhood relatons or fuzzy smlarty relatons are used and neghbourhood or fuzzy granules are generated. Then, we use these granules to approxmate decson classes. Gven a nonempty unverse, s a fuzzy bnary relaton on. If satsfes (1) reflexvty: (x, x)=1 () symmetry: (x, y)= (y, x) (3)sup-mn transtvty: (x, y) sup mn {(x, z), (z, y)} z Then s a fuzzy equvalence relaton. The fuzzy equvalence class [x] assocated wth x and s a fuzzy set on, where [x] (y) = (x, y) for all yϵ. Let be a nonempty unverse, be a fuzzy equvalence relaton on and X() be the fuzzy power set of. Gven B 057
3 Bchtrananda Behera et al, / (IJCIT) Internatonal Journal of Computer cence and Informaton Technologes, Vol. 5 (), 014, a fuzzy set X X(), the lower and upper approxmatons are defned as X = nf m ax(1 (x, y), X (y)) X = sup m n( (x, y), X (y)) These approxmaton operators were dscussed n the vew pont of the constructve and axomatc approaches. In 1998, Mors and Yakout replaced fuzzy equvalence relaton wth a T-equvalence relaton and bult an axom system of the model[13], where the lower and upper approxmatons of X X() are sx = nf (N ( (x, y)), X (y)) T X = sup T ( (x, y), X (y)) Where, T s a trangular norm. In 00, based on and adzkowska and Kerre ntroduced another model [14]: ϑ σ X X = n f ϑ ( (x, y), X (y)) = sup σ (N ( (x, y)), X (y)) y In classfcaton learnng, samples are assgned wth a class label and descrbed wth a group of features. Fuzzy equvalence relatons can be generated wth numercal or fuzzy features, whle the decson varable dvdes the samples nto some subsets. In ths case, the task s to approxmate these decson classes wth the fuzzy equvalence classes nduced wth the features. Gven a decson system <,, D>, for a decson class D /D, the membershp of a sample x to D s 1x D D = 0x D Then the membershp of sample x to the fuzzy lower approxmaton of D s nf D = nf D {1 (x, y), D (y)} nf y D = {1 (x, y),1} {1 (x, y), 0} nf y D nf y D = 1 {1 (x, y)} = {1 (x, y)} If we ntroduce Gaussan functon x y G(x, y) = exp σ to compute the smlarty, 1 G(x, y) can be consdered as a pseudo-dstance functon. mlarly, the membershp of sample x to the fuzzy upper approxmaton of D s D = mn{(x, y), D (y)} sup D = sup mn{(x, y),1} sup mn{(x, y),0} D y D sup mn{(x, y)} 0 D sup D = = mn{(x, y)} We can see that D s the dstance from x to ts nearest sample from dfferent classes; whle smlarty between x and the nearest sample n D. D s the III. OBT NEAET NEIGHBO FZZY OGH ET MODEL It was shown that both d or d T depends on the nearest mss of x,.e., the nearest sample from dfferent classes of x. As we know, the statstcs of mnmum and mum are very senstve to nosy samples. Just one nosy sample would change the mnmum or mum of a random varable. The senstveness of these statstcs leads to the poor performance of fuzzy rough sets n dealng wth nosy datasets. NN-F ntroduced robust statstcs to substtute the operators of mnmum and mum n the fuzzy rough set model. o that t can be performed better n nosy envronment. A.BAIC DEFINITION Gven a random varable X and ts n samples x1, x,..., x n sorted n the ascendng order, the k-trmmed mnmum of X s x k + 1 ; the k-trmmed mum of X s xn k 1 ; k- mnmum of X s n x = n k / k medan( 1 medan( x,..., k = 1 x / k ; k- mum of X s, and k-medan mnmum of X s x, x,..., x k ); k- mum of X s n k xn ), denoted by mn k trmmed (X), k trmmed(x), mn k (X), k (X), mn k medan(x), and k medan(x), respectvely. Gven DT =, A, D, s a fuzzy smlarty relaton nduced by B s subset C and (x, y) monotonously decreases wth ther dstance x y. If d s one class of x d samples labelled wth and, then the robust fuzzy rough operators are defned as d = 1 (x, y) k trmmed mn y dk trmmed d = (x, y) T k trmmed ϑ k trmmed σ k trmmed d k trmmed mn d = 1 (x, y) y d k trmmed d = 1 1 (x, y) d k trmmed 058
4 Bchtrananda Behera et al, / (IJCIT) Internatonal Journal of Computer cence and Informaton Technologes, Vol. 5 (), 014, mn d = 1 (x, y) k y dk d = (x, y) T k ϑ k σ k d k mn d = 1 (x, y) y d k d = 1 1 (x, y) d k mn y d d = 1 (x, y) k medan k medan d = (x, y) T k medan d k medan ϑ = mn 1 (x, y) k medan y d k medan σ = k medan d k medan d d 1 1 (x, y) The aforementoned models do not compute the lower and upper approxmatons wth respect to the nearest samples as they mght be outlers. These new models use k-trmmed or the or the medan of k nearest samples to compute the membershp of fuzzy approxmatons. Ths way, the varaton of approxmatons caused by outlers s expected to be reduced; thus, the new models may be robust. Gven a bnary classfcaton task, x d1 s a normal sample, and y 1 d s an outler close to x such that xy (, 1) = 0.9.Whle as a normal sample, d s the second nearest sample of x from d, and xy (, ) = 0.. As per the classcal fuzzy rough set model, d1 = 1 ( x, y1) = = 0.1.However, f we use the 1-trmmed model, d = 1 ( x, y ) = 0.8. Ths way, the nosy 1 trmmed 1 sample s gnored n the new model. At the same tme, assume x1 d1 s the second nearest sample of y 1, and x ( 1, y 1) = Accordng to the classcal model, d(y 1) = 1 ( x, y1) = =0.1 and as per the 1- d = 1 x (, y) = 0.1. trmmed model, 1 trmmed 1 1 It s seen that although the nearest sample x s gnored, 1 stll obtans a small value of membershp. In fact, the membershp should be small enough snce y 1 s a nosy sample. Ths example shows that the proposed model can not only reduce the nfluence of nosy samples on computaton of approxmatons of normal samples but can recognze the nosy samples and gve small membershps to them as well. y IV.OBT NEAET NEIGHBO FZZY OGH CLAIFIE(NN-FC) In ths secton, we desgn a robust classfer wth NN-F approxmaton. The dea of ths classfer comes from nearest neghbour rule (NN). ample x s classfed to the class of the nearest neghbour of x. Here nearest neghbour s determned by k-trmmed, k- and k-medan. Fuzzy rough classfer (FC) s desgned as follows. Gven a set of tranng samples wth m classes, x s an unseen sample. We compute m membershps of x to fuzzy lower approxmatons of m classes. Fnally, x s classfed to the class producng the mal membershp as x belongs to ths class wth the greatest consstency. Now we replace the fuzzy lower approxmaton wth the robust fuzzy lower approxmaton for a robust classfer. We call t obust Nearest Neghbour fuzzy rough classfer (NN-FC).It works n a smlar way wth F. We compute the membershps of an unseen sample to the soft fuzzy lower approxmatons of each class. A. NN-F Algorthm Gven a set of tranng samples DT =, A, D, x s a test sample. We compute the fuzzy lower approxmaton of each canddate class wth dfferent fuzzy rough set models. The decson functon s class = arg { class, class,..., class } d 1 Where s a certan fuzzy lower approxmaton operator Formally, the classfcaton algorthm s gven n Table 1. Ths algorthm assgns x the class label whch acheves the largest A. fuzzy lower approxmaton. uppose x class. We compute class. If x really belongs to class, t wll be far from the samples n the other classes. Table 1: NN-F Classfer Input tranng set X = {(x, y ),(x, y ),...,(x, y )} 1 1 n ' ' ' ' and test set X = {x 1, x,..., x m} Process label φ Output classnum= 1 For =1:m degree φ (y, y,..., y ) For class=1:classnum end class n n -mn 1 (y, y,..., y ) ' degree(class) class(x ) arg (degree) class label(x ) eturn label label ' class k n
5 Bchtrananda Behera et al, / (IJCIT) Internatonal Journal of Computer cence and Informaton Technologes, Vol. 5 (), 014, Thus, class should be large; otherwse, class s small. Therefore, the earler algorthm s ratonal to classfy x nto the class label producng the largest fuzzy lower approxmaton f no class nose exsts. However, f there are class-nosy samples, the prevous algorthm may not work when the classcal fuzzy rough set model s employed, whle the robust model s stll effectve n ths case. ome numercal experments are descrbed to prove effectveness of NN-FC n the numercal experments secton B. obustness evaluaton of Dfferent Classfers It s also known as comparson of classfcaton performance. o obustness of dfferent classfers are evaluated usng Average classfcaton accuraces. It s the measure used to evaluate the performance of classfers. Accuracy = (correctly classfed nstances) / (Total no. of nstances)100% 1. Accuracy = (TP+TN) / (TP+FP+TN+FN). enstvty = (TP/TP+FN)100% 3. pecfcty = (TN/ TN+FP) 100% Where, TP = true postve, TN = true negatve FP = false postve, FN = false negatve In 10-fold cross-valdaton, the orgnal sample s randomly parttoned nto 10- equal sze subsamples. Of the 10 subsamples, a sngle subsample s retaned as the valdaton data for testng the model, and the remanng 9 subsamples are used as tranng data. The cross-valdaton process s then repeated 10 tmes (the folds), wth each of the 10 subsamples used exactly once as the valdaton data. The 10 results from the folds can then be averaged (or otherwse combned) to produce a sngle estmaton. The advantage of ths method over repeated random subsamplng s that all observatons are used for both tranng and valdaton, and each observaton s used for valdaton exactly once. IV. NMEICAL EXPEIMENT The smulaton process s carred on a machne havng Intel() core (TM) Duo processor.40 GHz and.00 GB of AM. The MATLAB verson used s 01(a).The smulaton was carred out wth 4 data sets collected from nversty of Calforna, Irvne(CI)Machne Learnng epostory[15]. A. Data sets Four datasets from nversty of Calforna, Irvne (CI) Machne Learnng epostory are used. The nformaton related to the datasets s shown n table. Table : ummares of data sets Dataset amples Features Classes Irs Wdbc Wne Wpbc B. Dataset plt In the process of Classfcaton, the dataset s splt nto ten parts. The randomly chosen 90% of obects are used as the tranng set and the remander 10% as the testng set. C. Parameter pecfcaton We use Gaussan kernel to compute the fuzzy smlarty relatons between samples and the kernel parameter σ = 0.15, k=3. D. mulaton esult In ths secton we do some numercal experments to show robustness of the proposed classfer. Table 3: Classfcaton accuracy(%) on real world data set Data sets Irs Nose Level KNN NEC F k- k- medan k- trmme d 0% % % % % Wne 5% % % % Wdbc 5% % % % Wpbc 5% % % Average The classfcaton accuracy of dfferent classfers computed wth 10 fold cross valdaton s shown n table 3.Frst raw data sets are taken and ts classfcaton accuraces s shown wth zero nose level. nce we know that attrbute nose has less mpact on computaton of classfcaton accuracy. o, here we do not take attrbute nose. We here only consder class nose for accuracy c0mputaton.here 5%,10% and then 15% class nose data sets are taken and classfcaton accuracy are shown n the table 3. For consderng rs data set we see that all the classfers perform well; when there s no nose and KNN classfer performs best. But n nose envronment as nose ncreases only F classfer reduces rapdly others remans nearly same. o for rs data set F classfer performs worst and others perform better on nosy data set. It s shown n fg.1. In fg., F classfer performs best and KNN classfer performs worst and k-, k-medan, k-trmmed performs average when there s no nose n the wne data set. But ntroducng nose k-trmmed, k- and k-medan performs best, NEC and F average and K-NN worst. In wdbc data set k-trmmed performs best n both noseless and nosy envronment. The k-, NEC performs well after that k-medan and K-NN.F performs worst n nosy envronment. It s shown n fg
6 Bchtrananda Behera et al, / (IJCIT) Internatonal Journal of Computer cence and Informaton Technologes, Vol. 5 (), 014, Fg.1 Varaton of classfcaton accuracy for rs data set Fg.5 Average classfcaton accuracy of dfferent classfer Fg.5 shows average classfcaton performance of NN- FC.e. k-, k-trmmed and k-medan performs best.nec performance n between NN-FC and F. Average classfcaton performance of KNN classfer s worst among these classfers. Fg. Varaton of classfcaton accuracy for wne data set VI. CONCLION AND FTE WOK KNN, NEC and F classfers are wdely dscussed and have appled n many classfcaton applcatons. mlarly obust Nearest Neghbour fuzzy rough classfers (NN- FC) s recently dscussed. All classfers are good but there was no comparson was done to show among them best. In ths paper we compared NN-FC wth KNN, NEC and F classfers on both raw and nosy data sets and shown than all three classfers of NN-FC perform best. Out of the three classfers of NN-FC, K-trmmed based NN-FC shows best result both n average and for ndvdual data sets. After NN-FC, NEC classfer s better than F classfer and Fnally KNN classfers. Our future scope s to use NN-FC for feature selecton. Fg.3 Varaton of classfcaton accuracy for wdbc data set Consderng fg.4, we see that k-trmmed performs best n raw data set and also n nosy data set. The k-, k- medan and NEC performs average. But classfcaton performance of KNN and F reduces rapdly wth ncrease n nose.f performs worst. EFEENCE [1] Duda,., Hart, P.: Pattern Classfcaton and cene Analyss.Wley,New York(1973). [] Qnghua Hu, Daren Yu, Zongxa Xe. Neghborhood classfers. Expert ystems wth Applcatons. 34 (008) [3] Z. Pawlak, ough sets, Internatonal Journal of Computer and Informaton cences 11 (198) [4] D. Dubos, H. Prade, ough fuzzy sets and fuzzy rough sets, Internatonal Journal of General ystems 17 (1990) [5] J. M. F. aldo and. Murakam, ough set analyss of a general type of fuzzy data usng transtve aggregatons of fuzzy smlarty relatons, Fuzzy ets yst., vol. 139, pp , 003. [6] A. Meszkowcz-olka and L. olka, Varable Precson Fuzzy ough ets. Transactons on ough ets I. vol. LNC-3100, Berln, Germany:prnger, 004, pp [7] Q. H. Hu, Z. X. Xe, and D.. Yu, Hybrd attrbute reducton based on a novel fuzzy-rough model and nformaton granulaton, Pattern ecognt., vol. 40, no. 1, pp , 007. [8] C. Cornels, M. De Cock, and A. M. adzkowska, Vaguely quantfed rough sets, n Proc. 11th Int. Conf. ough ets, Fuzzy ets, Data Mnng,Granular Comput., 007, pp [9] C. Cornels and. Jensen, A nose-tolerant approach to fuzzyrough feature selecton, n Proc. IEEE Int. Conf. Fuzzy yst., 008, pp Fg.4 Varaton of classfcaton accuracy for wpbc data set 061
7 Bchtrananda Behera et al, / (IJCIT) Internatonal Journal of Computer cence and Informaton Technologes, Vol. 5 (), 014, [10]. Y. Zhao, E. C. C. Tsang, and D. G. Chen, The model of fuzzy varable precson rough sets, IEEE Trans. Fuzzy yst., vol. 17, no., pp ,Apr [11] Q. H. Hu,. An, and D.. Yu, oft fuzzy rough sets for robust feature evaluaton and selecton, Inf. c., vol. 180, pp , 010. [1] Qnghua Hu, Le Zhang, huang An, Davd Zhang, Daren Yu. On robust fuzzy rough set models. IEEE Transactons on Fuzzy ystems. 01, 0(4): [13] N.N. Mors, M.M. Yakout, Axomatcs for fuzzy rough sets, Fuzzy ets and ystems 100 (1998) [14] A.M. adzkowska, E.E. Kerre, A comparatve study of fuzzy rough sets, Fuzzy ets and ystems 16 (00) [15] A. Asuncon and D. J. Newman (007). CI machne learnng repostory,chool Inf. Comput. c., nv. Calforna, Irvne, [Onlne].Avalable: ml ATHO Bchtrananda Behera receved hs B.Tech n Computer cence and Engneerng from Dhaneswar ath Insttute of Engneerng and Management tudes, Cuttack, Inda n 011 and completed M.Tech.n Computer cence and Engneerng from College of Engneerng and Technology, Bhubaneswar, Inda n 013.Hs current research nterest s oft Computng, Data mnng and Algorthms. udhakar abat has obtaned hs B.Tech. n Computer cence and Engneerng from CV aman College of Engneerng, Bhubaneswar, Inda n year 011 and receved hs M.Tech.n Computer cence and Engneerng from College of Engneerng and Technology, Bhubaneswar, nda n 013.Hs feld of nterest s oftware engneerng and Data mnng. 06
Learning the Kernel Parameters in Kernel Minimum Distance Classifier
Learnng the Kernel Parameters n Kernel Mnmum Dstance Classfer Daoqang Zhang 1,, Songcan Chen and Zh-Hua Zhou 1* 1 Natonal Laboratory for Novel Software Technology Nanjng Unversty, Nanjng 193, Chna Department
More informationThe Research of Support Vector Machine in Agricultural Data Classification
The Research of Support Vector Machne n Agrcultural Data Classfcaton Le Sh, Qguo Duan, Xnmng Ma, Me Weng College of Informaton and Management Scence, HeNan Agrcultural Unversty, Zhengzhou 45000 Chna Zhengzhou
More informationClassifier Selection Based on Data Complexity Measures *
Classfer Selecton Based on Data Complexty Measures * Edth Hernández-Reyes, J.A. Carrasco-Ochoa, and J.Fco. Martínez-Trndad Natonal Insttute for Astrophyscs, Optcs and Electroncs, Lus Enrque Erro No.1 Sta.
More informationData Mining: Model Evaluation
Data Mnng: Model Evaluaton Aprl 16, 2013 1 Issues: Evaluatng Classfcaton Methods Accurac classfer accurac: predctng class label predctor accurac: guessng value of predcted attrbutes Speed tme to construct
More informationSupport Vector Machines
/9/207 MIST.6060 Busness Intellgence and Data Mnng What are Support Vector Machnes? Support Vector Machnes Support Vector Machnes (SVMs) are supervsed learnng technques that analyze data and recognze patterns.
More informationSubspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;
Subspace clusterng Clusterng Fundamental to all clusterng technques s the choce of dstance measure between data ponts; D q ( ) ( ) 2 x x = x x, j k = 1 k jk Squared Eucldean dstance Assumpton: All features
More informationMachine Learning 9. week
Machne Learnng 9. week Mappng Concept Radal Bass Functons (RBF) RBF Networks 1 Mappng It s probably the best scenaro for the classfcaton of two dataset s to separate them lnearly. As you see n the below
More informationParallelism for Nested Loops with Non-uniform and Flow Dependences
Parallelsm for Nested Loops wth Non-unform and Flow Dependences Sam-Jn Jeong Dept. of Informaton & Communcaton Engneerng, Cheonan Unversty, 5, Anseo-dong, Cheonan, Chungnam, 330-80, Korea. seong@cheonan.ac.kr
More informationMachine Learning: Algorithms and Applications
14/05/1 Machne Learnng: Algorthms and Applcatons Florano Zn Free Unversty of Bozen-Bolzano Faculty of Computer Scence Academc Year 011-01 Lecture 10: 14 May 01 Unsupervsed Learnng cont Sldes courtesy of
More informationNUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS
ARPN Journal of Engneerng and Appled Scences 006-017 Asan Research Publshng Network (ARPN). All rghts reserved. NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS Igor Grgoryev, Svetlana
More informationType-2 Fuzzy Non-uniform Rational B-spline Model with Type-2 Fuzzy Data
Malaysan Journal of Mathematcal Scences 11(S) Aprl : 35 46 (2017) Specal Issue: The 2nd Internatonal Conference and Workshop on Mathematcal Analyss (ICWOMA 2016) MALAYSIAN JOURNAL OF MATHEMATICAL SCIENCES
More informationTsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance
Tsnghua Unversty at TAC 2009: Summarzng Mult-documents by Informaton Dstance Chong Long, Mnle Huang, Xaoyan Zhu State Key Laboratory of Intellgent Technology and Systems, Tsnghua Natonal Laboratory for
More informationFuzzy Rough Neural Network and Its Application to Feature Selection
70 Internatonal Journal of Fuzzy Systems, Vol. 3, No. 4, December 0 Fuzzy Rough Neural Network and Its Applcaton to Feature Selecton Junyang Zhao and Zhl Zhang Abstract For the sake of measurng fuzzy uncertanty
More informationContent Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers
IOSR Journal of Electroncs and Communcaton Engneerng (IOSR-JECE) e-issn: 78-834,p- ISSN: 78-8735.Volume 9, Issue, Ver. IV (Mar - Apr. 04), PP 0-07 Content Based Image Retreval Usng -D Dscrete Wavelet wth
More informationFEATURE EXTRACTION. Dr. K.Vijayarekha. Associate Dean School of Electrical and Electronics Engineering SASTRA University, Thanjavur
FEATURE EXTRACTION Dr. K.Vjayarekha Assocate Dean School of Electrcal and Electroncs Engneerng SASTRA Unversty, Thanjavur613 41 Jont Intatve of IITs and IISc Funded by MHRD Page 1 of 8 Table of Contents
More informationTHE CONDENSED FUZZY K-NEAREST NEIGHBOR RULE BASED ON SAMPLE FUZZY ENTROPY
Proceedngs of the 20 Internatonal Conference on Machne Learnng and Cybernetcs, Guln, 0-3 July, 20 THE CONDENSED FUZZY K-NEAREST NEIGHBOR RULE BASED ON SAMPLE FUZZY ENTROPY JUN-HAI ZHAI, NA LI, MENG-YAO
More informationA Fast Content-Based Multimedia Retrieval Technique Using Compressed Data
A Fast Content-Based Multmeda Retreval Technque Usng Compressed Data Borko Furht and Pornvt Saksobhavvat NSF Multmeda Laboratory Florda Atlantc Unversty, Boca Raton, Florda 3343 ABSTRACT In ths paper,
More informationEdge Detection in Noisy Images Using the Support Vector Machines
Edge Detecton n Nosy Images Usng the Support Vector Machnes Hlaro Gómez-Moreno, Saturnno Maldonado-Bascón, Francsco López-Ferreras Sgnal Theory and Communcatons Department. Unversty of Alcalá Crta. Madrd-Barcelona
More informationOutline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1
4/14/011 Outlne Dscrmnatve classfers for mage recognton Wednesday, Aprl 13 Krsten Grauman UT-Austn Last tme: wndow-based generc obect detecton basc ppelne face detecton wth boostng as case study Today:
More informationA NOTE ON FUZZY CLOSURE OF A FUZZY SET
(JPMNT) Journal of Process Management New Technologes, Internatonal A NOTE ON FUZZY CLOSURE OF A FUZZY SET Bhmraj Basumatary Department of Mathematcal Scences, Bodoland Unversty, Kokrajhar, Assam, Inda,
More informationA New Approach For the Ranking of Fuzzy Sets With Different Heights
New pproach For the ankng of Fuzzy Sets Wth Dfferent Heghts Pushpnder Sngh School of Mathematcs Computer pplcatons Thapar Unversty, Patala-7 00 Inda pushpndersnl@gmalcom STCT ankng of fuzzy sets plays
More informationSmoothing Spline ANOVA for variable screening
Smoothng Splne ANOVA for varable screenng a useful tool for metamodels tranng and mult-objectve optmzaton L. Rcco, E. Rgon, A. Turco Outlne RSM Introducton Possble couplng Test case MOO MOO wth Game Theory
More informationA Binarization Algorithm specialized on Document Images and Photos
A Bnarzaton Algorthm specalzed on Document mages and Photos Ergna Kavalleratou Dept. of nformaton and Communcaton Systems Engneerng Unversty of the Aegean kavalleratou@aegean.gr Abstract n ths paper, a
More informationImage Representation & Visualization Basic Imaging Algorithms Shape Representation and Analysis. outline
mage Vsualzaton mage Vsualzaton mage Representaton & Vsualzaton Basc magng Algorthms Shape Representaton and Analyss outlne mage Representaton & Vsualzaton Basc magng Algorthms Shape Representaton and
More informationProblem Set 3 Solutions
Introducton to Algorthms October 4, 2002 Massachusetts Insttute of Technology 6046J/18410J Professors Erk Demane and Shaf Goldwasser Handout 14 Problem Set 3 Solutons (Exercses were not to be turned n,
More informationCluster Analysis of Electrical Behavior
Journal of Computer and Communcatons, 205, 3, 88-93 Publshed Onlne May 205 n ScRes. http://www.scrp.org/ournal/cc http://dx.do.org/0.4236/cc.205.350 Cluster Analyss of Electrcal Behavor Ln Lu Ln Lu, School
More informationSupport Vector Machines
Support Vector Machnes Decson surface s a hyperplane (lne n 2D) n feature space (smlar to the Perceptron) Arguably, the most mportant recent dscovery n machne learnng In a nutshell: map the data to a predetermned
More informationDetermining the Optimal Bandwidth Based on Multi-criterion Fusion
Proceedngs of 01 4th Internatonal Conference on Machne Learnng and Computng IPCSIT vol. 5 (01) (01) IACSIT Press, Sngapore Determnng the Optmal Bandwdth Based on Mult-crteron Fuson Ha-L Lang 1+, Xan-Mn
More informationA Statistical Model Selection Strategy Applied to Neural Networks
A Statstcal Model Selecton Strategy Appled to Neural Networks Joaquín Pzarro Elsa Guerrero Pedro L. Galndo joaqun.pzarro@uca.es elsa.guerrero@uca.es pedro.galndo@uca.es Dpto Lenguajes y Sstemas Informátcos
More informationClassifying Acoustic Transient Signals Using Artificial Intelligence
Classfyng Acoustc Transent Sgnals Usng Artfcal Intellgence Steve Sutton, Unversty of North Carolna At Wlmngton (suttons@charter.net) Greg Huff, Unversty of North Carolna At Wlmngton (jgh7476@uncwl.edu)
More informationTerm Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task
Proceedngs of NTCIR-6 Workshop Meetng, May 15-18, 2007, Tokyo, Japan Term Weghtng Classfcaton System Usng the Ch-square Statstc for the Classfcaton Subtask at NTCIR-6 Patent Retreval Task Kotaro Hashmoto
More informationDetection of an Object by using Principal Component Analysis
Detecton of an Object by usng Prncpal Component Analyss 1. G. Nagaven, 2. Dr. T. Sreenvasulu Reddy 1. M.Tech, Department of EEE, SVUCE, Trupath, Inda. 2. Assoc. Professor, Department of ECE, SVUCE, Trupath,
More information6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour
6.854 Advanced Algorthms Petar Maymounkov Problem Set 11 (November 23, 2005) Wth: Benjamn Rossman, Oren Wemann, and Pouya Kheradpour Problem 1. We reduce vertex cover to MAX-SAT wth weghts, such that the
More informationExperiments in Text Categorization Using Term Selection by Distance to Transition Point
Experments n Text Categorzaton Usng Term Selecton by Dstance to Transton Pont Edgar Moyotl-Hernández, Héctor Jménez-Salazar Facultad de Cencas de la Computacón, B. Unversdad Autónoma de Puebla, 14 Sur
More informationAn Optimal Algorithm for Prufer Codes *
J. Software Engneerng & Applcatons, 2009, 2: 111-115 do:10.4236/jsea.2009.22016 Publshed Onlne July 2009 (www.scrp.org/journal/jsea) An Optmal Algorthm for Prufer Codes * Xaodong Wang 1, 2, Le Wang 3,
More informationA MODIFIED K-NEAREST NEIGHBOR CLASSIFIER TO DEAL WITH UNBALANCED CLASSES
A MODIFIED K-NEAREST NEIGHBOR CLASSIFIER TO DEAL WITH UNBALANCED CLASSES Aram AlSuer, Ahmed Al-An and Amr Atya 2 Faculty of Engneerng and Informaton Technology, Unversty of Technology, Sydney, Australa
More informationSkew Angle Estimation and Correction of Hand Written, Textual and Large areas of Non-Textual Document Images: A Novel Approach
Angle Estmaton and Correcton of Hand Wrtten, Textual and Large areas of Non-Textual Document Images: A Novel Approach D.R.Ramesh Babu Pyush M Kumat Mahesh D Dhannawat PES Insttute of Technology Research
More informationThe Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique
//00 :0 AM Outlne and Readng The Greedy Method The Greedy Method Technque (secton.) Fractonal Knapsack Problem (secton..) Task Schedulng (secton..) Mnmum Spannng Trees (secton.) Change Money Problem Greedy
More informationLocal Quaternary Patterns and Feature Local Quaternary Patterns
Local Quaternary Patterns and Feature Local Quaternary Patterns Jayu Gu and Chengjun Lu The Department of Computer Scence, New Jersey Insttute of Technology, Newark, NJ 0102, USA Abstract - Ths paper presents
More informationFeature Selection as an Improving Step for Decision Tree Construction
2009 Internatonal Conference on Machne Learnng and Computng IPCSIT vol.3 (2011) (2011) IACSIT Press, Sngapore Feature Selecton as an Improvng Step for Decson Tree Constructon Mahd Esmael 1, Fazekas Gabor
More informationClassification / Regression Support Vector Machines
Classfcaton / Regresson Support Vector Machnes Jeff Howbert Introducton to Machne Learnng Wnter 04 Topcs SVM classfers for lnearly separable classes SVM classfers for non-lnearly separable classes SVM
More informationSum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints
Australan Journal of Basc and Appled Scences, 2(4): 1204-1208, 2008 ISSN 1991-8178 Sum of Lnear and Fractonal Multobjectve Programmng Problem under Fuzzy Rules Constrants 1 2 Sanjay Jan and Kalash Lachhwan
More informationOutline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like:
Self-Organzng Maps (SOM) Turgay İBRİKÇİ, PhD. Outlne Introducton Structures of SOM SOM Archtecture Neghborhoods SOM Algorthm Examples Summary 1 2 Unsupervsed Hebban Learnng US Hebban Learnng, Cntd 3 A
More informationA new paradigm of fuzzy control point in space curve
MATEMATIKA, 2016, Volume 32, Number 2, 153 159 c Penerbt UTM Press All rghts reserved A new paradgm of fuzzy control pont n space curve 1 Abd Fatah Wahab, 2 Mohd Sallehuddn Husan and 3 Mohammad Izat Emr
More informationA Similarity-Based Prognostics Approach for Remaining Useful Life Estimation of Engineered Systems
2008 INTERNATIONAL CONFERENCE ON PROGNOSTICS AND HEALTH MANAGEMENT A Smlarty-Based Prognostcs Approach for Remanng Useful Lfe Estmaton of Engneered Systems Tany Wang, Janbo Yu, Davd Segel, and Jay Lee
More informationA Modified Median Filter for the Removal of Impulse Noise Based on the Support Vector Machines
A Modfed Medan Flter for the Removal of Impulse Nose Based on the Support Vector Machnes H. GOMEZ-MORENO, S. MALDONADO-BASCON, F. LOPEZ-FERRERAS, M. UTRILLA- MANSO AND P. GIL-JIMENEZ Departamento de Teoría
More informationMachine Learning. Topic 6: Clustering
Machne Learnng Topc 6: lusterng lusterng Groupng data nto (hopefully useful) sets. Thngs on the left Thngs on the rght Applcatons of lusterng Hypothess Generaton lusters mght suggest natural groups. Hypothess
More informationLecture 5: Multilayer Perceptrons
Lecture 5: Multlayer Perceptrons Roger Grosse 1 Introducton So far, we ve only talked about lnear models: lnear regresson and lnear bnary classfers. We noted that there are functons that can t be represented
More informationIncremental Learning with Support Vector Machines and Fuzzy Set Theory
The 25th Workshop on Combnatoral Mathematcs and Computaton Theory Incremental Learnng wth Support Vector Machnes and Fuzzy Set Theory Yu-Mng Chuang 1 and Cha-Hwa Ln 2* 1 Department of Computer Scence and
More information(1) The control processes are too complex to analyze by conventional quantitative techniques.
Chapter 0 Fuzzy Control and Fuzzy Expert Systems The fuzzy logc controller (FLC) s ntroduced n ths chapter. After ntroducng the archtecture of the FLC, we study ts components step by step and suggest a
More informationHierarchical clustering for gene expression data analysis
Herarchcal clusterng for gene expresson data analyss Gorgo Valentn e-mal: valentn@ds.unm.t Clusterng of Mcroarray Data. Clusterng of gene expresson profles (rows) => dscovery of co-regulated and functonally
More informationCS 534: Computer Vision Model Fitting
CS 534: Computer Vson Model Fttng Sprng 004 Ahmed Elgammal Dept of Computer Scence CS 534 Model Fttng - 1 Outlnes Model fttng s mportant Least-squares fttng Maxmum lkelhood estmaton MAP estmaton Robust
More informationA Deflected Grid-based Algorithm for Clustering Analysis
A Deflected Grd-based Algorthm for Clusterng Analyss NANCY P. LIN, CHUNG-I CHANG, HAO-EN CHUEH, HUNG-JEN CHEN, WEI-HUA HAO Department of Computer Scence and Informaton Engneerng Tamkang Unversty 5 Yng-chuan
More informationR s s f. m y s. SPH3UW Unit 7.3 Spherical Concave Mirrors Page 1 of 12. Notes
SPH3UW Unt 7.3 Sphercal Concave Mrrors Page 1 of 1 Notes Physcs Tool box Concave Mrror If the reflectng surface takes place on the nner surface of the sphercal shape so that the centre of the mrror bulges
More informationFuzzy Filtering Algorithms for Image Processing: Performance Evaluation of Various Approaches
Proceedngs of the Internatonal Conference on Cognton and Recognton Fuzzy Flterng Algorthms for Image Processng: Performance Evaluaton of Varous Approaches Rajoo Pandey and Umesh Ghanekar Department of
More informationAssociative Based Classification Algorithm For Diabetes Disease Prediction
Internatonal Journal of Engneerng Trends and Technology (IJETT) Volume-41 Number-3 - November 016 Assocatve Based Classfcaton Algorthm For Dabetes Dsease Predcton 1 N. Gnana Deepka, Y.surekha, 3 G.Laltha
More informationPerformance Evaluation of Information Retrieval Systems
Why System Evaluaton? Performance Evaluaton of Informaton Retreval Systems Many sldes n ths secton are adapted from Prof. Joydeep Ghosh (UT ECE) who n turn adapted them from Prof. Dk Lee (Unv. of Scence
More informationExtraction of Fuzzy Rules from Trained Neural Network Using Evolutionary Algorithm *
Extracton of Fuzzy Rules from Traned Neural Network Usng Evolutonary Algorthm * Urszula Markowska-Kaczmar, Wojcech Trelak Wrocław Unversty of Technology, Poland kaczmar@c.pwr.wroc.pl, trelak@c.pwr.wroc.pl
More informationImplementation Naïve Bayes Algorithm for Student Classification Based on Graduation Status
Internatonal Journal of Appled Busness and Informaton Systems ISSN: 2597-8993 Vol 1, No 2, September 2017, pp. 6-12 6 Implementaton Naïve Bayes Algorthm for Student Classfcaton Based on Graduaton Status
More informationWishing you all a Total Quality New Year!
Total Qualty Management and Sx Sgma Post Graduate Program 214-15 Sesson 4 Vnay Kumar Kalakband Assstant Professor Operatons & Systems Area 1 Wshng you all a Total Qualty New Year! Hope you acheve Sx sgma
More informationBAYESIAN MULTI-SOURCE DOMAIN ADAPTATION
BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION SHI-LIANG SUN, HONG-LEI SHI Department of Computer Scence and Technology, East Chna Normal Unversty 500 Dongchuan Road, Shangha 200241, P. R. Chna E-MAIL: slsun@cs.ecnu.edu.cn,
More informationS1 Note. Basis functions.
S1 Note. Bass functons. Contents Types of bass functons...1 The Fourer bass...2 B-splne bass...3 Power and type I error rates wth dfferent numbers of bass functons...4 Table S1. Smulaton results of type
More informationA PATTERN RECOGNITION APPROACH TO IMAGE SEGMENTATION
1 THE PUBLISHING HOUSE PROCEEDINGS OF THE ROMANIAN ACADEMY, Seres A, OF THE ROMANIAN ACADEMY Volume 4, Number 2/2003, pp.000-000 A PATTERN RECOGNITION APPROACH TO IMAGE SEGMENTATION Tudor BARBU Insttute
More information12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification
Introducton to Artfcal Intellgence V22.0472-001 Fall 2009 Lecture 24: Nearest-Neghbors & Support Vector Machnes Rob Fergus Dept of Computer Scence, Courant Insttute, NYU Sldes from Danel Yeung, John DeNero
More informationINCOMPETE DATA IN FUZZY INFERENCE SYSTEM
XI Conference "Medcal Informatcs & Technologes" - 26 mssng data, ncomplete nformaton fuzz rules, fuzz operators, classfer Slwa POŚPIECH-KURKOWSKA * INCOMPETE DATA IN FUZZY INFERENCE SYSTEM The paper descrbes
More informationAnnouncements. Supervised Learning
Announcements See Chapter 5 of Duda, Hart, and Stork. Tutoral by Burge lnked to on web page. Supervsed Learnng Classfcaton wth labeled eamples. Images vectors n hgh-d space. Supervsed Learnng Labeled eamples
More informationTN348: Openlab Module - Colocalization
TN348: Openlab Module - Colocalzaton Topc The Colocalzaton module provdes the faclty to vsualze and quantfy colocalzaton between pars of mages. The Colocalzaton wndow contans a prevew of the two mages
More informationHelsinki University Of Technology, Systems Analysis Laboratory Mat Independent research projects in applied mathematics (3 cr)
Helsnk Unversty Of Technology, Systems Analyss Laboratory Mat-2.08 Independent research projects n appled mathematcs (3 cr) "! #$&% Antt Laukkanen 506 R ajlaukka@cc.hut.f 2 Introducton...3 2 Multattrbute
More informationMULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION
MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION Paulo Quntlano 1 & Antono Santa-Rosa 1 Federal Polce Department, Brasla, Brazl. E-mals: quntlano.pqs@dpf.gov.br and
More informationOutline. Type of Machine Learning. Examples of Application. Unsupervised Learning
Outlne Artfcal Intellgence and ts applcatons Lecture 8 Unsupervsed Learnng Professor Danel Yeung danyeung@eee.org Dr. Patrck Chan patrckchan@eee.org South Chna Unversty of Technology, Chna Introducton
More informationDetermining Fuzzy Sets for Quantitative Attributes in Data Mining Problems
Determnng Fuzzy Sets for Quanttatve Attrbutes n Data Mnng Problems ATTILA GYENESEI Turku Centre for Computer Scence (TUCS) Unversty of Turku, Department of Computer Scence Lemmnkäsenkatu 4A, FIN-5 Turku
More informationSteps for Computing the Dissimilarity, Entropy, Herfindahl-Hirschman and. Accessibility (Gravity with Competition) Indices
Steps for Computng the Dssmlarty, Entropy, Herfndahl-Hrschman and Accessblty (Gravty wth Competton) Indces I. Dssmlarty Index Measurement: The followng formula can be used to measure the evenness between
More informationA MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS
Proceedngs of the Wnter Smulaton Conference M E Kuhl, N M Steger, F B Armstrong, and J A Jones, eds A MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS Mark W Brantley Chun-Hung
More informationModule Management Tool in Software Development Organizations
Journal of Computer Scence (5): 8-, 7 ISSN 59-66 7 Scence Publcatons Management Tool n Software Development Organzatons Ahmad A. Al-Rababah and Mohammad A. Al-Rababah Faculty of IT, Al-Ahlyyah Amman Unversty,
More informationFace Recognition University at Buffalo CSE666 Lecture Slides Resources:
Face Recognton Unversty at Buffalo CSE666 Lecture Sldes Resources: http://www.face-rec.org/algorthms/ Overvew of face recognton algorthms Correlaton - Pxel based correspondence between two face mages Structural
More informationConcurrent Apriori Data Mining Algorithms
Concurrent Apror Data Mnng Algorthms Vassl Halatchev Department of Electrcal Engneerng and Computer Scence York Unversty, Toronto October 8, 2015 Outlne Why t s mportant Introducton to Assocaton Rule Mnng
More informationBOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET
1 BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET TZU-CHENG CHUANG School of Electrcal and Computer Engneerng, Purdue Unversty, West Lafayette, Indana 47907 SAUL B. GELFAND School
More informationQuerying by sketch geographical databases. Yu Han 1, a *
4th Internatonal Conference on Sensors, Measurement and Intellgent Materals (ICSMIM 2015) Queryng by sketch geographcal databases Yu Han 1, a * 1 Department of Basc Courses, Shenyang Insttute of Artllery,
More informationCompiler Design. Spring Register Allocation. Sample Exercises and Solutions. Prof. Pedro C. Diniz
Compler Desgn Sprng 2014 Regster Allocaton Sample Exercses and Solutons Prof. Pedro C. Dnz USC / Informaton Scences Insttute 4676 Admralty Way, Sute 1001 Marna del Rey, Calforna 90292 pedro@s.edu Regster
More informationComplex System Reliability Evaluation using Support Vector Machine for Incomplete Data-set
Internatonal Journal of Performablty Engneerng, Vol. 7, No. 1, January 2010, pp.32-42. RAMS Consultants Prnted n Inda Complex System Relablty Evaluaton usng Support Vector Machne for Incomplete Data-set
More informationA Post Randomization Framework for Privacy-Preserving Bayesian. Network Parameter Learning
A Post Randomzaton Framework for Prvacy-Preservng Bayesan Network Parameter Learnng JIANJIE MA K.SIVAKUMAR School Electrcal Engneerng and Computer Scence, Washngton State Unversty Pullman, WA. 9964-75
More informationTECHNIQUE OF FORMATION HOMOGENEOUS SAMPLE SAME OBJECTS. Muradaliyev A.Z.
TECHNIQUE OF FORMATION HOMOGENEOUS SAMPLE SAME OBJECTS Muradalyev AZ Azerbajan Scentfc-Research and Desgn-Prospectng Insttute of Energetc AZ1012, Ave HZardab-94 E-mal:aydn_murad@yahoocom Importance of
More informationSSDR: An Algorithm for Clustering Categorical Data Using Rough Set Theory
Avalable onlne at www.pelagaresearchlbrary.com Pelaga Research Lbrary Advances n Appled Scence Research, 20, 2 (3): 34-326 ISSN: 0976-860 CODEN (USA): AASRFC SSDR: An Algorthm for Clusterng Categorcal
More informationProblem Definitions and Evaluation Criteria for Computational Expensive Optimization
Problem efntons and Evaluaton Crtera for Computatonal Expensve Optmzaton B. Lu 1, Q. Chen and Q. Zhang 3, J. J. Lang 4, P. N. Suganthan, B. Y. Qu 6 1 epartment of Computng, Glyndwr Unversty, UK Faclty
More informationAn Image Fusion Approach Based on Segmentation Region
Rong Wang, L-Qun Gao, Shu Yang, Yu-Hua Cha, and Yan-Chun Lu An Image Fuson Approach Based On Segmentaton Regon An Image Fuson Approach Based on Segmentaton Regon Rong Wang, L-Qun Gao, Shu Yang 3, Yu-Hua
More informationOptimizing Document Scoring for Query Retrieval
Optmzng Document Scorng for Query Retreval Brent Ellwen baellwe@cs.stanford.edu Abstract The goal of ths project was to automate the process of tunng a document query engne. Specfcally, I used machne learnng
More informationImpact of a New Attribute Extraction Algorithm on Web Page Classification
Impact of a New Attrbute Extracton Algorthm on Web Page Classfcaton Gösel Brc, Banu Dr, Yldz Techncal Unversty, Computer Engneerng Department Abstract Ths paper ntroduces a new algorthm for dmensonalty
More informationInvestigating the Performance of Naïve- Bayes Classifiers and K- Nearest Neighbor Classifiers
Journal of Convergence Informaton Technology Volume 5, Number 2, Aprl 2010 Investgatng the Performance of Naïve- Bayes Classfers and K- Nearest Neghbor Classfers Mohammed J. Islam *, Q. M. Jonathan Wu,
More informationNon-Negative Matrix Factorization and Support Vector Data Description Based One Class Classification
IJCSI Internatonal Journal of Computer Scence Issues, Vol. 9, Issue 5, No, September 01 ISSN (Onlne): 1694-0814 www.ijcsi.org 36 Non-Negatve Matrx Factorzaton and Support Vector Data Descrpton Based One
More informationCSE 326: Data Structures Quicksort Comparison Sorting Bound
CSE 326: Data Structures Qucksort Comparson Sortng Bound Steve Setz Wnter 2009 Qucksort Qucksort uses a dvde and conquer strategy, but does not requre the O(N) extra space that MergeSort does. Here s the
More informationCollaboratively Regularized Nearest Points for Set Based Recognition
Academc Center for Computng and Meda Studes, Kyoto Unversty Collaboratvely Regularzed Nearest Ponts for Set Based Recognton Yang Wu, Mchhko Mnoh, Masayuk Mukunok Kyoto Unversty 9/1/013 BMVC 013 @ Brstol,
More information2x x l. Module 3: Element Properties Lecture 4: Lagrange and Serendipity Elements
Module 3: Element Propertes Lecture : Lagrange and Serendpty Elements 5 In last lecture note, the nterpolaton functons are derved on the bass of assumed polynomal from Pascal s trangle for the fled varable.
More informationISSN: International Journal of Engineering and Innovative Technology (IJEIT) Volume 1, Issue 4, April 2012
Performance Evoluton of Dfferent Codng Methods wth β - densty Decodng Usng Error Correctng Output Code Based on Multclass Classfcaton Devangn Dave, M. Samvatsar, P. K. Bhanoda Abstract A common way to
More information3D vector computer graphics
3D vector computer graphcs Paolo Varagnolo: freelance engneer Padova Aprl 2016 Prvate Practce ----------------------------------- 1. Introducton Vector 3D model representaton n computer graphcs requres
More informationLobachevsky State University of Nizhni Novgorod. Polyhedron. Quick Start Guide
Lobachevsky State Unversty of Nzhn Novgorod Polyhedron Quck Start Gude Nzhn Novgorod 2016 Contents Specfcaton of Polyhedron software... 3 Theoretcal background... 4 1. Interface of Polyhedron... 6 1.1.
More informationA mathematical programming approach to the analysis, design and scheduling of offshore oilfields
17 th European Symposum on Computer Aded Process Engneerng ESCAPE17 V. Plesu and P.S. Agach (Edtors) 2007 Elsever B.V. All rghts reserved. 1 A mathematcal programmng approach to the analyss, desgn and
More informationFast Feature Value Searching for Face Detection
Vol., No. 2 Computer and Informaton Scence Fast Feature Value Searchng for Face Detecton Yunyang Yan Department of Computer Engneerng Huayn Insttute of Technology Hua an 22300, Chna E-mal: areyyyke@63.com
More informationAn Evolvable Clustering Based Algorithm to Learn Distance Function for Supervised Environment
IJCSI Internatonal Journal of Computer Scence Issues, Vol. 7, Issue 5, September 2010 ISSN (Onlne): 1694-0814 www.ijcsi.org 374 An Evolvable Clusterng Based Algorthm to Learn Dstance Functon for Supervsed
More informationEYE CENTER LOCALIZATION ON A FACIAL IMAGE BASED ON MULTI-BLOCK LOCAL BINARY PATTERNS
P.G. Demdov Yaroslavl State Unversty Anatoly Ntn, Vladmr Khryashchev, Olga Stepanova, Igor Kostern EYE CENTER LOCALIZATION ON A FACIAL IMAGE BASED ON MULTI-BLOCK LOCAL BINARY PATTERNS Yaroslavl, 2015 Eye
More informationA Hybrid Text Classification System Using Sentential Frequent Itemsets
A Hybrd Text Classfcaton System Usng Sentental Frequent Itemsets Shzhu Lu, Hepng Hu College of Computer Scence, Huazhong Unversty of Scence and Technology, Wuhan 430074, Chna stoneboo@26.com Abstract:
More information