Relevance Assignment and Fusion of Multiple Learning Methods Applied to Remote Sensing Image Analysis
|
|
- Neal Thomas
- 5 years ago
- Views:
Transcription
1 Assgnment and Fuson of Multple Learnng Methods Appled to Remote Sensng Image Analyss Peter Bajcsy, We-Wen Feng and Praveen Kumar Natonal Center for Supercomputng Applcaton (NCSA), Unversty of Illnos at Urbana-Champagn, Urbana, Illnos, USA { pbajcsy, fengww}@ncsa.uuc.edu, umar@uuc.edu Abstract Wth the advances n remote sensng, varous machne learnng technques could be appled to study varable relatonshps. Although predcton models obtaned usng machne learnng technques has proven to be sutable for predctons, they do not explctly provde means for determnng nput-output varable relevance. We nvestgated the ssue of relevance assgnment for multple machne learnng models appled to remote sensng varables n the context of terrestral hydrology. The relevance s defned as the nfluence of an nput varable wth respect to predctng the output result. We ntroduce a methodology for assgnng relevance usng varous machne learnng methods. The learnng methods we use nclude Regresson Tree, Support Vector Machne, and K-Nearest Neghbor. We derve the relevance computaton scheme for each learnng method and propose a method for fusng relevance assgnment results from multple learnng technques by averagng and votng mechansm. All methods are evaluated n terms of relevance accuracy estmaton wth synthetc and measured data.. Introducton The problem of understandng relatonshps and dependences among geographc varables (features) s of hgh nterest to scentsts durng many data-drven, dscovery-type analyses. Varous machne learnng methods have for data-drven analyses been developed to buld predcton models that represent nput-output varable relatonshps. However, predcton models obtaned usng machne learnng technques vary n ther underlyng model representatons and frequently do not provde a clear nterpretaton of nput-output varable relatonshps. Thus, the goal of data-drven modelng s not only accurate predcton but also nterpretaton of the nput-output relatonshps. In ths paper, we address the problem of data-drven model nterpretaton to establsh relevance of nput varables wth respect to predcted output varables. Frst, we ntroduce the prevous wor n Secton and formulate an nterpretaton of data-drven models by assgnng relevance to nput varables n Secton 3. assgnments are derved at the sample (local) or model (global) levels based on co-lnearty of nput varable bass vectors wth the normal of regresson hyper-planes formed over model-defned parttons of data samples. Second, we propose algorthms for combnng relevance assgnments obtaned from multple data-drven models n Secton 4. Fnally, we evaluate accuracy of relevance assgnment usng (a) three types of synthetc and one set of measured data, (b) three machne learnng algorthms, such as Regresson Tree (RT), Support Vector Machne (SVM), and K-Nearest Neghbors (KNN), and (c) two relevance assgnment fuson methods as documented n Secton 5 and summarze our results n Secton 6. The novelty of our wor les n developng a methodology for relevance assgnment over a set of machne learnng models, proposng relevance fuson methods, and demonstratng the accuracy of multple relevance assgnment methods wth multple synthetc and expermental data sets.. Prevous Wor Interpretaton of data-drven models to establsh nput-output varable relatonshps s also part of varable (feature) subset selecton. Feature subset POC: Peter Bajcsy, pbajcsy@ncsa.uuc.edu
2 selecton s one of the classcal research areas n machne learnng [3][5][4]. Its goal s to pre-select the most relevant features for learnng concepts, and mprove accuracy of learnng [9]. In the wor of Pudl et al [], the authors use a sequental search through all possble combnatons of feature subsets, and fnd an optmal feature subset that yelds the best result. Wth a smlar attempt, Perner and hs coworers [0] compare the nfluence of dfferent feature selecton methods n the learnng performance for decson tree. However, the exhaustve search over all possble combnatons of feature subsets s not always computatonally feasble for large number of features, for nstance, n the case of hyperspectral band selecton. To address the feasblty aspect, Bajcsy and Groves [8][5] proposed to use a combnaton of unsupervsed and supervsed machne learnng technques. Other approaches n spectral classfcaton were proposed by S. De Bacer et al. []. Another challenge of relevance assgnment s related to the multtude of relevance defntons. For example, relevance of an nput varable could be defned by traversng a regresson tree model, and summng ts weghted occurrence n the tree structure as proposed by Whte and Kumar []. In the survey by Blum et al [], the authors gve a conceptual defnton of a relevant nput varable as a varable that wll affect the predcton results f t s modfed. In our wor, whle adherng to the conceptual defnton of Blum et al [], we extend the defnton of relevance assgnment by numercally quantfyng relatve mportance of nput varables. Our relevance assgnment s related to the wor of Heler et al [6], n whch the authors used colnearty of bass vectors wth the normal of a separatng hyper-plane obtaned from Support Vector Machne (SVM) method as the metrc for relevance assgnment. We use the co-lnearty property n our relevance assgnments derved from a set of regresson hyper-planes formed over model-defned parttons of data samples. 3. Assgnments To analyze nput-output relatonshps n practce, one s presented wth dscrete data that could be descrbed as a set of M examples (rows of a table) wth N varables (columns of a table). Examples represent nstances of multple ground and remotely sensed measurements. Measurements are the varables (features) that mght have relatonshps among themselves and our goal s to obtan better understandng of the relatonshps. In ths wor, we start wth a mathematcal defnton of varable relevance that s consstent wth the conceptual understandng of nput-output relatonshps. We defne relevance of an nput varable v v = ( v, v,..., vn) as the partal dervatve of an output (predcted) functon f ( v, v,..., vn ) wth respect to the nput varable v. Equaton () shows a vector of relevance values for all nput varables. R f( v, v,..., v ) f( v, v,..., v ) N N = ;...; v vn () Ths defnton assumes that nput and output varables are contnuous, and an output functon f s C contnuous (frst dervatve exsts). In order to follow ths mathematcal defnton n practce, there arse challenges assocated wth () processng dscrete samples, such as defnng the neghborhood proxmty of a sample n the manfold of nput varables, () representng a data-drven model, such as dervng analytcal form of a predcted functon f, (3) scalng nput and output varables wth dfferent dynamc ranges of measurements, (4) removng dependences on algorthmc parameters of machne learnng technques, (5) understandng varablty of relevance as a functon of data qualty parameters (e.g., sensor nose, cloud coverage durng remote sensng), and (6) treatng a mx of contnuous and categorcal varables, just to name a few. Our approach to the above challenges s (a) to use a mathematcally, well defned (analytcally descrbed) model, le the mult-varate regresson, on sample parttons obtaned usng a machne learnng technque, (b) to scale all varables to the same dynamc range of [0, ], (c) to nvestgate dependences on algorthmc parameters of machne learnng technques and data qualty parameters, and (d) to propose the fuson of mult-method relevance results to ncrease our confdence n the relevance results.. Havng analytcal descrpton of a functon f allows us to derve relevance accordng to the defnton. We have currently constraned our wor to processng only contnuous varables and foresee the ncluson of categorcal varables n our future wor. Based on the above consderatons, we defne sample and model relevance assgnments for a set of dscrete samples wth measurements of contnuous nput and output varables. Example relevance R s the local relevance of each nput varable v computed at the sample s j. The computaton s defned for three machne learnng technques n the next sub-sectons.
3 Model relevance R s the global relevance of each nput varable v over all examples n the entre datadrven model computed by summng all sample relevances. To obtan comparable example and model relevances from multple data-drven models, we normalze the relevance values by the sum of all model relevances over all nput varables (see Equaton ()). The normalzed relevance values are denoted wth a tlde. R R j R = ; R = () R R j j In the next subsectons, we ntroduce relevance assgnment for Regresson Tree (RT), Support Vector Machne (SVM), and K-Nearest Neghbors (KNN). The man reason for choosng these three methods comes from our requrement for remote sensng data analyss to process contnuous nput and output varables. Furthermore, the methods represent a set of machne learnng technques wth dfferent underpnnng prncples for data-drven modelng. The KNN method bulds models usng only close neghbors to a gven sample. The SVM method bulds models based on all nput samples. As for the RT method, t bulds ts model by herarchcally subdvdng all nput samples and fttng a local regresson model to samples n each dvded cell (leaf). Thus, these methods represent a spectrum of supervsed machne learnng technques that would be evaluated n terms of relevance assgnment accuracy. 3.. Regresson Tree The process of buldng a regresson tree based model can be descrbed by splttng nput examples nto sub-groups (denoted as cells or tree leaves) based on a sequence of crtera mposed on ndvdual varables [4]. The splttng crtera, e.g., nformaton gan or data varance, mght depend on problem types, and become one of the algorthmc parameters. In ths wor, we choose varance as our splttng crteron. Once all examples are parttoned nto leaves, a multvarate lnear regresson model s used to predct output values for examples that fall nto the leaf. For any example e j, example relevance R of the varable v s computed from the lnear regresson model assocated wth the regresson tree leaf. The regresson model at each regresson tree leaf approxmates locally the predcton functon f wth a lnear model wrtten as: f v wv w v w v (3) ( ) = β N N The example (local) relevance assgnment R s then computed as a dot product between the normal w of a regresson model hyper-plane and the unt vector u of nput varable v as descrbed below: R = w u (4) where w u denotes the absolute value of a dot product of vectors w and u. 3.. K-Nearest Neghbors K-nearest neghbors s a machne learnng method that predcts output values based on K closest examples to any chosen one measured n the space of nput varables v. Predcted values are formed as a weghted sum of those K nearest examples [3][4]. For any example e j, example relevance R of the varable v s computed from the lnear regresson model obtaned from the K nearest neghbors to the example e j. The lnear regresson model approxmates locally the predcton functon f. The example relevance assgnment s performed accordng to Equatons (3) and (4) Support Vector Machne Support vector machne (SVM) s a machne learnng method that could buld a model for separatng examples (classfcaton problem), or for fttng examples (predcton problem). We use SVM as a predcton technque to model nput data. The SVM model could be lnear or non-lnear dependng on a SVM ernel. The non-lnear models are obtaned by mappng nput data to a hgher dmensonal feature space, whch s convenently acheved by ernel mappngs [7]. Thus, the predcton functon f could be obtaned from mathematcal descrptons of lnear and non-lnear ernels. In ths paper, we focus only on a lnear model n order to smplfy the math. For a SVM method wth the lnear ernel, the mathematcal descrpton of f becomes a hyper-plane as descrbed n Equaton (3). The dfference between
4 RT, KNN and SVM methods s n the fact that SVM would use all examples to estmate the parameters of the hyper-plane and t would lead to only one hyperplane for the entre set of examples. The example relevance assgnment for SVM follows Equaton (4). In the case of SVM, the example relevance and model relevance are dentcal. 4. Fuson The goal of relevance fuson s to acheve more robust relevance assgnment n the presence of nose, varable data qualty (e.g., clouds durng remote sensng), as well as to remove any bas ntroduced by a sngle machne learnng method. The latter motvaton s also supported by the no-free-lunch (NFL) theorem [6], whch states that no sngle supervsed method s superor over all problem domans; methods can only be superor for partcular data sets. In ths paper, we propose two dfferent schemes to combne the results of relevance assgnment, such as average fuson and ran fuson. Average fuson s based on tang the numercal average of relevance values, and usng t as the combned relevance value. Ran fuson, on the other hand, uses the votng scheme to combne the relatve rans of each nput varable determned by each machne learnng method. 4.. Average Fuson Average fuson s executed by tang the normalzed relevance results from multple machne learnng methods, and computng the average of them. Gven R as the normalzed example relevance of the -th varable at the example e j obtaned from the -th machne learnng method, the example relevance assgnment of for the example R avg establshed usng average fuson e j s defned as: L L = avg R = R (5) where L s number of machne learnng methods used. The model relevance assgnment of avg R usng average fuson for an nput varable v s the sum of all example relevances as defned n Equaton (). 4.. Ran Fuson Ran fuson s executed n a smlar manner as the average fuson. The dfference les n assgnng a relevance ran to each varable v based on ts relevance ranng by the -th machne learnng model. The absolute magntude of relevance assgnment s lost n ran fuson snce the meanng of magntude s converted to relatve ranng durng the process. The ran fuson approach s expected to be more robust than the average fuson approach when some of the machne learnng methods create very ncorrect models due to varous reasons and sew the correct results of other machne learnng methods. The example relevance assgnment usng ran fuson s descrbed as follows: For each example e j and ts normalzed example relevance R, we defne the ran of each varable v as the ndex of a sorted lst of relevances from the smallest to the largest; ran R N.The ran fuson based { } {,,... } relevance assgnment for varable v s then computed as shown below: L ran R = ( N ran{ R }) (6) LN = The model relevance assgnment of ran R usng average fuson for an nput varable v s the sum of ran all example relevances R over all examples as defned n Equaton (). 5. Evaluaton Setup The evaluaton of relevance assgnments was performed usng GeoLearn software that was developed by NCSA and CEE UIUC. GeoLearn allows a user to model nput-output varable relatonshps from mult-varate NASA remote sensng mages over a set of boundares. The machne learnng algorthms n the GeoLearn system leverage fve software pacages, such as, ImLearn (remote sensng mage processng), ArcGIS (georeferencng), DK software (RT mplementaton), LbSVM [4] (SVM mplementaton), and KDTree [3] (KNN mplementaton). KNN and SVM methods are senstve to the scale of nput varables, and wll favor varables wth a wder scale of values. In order to avod ths type of a bas, the dynamc range of all varables s always normalzed to
5 the range between [0, ] accordng to the formula below: Value MnValue NormalzedValue = (7) MaxValue MnValue Evaluatons were performed wth both synthetc and measured data. Synthetc data allow us to smulate three categores of nput-output relatonshps wth nown ground truth to understand relevance assgnment accuracy and relevance dependences. Measured data was used to demonstrate the applcaton of relevance assgnment to studyng vegetaton changes and the results were verfed based on our lmted understandng of the phenomena. The next sub-sectons descrbe synthetc data smulatons, model buldng setup and evaluaton metrcs to assess relevance assgnment accuracy. 5.. Synthetc Data Smulatons Three sets of nput-output relatonshps were smulated to represent () lnear addtve, () nonlnear addtve and (3) non-lnear multplcatve categores of relatonshps. To ntroduce rrelevant nput varables nto the problem, we smulated output usng only two nput varables v, v (the relevant varables) whle modelng relatonshps wth four varables, where the addtonal two nput varables v 3, v 4 have values drawn from a unform dstrbuton of [0,] (the rrelevant varables). The specfc analytcal forms for generatng the three data sets are provded n Equatons (8) -lnear addtve, (9) nonlnear addtve and (0) non-lnear multplcatve., v, v3, v4) 4 f ( v = v + v (8) π f ( v, v, v3, v4) = sn π v + cos v (9) f ( v, v, v, v = v v (0) 3 4) In addton to smulatng multple nput-output relatonshps and relevant-rrelevant varables, we added nose to generated output values to test the nose robustness of relevance assgnments. Nose s smulated to be an addtve varable followng D Gaussan dstrbuton wth zero mean µ and standard devaton σ ; N ( µ = 0, σ ). The standard devaton was parameterzed as σ = αd, where α s the percentage of the dynamc range d of an output varable. In our experments, we used α = 0. and α = 0. 3 to generate the total of nne synthetc data sets (3 wthout nose, 3 wth addtve noseα = 0., and 3 wth addtve nose α = 0.3 ). 5.. Model Buldng Setup Model buldng setup s concerned wth optmzaton of algorthmc parameters and cross valdaton. Frst, we set the algorthmc parameters to the followng values: () RT - varance error as a crteron for splttng, mnmum number of examples per leaf to eght, maxmum tree depth to ; () KNN K = N + 3, where N s the dmenson of all nput varables. The reason for settng K slghtly larger than the nput varable dmensons s to meet the leastsquare fttng requrements for estmatng a hyperplane from K examples; (3) SVM - lnear ernel, cost factor C=.0, and termnaton crtera Epslon = The optmzaton of KNN s parameter K and RT s parameter maxmum tree depth was nvestgated expermentally. Second, we omtted cross valdaton of models n our experments and rather computed nput varable relevance based on all avalable examples. We wll nvestgate n the future the accuracy of nput varable relevance assgnment from examples selected by cross valdaton or all avalable examples Evaluaton Metrcs To evaluate the accuracy of nput varable relevance assgnment usng multple machne learnng methods, we ntroduced two metrcs, such as percentage of correctness and error dstance. The evaluatons are conducted only for the synthetc data aganst the ground truth values of normalzed example relevance R and normalzed model relevance GT R GT. The ground truth values are obtaned by computng partal dervatves of Equatons (8), (9) and (0) accordng to Equaton (). The percentage of correctness metrc s defned n Equaton () as: PC M δ j j= = M () GT where δ s f max R = max R, and s 0 j otherwse. The error dstance metrc s defned n Equaton () as the Eucldean dstance between the true model relevance derved from partal dervatve and the relevance estmaton from our methods. Ths
6 Table : table obtaned from synthetc data wthout nose by RT, KNN and SVM methods. Model Regresson Tree K-Nearest Neghbors Support Vector Machne (Lnear) metrc does not apply to the relevance results obtaned usng ran fuson snce the results are categorcal. N GT ErrorDst. = ( R R ) () = 6. Experment Results In ths secton, we present evaluatons wth synthetc and measured data n two forms. Frst, we report a relevance mage that shows the color of an nput varable wth maxmum relevance value at each pxel locaton. The color codng schema maps red, green, blue and yellow colors to ( v, v, v3, v 4). Second, we provde a relevance table wth model relevance value for each nput varable. Lnear Non-Lnear Add. Non-Lnear Mult. Var Correct % Correct % Correct % Dstance Error Dstance Error Dstance Error v % v % v3 3.80E E E-04.73E-04.79E-06 v4 3.7E-7 8.0E E v % v % v3 7.48E E v4 3.53E v % v % v3 6.9E v E locaton, we antcpated LAI to be more relevant to Fpar than LST, and both Lattude and Longtude to be almost rrelevant. The relevance results are summarzed n Fgure 3 and Table Synthetc Data The relevance assgnment results usng RT, KNN and SVM methods from synthetc data wthout nose are summarzed n Fgure and Table. The results obtaned usng the fuson methods for the same data are summarzed n Fgure and Table. We also evaluated the methods wth synthetc data wth nose and the results were omtted for brevty. 6.. Measured Data We processed measured remotely sensed data from NASA acqured n 003, at spatal resoluton 000m per pxel and at the locaton (lattude x longtude) = ([35.34, 36.35] x [-9.54, -93.3]. We modeled output Fpar varable (fracton of Photosynthetc actve radaton) as a functon of nput varables consstng of LST (Land Surface Temperature), LAI (Leaf Area Index), Lattude, and Longtude. For ths geo-spatal Fgure. mages obtaned from synthetc data wthout nose by RT, KNN and SVM methods. Rows correspond to categores of relatonshps (top lnear addtve, mddle - non-lnear multplcatve, bottom - non-lnear addtve). Columns correspond to methods (left - ground truth, second left RT, second rght KNN and rght SVM). 7. Dscusson The results presented n the prevous secton are dscussed by comparng ndvdual machne learnng methods and fuson methods.
7 Table : table obtaned from synthetc data wthout nose by average and ran fuson methods. Model Fuson Average Fuson Ran Lnear Non-Lnear Add. Non-Lnear Mult. Var Correct % Correct % Correct % Dstance Error Dstance Error Dstance Error v % v % v3.0e v4 5.37E v % v % v NA NA v NA 7.. Comparson of Indvdual Methods mddle average fuson, and rght ran fuson). From the expermental results, we observe that RT s a relable hybrd method that usually gves accurate relevance estmaton. KNN s flexble under dfferent data type and always gves good relevance estmaton n terms of correctness. However, t s very senstve to nose, and we observe sgnfcant performance drops when nose presents. SVM usually yelds more robust result under nose, but s restrcted by ts expressvty snce we use only lnear ernel here. There s no sngle best method for every dataset. The correctness of relevance assgnment strongly depends on the type of data and the learnng method we use. The fuson methods are proposed based on ths motvaton. LAI LST Lattude Longtude Fgure 3. mages for NASA data obtaned usng RT (left top), KNN (left mddle), SVM (left bottom), average fuson (rght top) and ran fuson (rght mddle). The most relevant nput varable at each pxel s encoded accordng to the color scheme n the legend (rght bottom). The majorty of pxels s labeled red (LAI). 7.. Comparson of Fuson Methods Fgure. mages obtaned from synthetc data wthout nose by usng average and ran fuson methods. Rows correspond to categores of relatonshps (top lnear addtve, mddle - non-lnear multplcatve, bottom - non-lnear addtve). Columns correspond to methods (left - ground truth, Fuson methods usually outperform any sngle method n terms of relevance assgnment correctness (% Correct) for synthetc datasets wthout nose. Under ths settng, the dfference between average fuson and ran fuson s not obvous. Although we have not ncluded the results for synthetc data wth nose for brevty reasons, we would comment on the results obtaned. For synthetc datasets wth nose, the correctness of relevance assgnment for fuson methods s stll more stable than for a sngle method (RT or KNN or SVM). However, fuson methods do not gve the optmal estmaton comparng to the best sngle method. 8. Concluson and Future Wor In ths paper, we proposed a framewor for computng nput varable relevance wth respect to a
8 predcted output varable from multple machne learnng methods. Followng the conceptual defnton of relevance n the lterature, we defned partal dervatves of nput-output dependences as our relevance assgnment approach. The estmaton of two types of relevances, such as example and model relevances, were mplemented for regresson tree, K- nearest neghbors, and support vector machne methods. Addtonal fuson schemes for combnng the relevance results from multple methods were evaluated together wth sngle methods by usng synthetc and measured data and two metrcs. Based on three categores of synthetc nput-output smulatons, ncludng lnear addtve, non-lnear addtve and nonlnear multplcatve relatonshps, wthout or wth nose added, we concluded that the relevance assgnment usng fuson approaches demonstrate more robust performance than the assgnment usng a sngle machne learnng method. Table 3 : assgnment results for remote sensng data set from NASA. Model Var Regresson Tree LAI LST Lattude.07E-04 Longtude 8.0E-04 K-Nearest Neghbors LAI LST Lattude 0.75 Longtude Support Vector Machne LAI (Lnear) LST Lattude Longtude.8E-05 Average Fuson LAI LST Lattude Longtude Ran Fuson LAI 0.4 LST Lattude Longtude 0. In the future, we would le to extend the fuson methods to nclude the results from other learnng methods and to understand the dependences of relevance assgnment on model buldng setups. 0. References [] Blum A.L. and P. Langley, 997. Selecton of Relevant Features and Examples n Machne Learnng. Artfcal Intellgence 97, 997 Elsever Scence. [] Whte A. and P. Kumar, 005. Domnant Influences on Vegetaton Greenness Over the Blue Rdge Ecoregon Submtted to Ecosystems. [3] Jan A. and D. Zonger, 997. Feature selecton: evaluaton, applcaton, and small sample performance. IEEE Trans. Pattern Anal. Machne Intell., 9:53 89, 997. [4] Ln C.J., 006. LIBSVM -- A Lbrary for Support Vector Machnes. [5] Kudo M. and J. Slansy, 000. Comparson of algorthms that select features for pattern classfers. Pattern Recognton, 33:5-4, 000. [6] Heler M., D. Cremers, and C. Schnorr, 00. Effcent Feature Subset Selecton for Support Vector Machnes. Techncal Report /00 Dept. of Math and Computer Scence, Unversty of Mannhem. [7] Crstann N. and J. S. Taylor, 000. An Introducton to Support Vector Machnes and Other Kernel-based Learnng Methods, Cambrdge Unversty Press, March 000. [8] Bajcsy P. and P. Groves, 004. Methodology For Hyperspectral Band Selecton, Photogrammetrc Engneerng and Remote Sensng journal, Vol. 70, Number 7, July 004, pp [9] Perner P. and C. Apte, 000. Emprcal Evaluaton of Feature Subset Selecton Based on a Real World Data Set, Prncples of Data Mnng and Knowledge Dscovery, Sprnger Verlag 000, p [0] Perner P., 00. Improvng the Accuracy of Decson Tree Inducton by Feature Pre-Selecton, Appled Artfcal Intellgence 00, vol. 5, No. 8, p [] Pudl P., J. Navovcova, and J.Kttler, 994. Floatng search methods n feature selecton, Pattern Recognton Letters, 5(994), 9-5. [] De Bacer S., P. Kempeneers, W. Debruyn and P. Scheunders, 005. "A Band Selecton Technque for Spectral Classfcaton," IEEE Geoscence and Remote Sensng Letters, Vol., Nr. 3, p , July (005). [3] Levy S. D., 006, A Java class for KD-tree search. [4] Han, J., and Kamber, M., 00. Data Mnng: Concepts and Technques, Morgan Kaufmann Publshers, San Francsco, CA. [5] Groves P. and P. Bajcsy, 003. Methodology for Hyperspectral Band and Classfcaton Model Selecton, IEEE Worshop on Advances n Technques for Analyss of Remotely Sensed Data, Washngton DC, October 7, 003. [6] Duda, R., P. Hart and D. Stor, 00. Pattern Classfcaton, Second Edton, Wley-Interscence.
Classifier Selection Based on Data Complexity Measures *
Classfer Selecton Based on Data Complexty Measures * Edth Hernández-Reyes, J.A. Carrasco-Ochoa, and J.Fco. Martínez-Trndad Natonal Insttute for Astrophyscs, Optcs and Electroncs, Lus Enrque Erro No.1 Sta.
More informationLearning the Kernel Parameters in Kernel Minimum Distance Classifier
Learnng the Kernel Parameters n Kernel Mnmum Dstance Classfer Daoqang Zhang 1,, Songcan Chen and Zh-Hua Zhou 1* 1 Natonal Laboratory for Novel Software Technology Nanjng Unversty, Nanjng 193, Chna Department
More informationDetermining the Optimal Bandwidth Based on Multi-criterion Fusion
Proceedngs of 01 4th Internatonal Conference on Machne Learnng and Computng IPCSIT vol. 5 (01) (01) IACSIT Press, Sngapore Determnng the Optmal Bandwdth Based on Mult-crteron Fuson Ha-L Lang 1+, Xan-Mn
More informationThe Research of Support Vector Machine in Agricultural Data Classification
The Research of Support Vector Machne n Agrcultural Data Classfcaton Le Sh, Qguo Duan, Xnmng Ma, Me Weng College of Informaton and Management Scence, HeNan Agrcultural Unversty, Zhengzhou 45000 Chna Zhengzhou
More informationSupport Vector Machines
/9/207 MIST.6060 Busness Intellgence and Data Mnng What are Support Vector Machnes? Support Vector Machnes Support Vector Machnes (SVMs) are supervsed learnng technques that analyze data and recognze patterns.
More informationEdge Detection in Noisy Images Using the Support Vector Machines
Edge Detecton n Nosy Images Usng the Support Vector Machnes Hlaro Gómez-Moreno, Saturnno Maldonado-Bascón, Francsco López-Ferreras Sgnal Theory and Communcatons Department. Unversty of Alcalá Crta. Madrd-Barcelona
More informationOutline. Type of Machine Learning. Examples of Application. Unsupervised Learning
Outlne Artfcal Intellgence and ts applcatons Lecture 8 Unsupervsed Learnng Professor Danel Yeung danyeung@eee.org Dr. Patrck Chan patrckchan@eee.org South Chna Unversty of Technology, Chna Introducton
More informationSLAM Summer School 2006 Practical 2: SLAM using Monocular Vision
SLAM Summer School 2006 Practcal 2: SLAM usng Monocular Vson Javer Cvera, Unversty of Zaragoza Andrew J. Davson, Imperal College London J.M.M Montel, Unversty of Zaragoza. josemar@unzar.es, jcvera@unzar.es,
More informationClassification / Regression Support Vector Machines
Classfcaton / Regresson Support Vector Machnes Jeff Howbert Introducton to Machne Learnng Wnter 04 Topcs SVM classfers for lnearly separable classes SVM classfers for non-lnearly separable classes SVM
More informationFace Recognition University at Buffalo CSE666 Lecture Slides Resources:
Face Recognton Unversty at Buffalo CSE666 Lecture Sldes Resources: http://www.face-rec.org/algorthms/ Overvew of face recognton algorthms Correlaton - Pxel based correspondence between two face mages Structural
More informationA Binarization Algorithm specialized on Document Images and Photos
A Bnarzaton Algorthm specalzed on Document mages and Photos Ergna Kavalleratou Dept. of nformaton and Communcaton Systems Engneerng Unversty of the Aegean kavalleratou@aegean.gr Abstract n ths paper, a
More informationS1 Note. Basis functions.
S1 Note. Bass functons. Contents Types of bass functons...1 The Fourer bass...2 B-splne bass...3 Power and type I error rates wth dfferent numbers of bass functons...4 Table S1. Smulaton results of type
More informationSimulation: Solving Dynamic Models ABE 5646 Week 11 Chapter 2, Spring 2010
Smulaton: Solvng Dynamc Models ABE 5646 Week Chapter 2, Sprng 200 Week Descrpton Readng Materal Mar 5- Mar 9 Evaluatng [Crop] Models Comparng a model wth data - Graphcal, errors - Measures of agreement
More informationContent Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers
IOSR Journal of Electroncs and Communcaton Engneerng (IOSR-JECE) e-issn: 78-834,p- ISSN: 78-8735.Volume 9, Issue, Ver. IV (Mar - Apr. 04), PP 0-07 Content Based Image Retreval Usng -D Dscrete Wavelet wth
More informationSupport Vector Machines
Support Vector Machnes Decson surface s a hyperplane (lne n 2D) n feature space (smlar to the Perceptron) Arguably, the most mportant recent dscovery n machne learnng In a nutshell: map the data to a predetermned
More informationUnsupervised Learning
Pattern Recognton Lecture 8 Outlne Introducton Unsupervsed Learnng Parametrc VS Non-Parametrc Approach Mxture of Denstes Maxmum-Lkelhood Estmates Clusterng Prof. Danel Yeung School of Computer Scence and
More informationA MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS
Proceedngs of the Wnter Smulaton Conference M E Kuhl, N M Steger, F B Armstrong, and J A Jones, eds A MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS Mark W Brantley Chun-Hung
More informationHermite Splines in Lie Groups as Products of Geodesics
Hermte Splnes n Le Groups as Products of Geodescs Ethan Eade Updated May 28, 2017 1 Introducton 1.1 Goal Ths document defnes a curve n the Le group G parametrzed by tme and by structural parameters n the
More informationImprovement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration
Improvement of Spatal Resoluton Usng BlockMatchng Based Moton Estmaton and Frame Integraton Danya Suga and Takayuk Hamamoto Graduate School of Engneerng, Tokyo Unversty of Scence, 6-3-1, Nuku, Katsuska-ku,
More informationLecture 5: Multilayer Perceptrons
Lecture 5: Multlayer Perceptrons Roger Grosse 1 Introducton So far, we ve only talked about lnear models: lnear regresson and lnear bnary classfers. We noted that there are functons that can t be represented
More informationCluster Analysis of Electrical Behavior
Journal of Computer and Communcatons, 205, 3, 88-93 Publshed Onlne May 205 n ScRes. http://www.scrp.org/ournal/cc http://dx.do.org/0.4236/cc.205.350 Cluster Analyss of Electrcal Behavor Ln Lu Ln Lu, School
More informationTsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance
Tsnghua Unversty at TAC 2009: Summarzng Mult-documents by Informaton Dstance Chong Long, Mnle Huang, Xaoyan Zhu State Key Laboratory of Intellgent Technology and Systems, Tsnghua Natonal Laboratory for
More informationAn Entropy-Based Approach to Integrated Information Needs Assessment
Dstrbuton Statement A: Approved for publc release; dstrbuton s unlmted. An Entropy-Based Approach to ntegrated nformaton Needs Assessment June 8, 2004 Wllam J. Farrell Lockheed Martn Advanced Technology
More informationSubspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;
Subspace clusterng Clusterng Fundamental to all clusterng technques s the choce of dstance measure between data ponts; D q ( ) ( ) 2 x x = x x, j k = 1 k jk Squared Eucldean dstance Assumpton: All features
More informationSum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints
Australan Journal of Basc and Appled Scences, 2(4): 1204-1208, 2008 ISSN 1991-8178 Sum of Lnear and Fractonal Multobjectve Programmng Problem under Fuzzy Rules Constrants 1 2 Sanjay Jan and Kalash Lachhwan
More information12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification
Introducton to Artfcal Intellgence V22.0472-001 Fall 2009 Lecture 24: Nearest-Neghbors & Support Vector Machnes Rob Fergus Dept of Computer Scence, Courant Insttute, NYU Sldes from Danel Yeung, John DeNero
More informationSVM-based Learning for Multiple Model Estimation
SVM-based Learnng for Multple Model Estmaton Vladmr Cherkassky and Yunqan Ma Department of Electrcal and Computer Engneerng Unversty of Mnnesota Mnneapols, MN 55455 {cherkass,myq}@ece.umn.edu Abstract:
More informationOptimizing Document Scoring for Query Retrieval
Optmzng Document Scorng for Query Retreval Brent Ellwen baellwe@cs.stanford.edu Abstract The goal of ths project was to automate the process of tunng a document query engne. Specfcally, I used machne learnng
More informationOutline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1
4/14/011 Outlne Dscrmnatve classfers for mage recognton Wednesday, Aprl 13 Krsten Grauman UT-Austn Last tme: wndow-based generc obect detecton basc ppelne face detecton wth boostng as case study Today:
More informationFuzzy Modeling of the Complexity vs. Accuracy Trade-off in a Sequential Two-Stage Multi-Classifier System
Fuzzy Modelng of the Complexty vs. Accuracy Trade-off n a Sequental Two-Stage Mult-Classfer System MARK LAST 1 Department of Informaton Systems Engneerng Ben-Guron Unversty of the Negev Beer-Sheva 84105
More informationTerm Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task
Proceedngs of NTCIR-6 Workshop Meetng, May 15-18, 2007, Tokyo, Japan Term Weghtng Classfcaton System Usng the Ch-square Statstc for the Classfcaton Subtask at NTCIR-6 Patent Retreval Task Kotaro Hashmoto
More informationA Fast Content-Based Multimedia Retrieval Technique Using Compressed Data
A Fast Content-Based Multmeda Retreval Technque Usng Compressed Data Borko Furht and Pornvt Saksobhavvat NSF Multmeda Laboratory Florda Atlantc Unversty, Boca Raton, Florda 3343 ABSTRACT In ths paper,
More informationCS 534: Computer Vision Model Fitting
CS 534: Computer Vson Model Fttng Sprng 004 Ahmed Elgammal Dept of Computer Scence CS 534 Model Fttng - 1 Outlnes Model fttng s mportant Least-squares fttng Maxmum lkelhood estmaton MAP estmaton Robust
More informationFeature Reduction and Selection
Feature Reducton and Selecton Dr. Shuang LIANG School of Software Engneerng TongJ Unversty Fall, 2012 Today s Topcs Introducton Problems of Dmensonalty Feature Reducton Statstc methods Prncpal Components
More informationClassifying Acoustic Transient Signals Using Artificial Intelligence
Classfyng Acoustc Transent Sgnals Usng Artfcal Intellgence Steve Sutton, Unversty of North Carolna At Wlmngton (suttons@charter.net) Greg Huff, Unversty of North Carolna At Wlmngton (jgh7476@uncwl.edu)
More informationReview of approximation techniques
CHAPTER 2 Revew of appromaton technques 2. Introducton Optmzaton problems n engneerng desgn are characterzed by the followng assocated features: the objectve functon and constrants are mplct functons evaluated
More informationMeta-heuristics for Multidimensional Knapsack Problems
2012 4th Internatonal Conference on Computer Research and Development IPCSIT vol.39 (2012) (2012) IACSIT Press, Sngapore Meta-heurstcs for Multdmensonal Knapsack Problems Zhbao Man + Computer Scence Department,
More informationProblem Definitions and Evaluation Criteria for Computational Expensive Optimization
Problem efntons and Evaluaton Crtera for Computatonal Expensve Optmzaton B. Lu 1, Q. Chen and Q. Zhang 3, J. J. Lang 4, P. N. Suganthan, B. Y. Qu 6 1 epartment of Computng, Glyndwr Unversty, UK Faclty
More informationSmoothing Spline ANOVA for variable screening
Smoothng Splne ANOVA for varable screenng a useful tool for metamodels tranng and mult-objectve optmzaton L. Rcco, E. Rgon, A. Turco Outlne RSM Introducton Possble couplng Test case MOO MOO wth Game Theory
More informationAn Iterative Solution Approach to Process Plant Layout using Mixed Integer Optimisation
17 th European Symposum on Computer Aded Process Engneerng ESCAPE17 V. Plesu and P.S. Agach (Edtors) 2007 Elsever B.V. All rghts reserved. 1 An Iteratve Soluton Approach to Process Plant Layout usng Mxed
More informationThree supervised learning methods on pen digits character recognition dataset
Three supervsed learnng methods on pen dgts character recognton dataset Chrs Flezach Department of Computer Scence and Engneerng Unversty of Calforna, San Dego San Dego, CA 92093 cflezac@cs.ucsd.edu Satoru
More informationMachine Learning 9. week
Machne Learnng 9. week Mappng Concept Radal Bass Functons (RBF) RBF Networks 1 Mappng It s probably the best scenaro for the classfcaton of two dataset s to separate them lnearly. As you see n the below
More informationA Modified Median Filter for the Removal of Impulse Noise Based on the Support Vector Machines
A Modfed Medan Flter for the Removal of Impulse Nose Based on the Support Vector Machnes H. GOMEZ-MORENO, S. MALDONADO-BASCON, F. LOPEZ-FERRERAS, M. UTRILLA- MANSO AND P. GIL-JIMENEZ Departamento de Teoría
More informationMULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION
MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION Paulo Quntlano 1 & Antono Santa-Rosa 1 Federal Polce Department, Brasla, Brazl. E-mals: quntlano.pqs@dpf.gov.br and
More informationIMAGE FUSION TECHNIQUES
Int. J. Chem. Sc.: 14(S3), 2016, 812-816 ISSN 0972-768X www.sadgurupublcatons.com IMAGE FUSION TECHNIQUES A Short Note P. SUBRAMANIAN *, M. SOWNDARIYA, S. SWATHI and SAINTA MONICA ECE Department, Aarupada
More informationAn Optimal Algorithm for Prufer Codes *
J. Software Engneerng & Applcatons, 2009, 2: 111-115 do:10.4236/jsea.2009.22016 Publshed Onlne July 2009 (www.scrp.org/journal/jsea) An Optmal Algorthm for Prufer Codes * Xaodong Wang 1, 2, Le Wang 3,
More informationLECTURE : MANIFOLD LEARNING
LECTURE : MANIFOLD LEARNING Rta Osadchy Some sldes are due to L.Saul, V. C. Raykar, N. Verma Topcs PCA MDS IsoMap LLE EgenMaps Done! Dmensonalty Reducton Data representaton Inputs are real-valued vectors
More information6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour
6.854 Advanced Algorthms Petar Maymounkov Problem Set 11 (November 23, 2005) Wth: Benjamn Rossman, Oren Wemann, and Pouya Kheradpour Problem 1. We reduce vertex cover to MAX-SAT wth weghts, such that the
More informationEmpirical Distributions of Parameter Estimates. in Binary Logistic Regression Using Bootstrap
Int. Journal of Math. Analyss, Vol. 8, 4, no. 5, 7-7 HIKARI Ltd, www.m-hkar.com http://dx.do.org/.988/jma.4.494 Emprcal Dstrbutons of Parameter Estmates n Bnary Logstc Regresson Usng Bootstrap Anwar Ftranto*
More informationParallelism for Nested Loops with Non-uniform and Flow Dependences
Parallelsm for Nested Loops wth Non-unform and Flow Dependences Sam-Jn Jeong Dept. of Informaton & Communcaton Engneerng, Cheonan Unversty, 5, Anseo-dong, Cheonan, Chungnam, 330-80, Korea. seong@cheonan.ac.kr
More informationFeature Selection as an Improving Step for Decision Tree Construction
2009 Internatonal Conference on Machne Learnng and Computng IPCSIT vol.3 (2011) (2011) IACSIT Press, Sngapore Feature Selecton as an Improvng Step for Decson Tree Constructon Mahd Esmael 1, Fazekas Gabor
More informationData Representation in Digital Design, a Single Conversion Equation and a Formal Languages Approach
Data Representaton n Dgtal Desgn, a Sngle Converson Equaton and a Formal Languages Approach Hassan Farhat Unversty of Nebraska at Omaha Abstract- In the study of data representaton n dgtal desgn and computer
More informationFEATURE EXTRACTION. Dr. K.Vijayarekha. Associate Dean School of Electrical and Electronics Engineering SASTRA University, Thanjavur
FEATURE EXTRACTION Dr. K.Vjayarekha Assocate Dean School of Electrcal and Electroncs Engneerng SASTRA Unversty, Thanjavur613 41 Jont Intatve of IITs and IISc Funded by MHRD Page 1 of 8 Table of Contents
More informationUSING LINEAR REGRESSION FOR THE AUTOMATION OF SUPERVISED CLASSIFICATION IN MULTITEMPORAL IMAGES
USING LINEAR REGRESSION FOR THE AUTOMATION OF SUPERVISED CLASSIFICATION IN MULTITEMPORAL IMAGES 1 Fetosa, R.Q., 2 Merelles, M.S.P., 3 Blos, P. A. 1,3 Dept. of Electrcal Engneerng ; Catholc Unversty of
More informationHigh-Boost Mesh Filtering for 3-D Shape Enhancement
Hgh-Boost Mesh Flterng for 3-D Shape Enhancement Hrokazu Yagou Λ Alexander Belyaev y Damng We z Λ y z ; ; Shape Modelng Laboratory, Unversty of Azu, Azu-Wakamatsu 965-8580 Japan y Computer Graphcs Group,
More informationA Fast Visual Tracking Algorithm Based on Circle Pixels Matching
A Fast Vsual Trackng Algorthm Based on Crcle Pxels Matchng Zhqang Hou hou_zhq@sohu.com Chongzhao Han czhan@mal.xjtu.edu.cn Ln Zheng Abstract: A fast vsual trackng algorthm based on crcle pxels matchng
More informationData Mining For Multi-Criteria Energy Predictions
Data Mnng For Mult-Crtera Energy Predctons Kashf Gll and Denns Moon Abstract We present a data mnng technque for mult-crtera predctons of wnd energy. A mult-crtera (MC) evolutonary computng method has
More informationA Similarity-Based Prognostics Approach for Remaining Useful Life Estimation of Engineered Systems
2008 INTERNATIONAL CONFERENCE ON PROGNOSTICS AND HEALTH MANAGEMENT A Smlarty-Based Prognostcs Approach for Remanng Useful Lfe Estmaton of Engneered Systems Tany Wang, Janbo Yu, Davd Segel, and Jay Lee
More informationy and the total sum of
Lnear regresson Testng for non-lnearty In analytcal chemstry, lnear regresson s commonly used n the constructon of calbraton functons requred for analytcal technques such as gas chromatography, atomc absorpton
More informationObject-Based Techniques for Image Retrieval
54 Zhang, Gao, & Luo Chapter VII Object-Based Technques for Image Retreval Y. J. Zhang, Tsnghua Unversty, Chna Y. Y. Gao, Tsnghua Unversty, Chna Y. Luo, Tsnghua Unversty, Chna ABSTRACT To overcome the
More informationLocal Quaternary Patterns and Feature Local Quaternary Patterns
Local Quaternary Patterns and Feature Local Quaternary Patterns Jayu Gu and Chengjun Lu The Department of Computer Scence, New Jersey Insttute of Technology, Newark, NJ 0102, USA Abstract - Ths paper presents
More informationBOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET
1 BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET TZU-CHENG CHUANG School of Electrcal and Computer Engneerng, Purdue Unversty, West Lafayette, Indana 47907 SAUL B. GELFAND School
More informationHelsinki University Of Technology, Systems Analysis Laboratory Mat Independent research projects in applied mathematics (3 cr)
Helsnk Unversty Of Technology, Systems Analyss Laboratory Mat-2.08 Independent research projects n appled mathematcs (3 cr) "! #$&% Antt Laukkanen 506 R ajlaukka@cc.hut.f 2 Introducton...3 2 Multattrbute
More informationUser Authentication Based On Behavioral Mouse Dynamics Biometrics
User Authentcaton Based On Behavoral Mouse Dynamcs Bometrcs Chee-Hyung Yoon Danel Donghyun Km Department of Computer Scence Department of Computer Scence Stanford Unversty Stanford Unversty Stanford, CA
More informationDetermining Fuzzy Sets for Quantitative Attributes in Data Mining Problems
Determnng Fuzzy Sets for Quanttatve Attrbutes n Data Mnng Problems ATTILA GYENESEI Turku Centre for Computer Scence (TUCS) Unversty of Turku, Department of Computer Scence Lemmnkäsenkatu 4A, FIN-5 Turku
More informationA Simple and Efficient Goal Programming Model for Computing of Fuzzy Linear Regression Parameters with Considering Outliers
62626262621 Journal of Uncertan Systems Vol.5, No.1, pp.62-71, 211 Onlne at: www.us.org.u A Smple and Effcent Goal Programmng Model for Computng of Fuzzy Lnear Regresson Parameters wth Consderng Outlers
More informationNAG Fortran Library Chapter Introduction. G10 Smoothing in Statistics
Introducton G10 NAG Fortran Lbrary Chapter Introducton G10 Smoothng n Statstcs Contents 1 Scope of the Chapter... 2 2 Background to the Problems... 2 2.1 Smoothng Methods... 2 2.2 Smoothng Splnes and Regresson
More informationSimulation Based Analysis of FAST TCP using OMNET++
Smulaton Based Analyss of FAST TCP usng OMNET++ Umar ul Hassan 04030038@lums.edu.pk Md Term Report CS678 Topcs n Internet Research Sprng, 2006 Introducton Internet traffc s doublng roughly every 3 months
More informationCS246: Mining Massive Datasets Jure Leskovec, Stanford University
CS46: Mnng Massve Datasets Jure Leskovec, Stanford Unversty http://cs46.stanford.edu /19/013 Jure Leskovec, Stanford CS46: Mnng Massve Datasets, http://cs46.stanford.edu Perceptron: y = sgn( x Ho to fnd
More informationNovel Fuzzy logic Based Edge Detection Technique
Novel Fuzzy logc Based Edge Detecton Technque Aborsade, D.O Department of Electroncs Engneerng, adoke Akntola Unversty of Tech., Ogbomoso. Oyo-state. doaborsade@yahoo.com Abstract Ths paper s based on
More informationX- Chart Using ANOM Approach
ISSN 1684-8403 Journal of Statstcs Volume 17, 010, pp. 3-3 Abstract X- Chart Usng ANOM Approach Gullapall Chakravarth 1 and Chaluvad Venkateswara Rao Control lmts for ndvdual measurements (X) chart are
More informationLoad-Balanced Anycast Routing
Load-Balanced Anycast Routng Chng-Yu Ln, Jung-Hua Lo, and Sy-Yen Kuo Department of Electrcal Engneerng atonal Tawan Unversty, Tape, Tawan sykuo@cc.ee.ntu.edu.tw Abstract For fault-tolerance and load-balance
More informationAn Improved Image Segmentation Algorithm Based on the Otsu Method
3th ACIS Internatonal Conference on Software Engneerng, Artfcal Intellgence, Networkng arallel/dstrbuted Computng An Improved Image Segmentaton Algorthm Based on the Otsu Method Mengxng Huang, enjao Yu,
More informationA Semi-parametric Regression Model to Estimate Variability of NO 2
Envronment and Polluton; Vol. 2, No. 1; 2013 ISSN 1927-0909 E-ISSN 1927-0917 Publshed by Canadan Center of Scence and Educaton A Sem-parametrc Regresson Model to Estmate Varablty of NO 2 Meczysław Szyszkowcz
More informationOutline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like:
Self-Organzng Maps (SOM) Turgay İBRİKÇİ, PhD. Outlne Introducton Structures of SOM SOM Archtecture Neghborhoods SOM Algorthm Examples Summary 1 2 Unsupervsed Hebban Learnng US Hebban Learnng, Cntd 3 A
More informationIncremental Learning with Support Vector Machines and Fuzzy Set Theory
The 25th Workshop on Combnatoral Mathematcs and Computaton Theory Incremental Learnng wth Support Vector Machnes and Fuzzy Set Theory Yu-Mng Chuang 1 and Cha-Hwa Ln 2* 1 Department of Computer Scence and
More informationNUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS
ARPN Journal of Engneerng and Appled Scences 006-017 Asan Research Publshng Network (ARPN). All rghts reserved. NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS Igor Grgoryev, Svetlana
More informationOptimal Workload-based Weighted Wavelet Synopses
Optmal Workload-based Weghted Wavelet Synopses Yoss Matas School of Computer Scence Tel Avv Unversty Tel Avv 69978, Israel matas@tau.ac.l Danel Urel School of Computer Scence Tel Avv Unversty Tel Avv 69978,
More informationWishing you all a Total Quality New Year!
Total Qualty Management and Sx Sgma Post Graduate Program 214-15 Sesson 4 Vnay Kumar Kalakband Assstant Professor Operatons & Systems Area 1 Wshng you all a Total Qualty New Year! Hope you acheve Sx sgma
More informationBAYESIAN MULTI-SOURCE DOMAIN ADAPTATION
BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION SHI-LIANG SUN, HONG-LEI SHI Department of Computer Scence and Technology, East Chna Normal Unversty 500 Dongchuan Road, Shangha 200241, P. R. Chna E-MAIL: slsun@cs.ecnu.edu.cn,
More informationLecture 4: Principal components
/3/6 Lecture 4: Prncpal components 3..6 Multvarate lnear regresson MLR s optmal for the estmaton data...but poor for handlng collnear data Covarance matrx s not nvertble (large condton number) Robustness
More informationAdaptive Transfer Learning
Adaptve Transfer Learnng Bn Cao, Snno Jaln Pan, Yu Zhang, Dt-Yan Yeung, Qang Yang Hong Kong Unversty of Scence and Technology Clear Water Bay, Kowloon, Hong Kong {caobn,snnopan,zhangyu,dyyeung,qyang}@cse.ust.hk
More informationAn Application of the Dulmage-Mendelsohn Decomposition to Sparse Null Space Bases of Full Row Rank Matrices
Internatonal Mathematcal Forum, Vol 7, 2012, no 52, 2549-2554 An Applcaton of the Dulmage-Mendelsohn Decomposton to Sparse Null Space Bases of Full Row Rank Matrces Mostafa Khorramzadeh Department of Mathematcal
More informationUB at GeoCLEF Department of Geography Abstract
UB at GeoCLEF 2006 Mguel E. Ruz (1), Stuart Shapro (2), June Abbas (1), Slva B. Southwck (1) and Davd Mark (3) State Unversty of New York at Buffalo (1) Department of Lbrary and Informaton Studes (2) Department
More informationA Genetic Programming-PCA Hybrid Face Recognition Algorithm
Journal of Sgnal and Informaton Processng, 20, 2, 70-74 do:0.4236/jsp.20.23022 Publshed Onlne August 20 (http://www.scrp.org/journal/jsp) A Genetc Programmng-PCA Hybrd Face Recognton Algorthm Behzad Bozorgtabar,
More informationSteps for Computing the Dissimilarity, Entropy, Herfindahl-Hirschman and. Accessibility (Gravity with Competition) Indices
Steps for Computng the Dssmlarty, Entropy, Herfndahl-Hrschman and Accessblty (Gravty wth Competton) Indces I. Dssmlarty Index Measurement: The followng formula can be used to measure the evenness between
More informationMachine Learning: Algorithms and Applications
14/05/1 Machne Learnng: Algorthms and Applcatons Florano Zn Free Unversty of Bozen-Bolzano Faculty of Computer Scence Academc Year 011-01 Lecture 10: 14 May 01 Unsupervsed Learnng cont Sldes courtesy of
More informationLecture 13: High-dimensional Images
Lec : Hgh-dmensonal Images Grayscale Images Lecture : Hgh-dmensonal Images Math 90 Prof. Todd Wttman The Ctadel A grayscale mage s an nteger-valued D matrx. An 8-bt mage takes on values between 0 and 55.
More informationA Unified Framework for Semantics and Feature Based Relevance Feedback in Image Retrieval Systems
A Unfed Framework for Semantcs and Feature Based Relevance Feedback n Image Retreval Systems Ye Lu *, Chunhu Hu 2, Xngquan Zhu 3*, HongJang Zhang 2, Qang Yang * School of Computng Scence Smon Fraser Unversty
More informationSolving two-person zero-sum game by Matlab
Appled Mechancs and Materals Onlne: 2011-02-02 ISSN: 1662-7482, Vols. 50-51, pp 262-265 do:10.4028/www.scentfc.net/amm.50-51.262 2011 Trans Tech Publcatons, Swtzerland Solvng two-person zero-sum game by
More informationFusion Performance Model for Distributed Tracking and Classification
Fuson Performance Model for Dstrbuted rackng and Classfcaton K.C. Chang and Yng Song Dept. of SEOR, School of I&E George Mason Unversty FAIRFAX, VA kchang@gmu.edu Martn Lggns Verdan Systems Dvson, Inc.
More informationFrom Comparing Clusterings to Combining Clusterings
Proceedngs of the Twenty-Thrd AAAI Conference on Artfcal Intellgence (008 From Comparng Clusterngs to Combnng Clusterngs Zhwu Lu and Yuxn Peng and Janguo Xao Insttute of Computer Scence and Technology,
More informationLearning-Based Top-N Selection Query Evaluation over Relational Databases
Learnng-Based Top-N Selecton Query Evaluaton over Relatonal Databases Lang Zhu *, Wey Meng ** * School of Mathematcs and Computer Scence, Hebe Unversty, Baodng, Hebe 071002, Chna, zhu@mal.hbu.edu.cn **
More informationA PATTERN RECOGNITION APPROACH TO IMAGE SEGMENTATION
1 THE PUBLISHING HOUSE PROCEEDINGS OF THE ROMANIAN ACADEMY, Seres A, OF THE ROMANIAN ACADEMY Volume 4, Number 2/2003, pp.000-000 A PATTERN RECOGNITION APPROACH TO IMAGE SEGMENTATION Tudor BARBU Insttute
More informationAn Accurate Evaluation of Integrals in Convex and Non convex Polygonal Domain by Twelve Node Quadrilateral Finite Element Method
Internatonal Journal of Computatonal and Appled Mathematcs. ISSN 89-4966 Volume, Number (07), pp. 33-4 Research Inda Publcatons http://www.rpublcaton.com An Accurate Evaluaton of Integrals n Convex and
More informationAn Image Fusion Approach Based on Segmentation Region
Rong Wang, L-Qun Gao, Shu Yang, Yu-Hua Cha, and Yan-Chun Lu An Image Fuson Approach Based On Segmentaton Regon An Image Fuson Approach Based on Segmentaton Regon Rong Wang, L-Qun Gao, Shu Yang 3, Yu-Hua
More informationMachine Learning. Support Vector Machines. (contains material adapted from talks by Constantin F. Aliferis & Ioannis Tsamardinos, and Martin Law)
Machne Learnng Support Vector Machnes (contans materal adapted from talks by Constantn F. Alfers & Ioanns Tsamardnos, and Martn Law) Bryan Pardo, Machne Learnng: EECS 349 Fall 2014 Support Vector Machnes
More informationUsing Neural Networks and Support Vector Machines in Data Mining
Usng eural etworks and Support Vector Machnes n Data Mnng RICHARD A. WASIOWSKI Computer Scence Department Calforna State Unversty Domnguez Hlls Carson, CA 90747 USA Abstract: - Multvarate data analyss
More informationAn Evolvable Clustering Based Algorithm to Learn Distance Function for Supervised Environment
IJCSI Internatonal Journal of Computer Scence Issues, Vol. 7, Issue 5, September 2010 ISSN (Onlne): 1694-0814 www.ijcsi.org 374 An Evolvable Clusterng Based Algorthm to Learn Dstance Functon for Supervsed
More informationA Robust Method for Estimating the Fundamental Matrix
Proc. VIIth Dgtal Image Computng: Technques and Applcatons, Sun C., Talbot H., Ourseln S. and Adraansen T. (Eds.), 0- Dec. 003, Sydney A Robust Method for Estmatng the Fundamental Matrx C.L. Feng and Y.S.
More information