Adaptive Virtual Support Vector Machine for the Reliability Analysis of High-Dimensional Problems

Size: px
Start display at page:

Download "Adaptive Virtual Support Vector Machine for the Reliability Analysis of High-Dimensional Problems"

Transcription

1 Proceedngs of the ASME 2 Internatonal Desgn Engneerng Techncal Conferences & Computers and Informaton n Engneerng Conference IDETC/CIE 2 August 29-3, 2, Washngton, D.C., USA DETC Adaptve Vrtual Support Vector Machne for the Relablty Analyss of Hgh-Dmensonal Problems Hyeongjn Song, K.K. Cho*, Ikjn Lee and Lang Zhao Department of Mechancal & Industral Engneerng College of Engneerng The Unversty of Iowa, Iowa Cty, IA 52242, USA hyesong@engneerng.uowa.edu, kkcho@engneerng.uowa.edu, lee@engneerng.uowa.edu lazhao@engneerng.uowa.edu *Correspondng author Davd Lamb US Army RDECOM/TARDEC Warren, MI , USA davd.lamb@us.army.ml ABSTRACT In ths study, an effcent classfcaton methodology s developed for relablty analyss whle mantanng the accuracy level smlar to or better than exstng response surface methods. The samplng-based relablty analyss requres only the classfcaton nformaton a success or a falure but the response surface methods provde real functon values as ther output, whch requres more computatonal effort. The problem s even more challengng to deal wth hgh-dmensonal problems due to the curse of dmensonalty. In the newly proposed vrtual support vector machne (VSVM), vrtual samples are generated near the lmt state functon by usng lnear or Krgng-based approxmatons. The exact functon values are used for approxmatons of vrtual samples to mprove accuracy of the resultng V. By ntroducng the vrtual samples, VSVM can overcome the defcency n exstng classfcaton methods where only classfed functon values are used as ther nput. The unversal Krgng method s used to obtan vrtual samples to mprove the accuracy of the decson functon for hghly nonlnear problems. A sequental samplng strategy that chooses a new sample near the true lmt state functon s ntegrated wth VSVM to maxmze the accuracy. Examples show the proposed adaptve VSVM yelds better effcency n terms of the modelng tme and the number of requred samples whle mantanng smlar level or better accuracy especally for hgh-dmensonal problems. KEYWORDS Surrogate Model, Support Vector Machne (SVM), Sequental Samplng, Vrtual Samples, Vrtual Support Vector Machne (VSVM), Hgh-dmensonal Problem. INTRODUCTION Accurate relablty analyss s of great mportance for solvng engneerng problems. Poor relablty analyss results can lead to unrelable or overly conservatve desgns. Currently, the most probable pont (MPP) based methods are used to obtan relablty analyss results n many engneerng problems where senstvty nformaton s used [-3]. However, the senstvty s often not avalable or dffcult to obtan accurately n complex mult-physcs or multdscplnary smulaton-based engneerng desgn applcatons. Wthout the senstvty, an alternatve to the MPP-based method s to drectly perform the probablty ntegraton numercally by carryng out computer smulatons at the Monte Carlo smulaton (MCS) samplng ponts [4]. However, ths method requres a large number of response functon evaluatons and can be mpractcal n terms of computatonal cost. Therefore, surrogate-based methods are used to decrease the cost whle requrng no senstvty analyss. The man advantage of the surrogate-based method s that a lmted number of functon evaluatons are requred to construct surrogate models. Many dfferent surrogates such as the polynomal response surface (PRS), radal bass functon (RBF), multvarate adaptve regresson splne (MARS), support vector regresson (SVR), movng least squares (MLS) and Krgng have been developed and appled to engneerng problems [5-2]. These surrogates provde approxmatons of otherwse expensve computer smulatons. Once an accurate surrogate model s generated, the drect MCS can be appled to the surrogate model to estmate the relablty wth affordable computatonal cost. Ths method s called the samplng-based relablty analyss. The samplng-based method requres the decson functon to determne f a predcton at a testng pont

2 s a success or a falure. That s, only the decson between a success and a falure s used nstead of the functon value. In ths paper, the decson functon s used to express an approxmated lmt state. However, surrogate-based approaches usually try to obtan accurate response functon values over the gven doman. Therefore, the surrogate-based methods requre many samples n unnecessary regons to reach the target accuracy (.e., Mean Squared Error or R 2 ), and thus they actually solve more complcated problems and become neffcent [3]. The computatonal burden becomes heaver n hgh-dmensonal space due to the curse of dmensonalty [4-6]. On the other hand, the support vector machne (SVM), whch s a classfcaton method, only constructs an explct decson functon [4-2]. The SVM wth a sequental samplng strategy whch s called the explct desgn space decomposton (EDSD) s tested and appled to dscontnuous problems successfully [2, 22]. Even though EDSD can be also appled to contnuous problems, t often converges very slowly, and thus requres a large number of samples. One of the man reasons for the neffcency of EDSD for contnuous problems s that t only uses the classfcaton response functon values rather than the functon values to construct the decson functon. In ths paper, a vrtual SVM (VSVM) s proposed to mprove the effcency of SVM whle mantanng the good features of SVM by usng the avalable true response functon values. Unlke EDSD, VSVM s developed manly for contnuous problems. The VSVM does not depend on the avalablty of accurate gradent nformaton and only constructs the decson functon rather than the surrogate model over the gven doman. A proposed adaptve samplng method provdes new samples n the vcnty of the lmt state, whch makes the method even more effcent. The VSVM decson functon s used to evaluate the probablty of falure at a gven desgn. Basc concepts and mportant features of SVM are presented n Secton 2. In Secton 3, the vrtual sample generaton method and the adaptve samplng strategy are explaned. Stoppng crtera are defned to stop the updatng process as the decson functon converges. In Secton 4, recently developed EDSD and dynamc Krgng method are compared wth the proposed VSVM to demonstrate the effcency of VSVM whle mantanng the accuracy. An error measure s also defned to compare the accuracy of the result. The concluson s followed n Secton SUPPORT VECTOR MACHINE An SVM s a machne learnng technque used for the classfcaton of data n pattern recognton [4-22]. It has the ablty to explctly construct a multdmensonal and complex decson functon that optmally separates multple classes of data. Even though SVM s able to deal wth mult-class cases, only two classes success or falure are used n relablty analyses, and thus only a two-class classfcaton problem wll be consdered n ths paper. Good features for the hghdmensonal problem make SVM an approprate method for the formulaton of the explct lmt state functon. In ths secton, a bref overvew of SVM s presented, ncludng basc deas and some mportant features. 2. Lnear SVM For the gven multdmensonal problem, N samples are dstrbuted wthn the local or global wndow. Each sample x s assocated wth one of two classes characterzed by a value y =±, whch represents a success ( ) or a falure ( ). The SVM algorthm constructs the decson functon that optmally separates two classes of samples. The correspondng explct boundary functon s expressed as N s( x) b y K( x, x ) () where b s the bas, α are Lagrange multplers obtaned from the quadratc programmng optmzaton problem used to construct SVM, x s an arbtrary pont to be predcted, and K s the kernel of SVM. The classfcaton of any arbtrary pont x s gven by the sgn of s n Eq. (). The optmzaton process s used to solve for the optmal wth a maxmal margn. Fgure shows a lnear SVM result and the noton of margn can be easly notced. In ths case, the margn s the dstance between two parallel hyperplanes gven by s(x) = ± n the desgn space. These hyperplanes are called support hyperplanes and pass through one or several samples, whch are called support vectors. The SVM optmzaton process also does not allow any samples to exst wthn the margn space support hyperplane support vectors class + class x Fgure. Lnear decson functon for two-dmensonal problem The Lagrange multplers assocated wth the support vectors are postve whle the other Lagrange multplers equal zero. It means that the explct uses only support vectors n ts formulaton, and thus SVM constructed only wth support vectors s dentcal to the one obtaned wth all samples. Typcally, the number of support vectors s much smaller than the number of samples N. 2.2 Nonlnear SVM and Kernel Functons To construct nonlnear decson functons, kernels are ntroduced n SVM. In the formulaton of the SVM decson functon, t s assumed that there exsts always a hgher dmensonal space where the transformed data can be lnearly separated. The transformaton from the orgnal desgn space to the hgher dmensonal space s based on the kernel functon K n SVM. The kernel K n SVM equaton can have dfferent - -

3 forms such as polynomal, Gaussan, Sgmod, etc. A Gaussan kernel s used n ths paper and s gven as [5, 8, 9]: 2 x x K( xx, ) exp () 2 2 the gven samples as shown n Fg. 3 (a). However, n realty, samples wth small absolute functon values are more lkely to be located closer to the lmt state functon than those wth large absolute functon values. where σ s the parameter of the Gaussan kernel. Fgure 2 s an example of nonlnear wth the Gaussan kernel for a two-dmensonal problem. Even though the boundary s always lnear n the transformed hgher dmensonal space, the boundary s nonlnear n the orgnal desgn space. SVM and Kernel Methods Matlab toolbox [23] s used for the formulaton of SVM support hyperplane support vectors class + class (a) x Fgure 2. Nonlnear decson functon for twodmensonal problem The SVM can deal wth hgh-dmensonal problems and can separate two classes of data wth the maxmal margn. The has an explct form, and thus predctons based on SVM are faster than those based on mplct surrogate methods such as Krgng. The predcton speed s mportant for samplng-based relablty analyses, snce a very large number of MCS samples are requred n evaluatng the probablty of falure. The EDSD, whch s an SVM wth a sequental samplng strategy, yelds good performance for dscontnuous lmt state functons. However, EDSD s slow n convergence and requres many samples for contnuous problems, snce EDSD does not use functon values. Ths can be mproved by nsertng vrtual samples generated based on avalable functon values. 3. VIRTUAL SUPPORT VECTOR MACHINE 3. Vrtual Sample Generaton and VSVM For the constructon of SVM, ntal samples, whch nclude both success and falure samples, should be gven. Intal samples are generated by Latnzed Centrodal Vorono Tessellaton (LCVT), snce t shows very good unformty and randomness [2, 27]. The classfcaton methods such as SVM only deal wth classfcaton of responses,.e., successes ( ) or falures ( ). The s located n the mddle of opposte sgned samples, regardless of the functon values of (b) V wth vrtual samples Fgure 3. and VSVM decson functon wth vrtual samples red sold lne The basc dea of VSVM s to ncrease the probablty of locatng the decson functon close to the lmt state functon, by nsertng two opposte sgned vrtual samples between the gven two samples. These vrtual samples play two major roles n VSVM. One s to make the predctons more accurate and the other s to locate new sequental samples near the lmt state functon, whch wll be presented n Secton 3.2. In Fg. 3 (b), the V s shfted towards the sample wth a small absolute functon value by nsertng two vrtual samples. The vrtual samples wth opposte sgns should be near the lmt state functon and be equally dstanced from the lmt state functon to obtan the best. In ths paper, two types of samples are used. The frst types are real samples, whch nclude ntal samples and

4 sequental samples. Sequental samples are nserted when the VSVM model s not accurate enough. These real samples requre functon evaluatons. The second types are vrtual samples whch are generated to mprove accuracy of the resultng V. Such vrtual samples do not requre functon evaluatons and only have vrtual sgns. 3.. Informatve Sample Set and Vald Dstance Vrtual samples are generated from approxmatons usng any par of samples. However, t s very much desrable to use two opposte class samples. If both samples have the same sgn, then fndng the decson functon s an extrapolaton problem of whch a soluton s often naccurate and s not located between two gven samples. If two exstng samples have opposte sgns (+ and ), then the decson functon should exst between the two samples for a contnuous problem. Any par of dfferent class samples can be used n theory, but f the dstance between two gven samples s large or both samples are far from the lmt state functon, then the accuracy of postonng the zero pont between two samples cannot be expected. Thus, one of two ponts should be close to the lmt state functon and both should be close to each other to make approxmatons more accurate and useful. Therefore, an nformatve sample set from whch vrtual samples are generated s defned frst. Support vectors are located near the lmt state functon, and thus they are ncluded n the nformatve set. Orgnal SVM s constructed frst based on exstng samples to dentfy support vectors. It s hghly probable that some samples wth small absolute values are also located close to the lmt state functon, even though they may not be support vectors. All the samples that have absolute response values that are smaller than the maxmum absolute responses of the support vectors are chosen as members of the nformatve set. Ths can be expressed as { x y( x ) max({ y( x*)}),,, N} () x* where x s the th sample, x* are support vectors, y s the functon value at the gven poston, N s the number of samples and { y( x*)} s a set of absolute response functon values at support vectors. From the prevously chosen nformatve samples, the closest opposte sgned samples are pared to generate vrtual samples between each par. However, there exst some pars that can generate mportant vrtual samples, even though they are not the closest opposte sgned samples to each other. To solve ths problem, a vald dstance concept s ntroduced. Pars can generate vrtual samples f the dstance between them s shorter than the vald dstance. If the vald dstance s too large, then there s a rsk of ncludng many unnecessary vrtual samples and producng poor approxmatons. If the vald dstance s too short, t may not nclude more useful nformaton. Fgure 4 shows the nfluence of the vald dstance concept n a two-dmensonal example. By nsertng an addtonal par of vrtual samples between two exstng vrtual sample pars, the accuracy s mproved n the area near the new vrtual sample par. The dstances between pars of nformatve samples and the closest opposte sgned samples can be obtaned. The maxmum dstance between above pars s defned as the vald dstance n ths paper. (a) The closest samples only (wthout the vald dstance concept) (b) Wth the vald dstance concept Fgure 4. Vs wth/wthout the vald dstance concept 3..2 Approxmatons for Zero Postons Two addtonal steps are needed for the generaton of the vrtual samples after the nformatve sample set and the vald dstance are defned. Frstly, snce the true lmt state functon s not known n general, a zero poston s approxmated from two dfferent class samples by usng approxmaton methods such as lnear approxmaton, Krgng or MLS. A zero poston means a pont where the approxmaton value s zero among all the ponts on the lne between two opposte sgned samples. Lnear approxmaton smply assumes that the functon value between two gven samples s lnear and tres to fnd the zero pont. Lnear approxmaton s very fast and easy to apply but can be naccurate for hghly nonlnear functons. Snce new samples are located near the true lmt state functon by the sequental samplng method, the Krgng or MLS methods, whch are accurate near gven samples, are approprate to obtan better approxmatons. In ths paper, the unversal Krgng method s used to approxmate the zero True lmt state functon Class + Class - Informatve sample(+) Informatve sample(-) Vrtual samples x x True lmt state functon Class + Class - Informatve sample(+) Informatve sample(-) Vrtual samples

5 pont between two opposte sgned samples and SURROGATES toolbox [24] s used for the constructon of the unversal Krgng model. The optmzaton problem for fndng the zero poston between two samples s expressed as mn Aˆ ( x) x s.. t x x t x j ( t) () t where x and x j are orgnal samples wth opposte sgns, x s a pont on the straght lne connectng x and x j and Â(x) s an approxmated value at x obtaned by the unversal Krgng method. When new sequental sample s nserted, the unversal Krgng model s constructed based on the new sample set. In the Krgng model, the correlaton functon R(θ, x, x j ) should be estmated from the sample data, where x and x j are gven samples and θ s the process parameter. The nfluence of the parameter θ on the performance s sgnfcant, and thus the determnaton of the parameter s mportant. To fnd the optmum θ, dfferent methods such as Hookes&Jeeves (H-J), Lavenberg-Marquardt (L-M), genetc algorthm (GA) and PatternSearch (PS) methods [24, 25] have been appled. Among them, the PS method s most accurate but t requres more computatonal effort than other methods. However, wth VSVM, less number of teratons can be used to acheve a smlar level of accuracy wth more accurate Krgng models by locatng new samples correctly. Therefore, tme and resources can be saved by usng the PS method. To make the estmaton process more effcent, the hstory of parameter changes was nvestgated to fnd that new optmum θ s close to the prevous optmum θ wth one less sample n general. If the current SVM model s smlar to the prevous SVM then both optmum Krgng parameters are also close to each other. Therefore, the prevous optmum Krgng parameter θ value s used as the ntal value for the PS method. By mplementng ths effcency strategy, the elapsed tme to fnd the optmum θ s reduced by 9% per teraton n average. It requres far amount of computatonal tme to solve Eq. (4) accurately. However, f the zero poston s wthn the vrtual margn explaned n Secton 3..3, then the resultng s smlar to the decson functon wth exact zero poston. Also Krgng approxmatons take large amount of tme f approxmatons are calculated one by one due to ts mplct formulaton. Therefore, the lne connectng two opposte sgned samples x and x j s dvded nto elements, ther Krgng approxmatons are evaluated at once and the poston wth the mnmum absolute functon value s chosen. elements are used n ths paper because the vrtual margn s.2 and the mean dstance between exstng sample pars s.2 n the normalzed varable space. By ntroducng the new method, the elapsed tme for generatng vrtual samples s reduced from sec. to 2. sec. per teraton for the twelve-dmensonal problem Generaton of Vrtual Samples from Zero Postons Secondly, two opposte sgned vrtual samples are generated near the zero pont. One s located n the drecton of the success sample and the other s n the drecton of the falure sample. These are vrtual samples and the one shfted towards the success sample wll be assgned as a success and the other one wll be assgned as a falure vrtually. Both vrtual samples should be between the gven two opposte sgned samples and on the lne that connects these ponts. Then, a new based on the orgnal and vrtual samples wll be located between the vrtual sample pars, because the vrtual samples n the par have dfferent sgns and are close to each other. If approxmatons for zero ponts are accurate, then both vrtual samples and a new decson functon wll be near the lmt state functon. One mportant queston s how closely a par of vrtual samples should be located. If the dstance between a par of vrtual samples s too long, then these vrtual samples wll not be chosen as support vectors and they become meanngless. To make the vrtual samples useful, the dstance should be short enough so that the vrtual samples are chosen as support vectors. However, due to the error of the samplng-based probablty of falure evaluaton [], the vrtual margn, the dstance between a par of vrtual samples should not be extremely small. Therefore, a decson about the sze of the vrtual margn should be based on the target error level. If many vrtual samples are clustered together wthn a small regon, the addtonal nformaton from most closely located vrtual samples s neglgble and the computatonal tme ncreases unnecessarly. In each vrtual sample choce process, both the amount of addtonal nformaton and the computatonal cost should be consdered. The frst par of vrtual samples are generated between a sample wth the smallest absolute functon value and ts closest opposte sgned sample, snce they provde the most accurate approxmatons. After the frst par s chosen, the vald dstance s defned based on SVM wth ntal sample set, and vrtual sample canddates are generated from two opposte samples wthn the vald dstance. The canddate par that have the longest dstance from both real and vrtual samples are chosen as the next vrtual samples to prevent clustered vrtual samples wthn a small regon. To avod clustered vrtual samples, the number of vrtual samples s lmted by a predefned number. Otherwse, the process wll end up generatng unnecessarly many vrtual samples. Once all vrtual samples are generated, new VSVM can be constructed by usng both exstng samples and vrtual samples. 3.2 Adaptve Strategy wth Samplng and Stoppng Crtera 3.2. Adaptve Sequental Samplng The surrogate-based approaches construct a model that s accurate over the gven doman, and thus samples tend to spread out wthn the gven doman to satsfy the target accuracy. However, snce only an accurate decson functon s requred for the samplng-based methods, samples near the lmt state functon are more nformatve than samples far away from the lmt state functon. Such effcency cannot be acheved by usng a unform samplng strategy, and thus a sequental samplng method s crucal for better effcency and accuracy. In ths paper, a new sample s selected such that t s located wthn the margn ( s(x) <), whch s narrow snce each par of vrtual samples are closely located. In addton, a

6 new sample should have the maxmum dstance from the closest exstng sample to maxmze the addtonal nformaton by the new sample. Ths strategy s smlar to the sequental samplng method by Basudhar and Mssoum [2] but the computatonal burden can be reduced by usng the wthn-themargn constrant ( s(x) <) rather than the on-the-decsonfuncton constrant (s(x)=) whch s more dffcult to satsfy. A less strct constrant can be used wth VSVM snce new samples do not need to be on the lmt state functon by ntroducng vrtual samples. In other words, f new samples are located near the lmt state functon, accurate vrtual samples close to the lmt state functon can be obtaned. The optmzaton problem s defned as True lmt state functon class + class - max x x x st.. s( x) nearest () x where x nearest s the exstng sample closest to the new sample x. Snce x nearest changes as the poston of new sample canddate x moves, Eq. (5) s a movng target problem. In Fg. 5, new sample s nserted nto a regon near the lmt state functon and where there s no exstng sample nearby. The V s mproved drastcally near the sequental sample True lmt state functon class + class - New sample x (a) The V and a sequental sample (b) The V wth a new sample Fgure 5. Changes of the V n the normalzed desgn space As explaned n the prevous paragraph, the accurate soluton for Eq. (5) s not necessary. Therefore, gradent-based optmzaton methods such as trust-regon-reflectve algorthm [28, 29], actve-set algorthm [3, 3] or nteror-pont algorthm [32, 33] can be used nstead of the PS method snce they are faster than PS wthout sacrfcng the accuracy much Stoppng Crtera Stoppng crtera are requred to determne when the decson functon s converged. Snce the true lmt state functon s not known, the crteron s based on the varatons of the approxmated decson functon. A set of N stop testng ponts s generated usng nput dstrbutons because the MCS samples are also generated n the same way for the samplngbased relablty analyss. In ths paper, ten thousand testng samples were used for all examples. The fracton of testng ponts that show dfferent sgns from the prevous surrogate s calculated as [2] k N stop Ik( x) (%) N stop () where k s the current teraton number, s the fracton of testng ponts for whch the sgn of the SVM evaluaton changes between k- th and k th teratons. I k (x ) n Eq. (6) s an ndcator functon defned as, sgn( sk ( x)) sgn( sk( x)) Ik( x ) (),otherwse where and represent the SVM value at at k- th and k th teratons, respectvely. Changes n the SVM decson functon fluctuate and usually decrease as the number of teratons ncreases as s shown n Fg. 6.

7 6 5 k Ftted exponental curve 4 k (%) Iteratons (k) Fgure 6. Changes of and ftted exponental curve In order to mplement more stable stoppng crtera, the fracton of testng ponts changng sgns between successve teratons s ftted by an exponental curve as [2] ˆ Bk Ae () k where represents the ftted values of and A and B are the parameters of the exponental curve. The value of and the slope of the curve are calculated whenever each new sample s added. If s large whle s small, t means that a bg change occurred n the model at the k th teraton, whch dd not catch. If s small whle s large, the stuaton s that the new sample s nserted nto a regon where zero-poston approxmatons are already accurate, so there s a small change between recent two models but t may not be converged yet. Therefore, both and should be kept small for more robust results. The slope of the curve s also kept close to zero for stable results. To stop the updatng process, the maxmum of and should be less than a small postve number ε. Smultaneously, the absolute value of the slope of the curve at convergence should be lower than ε 2. Thus, the stoppng crtera can be defned as max(, ˆ k k) Bk BAe. 2 ε and ε 2 are determned so that the target classfcaton error level can be acheved. The target classfcaton error s 2.% n ths paper. For more accurate lmt state functon, smaller values can be appled. Generally, ε 2 should be smaller than ε for more stable convergence. The overall procedure of VSVM wth a sequental samplng strategy s shown as Fg. 7. () Fgure 7. Flowchart of VSVM wth a sequental samplng strategy 4. COMPARISON STUDY BETWEEN VSVM AND OTHER SURROGATES 4. Comparson Procedure The two most recent surrogate modelng methods wth sequental samplng schemes were selected to be compared wth the proposed VSVM. One s the explct desgn space decomposton (EDSD) method wth an mproved adaptve samplng scheme that uses SVM [2, 22]. The mproved adaptve samplng method has two ways to choose a new sample: () to select the sample that has the largest dstance to the closest exstng samples whle mantanng s(x)=, and (2) to choose the support vector x* that s farthest from the exstng samples of the opposte class and to select the sample that s farthest from x* whle mantanng the opposte sgn of y* and on the hypersphere of radus R centered around x*. y* s the functon value at x* and R s chosen as half the dstance from x* to the closest opposte sgned sample. For a far comparson for both EDSD and VSVM, the same parameters for SVM are used. Therefore, the dfferences between them are the sequental samplng strategy and the use of vrtual samples. The other surrogate modelng method s the dynamc Krgng (DKG) wth a sequental samplng method [2]. Zhao

8 et.al., showed that DKG s one of the most accurate response surface methods when the same number of samples s used. DKG was compared wth polynomal response surface, radal bass functon, support vector regresson, and unversal Krgng. Therefore, dynamc Krgng s chosen to compare the accuracy of VSVM wth one of best response surface methods. The basc form of the dynamc Krgng predcton s expressed as yˆ( ) ( ) T x r Fλ R Y () where R s the symmetrc correlaton matrx, r s the correlaton vector between the predcton locaton x and all N samples x, =,,N, Y s the response vector, F s a desgn matrx of bass functons and λ s a regresson coeffcent vector. In the dynamc Krgng method, F s not fxed, but the best one s chosen by the genetc algorthm (GA). The sequental samplng method chooses a new sample where the predcton varance s largest. Three test examples are used to show the performance of the adaptve samplng-based VSVM. One example s a lowdmensonal problem and the other two are hgh-dmensonal problems. SVM can be appled to both global and local wndows. However, global wndow usually requres unnecessarly many samples to acheve the target accuracy n relablty analyses. Therefore, SVM s appled to local wndows of the orgnal nput doman and the orgnal functons are shfted approprately to nclude both sgned samples so that the local wndows nclude the true lmt state functons. In Secton 4.2, 4.3 and 4.4, local wndows are defned as hyper-cubes based on lower and upper bounds respectvely. For the Gaussan kernel n Eq. (2), parameter σ should be provded. Decson of optmum σ s an ongong research subject. In ths paper, fxed σ values, whch are small enough to mantan zero tranng error, are used. Tranng error s defned as the classfcaton error wth respect to exstng samples and not testng samples. Snce the SVM s a classfcaton method and only takes care of the decson functon, the mean squared error (MSE) and R 2, whch are wdely used n the surrogate-based methods, cannot be used for comparson. Therefore, the accuracy of the should be judged by ts closeness to the true lmt state functon. In real stuatons, the lmt state functon s often unavalable and so s the error measure. However, the error measure can be obtaned for academc analytcal test functons. One mllon testng ponts (N test ) are generated based on nput dstrbutons because the MCS samples are also generated n the same way for the samplngbased relablty analyss. These testng ponts are used to calculate the classfcaton error, whch s the fracton of msclassfed testng ponts over total number of testng ponts. A test pont for whch the sgn of SVM does not match the sgn provded by the true lmt state functon s consdered as msclassfcaton [2]. Therefore, the classfcaton error c s N test I ( xtest ) c (%) N test () where x test represents a test sample. I(x test ) n Eq. () s an ndcator functon and defned as, s( xtest ) ytest I( x test ) (), otherwse where y test represents the correspondng classfcaton value (±) at x test, s(x test ) s the SVM approxmaton at x test. Our purpose s to evaluate the probablty of falure accurately. The relatonshp between the probablty of falure measurement error and the classfcaton error s approxmately proportonal. Therefore, accurate probablty of falure can be obtaned by keepng the classfcaton error small. Also, the classfcaton error represents the accuracy of the obtaned lmt state functon, so the classfcaton error s used as the error measure for comparson n ths paper D Example The analytc functon s a 4 th order polynomal functon, whch s expressed as f ( x) (.963 x.4226 x ) (.963 x.4226 x 6).6(.963 x.4226 x ) (.4226 x.963 ) 4.5 x 6.5, 5.5 x The number of ntal samples s for all 2 tests and each test starts wth dfferent ntal profles. Parameters σ, ε, and ε 2 are 3,.8, and.3, respectvely, for both EDSD and VSVM. To compare the performances wth respect to the same number of addtonal samples, VSVM s performed frst and DKG and EDSD are performed later usng the same number of samples as VSVM. Each process s forced to stop when t reaches the same number of samples. Each method has ts own sequental samplng strategy, and thus all fnal profles are dfferent except the ntal samples. Accordng to Table, whch provdes averaged values of 2 test cases, EDSD s the fastest, but the classfcaton error s not accurate at all. Ths clearly shows that EDSD converges slowly due to ncapablty of usng exact response functon values. The VSVM uses about the same amount of tme as DKG and results n a better classfcaton error. Table. Average classfcaton error and elapsed tme over 2 tests Classfcaton error (%) Elapsed tme (sec) DKG EDSD VSVM D Example The nne-dmensonal extended Rosenbrock functon s used for the test, whch s expressed as f( x) ( x) ( x x ) 68 3 x 2,,,9. () ()

9 The ntal sample sze s 2, and 2 dfferent ntal sample profles are used. For both EDSD and VSVM, σ, ε, and ε 2 are 5,.5, and.3, respectvely. The same number of addtonal samples s used n the same way as prevous twodmensonal problem. In Table 2, whch provdes averaged values of 2 test cases, EDSD s stll the fastest, but the classfcaton error s not accurate. VSVM uses about half amount of tme as DKG and results n better classfcaton error. Therefore, VSVM s effcent and accurate for nnedmensonal problem. Table 2. Average classfcaton error and elapsed tme over 2 tests Classfcaton error (%) Elapsed tme (sec) DKG EDSD VSVM D Example For a twelve-dmensonal example, the Dxon-Prce functon, whch s expressed as f( x) ( x ) (2 x x ) 36 3 x 4,,,2. s used. The ntal sample sze s 35 for 2 tests. Parameters σ, ε, and ε 2 are 5,.25, and.5, respectvely. The same number of addtonal samples s used for all three methods. In Table 3, whch provdes averaged values of 2 test cases, EDSD s the fastest, but the classfcaton error s not accurate. VSVM uses less tme than DKG but results n a better classfcaton error. Table 3. Average classfcaton errors and elapsed tme over 2 tests Classfcaton error (%) Elapsed tme (sec) DKG EDSD VSVM For other way of comparson, EDSD s performed usng the same stoppng crtera as VSVM so that EDSD can use more samples to construct the decson functon. Accordng to Table 4, the average number of addtonal samples of EDSD s 77.9, whch s far more than 33.3 of VSVM. EDSD also uses slghtly less tme than VSVM, and the classfcaton error s stll qute large. Clearly, VSVM s more accurate and effcent than EDSD. Table 4. Average number of addtonal samples, classfcaton error, and elapsed tme wth the same stoppng crtera over 2 tests () Classfcaton error (%) Elapsed tme (sec) Snce DKG and VSVM use dfferent stoppng crtera, a smaller stoppng crteron s used for DKG to acheve a classfcaton error smlar to that of VSVM. In Table 5, DKG can acheve a classfcaton error level smlar to that of VSVM after t uses about 6 more samples. Furthermore, the elapsed tme of DKG s larger than that of VSVM. Table 5. Average number of addtonal samples, classfcaton error, and elapsed tme of DKG and VSVM when smlar classfcaton error was acheved (2 tests) Number of addtonal samples DKG VSVM Classfcaton error (%) Elapsed tme (sec) VSVM s more effcent than DKG n terms of elapsed tme for modelng whle mantanng better accuracy level, especally n hgh-dmensonal space. EDSD converges very slowly and s neffcent n terms of the number of addtonal samples. Ths s more problematc when the computer smulatons at each sample pont are very expensve. For future, effcency strateges can be modfed further to make VSVM faster whle mantanng the accuracy. Ths adaptve VSVM also wll be appled to samplng-based relablty-based desgn optmzaton (RBDO). 5. CONCLUSION A sequental samplng-based vrtual support vector machne method s proposed to effcently construct the accurate decson functon for the relablty analyss, especally n hgh-dmensonal space. Vrtual samples are generated from real samples and ther response functon values to mprove the accuracy of the, and the sequental samplng method s also used to ncrease the effcency of the algorthm by nsertng new samples near the true lmt state functon. The proposed method s compared wth dfferent surrogate modelng methods such as EDSD and DKG wth ther own sequental samplng strateges. DKG can construct accurate surrogates wth relatvely small number of samples but t s neffcent snce the dynamc bass selecton process requres sgnfcant computatonal effort [2]. For a lowdmensonal problem, both VSVM and DKG are accurate and requre smlar modelng tme. However, VSVM becomes more effcent than DKG and EDSD whle mantanng the requred accuracy for hgh-dmensonal problems. Therefore, both VSVM and DKG are recommended to be appled to lowdmensonal problems, and adaptve VSVM s recommended for hgh-dmensonal problems. EDSD requres a large number of samples n all cases, snce t does not use functon values. Number of addtonal samples EDSD VSVM ACKNOWLEDGEMENT Research s jontly supported by the ARO Project W9NF and the Automotve Research Center,

10 whch s sponsored by the U.S. Army TARDEC. These supports are greatly apprecated. 7. REFERENCES [] Haldar, A., and Mahadevan, S., "Probablty, Relablty and Statstcal Methods n Engneerng Desgn," John Wley & Sons, New York, 2. [2] Tu, J., Cho, K.K., and Park, Y.H., "A New Study on Relablty-Based Desgn Optmzaton," Journal of Mechancal Desgn, Vol.2, No.4, pp , 999. [3] Youn, B.D., Cho, K.K., and Du, L., "Enrched Performance Measure Approach for Relablty-Based Desgn Optmzaton," AIAA Journal, Vol.43, No.4, pp , 25. [4] Rubnsten, R.Y., "Smulaton and the Monte Carlo method," Wley, New York, 98. [5] Cresse, N.A.C., "Statstcs for Spatal Data," John Wley & Sons, New York, 99. [6] Barton, R.R., "Metamodelng: a State of the Art Revew," WSC '94: Proceedngs of the 26th Conference on Wnter Smulaton, Anonymous Socety for Computer Smulaton Internatonal, San Dego, CA, USA, pp , 994. [7] Jn, R., Chen, W., and Smpson, T., "Comparatve Studes of Metamodellng Technques Under Multple Modellng Crtera," Structural and Multdscplnary Optmzaton, Vol.23, No., pp.-3, 2. [8] Smpson, T., Poplnsk, J., and Koch, P., "Metamodels for Computer-Based Engneerng Desgn: Survey and Recommendatons," Engneerng wth Computers, Vol.7, No.2, pp.29-5, 2. [9] Wang, G.G., and Shan, S., "Revew of Metamodelng Technques n Support of Engneerng Desgn Optmzaton," Journal of Mechancal Desgn, Vol.29, No.4, pp., 27. [] Forrester, A., Sobester, A., and Keane, A., "Engneerng Desgn va Surrogate Modellng, A Practcal Gude," John Wley & Sons, Unted Kngdom, 28. [] Forrester, A., and Keane, A., "Recent Advances n Surrogate-Based Optmzaton," Progress n Aerospace Scences, Vol.45, No.-3, pp.5-79, 29. [2] Zhao, L., Cho, K.K., and Lee, I., "A Metamodel Method Usng Dynamc Krgng and Sequental Samplng," The 3th AIAA/ISSMO Multdscplnary Analyss and Optmzaton Conference, Fort Worth, TX, Sept.3-5, 2. [3] Hurtado, J.E., and Alvarez, D.A., "Classfcaton Approach for Relablty Analyss wth Stochastc Fnte- Element Modelng," Journal of Structural Engneerng, Vol.29, No.8, pp.4-49, 23. [4] Vapnk, V.N., "Statstcal Learnng Theory," Wley, New York, 998. [5] Cherkassky, V., and Muler, F., Learnng from data : Concepts, Theory, and Methods, John Wley & Sons, New York, 998. [6] Burges, C.J.C., "A Tutoral on Support Vector Machnes for Pattern Recognton," Data Mnng and Knowledge Dscovery, Vol.2, No.2, pp.2-67, 998. [7] Schölkopf, B., "Advances n Kernel Methods Support Vector Learnng," MIT Press, Cambrdge, Mass., 999. [8] Vapnk, V.N., "The Nature of Statstcal Learnng Theory," Sprnger, New York, 2. [9] Kecman, V., "Learnng and Soft Computng: Support Vector Machnes, Neural Networks, and Fuzzy Logc Models," MIT Press, Cambrdge, Mass., 2. [2] Schölkopf, B., and Smola, A.J., "Learnng wth Kernels : Support Vector Machnes, Regularzaton, Optmzaton, and Beyond," MIT Press, Cambrdge, Mass., 22. [2] Basudhar, A., and Mssoum, S., "Adaptve Explct Decson Functons for Probablstc Desgn and Optmzaton usng Support Vector Machnes," Computers & Structures, Vol.86, No.9-2, pp.94-97, 28. [22] Basudhar, A., and Mssoum, S., "An Improved Adaptve Samplng Scheme for the Constructon of Explct Boundares," Structural and Multdscplnary Optmzaton, Vol.42, No.4, pp , 2. [23] Canu, S., Grandvalet, Y., and Gugue, V., "SVM and Kernel Methods Matlab Toolbox," [24] Vana, F.A.C., "SURROGATES Toolbox User's Gude," 2. [25] Martn, J. D., "Computatonal Improvements to Estmatng Krgng Metamodel Parameters," Journal of Mechancal Desgn, Vol.3, No.8, 29. [26] Lews, R. M., and Torczon, V., "Pattern Search Algorthms for Bound Constraned Mnmzaton," SIAM Journal on Optmzaton, Vol.9, No.4, pp.82-99, 999. [27] Saka, Y., Gunzburger, M., and Burkardt, J., "Latnzed, Improved LHS, and CVT Pont Sets n Hypercubes," Internatonal Journal of Numercal Analyss and Modelng, Vol.4, No.3-4, pp , 27. [28] Coleman, T.F., and L, Y., "An Interor, Trust Regon Approach for Nonlnear Mnmzaton Subject to Bounds," SIAM Journal on Optmzaton, Vol.6, pp , 996. [29] Coleman, T.F., and L, Y., "On the Convergence of Reflectve Newton Methods for Large-Scale Nonlnear

11 Mnmzaton Subject to Bounds," Mathematcal Programmng, Vol.67, No.2, pp , 994. [3] Powell, M.J.D., "A Fast Algorthm for Nonlnearly Constraned Optmzaton Calculatons," Numercal Analyss, ed. G.A. Watson, Lecture Notes n Mathematcs, Sprnger Verlag, Vol.63, 978. [3] Powell, M.J.D., "The Convergence of Varable Metrc Methods for Nonlnearly Constraned Optmzaton Calculatons," Nonlnear Programmng 3 (Mangasaran, O.L., Meyer, R.R., and Robnson, S.M., eds.), Academc Press, 978. [32] Byrd, R.H., Glbert, J.C., and Nocedal, J., "A Trust Regon Method Based on Interor Pont Technques for Nonlnear Programmng," Mathematcal Programmng, Vol.89, No., pp.49-85, 2. [33] Waltz, R.A., Morales, J.L., Nocedal, J., and Orban, D., "An nteror algorthm for nonlnear optmzaton that combnes lne search and trust regon steps," Mathematcal Programmng, Vol.7, No.3, pp.39-48, 26.

Support Vector Machines

Support Vector Machines /9/207 MIST.6060 Busness Intellgence and Data Mnng What are Support Vector Machnes? Support Vector Machnes Support Vector Machnes (SVMs) are supervsed learnng technques that analyze data and recognze patterns.

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Decson surface s a hyperplane (lne n 2D) n feature space (smlar to the Perceptron) Arguably, the most mportant recent dscovery n machne learnng In a nutshell: map the data to a predetermned

More information

Classification / Regression Support Vector Machines

Classification / Regression Support Vector Machines Classfcaton / Regresson Support Vector Machnes Jeff Howbert Introducton to Machne Learnng Wnter 04 Topcs SVM classfers for lnearly separable classes SVM classfers for non-lnearly separable classes SVM

More information

S1 Note. Basis functions.

S1 Note. Basis functions. S1 Note. Bass functons. Contents Types of bass functons...1 The Fourer bass...2 B-splne bass...3 Power and type I error rates wth dfferent numbers of bass functons...4 Table S1. Smulaton results of type

More information

Learning the Kernel Parameters in Kernel Minimum Distance Classifier

Learning the Kernel Parameters in Kernel Minimum Distance Classifier Learnng the Kernel Parameters n Kernel Mnmum Dstance Classfer Daoqang Zhang 1,, Songcan Chen and Zh-Hua Zhou 1* 1 Natonal Laboratory for Novel Software Technology Nanjng Unversty, Nanjng 193, Chna Department

More information

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS ARPN Journal of Engneerng and Appled Scences 006-017 Asan Research Publshng Network (ARPN). All rghts reserved. NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS Igor Grgoryev, Svetlana

More information

Smoothing Spline ANOVA for variable screening

Smoothing Spline ANOVA for variable screening Smoothng Splne ANOVA for varable screenng a useful tool for metamodels tranng and mult-objectve optmzaton L. Rcco, E. Rgon, A. Turco Outlne RSM Introducton Possble couplng Test case MOO MOO wth Game Theory

More information

Determining the Optimal Bandwidth Based on Multi-criterion Fusion

Determining the Optimal Bandwidth Based on Multi-criterion Fusion Proceedngs of 01 4th Internatonal Conference on Machne Learnng and Computng IPCSIT vol. 5 (01) (01) IACSIT Press, Sngapore Determnng the Optmal Bandwdth Based on Mult-crteron Fuson Ha-L Lang 1+, Xan-Mn

More information

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1 4/14/011 Outlne Dscrmnatve classfers for mage recognton Wednesday, Aprl 13 Krsten Grauman UT-Austn Last tme: wndow-based generc obect detecton basc ppelne face detecton wth boostng as case study Today:

More information

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points; Subspace clusterng Clusterng Fundamental to all clusterng technques s the choce of dstance measure between data ponts; D q ( ) ( ) 2 x x = x x, j k = 1 k jk Squared Eucldean dstance Assumpton: All features

More information

Parallelism for Nested Loops with Non-uniform and Flow Dependences

Parallelism for Nested Loops with Non-uniform and Flow Dependences Parallelsm for Nested Loops wth Non-unform and Flow Dependences Sam-Jn Jeong Dept. of Informaton & Communcaton Engneerng, Cheonan Unversty, 5, Anseo-dong, Cheonan, Chungnam, 330-80, Korea. seong@cheonan.ac.kr

More information

An Iterative Solution Approach to Process Plant Layout using Mixed Integer Optimisation

An Iterative Solution Approach to Process Plant Layout using Mixed Integer Optimisation 17 th European Symposum on Computer Aded Process Engneerng ESCAPE17 V. Plesu and P.S. Agach (Edtors) 2007 Elsever B.V. All rghts reserved. 1 An Iteratve Soluton Approach to Process Plant Layout usng Mxed

More information

Classifier Selection Based on Data Complexity Measures *

Classifier Selection Based on Data Complexity Measures * Classfer Selecton Based on Data Complexty Measures * Edth Hernández-Reyes, J.A. Carrasco-Ochoa, and J.Fco. Martínez-Trndad Natonal Insttute for Astrophyscs, Optcs and Electroncs, Lus Enrque Erro No.1 Sta.

More information

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization Problem efntons and Evaluaton Crtera for Computatonal Expensve Optmzaton B. Lu 1, Q. Chen and Q. Zhang 3, J. J. Lang 4, P. N. Suganthan, B. Y. Qu 6 1 epartment of Computng, Glyndwr Unversty, UK Faclty

More information

Review of approximation techniques

Review of approximation techniques CHAPTER 2 Revew of appromaton technques 2. Introducton Optmzaton problems n engneerng desgn are characterzed by the followng assocated features: the objectve functon and constrants are mplct functons evaluated

More information

Face Recognition University at Buffalo CSE666 Lecture Slides Resources:

Face Recognition University at Buffalo CSE666 Lecture Slides Resources: Face Recognton Unversty at Buffalo CSE666 Lecture Sldes Resources: http://www.face-rec.org/algorthms/ Overvew of face recognton algorthms Correlaton - Pxel based correspondence between two face mages Structural

More information

Multi-objective Optimization Using Adaptive Explicit Non-Dominated Region Sampling

Multi-objective Optimization Using Adaptive Explicit Non-Dominated Region Sampling 11 th World Congress on Structural and Multdscplnary Optmsaton 07 th -12 th, June 2015, Sydney Australa Mult-objectve Optmzaton Usng Adaptve Explct Non-Domnated Regon Samplng Anrban Basudhar Lvermore Software

More information

CHAPTER 3 SEQUENTIAL MINIMAL OPTIMIZATION TRAINED SUPPORT VECTOR CLASSIFIER FOR CANCER PREDICTION

CHAPTER 3 SEQUENTIAL MINIMAL OPTIMIZATION TRAINED SUPPORT VECTOR CLASSIFIER FOR CANCER PREDICTION 48 CHAPTER 3 SEQUENTIAL MINIMAL OPTIMIZATION TRAINED SUPPORT VECTOR CLASSIFIER FOR CANCER PREDICTION 3.1 INTRODUCTION The raw mcroarray data s bascally an mage wth dfferent colors ndcatng hybrdzaton (Xue

More information

Simulation: Solving Dynamic Models ABE 5646 Week 11 Chapter 2, Spring 2010

Simulation: Solving Dynamic Models ABE 5646 Week 11 Chapter 2, Spring 2010 Smulaton: Solvng Dynamc Models ABE 5646 Week Chapter 2, Sprng 200 Week Descrpton Readng Materal Mar 5- Mar 9 Evaluatng [Crop] Models Comparng a model wth data - Graphcal, errors - Measures of agreement

More information

Machine Learning. Support Vector Machines. (contains material adapted from talks by Constantin F. Aliferis & Ioannis Tsamardinos, and Martin Law)

Machine Learning. Support Vector Machines. (contains material adapted from talks by Constantin F. Aliferis & Ioannis Tsamardinos, and Martin Law) Machne Learnng Support Vector Machnes (contans materal adapted from talks by Constantn F. Alfers & Ioanns Tsamardnos, and Martn Law) Bryan Pardo, Machne Learnng: EECS 349 Fall 2014 Support Vector Machnes

More information

The Research of Support Vector Machine in Agricultural Data Classification

The Research of Support Vector Machine in Agricultural Data Classification The Research of Support Vector Machne n Agrcultural Data Classfcaton Le Sh, Qguo Duan, Xnmng Ma, Me Weng College of Informaton and Management Scence, HeNan Agrcultural Unversty, Zhengzhou 45000 Chna Zhengzhou

More information

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision SLAM Summer School 2006 Practcal 2: SLAM usng Monocular Vson Javer Cvera, Unversty of Zaragoza Andrew J. Davson, Imperal College London J.M.M Montel, Unversty of Zaragoza. josemar@unzar.es, jcvera@unzar.es,

More information

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification Introducton to Artfcal Intellgence V22.0472-001 Fall 2009 Lecture 24: Nearest-Neghbors & Support Vector Machnes Rob Fergus Dept of Computer Scence, Courant Insttute, NYU Sldes from Danel Yeung, John DeNero

More information

Edge Detection in Noisy Images Using the Support Vector Machines

Edge Detection in Noisy Images Using the Support Vector Machines Edge Detecton n Nosy Images Usng the Support Vector Machnes Hlaro Gómez-Moreno, Saturnno Maldonado-Bascón, Francsco López-Ferreras Sgnal Theory and Communcatons Department. Unversty of Alcalá Crta. Madrd-Barcelona

More information

Cluster Analysis of Electrical Behavior

Cluster Analysis of Electrical Behavior Journal of Computer and Communcatons, 205, 3, 88-93 Publshed Onlne May 205 n ScRes. http://www.scrp.org/ournal/cc http://dx.do.org/0.4236/cc.205.350 Cluster Analyss of Electrcal Behavor Ln Lu Ln Lu, School

More information

Solving two-person zero-sum game by Matlab

Solving two-person zero-sum game by Matlab Appled Mechancs and Materals Onlne: 2011-02-02 ISSN: 1662-7482, Vols. 50-51, pp 262-265 do:10.4028/www.scentfc.net/amm.50-51.262 2011 Trans Tech Publcatons, Swtzerland Solvng two-person zero-sum game by

More information

Quality Improvement Algorithm for Tetrahedral Mesh Based on Optimal Delaunay Triangulation

Quality Improvement Algorithm for Tetrahedral Mesh Based on Optimal Delaunay Triangulation Intellgent Informaton Management, 013, 5, 191-195 Publshed Onlne November 013 (http://www.scrp.org/journal/m) http://dx.do.org/10.36/m.013.5601 Qualty Improvement Algorthm for Tetrahedral Mesh Based on

More information

A MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS

A MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS Proceedngs of the Wnter Smulaton Conference M E Kuhl, N M Steger, F B Armstrong, and J A Jones, eds A MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS Mark W Brantley Chun-Hung

More information

CS246: Mining Massive Datasets Jure Leskovec, Stanford University

CS246: Mining Massive Datasets Jure Leskovec, Stanford University CS46: Mnng Massve Datasets Jure Leskovec, Stanford Unversty http://cs46.stanford.edu /19/013 Jure Leskovec, Stanford CS46: Mnng Massve Datasets, http://cs46.stanford.edu Perceptron: y = sgn( x Ho to fnd

More information

An Accurate Evaluation of Integrals in Convex and Non convex Polygonal Domain by Twelve Node Quadrilateral Finite Element Method

An Accurate Evaluation of Integrals in Convex and Non convex Polygonal Domain by Twelve Node Quadrilateral Finite Element Method Internatonal Journal of Computatonal and Appled Mathematcs. ISSN 89-4966 Volume, Number (07), pp. 33-4 Research Inda Publcatons http://www.rpublcaton.com An Accurate Evaluaton of Integrals n Convex and

More information

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration Improvement of Spatal Resoluton Usng BlockMatchng Based Moton Estmaton and Frame Integraton Danya Suga and Takayuk Hamamoto Graduate School of Engneerng, Tokyo Unversty of Scence, 6-3-1, Nuku, Katsuska-ku,

More information

SVM-based Learning for Multiple Model Estimation

SVM-based Learning for Multiple Model Estimation SVM-based Learnng for Multple Model Estmaton Vladmr Cherkassky and Yunqan Ma Department of Electrcal and Computer Engneerng Unversty of Mnnesota Mnneapols, MN 55455 {cherkass,myq}@ece.umn.edu Abstract:

More information

Biostatistics 615/815

Biostatistics 615/815 The E-M Algorthm Bostatstcs 615/815 Lecture 17 Last Lecture: The Smplex Method General method for optmzaton Makes few assumptons about functon Crawls towards mnmum Some recommendatons Multple startng ponts

More information

X- Chart Using ANOM Approach

X- Chart Using ANOM Approach ISSN 1684-8403 Journal of Statstcs Volume 17, 010, pp. 3-3 Abstract X- Chart Usng ANOM Approach Gullapall Chakravarth 1 and Chaluvad Venkateswara Rao Control lmts for ndvdual measurements (X) chart are

More information

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching A Fast Vsual Trackng Algorthm Based on Crcle Pxels Matchng Zhqang Hou hou_zhq@sohu.com Chongzhao Han czhan@mal.xjtu.edu.cn Ln Zheng Abstract: A fast vsual trackng algorthm based on crcle pxels matchng

More information

Lecture 4: Principal components

Lecture 4: Principal components /3/6 Lecture 4: Prncpal components 3..6 Multvarate lnear regresson MLR s optmal for the estmaton data...but poor for handlng collnear data Covarance matrx s not nvertble (large condton number) Robustness

More information

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET 1 BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET TZU-CHENG CHUANG School of Electrcal and Computer Engneerng, Purdue Unversty, West Lafayette, Indana 47907 SAUL B. GELFAND School

More information

Sequential Projection Maximin Distance Sampling Method

Sequential Projection Maximin Distance Sampling Method APCOM & ISCM 11-14 th December, 2013, Sngapore Sequental Projecton Maxmn Dstance Samplng Method J. Jang 1, W. Lm 1, S. Cho 1, M. Lee 2, J. Na 3 and * T.H. Lee 1 1 Department of automotve engneerng, Hanyang

More information

Face Recognition Method Based on Within-class Clustering SVM

Face Recognition Method Based on Within-class Clustering SVM Face Recognton Method Based on Wthn-class Clusterng SVM Yan Wu, Xao Yao and Yng Xa Department of Computer Scence and Engneerng Tong Unversty Shangha, Chna Abstract - A face recognton method based on Wthn-class

More information

Feature Reduction and Selection

Feature Reduction and Selection Feature Reducton and Selecton Dr. Shuang LIANG School of Software Engneerng TongJ Unversty Fall, 2012 Today s Topcs Introducton Problems of Dmensonalty Feature Reducton Statstc methods Prncpal Components

More information

An Application of the Dulmage-Mendelsohn Decomposition to Sparse Null Space Bases of Full Row Rank Matrices

An Application of the Dulmage-Mendelsohn Decomposition to Sparse Null Space Bases of Full Row Rank Matrices Internatonal Mathematcal Forum, Vol 7, 2012, no 52, 2549-2554 An Applcaton of the Dulmage-Mendelsohn Decomposton to Sparse Null Space Bases of Full Row Rank Matrces Mostafa Khorramzadeh Department of Mathematcal

More information

Range images. Range image registration. Examples of sampling patterns. Range images and range surfaces

Range images. Range image registration. Examples of sampling patterns. Range images and range surfaces Range mages For many structured lght scanners, the range data forms a hghly regular pattern known as a range mage. he samplng pattern s determned by the specfc scanner. Range mage regstraton 1 Examples

More information

A COMPARISON OF TWO METHODS FOR FITTING HIGH DIMENSIONAL RESPONSE SURFACES

A COMPARISON OF TWO METHODS FOR FITTING HIGH DIMENSIONAL RESPONSE SURFACES Mam, Florda, U.S.A., Aprl 6-8, 7 A COMPARISON OF TWO METHODS FOR FITTING HIGH DIMENSIONAL RESPONSE SURFACES Marcelo J. Colaço Department of Mechancal and Materals Eng. Mltary Insttute of Engneerng Ro de

More information

Support Vector Machines. CS534 - Machine Learning

Support Vector Machines. CS534 - Machine Learning Support Vector Machnes CS534 - Machne Learnng Perceptron Revsted: Lnear Separators Bnar classfcaton can be veed as the task of separatng classes n feature space: b > 0 b 0 b < 0 f() sgn( b) Lnear Separators

More information

A Statistical Model Selection Strategy Applied to Neural Networks

A Statistical Model Selection Strategy Applied to Neural Networks A Statstcal Model Selecton Strategy Appled to Neural Networks Joaquín Pzarro Elsa Guerrero Pedro L. Galndo joaqun.pzarro@uca.es elsa.guerrero@uca.es pedro.galndo@uca.es Dpto Lenguajes y Sstemas Informátcos

More information

Wishing you all a Total Quality New Year!

Wishing you all a Total Quality New Year! Total Qualty Management and Sx Sgma Post Graduate Program 214-15 Sesson 4 Vnay Kumar Kalakband Assstant Professor Operatons & Systems Area 1 Wshng you all a Total Qualty New Year! Hope you acheve Sx sgma

More information

Machine Learning 9. week

Machine Learning 9. week Machne Learnng 9. week Mappng Concept Radal Bass Functons (RBF) RBF Networks 1 Mappng It s probably the best scenaro for the classfcaton of two dataset s to separate them lnearly. As you see n the below

More information

y and the total sum of

y and the total sum of Lnear regresson Testng for non-lnearty In analytcal chemstry, lnear regresson s commonly used n the constructon of calbraton functons requred for analytcal technques such as gas chromatography, atomc absorpton

More information

A New Approach For the Ranking of Fuzzy Sets With Different Heights

A New Approach For the Ranking of Fuzzy Sets With Different Heights New pproach For the ankng of Fuzzy Sets Wth Dfferent Heghts Pushpnder Sngh School of Mathematcs Computer pplcatons Thapar Unversty, Patala-7 00 Inda pushpndersnl@gmalcom STCT ankng of fuzzy sets plays

More information

Announcements. Supervised Learning

Announcements. Supervised Learning Announcements See Chapter 5 of Duda, Hart, and Stork. Tutoral by Burge lnked to on web page. Supervsed Learnng Classfcaton wth labeled eamples. Images vectors n hgh-d space. Supervsed Learnng Labeled eamples

More information

An Optimal Algorithm for Prufer Codes *

An Optimal Algorithm for Prufer Codes * J. Software Engneerng & Applcatons, 2009, 2: 111-115 do:10.4236/jsea.2009.22016 Publshed Onlne July 2009 (www.scrp.org/journal/jsea) An Optmal Algorthm for Prufer Codes * Xaodong Wang 1, 2, Le Wang 3,

More information

CS 534: Computer Vision Model Fitting

CS 534: Computer Vision Model Fitting CS 534: Computer Vson Model Fttng Sprng 004 Ahmed Elgammal Dept of Computer Scence CS 534 Model Fttng - 1 Outlnes Model fttng s mportant Least-squares fttng Maxmum lkelhood estmaton MAP estmaton Robust

More information

A mathematical programming approach to the analysis, design and scheduling of offshore oilfields

A mathematical programming approach to the analysis, design and scheduling of offshore oilfields 17 th European Symposum on Computer Aded Process Engneerng ESCAPE17 V. Plesu and P.S. Agach (Edtors) 2007 Elsever B.V. All rghts reserved. 1 A mathematcal programmng approach to the analyss, desgn and

More information

Meta-heuristics for Multidimensional Knapsack Problems

Meta-heuristics for Multidimensional Knapsack Problems 2012 4th Internatonal Conference on Computer Research and Development IPCSIT vol.39 (2012) (2012) IACSIT Press, Sngapore Meta-heurstcs for Multdmensonal Knapsack Problems Zhbao Man + Computer Scence Department,

More information

LECTURE : MANIFOLD LEARNING

LECTURE : MANIFOLD LEARNING LECTURE : MANIFOLD LEARNING Rta Osadchy Some sldes are due to L.Saul, V. C. Raykar, N. Verma Topcs PCA MDS IsoMap LLE EgenMaps Done! Dmensonalty Reducton Data representaton Inputs are real-valued vectors

More information

LECTURE NOTES Duality Theory, Sensitivity Analysis, and Parametric Programming

LECTURE NOTES Duality Theory, Sensitivity Analysis, and Parametric Programming CEE 60 Davd Rosenberg p. LECTURE NOTES Dualty Theory, Senstvty Analyss, and Parametrc Programmng Learnng Objectves. Revew the prmal LP model formulaton 2. Formulate the Dual Problem of an LP problem (TUES)

More information

Mathematics 256 a course in differential equations for engineering students

Mathematics 256 a course in differential equations for engineering students Mathematcs 56 a course n dfferental equatons for engneerng students Chapter 5. More effcent methods of numercal soluton Euler s method s qute neffcent. Because the error s essentally proportonal to the

More information

R s s f. m y s. SPH3UW Unit 7.3 Spherical Concave Mirrors Page 1 of 12. Notes

R s s f. m y s. SPH3UW Unit 7.3 Spherical Concave Mirrors Page 1 of 12. Notes SPH3UW Unt 7.3 Sphercal Concave Mrrors Page 1 of 1 Notes Physcs Tool box Concave Mrror If the reflectng surface takes place on the nner surface of the sphercal shape so that the centre of the mrror bulges

More information

Empirical Distributions of Parameter Estimates. in Binary Logistic Regression Using Bootstrap

Empirical Distributions of Parameter Estimates. in Binary Logistic Regression Using Bootstrap Int. Journal of Math. Analyss, Vol. 8, 4, no. 5, 7-7 HIKARI Ltd, www.m-hkar.com http://dx.do.org/.988/jma.4.494 Emprcal Dstrbutons of Parameter Estmates n Bnary Logstc Regresson Usng Bootstrap Anwar Ftranto*

More information

Incremental Learning with Support Vector Machines and Fuzzy Set Theory

Incremental Learning with Support Vector Machines and Fuzzy Set Theory The 25th Workshop on Combnatoral Mathematcs and Computaton Theory Incremental Learnng wth Support Vector Machnes and Fuzzy Set Theory Yu-Mng Chuang 1 and Cha-Hwa Ln 2* 1 Department of Computer Scence and

More information

A Binarization Algorithm specialized on Document Images and Photos

A Binarization Algorithm specialized on Document Images and Photos A Bnarzaton Algorthm specalzed on Document mages and Photos Ergna Kavalleratou Dept. of nformaton and Communcaton Systems Engneerng Unversty of the Aegean kavalleratou@aegean.gr Abstract n ths paper, a

More information

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints Australan Journal of Basc and Appled Scences, 2(4): 1204-1208, 2008 ISSN 1991-8178 Sum of Lnear and Fractonal Multobjectve Programmng Problem under Fuzzy Rules Constrants 1 2 Sanjay Jan and Kalash Lachhwan

More information

Network Intrusion Detection Based on PSO-SVM

Network Intrusion Detection Based on PSO-SVM TELKOMNIKA Indonesan Journal of Electrcal Engneerng Vol.1, No., February 014, pp. 150 ~ 1508 DOI: http://dx.do.org/10.11591/telkomnka.v1.386 150 Network Intruson Detecton Based on PSO-SVM Changsheng Xang*

More information

Skew Angle Estimation and Correction of Hand Written, Textual and Large areas of Non-Textual Document Images: A Novel Approach

Skew Angle Estimation and Correction of Hand Written, Textual and Large areas of Non-Textual Document Images: A Novel Approach Angle Estmaton and Correcton of Hand Wrtten, Textual and Large areas of Non-Textual Document Images: A Novel Approach D.R.Ramesh Babu Pyush M Kumat Mahesh D Dhannawat PES Insttute of Technology Research

More information

Categories and Subject Descriptors B.7.2 [Integrated Circuits]: Design Aids Verification. General Terms Algorithms

Categories and Subject Descriptors B.7.2 [Integrated Circuits]: Design Aids Verification. General Terms Algorithms 3. Fndng Determnstc Soluton from Underdetermned Equaton: Large-Scale Performance Modelng by Least Angle Regresson Xn L ECE Department, Carnege Mellon Unversty Forbs Avenue, Pttsburgh, PA 3 xnl@ece.cmu.edu

More information

Classifying Acoustic Transient Signals Using Artificial Intelligence

Classifying Acoustic Transient Signals Using Artificial Intelligence Classfyng Acoustc Transent Sgnals Usng Artfcal Intellgence Steve Sutton, Unversty of North Carolna At Wlmngton (suttons@charter.net) Greg Huff, Unversty of North Carolna At Wlmngton (jgh7476@uncwl.edu)

More information

Wavefront Reconstructor

Wavefront Reconstructor A Dstrbuted Smplex B-Splne Based Wavefront Reconstructor Coen de Vsser and Mchel Verhaegen 14-12-201212 2012 Delft Unversty of Technology Contents Introducton Wavefront reconstructon usng Smplex B-Splnes

More information

Programming in Fortran 90 : 2017/2018

Programming in Fortran 90 : 2017/2018 Programmng n Fortran 90 : 2017/2018 Programmng n Fortran 90 : 2017/2018 Exercse 1 : Evaluaton of functon dependng on nput Wrte a program who evaluate the functon f (x,y) for any two user specfed values

More information

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task Proceedngs of NTCIR-6 Workshop Meetng, May 15-18, 2007, Tokyo, Japan Term Weghtng Classfcaton System Usng the Ch-square Statstc for the Classfcaton Subtask at NTCIR-6 Patent Retreval Task Kotaro Hashmoto

More information

An Efficient Pareto Set Identification Approach for Multi-objective Optimization on Black-box Functions

An Efficient Pareto Set Identification Approach for Multi-objective Optimization on Black-box Functions . Abstract An Effcent Pareto Set Identfcaton Approach for Mult-objectve Optmzaton on Black-box Functons Songqng Shan G. Gary Wang Both multple objectves and computaton-ntensve black-box functons often

More information

Using Neural Networks and Support Vector Machines in Data Mining

Using Neural Networks and Support Vector Machines in Data Mining Usng eural etworks and Support Vector Machnes n Data Mnng RICHARD A. WASIOWSKI Computer Scence Department Calforna State Unversty Domnguez Hlls Carson, CA 90747 USA Abstract: - Multvarate data analyss

More information

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers IOSR Journal of Electroncs and Communcaton Engneerng (IOSR-JECE) e-issn: 78-834,p- ISSN: 78-8735.Volume 9, Issue, Ver. IV (Mar - Apr. 04), PP 0-07 Content Based Image Retreval Usng -D Dscrete Wavelet wth

More information

Learning-Based Top-N Selection Query Evaluation over Relational Databases

Learning-Based Top-N Selection Query Evaluation over Relational Databases Learnng-Based Top-N Selecton Query Evaluaton over Relatonal Databases Lang Zhu *, Wey Meng ** * School of Mathematcs and Computer Scence, Hebe Unversty, Baodng, Hebe 071002, Chna, zhu@mal.hbu.edu.cn **

More information

Course Introduction. Algorithm 8/31/2017. COSC 320 Advanced Data Structures and Algorithms. COSC 320 Advanced Data Structures and Algorithms

Course Introduction. Algorithm 8/31/2017. COSC 320 Advanced Data Structures and Algorithms. COSC 320 Advanced Data Structures and Algorithms Course Introducton Course Topcs Exams, abs, Proects A quc loo at a few algorthms 1 Advanced Data Structures and Algorthms Descrpton: We are gong to dscuss algorthm complexty analyss, algorthm desgn technques

More information

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique //00 :0 AM Outlne and Readng The Greedy Method The Greedy Method Technque (secton.) Fractonal Knapsack Problem (secton..) Task Schedulng (secton..) Mnmum Spannng Trees (secton.) Change Money Problem Greedy

More information

Repeater Insertion for Two-Terminal Nets in Three-Dimensional Integrated Circuits

Repeater Insertion for Two-Terminal Nets in Three-Dimensional Integrated Circuits Repeater Inserton for Two-Termnal Nets n Three-Dmensonal Integrated Crcuts Hu Xu, Vasls F. Pavlds, and Govann De Mchel LSI - EPFL, CH-5, Swtzerland, {hu.xu,vasleos.pavlds,govann.demchel}@epfl.ch Abstract.

More information

Machine Learning. Topic 6: Clustering

Machine Learning. Topic 6: Clustering Machne Learnng Topc 6: lusterng lusterng Groupng data nto (hopefully useful) sets. Thngs on the left Thngs on the rght Applcatons of lusterng Hypothess Generaton lusters mght suggest natural groups. Hypothess

More information

Projection-Based Performance Modeling for Inter/Intra-Die Variations

Projection-Based Performance Modeling for Inter/Intra-Die Variations Proecton-Based Performance Modelng for Inter/Intra-De Varatons Xn L, Jayong Le 2, Lawrence. Plegg and Andrze Strowas Dept. of Electrcal & Computer Engneerng Carnege Mellon Unversty Pttsburgh, PA 523, USA

More information

Evolutionary Support Vector Regression based on Multi-Scale Radial Basis Function Kernel

Evolutionary Support Vector Regression based on Multi-Scale Radial Basis Function Kernel Eolutonary Support Vector Regresson based on Mult-Scale Radal Bass Functon Kernel Tanasanee Phenthrakul and Boonserm Kjsrkul Abstract Kernel functons are used n support ector regresson (SVR) to compute

More information

Lecture 5: Multilayer Perceptrons

Lecture 5: Multilayer Perceptrons Lecture 5: Multlayer Perceptrons Roger Grosse 1 Introducton So far, we ve only talked about lnear models: lnear regresson and lnear bnary classfers. We noted that there are functons that can t be represented

More information

Data Mining: Model Evaluation

Data Mining: Model Evaluation Data Mnng: Model Evaluaton Aprl 16, 2013 1 Issues: Evaluatng Classfcaton Methods Accurac classfer accurac: predctng class label predctor accurac: guessng value of predcted attrbutes Speed tme to construct

More information

A New Token Allocation Algorithm for TCP Traffic in Diffserv Network

A New Token Allocation Algorithm for TCP Traffic in Diffserv Network A New Token Allocaton Algorthm for TCP Traffc n Dffserv Network A New Token Allocaton Algorthm for TCP Traffc n Dffserv Network S. Sudha and N. Ammasagounden Natonal Insttute of Technology, Truchrappall,

More information

Structural Optimization Using OPTIMIZER Program

Structural Optimization Using OPTIMIZER Program SprngerLnk - Book Chapter http://www.sprngerlnk.com/content/m28478j4372qh274/?prnt=true ق.ظ 1 of 2 2009/03/12 11:30 Book Chapter large verson Structural Optmzaton Usng OPTIMIZER Program Book III European

More information

Positive Semi-definite Programming Localization in Wireless Sensor Networks

Positive Semi-definite Programming Localization in Wireless Sensor Networks Postve Sem-defnte Programmng Localzaton n Wreless Sensor etworks Shengdong Xe 1,, Jn Wang, Aqun Hu 1, Yunl Gu, Jang Xu, 1 School of Informaton Scence and Engneerng, Southeast Unversty, 10096, anjng Computer

More information

A Simple and Efficient Goal Programming Model for Computing of Fuzzy Linear Regression Parameters with Considering Outliers

A Simple and Efficient Goal Programming Model for Computing of Fuzzy Linear Regression Parameters with Considering Outliers 62626262621 Journal of Uncertan Systems Vol.5, No.1, pp.62-71, 211 Onlne at: www.us.org.u A Smple and Effcent Goal Programmng Model for Computng of Fuzzy Lnear Regresson Parameters wth Consderng Outlers

More information

APPLICATION OF A COMPUTATIONALLY EFFICIENT GEOSTATISTICAL APPROACH TO CHARACTERIZING VARIABLY SPACED WATER-TABLE DATA

APPLICATION OF A COMPUTATIONALLY EFFICIENT GEOSTATISTICAL APPROACH TO CHARACTERIZING VARIABLY SPACED WATER-TABLE DATA RFr"W/FZD JAN 2 4 1995 OST control # 1385 John J Q U ~ M Argonne Natonal Laboratory Argonne, L 60439 Tel: 708-252-5357, Fax: 708-252-3 611 APPLCATON OF A COMPUTATONALLY EFFCENT GEOSTATSTCAL APPROACH TO

More information

Discriminative classifiers for object classification. Last time

Discriminative classifiers for object classification. Last time Dscrmnatve classfers for object classfcaton Thursday, Nov 12 Krsten Grauman UT Austn Last tme Supervsed classfcaton Loss and rsk, kbayes rule Skn color detecton example Sldng ndo detecton Classfers, boostng

More information

Hierarchical clustering for gene expression data analysis

Hierarchical clustering for gene expression data analysis Herarchcal clusterng for gene expresson data analyss Gorgo Valentn e-mal: valentn@ds.unm.t Clusterng of Mcroarray Data. Clusterng of gene expresson profles (rows) => dscovery of co-regulated and functonally

More information

Design for Reliability: Case Studies in Manufacturing Process Synthesis

Design for Reliability: Case Studies in Manufacturing Process Synthesis Desgn for Relablty: Case Studes n Manufacturng Process Synthess Y. Lawrence Yao*, and Chao Lu Department of Mechancal Engneerng, Columba Unversty, Mudd Bldg., MC 473, New York, NY 7, USA * Correspondng

More information

Three supervised learning methods on pen digits character recognition dataset

Three supervised learning methods on pen digits character recognition dataset Three supervsed learnng methods on pen dgts character recognton dataset Chrs Flezach Department of Computer Scence and Engneerng Unversty of Calforna, San Dego San Dego, CA 92093 cflezac@cs.ucsd.edu Satoru

More information

Machine Learning: Algorithms and Applications

Machine Learning: Algorithms and Applications 14/05/1 Machne Learnng: Algorthms and Applcatons Florano Zn Free Unversty of Bozen-Bolzano Faculty of Computer Scence Academc Year 011-01 Lecture 10: 14 May 01 Unsupervsed Learnng cont Sldes courtesy of

More information

Intra-Parametric Analysis of a Fuzzy MOLP

Intra-Parametric Analysis of a Fuzzy MOLP Intra-Parametrc Analyss of a Fuzzy MOLP a MIAO-LING WANG a Department of Industral Engneerng and Management a Mnghsn Insttute of Technology and Hsnchu Tawan, ROC b HSIAO-FAN WANG b Insttute of Industral

More information

Lobachevsky State University of Nizhni Novgorod. Polyhedron. Quick Start Guide

Lobachevsky State University of Nizhni Novgorod. Polyhedron. Quick Start Guide Lobachevsky State Unversty of Nzhn Novgorod Polyhedron Quck Start Gude Nzhn Novgorod 2016 Contents Specfcaton of Polyhedron software... 3 Theoretcal background... 4 1. Interface of Polyhedron... 6 1.1.

More information

EVALUATION OF THE PERFORMANCES OF ARTIFICIAL BEE COLONY AND INVASIVE WEED OPTIMIZATION ALGORITHMS ON THE MODIFIED BENCHMARK FUNCTIONS

EVALUATION OF THE PERFORMANCES OF ARTIFICIAL BEE COLONY AND INVASIVE WEED OPTIMIZATION ALGORITHMS ON THE MODIFIED BENCHMARK FUNCTIONS Academc Research Internatonal ISS-L: 3-9553, ISS: 3-9944 Vol., o. 3, May 0 EVALUATIO OF THE PERFORMACES OF ARTIFICIAL BEE COLOY AD IVASIVE WEED OPTIMIZATIO ALGORITHMS O THE MODIFIED BECHMARK FUCTIOS Dlay

More information

Unsupervised Learning and Clustering

Unsupervised Learning and Clustering Unsupervsed Learnng and Clusterng Why consder unlabeled samples?. Collectng and labelng large set of samples s costly Gettng recorded speech s free, labelng s tme consumng 2. Classfer could be desgned

More information

Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like:

Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like: Self-Organzng Maps (SOM) Turgay İBRİKÇİ, PhD. Outlne Introducton Structures of SOM SOM Archtecture Neghborhoods SOM Algorthm Examples Summary 1 2 Unsupervsed Hebban Learnng US Hebban Learnng, Cntd 3 A

More information

Radial Basis Functions

Radial Basis Functions Radal Bass Functons Mesh Reconstructon Input: pont cloud Output: water-tght manfold mesh Explct Connectvty estmaton Implct Sgned dstance functon estmaton Image from: Reconstructon and Representaton of

More information

Simulation Based Analysis of FAST TCP using OMNET++

Simulation Based Analysis of FAST TCP using OMNET++ Smulaton Based Analyss of FAST TCP usng OMNET++ Umar ul Hassan 04030038@lums.edu.pk Md Term Report CS678 Topcs n Internet Research Sprng, 2006 Introducton Internet traffc s doublng roughly every 3 months

More information

Helsinki University Of Technology, Systems Analysis Laboratory Mat Independent research projects in applied mathematics (3 cr)

Helsinki University Of Technology, Systems Analysis Laboratory Mat Independent research projects in applied mathematics (3 cr) Helsnk Unversty Of Technology, Systems Analyss Laboratory Mat-2.08 Independent research projects n appled mathematcs (3 cr) "! #$&% Antt Laukkanen 506 R ajlaukka@cc.hut.f 2 Introducton...3 2 Multattrbute

More information

The Comparison of Calibration Method of Binocular Stereo Vision System Ke Zhang a *, Zhao Gao b

The Comparison of Calibration Method of Binocular Stereo Vision System Ke Zhang a *, Zhao Gao b 3rd Internatonal Conference on Materal, Mechancal and Manufacturng Engneerng (IC3ME 2015) The Comparson of Calbraton Method of Bnocular Stereo Vson System Ke Zhang a *, Zhao Gao b College of Engneerng,

More information