A Naïve Support Vector Regression Benchmark for the NN3 Forecasting Competition

Size: px
Start display at page:

Download "A Naïve Support Vector Regression Benchmark for the NN3 Forecasting Competition"

Transcription

1 A Naïve Support Vector Regresson Benchmark for the NN3 Forecastng Competton Sven F. Crone and Swantje Petsch Abstract Support Vector Regresson s one of the promsng contenders n predctng the 111 tme seres of the NN3 Neural Forecastng Competton. As they offer substantal degrees of freedom n the modelng process, n selectng the kernel functon and ts parameters, cost and epslon parameters, ssues of model parameterzaton and model selecton arse. In lack of an establshed methodology or comprehensve emprcal evdence on ther modelng, a number of heurstcs and ad-hoc rules have emerged, that result n selectng dfferent models, whch show dfferent performance. In order to determne a lower bound for Support Vector Regresson accuracy n the NN3 competton, ths paper seeks to compute benchmark results usng a naïve methodology wth a fxed parameter grdsearch and exponentally ncreasng step szes for radal bass functon kernels, estmatng 43,725 canddate models for each of the 111 tme seres. The naïve approach attempts to mmc many of the common mstakes n model buldng, provdng error as a lower bound to support vector regresson accuracy. The n-sample results parameters are evaluated to estmate the mpact of potental shortcomngs n the grd search heurstc and the nteracton effects of the parameters. T I. INTRODUCTION me seres forecastng wth computatonal ntellgence has receved ncreasng attenton n theory and practce. However, n order to prove ther effcacy n forecastng ther accuracy must be evaluated aganst establshed statstcal forecastng methods on emprcal datasets [1, 2]. The 2007 NN3 Forecastng Competton for computatonal ntellgence methods provdes ths opportunty on a dataset of 111 emprcal tme seres and a reduced subset of 11 tme seres n order to establsh the forecastng accuracy of computatonal ntellgence methods on busness data. Recently, the method of Support Vector Regresson (SVR) has shown promsng performance n varous scentfc forecastng domans [3-7], offerng a nonparametrc, data-drven and self-adaptve method that learns lnear or nonlnear functonal relatonshps drectly from tranng examples [2, 4]. However, recent experments have demonstrated that despte the promse of automatcally estmatng optmal predctors usng statstcal learnng theory, SVR offer substantal degrees of freedom n forecastng, requrng a data dependent selecton of the kernel functon and ts parameters from a set of potental functons, and two metrc scaled parameters to control the Sven F. Crone s wth the Department of Management Scence, Lancaster Unversty Management School, Lancaster LA1 4YX, Unted Kngdom ( , e-mal: sven.f.crone@crone.de). Swantje Petsch s wth the Insttute of Informaton Systems, Unversty of Hamburg, Hamburg, Germany; (e-mal: malng@swantjepetsch.de). cost and epslon-nsenstve margn. Hence, support vector regresson share common and well known problems wth competng forecastng methods such as artfcal neural networks, offerng near endless degrees of freedom n the choce of archtecture parameters to be selected based upon ther performance on short and nosy tme seres. Due to the lack of an establshed methodology, a number of modelng heurstcs and ad-hoc rules-of-thumb have emerged n order to gude archtecture decsons n SVR modelng. As dfferent heurstcs may lead to dstnct models and varyng performance, they requre a systematc evaluaton. To further nvestgate the capablty of SVR n tme seres forecastng, SVR s appled to forecast the 111 tme seres of the NN3 competton. In order to determne a lower bound for SVR predctons n the competton, ths study seeks to compute benchmark results usng a naïve methodology of a fxed parameter grd-search wth exponentally ncreasng step szes. The heurstc s lmted to a radal bass functon kernel, usng a smple preprocessng of nput-data, and followng a smple pck-the-best approach n model selecton of the canddate wth the lowest cross-valdaton mean squared error (MSE). Consderng the naïve heurstcs employed, the methodology actvely neglects relevant modelng gudelnes n SVR modelng, such as regardng data dependent kernel selecton, canddate model selecton usng k-fold cross-valdaton for performance predcton, adequate scalng and preprocessng of data etc. Hence we develop a naïve benchmark utlzng avalable computatonal power as a lower bound to SVR performance. The naïve grd search estmates 43,725 canddate models for each of the 111 tme seres, computng a total of 4,853,475 models for the NN3 datasets. In addton to provdng benchmark results, we analyse the nteracton of dfferent parameter choces and nterpret the results. The paper s organsed as follows. Frst we provde a bref ntroducton to ε-svr and the relevant model parameters n forecastng, followed by a short dscusson on alternatve methodologes for modellng SVR. Secton 3 outlnes the expermental setup of data pre-processng, SVR parameter selecton and the SVR canddate selecton, followed by the parameter results and an analyss of ther nteractons. II. SUPPORT VECTOR REGRESSION A. Method Background The method of Support Vector Regresson (SVR) s based on statstcal learnng theory by Vapnk [8] and estmates a functon f ( x ) that mnmzes the forecastng error on a tranng data set (( x, y ),...,( x, y )) ( X Y) whle keepng 1 1

2 the functonal form as flat as possble [4, 9, 10]. SVR s formulated as convex optmzaton problem wth slack varables ξ,ξ to allow for model errors [11, 12] 1 mnmze w 2 subject to 2 + C = y ( w x ) ( w x ) + b ξ, ξ 1 y ( ξ + ξ ) b = ε + ξ = ε + ξ 0 whch controls the trade-off between overfttng and model complexty through a regularzaton parameter C>0 [8]. We employ ε-svr, usng an ε-ntensve loss functon, that assgns an error only to those observatons ξ,ξ 0 outsde the ε-nsenstve tube [13, 14], named support vectors [15], usng the loss functon 0 f ξ ε ξ ε : = (2) ξ ε else To handle non-lnear functonal relatonshps n forecastng problems, the data are mapped usng a kernel functon φ nto a hgh dmensonal feature space F, where they may be solved lnear regresson, whch corresponds to nonlnear regresson n a lower dmensonal nput space [5].In ths study the radal bass kernel functon (RBF) s appled, as t s commonly used n ε-svr usng just one parameter γ that to determned the RBF wdth a pror [3, 4, 16]. For the RBF, the number of centres, locaton of the centres, the weghts and the thresholds are all determned whlst tranng [17]. Varous basc tutorals exst to provde an ntroducton on the background and the mathematcal propertes of support vector machnes [17] and SVR n partcular [4]. In forecastng wth SVR, the nput vector contans the lag structure of the tme seres, whch results n dot products after combnng them wth the support vectors n the kernel functon. The quadratc optmzaton problem s solved to determne Lagrangan multplers α, α that are used to determne the weghts υ = α α [4]. The dot products are then weghted by υ = α α to calculate the one-step-ahead predcton value together wth the threshold b [4]. The forecastng process can be vsualsed as n fgure 2. B. Methodologes n specfyng SVR parameters SVR s a data-drven and self-adaptve methods, whch s capable of approxmatng lnear and nonlnear functonal relatonshps from data, unlke tradtonal model-based statstcal methods [2, 4]. For applyng ε-svr n forecastng, a number of parameters must be determned a pror by determnng the Costs C, the wdth of the epslon-nsenstve loss functon ε, the kernel functon and ts kernel parameters γ [18] and the number and lags of ndependent varables to specfy the nput vector as a rollng wndow of fxed sze over a tme seres. Recent experments have shown that the forecastng performance of the ε-svr s sgnfcantly mpacted by an adequate a pror selecton of ther parameters [19], they are consdered to be sem-parametrc (1) Fg. 2. Tme seres predcton wth SVR rather then non-parametrc methods, smlar to neural networks. Hence they requre an expert to determne a pror model parameters that mpact on the forecastng capabltes. In order to determne the ε-svr parameters varous modellng heurstcs exst. Gao, Gunn, Harrs and Brown determne SVR parameters usng a Bayesan framework for Gaussan SVR [20]. Chu, Keerth and Ong follow another Bayesan approach, that combnes the merts of SVR wth the advantages of Bayesan methods for model adaptaton [21]. Chang and Ln derved leave-one-out bounds for SVR parameters [22]. Heurstcs can also lead to parameter combnatons wth nferor performance the even a smple parameter grd search [23], as n comparng a grd search wth a Bayesan approach n Ln and Weng [24]. Momma and Bennett perform model selecton by pattern search to reduce the number of parameter combnatons that need to be tested [18, 25]. Kwok and Tsang [26] as well as Smola, Murata, Schölkopf and Müller [27] determne the parameter ε as a lnear dependency on the nose of the tranng data, whch requres a pror knowledge of the nose level [22]. In contrast, Cherkassky and Ma analyse the parameter nteracton to lmt the number of relevant parameters. The suggest that for a gven ε, the value of C has only neglgble effects on the generalzaton performance as long as C s larger than a certan threshold that can be determned from the tranng data [18]. A smple method to determne sutable ε-svr parameters for each tme seres follows a systematc grd search over the parameter space [23, 28]. Instead of evaluatng every possble parameter combnaton, whch would be ntractable for parameters of nterval scale, a grd usng equdstant steps n the parameter space lmts the computatonal effort. However, dfferent grds are applcable, usng lnear step szes, exponental ncreasng sequences as n Hsu [23] or Luxburg [29] or logarthmc sequences as n Chang and Ln [22]. In addton, stepwse refnements of the grd sze n parameter space are feasble, leadng to an analytcally

3 smple yet computatonally expensve parameter selecton approach. In ths study we seek to explore the smple grdbased approach, usng a brute-force, exhaustve enumeraton of a representatve parameter space. growng sequences that cover a vast range of value combnatons. Fgure 2 shows an example grd wth exponentally growng sequences. III. EXPERIMENTAL DESIGN OF NAÏVE SVR A. Specfyng SVR nput vectors The forecastng accuracy of any method depends largely on provdng adequate nput nformaton to learn from. In tme seres forecastng ths takes the form of specfyng sgnfcant tmed lags of the dependent varable y t-n and excludng rrelevant ones, hence determnng the length of the nput vector. Multple methods exst n specfyng nput vectors, based on smple heurstc rules, statstcal autocorrelaton analyss to determne the order of autoregressve (AR), ntegrated (I) and movng average (MA) processes or mxed ARIMA-processes of lagged realsatons of the dependent varable [2] or usng spectral analyss to detect multple overlyng seasonal patterns. To pursue a naïve modellng approach we select a smple heurstc decson rule based on the observaton nterval of the tme seres, usng a constant lag structure of the past 12 monthly observatons n a year to account for possble seasonalty of the months or quarters. The same lag structure was used for all 111 tme seres, despte the possblty of dfferent lag structures for dfferent tme seres, the necessty to nclude 13 lags for seasonal ntegrated autoregressve processes SARIMA(p,d,0)(P,D,0) s or the approxmaton of MA processes of SARIMA(0,0,q)(0,0,Q) s. by extendng the nput vector to multples of the yearly seasonalty. B. Data Pre-Processng The dataset contans 111 monthly tme seres from the complete dataset of the NN3 competton. Of these seres 11 monthly tme seres form a reduced data subset of the competton. The NN3 tme seres are heterogeneous, show varous seasonal and non-seasonal patterns and nose levels, and vary n length between 49 and 126 observatons. The predcton objectve s to forecast the next 18 observatons n a multple step ahead forecast as accurately as possble. Data s scaled before tranng n order to speed up the computaton process and to avod numercal dffcultes [23]. Each tme seres observaton x t s lnearly scaled as z t nto the nterval of [-0.5; 0.5], usng the scalng functon of z t ( xt xt mn ) ( x x ) = 0,5 +, (3) t max t mn on the mnmum x t mn and maxmum value x t max of x t on the tranng and valdaton set and by applyng headroom of 50% to avod possble saturaton effects of the nodes on nstatonary tme seres patterns n the unseen test data. C. SVR tranng and parameters We employ a smple grd search of costs C, epslon ε and the parameter of a Gaussan kernel γ wth exponentally Fg. 2. A grd wth exponentally growng sequences However, employng a grd search methodology requres the settng of vald and relable lower and upper parameter bounds that defne the search space of the grd. The regularsaton parameter C determnes the trade-off between the model capacty, reflected n the flatness of the approxmated functon, and the amount to whch devatons larger then the ε-nsenstve tube are tolerated [13]. A larger value for C reduces the error contrbuton but yelds a more complex forecastng functon that s more lkely to overft on the tranng data [18]. Hence t appears reasonable to evaluate parameters of C between a very small lower bound to create SVR-models wth smple, flat functons to handle strong nose and a large upper bound to also consder SVRmodels that descrbe more complex tme seres structures. In other experments, Chang and Ln used parameter bounds between e -8 C e 8 [23] whle Hsu and Ln used bounds between 2-2 C 2 12 [30]. To follow an exhaustve approach, we set the lower bound to C l =2-10 and the upper bound to C l =2 16, exceedng the parameter range of prevous experments n order to provde the capablty for a suffcent trade off for the dfferent tme seres patterns. Ths mplements an exponental grd wth 36 steps of to evaluate the parameter values of C=[2-10,2-9,5,,2 16 ]. The ε-parameter controls the sze of the ε-nsenstve tube and hence the number of support vectors and the error contrbutons of observatons lyng outsde t [9, 15]. As ε corresponds to the level of nose n a tme seres, large values of ε allow an approxmaton of the structure of the underlyng functonal relatonshp of a tme seres wth hgh nose as opposed to overfttng to the nose. Chang and Ln [22] use margn values of e -8 ε e -1, wth Ln and Weng usng smlar margns [24]. We extend these search spaces and use a lower margn of 2-8 wth an upper margn of 2 0. We use exponental grd steps of , evaluatng 32 parameter values of ε=[2-8,2-7,75,,2 0 ] for dfferent nose. The kernel parameter γ defnes the wdth of the kernel to reflect the range of the tranng data n feature space and therefore the ablty of an SVR to adapt to the data [17, 18, 31]. Chang and Ln [22] used parameter bounds of e -8 γ e 8

4 and Ln and Weng [24] used bounds of 2-8 γ 2 1. We select an exponental grd wth steps of 2 0.5, evaluatng 30 parameter values of γ=[2-12,2-11,5,,2 0 ] to provde feasble kernel parameters for the scaled tme seres data. In total we evaluate 43,725 parameter combnatons for each tme seres. As a grd search of ths magntude s a tme ntensve approach of parameter selecton, we reduce the tranng tme by applyng a shrnkng technque to speed up the decomposton used to solve the SVR optmzaton problem. It teratvely removes bounded components, so that reduced problems are solved [28, 30]. See Fan Chen and Ln for detals [32]. To summarse, a total of 4,853,475 ε-svr canddate models are computed on the 111 tme seres usng YALE [33] and the LIBSVM lbrares [28] D. Selecton of SVR canddate models Dependng on the combnaton of model parameters, a SVR s capable of approxmatng the underlyng data generatng process of a tme seres to dfferent degrees of accuracy, permttng overfttng to the tranng data though the combnaton of sub-optmal parameters and therefore lmtng ts ablty to generalze on unseen data [4], see fgure 3. Fg.3. The dfference between overfttng and generalzaton Hence the selecton of a robust SVR canddate for each tme seres requres partcular attenton. To select the best SVR canddate model from the dfferent parameter setups, whch are 43,725 per tme seres n ths study, each tme seres s splt nto two subsets of 65% tranng data and 35% valdaton data [34]. Consderng the length of the seres the valdaton set s selected to roughly match the undsclosed test set n length, servng as a frst estmate of a quas-outof-sample accuracy. Each canddate ε-svr s traned exclusvely on the tranng data and s selected on ts valdaton dataset. As only a short valdaton dataset s used for selectng the best canddate model for that tme seres, overfttng on the valdaton set frequently occurs f the valdaton subset does not adequately represent the true data generatng process, whch cannot be expected from small data sub-samples. Multple approaches are feasble to avod overfttng to the valdaton data set n model selecton and to derve an unbased estmator on unseen data. Comprehensve methods for data sub-samplng may be consdered, ncludng k-fold cross valdaton usng dfferent numbers of data folds or leave-one-out cross valdaton [35] To adhere to the naïve approach, whle avodng the grave mstake of selecton of the best canddate model on the tranng data tself, we compute only sngle cross-valdaton errors and select the best model on the prefxed valdaton set. So all SVR canddates are parametersed exclusvely on the tranng set, whle the forecastng capablty of the models s evaluated on the valdaton set and the canddate model wth the lowest valdaton error s selected [36, 37]. Emprcal smulaton experments have proven that error measures play an mportant role n calbratng and refnng, model selecton and ex post evaluaton of competng forecastng models n order to determne the compettve accuracy and rank canddate models [36, 38]. Although they should be selected wth care, we apply the quadratc error measure of the root mean squared error (RMSE), weghtng each error devaton by the quadratc dstance usng: 2 1 RMSE = ( ) e t (4) n t = 1 Usng quadratc error measures emphasses the nfluence of large forecast errors over small ones, e.g. from outlers, and should normally be avoded n the evaluaton of model performance, but accordng to Armstrong and Collopy [38] practtoners and academcans used the RMSE frequently to draw conclusons about forecastng methods. Furthermore, they are frequently used due to ther relaton wth conventonal least-squares-estmators and ther mathematcal smplcty. Ths s also noteworthy, as the selecton crtera dverges from the fnal forecastng error metrc n the NN3 competton, whch s usng a symmetrc mean absolute percent error (SMAPE) [38, 39]: n 1 y ˆ t yt SMAPE = ( 100) n (5) y + y ˆ /2 ( ) t= 1 t t The model wth the lowest RMSE on forecastng 18 t+1 step-ahead forecasts on the valdaton set s selected and appled to predct the next 18 data ponts as multple-stepahead forecasts t+1, t+2,, t+18 on the NN3 competton data sets. It s apparent, that ths gves rse to another msmatch, as a method may show adequate accuracy on forecastng one step nto the future, yet another set of parameters may perform better n forecastng multple steps ahead. As ths s commonly not algned n prevous studes, we comply wth ths malpractce n the naïve methodology, ntroducng further potental for msspecfcaton errors. No true observatons for the fnal 18 data ponts of the NN3 competton are avalable, so no evaluaton of out-ofsample accuracy may be conducted. More thorough evaluatons of the naïve methodology would be feasble by splttng the avalable data nto tranng, valdaton and test set, but these are not conducted due to the obvous suboptmalty of the naïve approach. Ths lmts the followng nvestgaton to an analyss of correlaton between tranng and valdaton data and the ranges of optmal parameters.

5 IV. EXPERIMENTAL RESULTS To derve nsght about the potental qualty of the estmated SVR models, we nvestgate the one-step-ahead predctons for all tme seres on tranng and valdaton data usng the SMAPE. Table 1 shows the SMAPE for the 111 NN3 tme seres ncludng the 11 tme seres from the reduced dataset from NN100 to NN101. An analyss of the errors on tranng and valdaton sets ndcates varous problems of hgh errors, overfttng n the tranng process as ndcated by sgnfcantly smaller errors on the tranng data then on the valdaton data set, and overfttng to the valdaton set, ndcated by sgnfcantly smaller errors on the valdaton set then on the tranng set. The numercal analyss therefore confrms that varous msspecfcatons may have occurred, whch however cannot be attrbuted to a partcular model choce or the uncontrollable structure dataset. To support the numercal evdence of the expected low valdty and relablty of the naïve benchmark approach, we conduct a vsual nspecton of the hghlghted tme seres to valdate the assumptons. The vsual nspecton confrms that the SVR models for tme seres NN009, NN010, NN019 and NN059 overft on the tranng data, showng lmted generalsaton through sgnfcantly lower accuracy on the valdaton set. Tme seres NN40 to NN49 show extreme nose and no apparent generalsaton of the underlyng structure. For most of these tme seres the SVR model s ether very flat or sgnfcantly overftted to the nose. Tme seres NN022, NN037, NN043, NN054, NN063 and NN069 show a sgnfcant error ncrease on the valdaton set, whch can be explaned by vsual apparent outlers n the valdaton set. Tme seres NN093 and NN009 show a structural break, explanng the hgh forecastng error. The SVR models for tme seres NN027, NN029, NN043 and NN045 are very flat and hence explan the lmted ft to the nosy tme seres. V. CONCLUSIONS We compute a naïve heurstc, makng use of most frequent mstakes n SVR modelng for tme seres predcton n order to establsh a lower bound for support vector accuracy n the NN3 forecastng competton. The naïve heurstc evaluates an extensve grd search of 43,725 combnatons of cost parameters, epslon-parameters and kernel parameters for each tme seres, calculatng a total of 4,853,475 ε-svr canddate models. In amng for a lower bound, we neglect the necessty to dentfy a sgnfcant nput vector per tme seres, evaluate dfferent scalng schemes, evaluate dfferent kernel functons, control for overfttng n model selecton form the valdaton data usng k-fold cross-valdaton, conductng model selecton and evaluaton on a representatve error metrc for the ex post evaluaton of the performance or the true cost of the decson, and computng and evaluatng one step ahead predctors nstead of multple-step-ahead predctors as requred n the fnal test evaluaton. The naïve heurstc dentfes a set of parameters for each of the 111 tme seres, whch are subsequently used to forecast the next 18 steps nto the future for unseen data. Whle we hope to demonstrate the general ablty of SVR to forecast lnear and nonlnear tme seres wth seasonal and non-seasonal patterns, the dscusson of the naïve heurstc methodology also ams at drawng attenton to the most common mstakes n SVR as well as neural network model buldng. Further research, systematc evaluatons of forecastng accuracy on multple emprcal tme seres are requred to establsh a vald, relable and robust methodology for automatc SVR model buldng. Untl then, the naïve grd search heurstc may serve as a warnng benchmark whch ptfalls to avod n SVR model buldng. TABLE 1 SMAPE PREDICTION ERROR ON TRAINING AND VALIDATION SET OR THE 111 SVR MODELS ON THE NN3 COMPETITION DATASET Seres Tran Vald Seres Tran Vald Seres Tran Vald Seres Tran Vald Seres Tran Vald Seres Tran Vald NN NN NN NN NN NN ,49 NN NN NN NN NN NN ,50 NN NN NN NN NN NN ,46 NN NN NN NN NN NN ,40 NN NN NN NN NN NN ,70 NN NN NN NN NN NN ,38 NN NN NN NN NN NN ,21 NN NN NN NN NN NN ,35 NN NN NN NN NN NN ,54 NN NN NN NN NN NN ,70 NN NN NN NN NN NN ,82 NN NN NN NN NN NN NN NN NN NN NN NN NN NN NN NN NN NN NN NN NN NN NN NN NN NN NN NN NN NN NN NN NN NN NN NN NN NN NN NN NN NN NN NN NN Hgh error e, wth e vald, e tran >x; Overfttng on the tranng set, wth e vald >e tran ; Overfttng on the Valdaton set for model selecton, wth e vald <e tran

6 ACKNOWLEDGMENT To ensure objectve and true ex ante results, all computatons were conducted by the second author, wthout access to the undsclosed NN3 test set data at any tme. None the less, the paper s exempt from offcally partcpatng n the award of the NN3 competton. REFERENCES [1] K.-P. Lao; and R. Fldes, "The Accuracy of a Procedural Approach to Specfyng Feedforward Neural Networks for Forecastng," Computers & Operatons Research, vol. 32, pp , [2] G. Zhang, B. E. Patuwo, and M. Y. Hu, "Forecastng wth Artfcal Neural Networks: The State of the Art," Internatonal Journal of Forecastng, vol. 14, pp , [3] C.-H. Wu, C.-C. We, D.-C. Su, M.-H. Chang, and J.-M. Ho, "Travel Tme Predcton wth Support Vector Regresson," presented at IEEE Intellgent Transportaton Systems Conference, [4] A. J. Smola; and B. Schölkopf, "A Tutoral on Support Vector Regresson," Statstcs and Computng, vol. 14, pp , [5] K.-R. Müller, A. J. Smola, G. Rätsch, B. Schölkopf, J. Kohlmorgen, and V. Vapnk, "Predctng Tme Seres wth Support Vector Machnes," n Advances n Kernel Methods Support Vector Learnng, B. Schölkopf, C. J. C. Burges, and A. J. Smola, Eds. Cambrdge: MIT Press, 1999, pp [6] H. Yang, K. Huang, L. W. Chan, K. C. I. Kng, and R. M. Lyu, "Outlers Treatment n Support Vector Regresson for Fnancal Tme SeresPredcton"." Computer Scence, vol. 3316, pp , [7] S. F. Crone, S. Lessmann, and S. Petsch, "Forecastng wth Computatonal Intellgence - An Evaluaton of Support Vector Regresson and Artfcal Neural Networks for Tme Seres Predcton," presented at 2006 IEEE WCCI, Vancouver (Canada), [8] V. N. Vapnk, "An Overvew of Statstcal Learnng Theory," IEEE Transactons on Neural Networks, vol. 10, pp , [9] N. Crstann; and J. Shawe-Taylor, An Introducton to Support Vector Machnes and other kernel-based Learnng Methods. Cambrdge (Unted Kngdom): Cambrdge Unversty Press, [10] M. Anthony; and N. Bggs, Computatonal Learnng Theory. Cambrdge (Unted Kngdom): Cambrdge Unversty Press, [11] M. Wellng, "Support Vector Regresson," Department of Computer Scence, Unversty of Toronto, Toronto (Kanada) [12] J. B; and K. P. Bennett, "A Geometrc Approach to Support Vector Regresson," Neurocomputng, vol. 55, pp , [13] A. Smola, "Regresson Estmaton wth Support Vector Learnng Machnes," Technsche Unverstät München, [14] S. R. Gunn, "Support Vector Machnes for Classfcaton and Regresson," Unversty of Southampton, Techncal Report. Image, Speech and Intellgent Systems Group January [15] B. Schölkopf, "Support Vector Learnng." Berln: Technsche Unverstät, [16] H. Yang, I. Kng, and L. Chan, "Non-Fxed and Asymmetrcal Margn Approach to Stock Market Predcton Usng Support Vector Regresson," presented at 9th Conference on Neural Informaton Processng, Sngapore (Malaysa), [17] C. J. C. Burges, "A Tutoral on Support Vector Machnes for Pattern Recognton," n Data Mnng and Knowledge Dscovery, vol. 2, U. Fayyad, Ed. Boston (U.S.A.): Kluwer Academc, 1998, pp [18] V. Cherkassky; and Y. Ma, "Practcal Selecton of SVM Parameters and Nose Estmaton for SVM Regresson," Neural Networks, vol. 17, pp , [19] S. F. Crone, S. Lessmann, and S. Petsch, "Parameter Senstvty of Support Vector Regresson and Neural Networks for Forecastng," presented at Internatonal Conference on Data Mnng, Las Vegas (U.S.A.), [20] J. B. Gao, S. R. Gunn, C. J. Harrs, and M. Brown, "A probablstc framework for SVM regresson and error bar estmaton," Machne Learnng, vol. 46, pp , [21] W. Chu, S. S. Keerth, and C. J. Ong, "Bayesan Support Vector Regresson Usng a Unfed Loss functon," IEEE Transactons on Neural Networks, vol. 15, pp , [22] M.-W. Chang; and C.-J. Ln, "Leave-One-Out Bounds for Support Vector Regresson Model Selecton," Neural Computaton, vol. 17, pp , [23] C.-W. Hsu, C.-C. Chang, and C.-J. Ln, "A Practcal Gude to Support Vector Classfcaton," Natonal Tawan Unversty, Tape, [24] C.-J. Ln; and R. C. Weng, "Smple probablstc predctons for support vector regresson," Natonal Tawan Unversty, Tape [25] M. Momma; and K. P. Bennett, "A pattern search method for model selecton of support vector regresson," presented at Second SIAM Internatonal Conference of Data Mnng, [26] J. T. Kwok; and I. W. Tsang, "Lnear Dependency Between epslon and the Input Nose n epslon -Support Vector Regresson," IEEE Transactons on Neural Networks, ICANN 2001, pp , [27] A. Smola, N. Murata, B. Schölkopf, and K.-R. Müller, "Asymptotcally optmal choce of epslon-loss for support vector machnes," presented at Proceedng of the Internatonal Conference on Artfcal Neural Networks, [28] C.-C. Chang; and C.-J. Ln, "LIBSVM: a Lbrary for Support Vector Machnes," Natonal Scence Councl of Tawan, Tape (Tawan) 17. Aprl [29] U. v. Luxburg, O. Bousquet, and B. Schölkopf, "A Compresson Approach to Support Vector Model Selecton," Journal of Machne Learnng Research, vol. 5, pp , [30] C.-W. Hsu; and C.-J. Ln, "A Comparson of Methods for Multclass Support Vector Machnes," IEEE Transactons on Neural Networks, vol. 13, pp , [31] C.-C. Hsu, C.-H. Wu, S.-C. Chen, and K.-L. Peng, "Dynamcally Optmzng Parameters n Support Vector Regresson: An Applcaton of Electrcty Load Forecastng," presented at 39th Internatonal Conference on System Scences, [32] R.-E. Fan, P.-H. Chen, and C.-J. Ln, "Workng Set Selecton Usng Second Order Informaton for Tranng Support Vector Machnes," Journal of Machne Learnng Research, vol. 6, pp , [33] I. Merswa, M. Wurst, R. Klngenberg, M. Scholz, and T. Euler, "YALE: Rapd Prototypng for Complex Data Mnng Tasks," presented at Proceedng of the 12th ACM SIGKDD Internatonal Conference on Knowledge Dscovery and Data Mnng, [34] P. Cunnngham, "Overfttng and Dversty n Classfcaton Ensembles based on Feature Selecton," Trnty College Dubln, Dubln (Ireland), Computer Scence Techncal Report: TCD-CS , [35] D. Optz; and R. Macln, "Popular Ensemble Methods: An Emprcal Study," Journal of Artfcal Intellgence Research, vol. 11, pp , [36] S. Makrdaks, S. C. Wheelwrght, and R. J. Hyndman, Forecastng Methods and Applcatons, 3 ed. New York: John Wley & Sons, [37] A. Foka, "Tme Seres Predcton Usng Evolvng Polynomal Neural Networks," n Insttute of Scence and Technology. Manchester (Unted Kngdom): Unversty of Manchester, [38] J. S. Armstrong; and F. Collopy, "Error Measures for Generalzng About Forecastng Methods: Emprcal Comparsons," Internatonal Journal of Forecastng, vol. 8, pp , [39] R. J. Hyndman; and A. B. Koehler, "Another Look at Measures of Forecast Accuracy," Monash Unversty, Workng Paper 13/05, 20 Ma 2005.

Support Vector Machines

Support Vector Machines /9/207 MIST.6060 Busness Intellgence and Data Mnng What are Support Vector Machnes? Support Vector Machnes Support Vector Machnes (SVMs) are supervsed learnng technques that analyze data and recognze patterns.

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Decson surface s a hyperplane (lne n 2D) n feature space (smlar to the Perceptron) Arguably, the most mportant recent dscovery n machne learnng In a nutshell: map the data to a predetermned

More information

Determining the Optimal Bandwidth Based on Multi-criterion Fusion

Determining the Optimal Bandwidth Based on Multi-criterion Fusion Proceedngs of 01 4th Internatonal Conference on Machne Learnng and Computng IPCSIT vol. 5 (01) (01) IACSIT Press, Sngapore Determnng the Optmal Bandwdth Based on Mult-crteron Fuson Ha-L Lang 1+, Xan-Mn

More information

SVM-based Learning for Multiple Model Estimation

SVM-based Learning for Multiple Model Estimation SVM-based Learnng for Multple Model Estmaton Vladmr Cherkassky and Yunqan Ma Department of Electrcal and Computer Engneerng Unversty of Mnnesota Mnneapols, MN 55455 {cherkass,myq}@ece.umn.edu Abstract:

More information

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task Proceedngs of NTCIR-6 Workshop Meetng, May 15-18, 2007, Tokyo, Japan Term Weghtng Classfcaton System Usng the Ch-square Statstc for the Classfcaton Subtask at NTCIR-6 Patent Retreval Task Kotaro Hashmoto

More information

Simulation: Solving Dynamic Models ABE 5646 Week 11 Chapter 2, Spring 2010

Simulation: Solving Dynamic Models ABE 5646 Week 11 Chapter 2, Spring 2010 Smulaton: Solvng Dynamc Models ABE 5646 Week Chapter 2, Sprng 200 Week Descrpton Readng Materal Mar 5- Mar 9 Evaluatng [Crop] Models Comparng a model wth data - Graphcal, errors - Measures of agreement

More information

Classification / Regression Support Vector Machines

Classification / Regression Support Vector Machines Classfcaton / Regresson Support Vector Machnes Jeff Howbert Introducton to Machne Learnng Wnter 04 Topcs SVM classfers for lnearly separable classes SVM classfers for non-lnearly separable classes SVM

More information

Learning the Kernel Parameters in Kernel Minimum Distance Classifier

Learning the Kernel Parameters in Kernel Minimum Distance Classifier Learnng the Kernel Parameters n Kernel Mnmum Dstance Classfer Daoqang Zhang 1,, Songcan Chen and Zh-Hua Zhou 1* 1 Natonal Laboratory for Novel Software Technology Nanjng Unversty, Nanjng 193, Chna Department

More information

The Research of Support Vector Machine in Agricultural Data Classification

The Research of Support Vector Machine in Agricultural Data Classification The Research of Support Vector Machne n Agrcultural Data Classfcaton Le Sh, Qguo Duan, Xnmng Ma, Me Weng College of Informaton and Management Scence, HeNan Agrcultural Unversty, Zhengzhou 45000 Chna Zhengzhou

More information

The Impact of Preprocessing on Support Vector Regression and Neural Networks in Time Series Prediction

The Impact of Preprocessing on Support Vector Regression and Neural Networks in Time Series Prediction 27] The Impact of Preprocessng on Support Vector Regresson and Neural Networks n Tme Seres Predcton Sven F. Crone, Jose Guajardo, and Rchard Weber Abstract Support Vector Regresson (SVR) and Neural Networks

More information

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1 4/14/011 Outlne Dscrmnatve classfers for mage recognton Wednesday, Aprl 13 Krsten Grauman UT-Austn Last tme: wndow-based generc obect detecton basc ppelne face detecton wth boostng as case study Today:

More information

Classifier Selection Based on Data Complexity Measures *

Classifier Selection Based on Data Complexity Measures * Classfer Selecton Based on Data Complexty Measures * Edth Hernández-Reyes, J.A. Carrasco-Ochoa, and J.Fco. Martínez-Trndad Natonal Insttute for Astrophyscs, Optcs and Electroncs, Lus Enrque Erro No.1 Sta.

More information

Feature Reduction and Selection

Feature Reduction and Selection Feature Reducton and Selecton Dr. Shuang LIANG School of Software Engneerng TongJ Unversty Fall, 2012 Today s Topcs Introducton Problems of Dmensonalty Feature Reducton Statstc methods Prncpal Components

More information

Using Neural Networks and Support Vector Machines in Data Mining

Using Neural Networks and Support Vector Machines in Data Mining Usng eural etworks and Support Vector Machnes n Data Mnng RICHARD A. WASIOWSKI Computer Scence Department Calforna State Unversty Domnguez Hlls Carson, CA 90747 USA Abstract: - Multvarate data analyss

More information

Cluster Analysis of Electrical Behavior

Cluster Analysis of Electrical Behavior Journal of Computer and Communcatons, 205, 3, 88-93 Publshed Onlne May 205 n ScRes. http://www.scrp.org/ournal/cc http://dx.do.org/0.4236/cc.205.350 Cluster Analyss of Electrcal Behavor Ln Lu Ln Lu, School

More information

S1 Note. Basis functions.

S1 Note. Basis functions. S1 Note. Bass functons. Contents Types of bass functons...1 The Fourer bass...2 B-splne bass...3 Power and type I error rates wth dfferent numbers of bass functons...4 Table S1. Smulaton results of type

More information

CS 534: Computer Vision Model Fitting

CS 534: Computer Vision Model Fitting CS 534: Computer Vson Model Fttng Sprng 004 Ahmed Elgammal Dept of Computer Scence CS 534 Model Fttng - 1 Outlnes Model fttng s mportant Least-squares fttng Maxmum lkelhood estmaton MAP estmaton Robust

More information

A MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS

A MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS Proceedngs of the Wnter Smulaton Conference M E Kuhl, N M Steger, F B Armstrong, and J A Jones, eds A MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS Mark W Brantley Chun-Hung

More information

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification Introducton to Artfcal Intellgence V22.0472-001 Fall 2009 Lecture 24: Nearest-Neghbors & Support Vector Machnes Rob Fergus Dept of Computer Scence, Courant Insttute, NYU Sldes from Danel Yeung, John DeNero

More information

TECHNIQUE OF FORMATION HOMOGENEOUS SAMPLE SAME OBJECTS. Muradaliyev A.Z.

TECHNIQUE OF FORMATION HOMOGENEOUS SAMPLE SAME OBJECTS. Muradaliyev A.Z. TECHNIQUE OF FORMATION HOMOGENEOUS SAMPLE SAME OBJECTS Muradalyev AZ Azerbajan Scentfc-Research and Desgn-Prospectng Insttute of Energetc AZ1012, Ave HZardab-94 E-mal:aydn_murad@yahoocom Importance of

More information

Evolutionary Support Vector Regression based on Multi-Scale Radial Basis Function Kernel

Evolutionary Support Vector Regression based on Multi-Scale Radial Basis Function Kernel Eolutonary Support Vector Regresson based on Mult-Scale Radal Bass Functon Kernel Tanasanee Phenthrakul and Boonserm Kjsrkul Abstract Kernel functons are used n support ector regresson (SVR) to compute

More information

Smoothing Spline ANOVA for variable screening

Smoothing Spline ANOVA for variable screening Smoothng Splne ANOVA for varable screenng a useful tool for metamodels tranng and mult-objectve optmzaton L. Rcco, E. Rgon, A. Turco Outlne RSM Introducton Possble couplng Test case MOO MOO wth Game Theory

More information

X- Chart Using ANOM Approach

X- Chart Using ANOM Approach ISSN 1684-8403 Journal of Statstcs Volume 17, 010, pp. 3-3 Abstract X- Chart Usng ANOM Approach Gullapall Chakravarth 1 and Chaluvad Venkateswara Rao Control lmts for ndvdual measurements (X) chart are

More information

Edge Detection in Noisy Images Using the Support Vector Machines

Edge Detection in Noisy Images Using the Support Vector Machines Edge Detecton n Nosy Images Usng the Support Vector Machnes Hlaro Gómez-Moreno, Saturnno Maldonado-Bascón, Francsco López-Ferreras Sgnal Theory and Communcatons Department. Unversty of Alcalá Crta. Madrd-Barcelona

More information

NAG Fortran Library Chapter Introduction. G10 Smoothing in Statistics

NAG Fortran Library Chapter Introduction. G10 Smoothing in Statistics Introducton G10 NAG Fortran Lbrary Chapter Introducton G10 Smoothng n Statstcs Contents 1 Scope of the Chapter... 2 2 Background to the Problems... 2 2.1 Smoothng Methods... 2 2.2 Smoothng Splnes and Regresson

More information

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision SLAM Summer School 2006 Practcal 2: SLAM usng Monocular Vson Javer Cvera, Unversty of Zaragoza Andrew J. Davson, Imperal College London J.M.M Montel, Unversty of Zaragoza. josemar@unzar.es, jcvera@unzar.es,

More information

y and the total sum of

y and the total sum of Lnear regresson Testng for non-lnearty In analytcal chemstry, lnear regresson s commonly used n the constructon of calbraton functons requred for analytcal technques such as gas chromatography, atomc absorpton

More information

Available online at ScienceDirect. Procedia Environmental Sciences 26 (2015 )

Available online at   ScienceDirect. Procedia Environmental Sciences 26 (2015 ) Avalable onlne at www.scencedrect.com ScenceDrect Proceda Envronmental Scences 26 (2015 ) 109 114 Spatal Statstcs 2015: Emergng Patterns Calbratng a Geographcally Weghted Regresson Model wth Parameter-Specfc

More information

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET

BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET 1 BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET TZU-CHENG CHUANG School of Electrcal and Computer Engneerng, Purdue Unversty, West Lafayette, Indana 47907 SAUL B. GELFAND School

More information

A Statistical Model Selection Strategy Applied to Neural Networks

A Statistical Model Selection Strategy Applied to Neural Networks A Statstcal Model Selecton Strategy Appled to Neural Networks Joaquín Pzarro Elsa Guerrero Pedro L. Galndo joaqun.pzarro@uca.es elsa.guerrero@uca.es pedro.galndo@uca.es Dpto Lenguajes y Sstemas Informátcos

More information

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization Problem efntons and Evaluaton Crtera for Computatonal Expensve Optmzaton B. Lu 1, Q. Chen and Q. Zhang 3, J. J. Lang 4, P. N. Suganthan, B. Y. Qu 6 1 epartment of Computng, Glyndwr Unversty, UK Faclty

More information

Relevance Assignment and Fusion of Multiple Learning Methods Applied to Remote Sensing Image Analysis

Relevance Assignment and Fusion of Multiple Learning Methods Applied to Remote Sensing Image Analysis Assgnment and Fuson of Multple Learnng Methods Appled to Remote Sensng Image Analyss Peter Bajcsy, We-Wen Feng and Praveen Kumar Natonal Center for Supercomputng Applcaton (NCSA), Unversty of Illnos at

More information

Empirical Distributions of Parameter Estimates. in Binary Logistic Regression Using Bootstrap

Empirical Distributions of Parameter Estimates. in Binary Logistic Regression Using Bootstrap Int. Journal of Math. Analyss, Vol. 8, 4, no. 5, 7-7 HIKARI Ltd, www.m-hkar.com http://dx.do.org/.988/jma.4.494 Emprcal Dstrbutons of Parameter Estmates n Bnary Logstc Regresson Usng Bootstrap Anwar Ftranto*

More information

CS246: Mining Massive Datasets Jure Leskovec, Stanford University

CS246: Mining Massive Datasets Jure Leskovec, Stanford University CS46: Mnng Massve Datasets Jure Leskovec, Stanford Unversty http://cs46.stanford.edu /19/013 Jure Leskovec, Stanford CS46: Mnng Massve Datasets, http://cs46.stanford.edu Perceptron: y = sgn( x Ho to fnd

More information

A Binarization Algorithm specialized on Document Images and Photos

A Binarization Algorithm specialized on Document Images and Photos A Bnarzaton Algorthm specalzed on Document mages and Photos Ergna Kavalleratou Dept. of nformaton and Communcaton Systems Engneerng Unversty of the Aegean kavalleratou@aegean.gr Abstract n ths paper, a

More information

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration Improvement of Spatal Resoluton Usng BlockMatchng Based Moton Estmaton and Frame Integraton Danya Suga and Takayuk Hamamoto Graduate School of Engneerng, Tokyo Unversty of Scence, 6-3-1, Nuku, Katsuska-ku,

More information

Machine Learning. Support Vector Machines. (contains material adapted from talks by Constantin F. Aliferis & Ioannis Tsamardinos, and Martin Law)

Machine Learning. Support Vector Machines. (contains material adapted from talks by Constantin F. Aliferis & Ioannis Tsamardinos, and Martin Law) Machne Learnng Support Vector Machnes (contans materal adapted from talks by Constantn F. Alfers & Ioanns Tsamardnos, and Martn Law) Bryan Pardo, Machne Learnng: EECS 349 Fall 2014 Support Vector Machnes

More information

Lecture 5: Multilayer Perceptrons

Lecture 5: Multilayer Perceptrons Lecture 5: Multlayer Perceptrons Roger Grosse 1 Introducton So far, we ve only talked about lnear models: lnear regresson and lnear bnary classfers. We noted that there are functons that can t be represented

More information

EXTENDED BIC CRITERION FOR MODEL SELECTION

EXTENDED BIC CRITERION FOR MODEL SELECTION IDIAP RESEARCH REPORT EXTEDED BIC CRITERIO FOR ODEL SELECTIO Itshak Lapdot Andrew orrs IDIAP-RR-0-4 Dalle olle Insttute for Perceptual Artfcal Intellgence P.O.Box 59 artgny Valas Swtzerland phone +4 7

More information

Classifying Acoustic Transient Signals Using Artificial Intelligence

Classifying Acoustic Transient Signals Using Artificial Intelligence Classfyng Acoustc Transent Sgnals Usng Artfcal Intellgence Steve Sutton, Unversty of North Carolna At Wlmngton (suttons@charter.net) Greg Huff, Unversty of North Carolna At Wlmngton (jgh7476@uncwl.edu)

More information

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS ARPN Journal of Engneerng and Appled Scences 006-017 Asan Research Publshng Network (ARPN). All rghts reserved. NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS Igor Grgoryev, Svetlana

More information

Helsinki University Of Technology, Systems Analysis Laboratory Mat Independent research projects in applied mathematics (3 cr)

Helsinki University Of Technology, Systems Analysis Laboratory Mat Independent research projects in applied mathematics (3 cr) Helsnk Unversty Of Technology, Systems Analyss Laboratory Mat-2.08 Independent research projects n appled mathematcs (3 cr) "! #$&% Antt Laukkanen 506 R ajlaukka@cc.hut.f 2 Introducton...3 2 Multattrbute

More information

A mathematical programming approach to the analysis, design and scheduling of offshore oilfields

A mathematical programming approach to the analysis, design and scheduling of offshore oilfields 17 th European Symposum on Computer Aded Process Engneerng ESCAPE17 V. Plesu and P.S. Agach (Edtors) 2007 Elsever B.V. All rghts reserved. 1 A mathematcal programmng approach to the analyss, desgn and

More information

6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour

6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour 6.854 Advanced Algorthms Petar Maymounkov Problem Set 11 (November 23, 2005) Wth: Benjamn Rossman, Oren Wemann, and Pouya Kheradpour Problem 1. We reduce vertex cover to MAX-SAT wth weghts, such that the

More information

A New Approach For the Ranking of Fuzzy Sets With Different Heights

A New Approach For the Ranking of Fuzzy Sets With Different Heights New pproach For the ankng of Fuzzy Sets Wth Dfferent Heghts Pushpnder Sngh School of Mathematcs Computer pplcatons Thapar Unversty, Patala-7 00 Inda pushpndersnl@gmalcom STCT ankng of fuzzy sets plays

More information

SHAPE RECOGNITION METHOD BASED ON THE k-nearest NEIGHBOR RULE

SHAPE RECOGNITION METHOD BASED ON THE k-nearest NEIGHBOR RULE SHAPE RECOGNITION METHOD BASED ON THE k-nearest NEIGHBOR RULE Dorna Purcaru Faculty of Automaton, Computers and Electroncs Unersty of Craoa 13 Al. I. Cuza Street, Craoa RO-1100 ROMANIA E-mal: dpurcaru@electroncs.uc.ro

More information

Wishing you all a Total Quality New Year!

Wishing you all a Total Quality New Year! Total Qualty Management and Sx Sgma Post Graduate Program 214-15 Sesson 4 Vnay Kumar Kalakband Assstant Professor Operatons & Systems Area 1 Wshng you all a Total Qualty New Year! Hope you acheve Sx sgma

More information

TN348: Openlab Module - Colocalization

TN348: Openlab Module - Colocalization TN348: Openlab Module - Colocalzaton Topc The Colocalzaton module provdes the faclty to vsualze and quantfy colocalzaton between pars of mages. The Colocalzaton wndow contans a prevew of the two mages

More information

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers IOSR Journal of Electroncs and Communcaton Engneerng (IOSR-JECE) e-issn: 78-834,p- ISSN: 78-8735.Volume 9, Issue, Ver. IV (Mar - Apr. 04), PP 0-07 Content Based Image Retreval Usng -D Dscrete Wavelet wth

More information

Proper Choice of Data Used for the Estimation of Datum Transformation Parameters

Proper Choice of Data Used for the Estimation of Datum Transformation Parameters Proper Choce of Data Used for the Estmaton of Datum Transformaton Parameters Hakan S. KUTOGLU, Turkey Key words: Coordnate systems; transformaton; estmaton, relablty. SUMMARY Advances n technologes and

More information

Artificial Intelligence (AI) methods are concerned with. Artificial Intelligence Techniques for Steam Generator Modelling

Artificial Intelligence (AI) methods are concerned with. Artificial Intelligence Techniques for Steam Generator Modelling Artfcal Intellgence Technques for Steam Generator Modellng Sarah Wrght and Tshldz Marwala Abstract Ths paper nvestgates the use of dfferent Artfcal Intellgence methods to predct the values of several contnuous

More information

Three supervised learning methods on pen digits character recognition dataset

Three supervised learning methods on pen digits character recognition dataset Three supervsed learnng methods on pen dgts character recognton dataset Chrs Flezach Department of Computer Scence and Engneerng Unversty of Calforna, San Dego San Dego, CA 92093 cflezac@cs.ucsd.edu Satoru

More information

Meta-heuristics for Multidimensional Knapsack Problems

Meta-heuristics for Multidimensional Knapsack Problems 2012 4th Internatonal Conference on Computer Research and Development IPCSIT vol.39 (2012) (2012) IACSIT Press, Sngapore Meta-heurstcs for Multdmensonal Knapsack Problems Zhbao Man + Computer Scence Department,

More information

Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like:

Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like: Self-Organzng Maps (SOM) Turgay İBRİKÇİ, PhD. Outlne Introducton Structures of SOM SOM Archtecture Neghborhoods SOM Algorthm Examples Summary 1 2 Unsupervsed Hebban Learnng US Hebban Learnng, Cntd 3 A

More information

Machine Learning 9. week

Machine Learning 9. week Machne Learnng 9. week Mappng Concept Radal Bass Functons (RBF) RBF Networks 1 Mappng It s probably the best scenaro for the classfcaton of two dataset s to separate them lnearly. As you see n the below

More information

A Simple and Efficient Goal Programming Model for Computing of Fuzzy Linear Regression Parameters with Considering Outliers

A Simple and Efficient Goal Programming Model for Computing of Fuzzy Linear Regression Parameters with Considering Outliers 62626262621 Journal of Uncertan Systems Vol.5, No.1, pp.62-71, 211 Onlne at: www.us.org.u A Smple and Effcent Goal Programmng Model for Computng of Fuzzy Lnear Regresson Parameters wth Consderng Outlers

More information

Solving two-person zero-sum game by Matlab

Solving two-person zero-sum game by Matlab Appled Mechancs and Materals Onlne: 2011-02-02 ISSN: 1662-7482, Vols. 50-51, pp 262-265 do:10.4028/www.scentfc.net/amm.50-51.262 2011 Trans Tech Publcatons, Swtzerland Solvng two-person zero-sum game by

More information

Wavelets and Support Vector Machines for Texture Classification

Wavelets and Support Vector Machines for Texture Classification Wavelets and Support Vector Machnes for Texture Classfcaton Kashf Mahmood Rapoot Faculty of Computer Scence & Engneerng, Ghulam Ishaq Khan Insttute, Top, PAKISTAN. kmr@gk.edu.pk Nasr Mahmood Rapoot Department

More information

Course Introduction. Algorithm 8/31/2017. COSC 320 Advanced Data Structures and Algorithms. COSC 320 Advanced Data Structures and Algorithms

Course Introduction. Algorithm 8/31/2017. COSC 320 Advanced Data Structures and Algorithms. COSC 320 Advanced Data Structures and Algorithms Course Introducton Course Topcs Exams, abs, Proects A quc loo at a few algorthms 1 Advanced Data Structures and Algorthms Descrpton: We are gong to dscuss algorthm complexty analyss, algorthm desgn technques

More information

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance Tsnghua Unversty at TAC 2009: Summarzng Mult-documents by Informaton Dstance Chong Long, Mnle Huang, Xaoyan Zhu State Key Laboratory of Intellgent Technology and Systems, Tsnghua Natonal Laboratory for

More information

A Modified Median Filter for the Removal of Impulse Noise Based on the Support Vector Machines

A Modified Median Filter for the Removal of Impulse Noise Based on the Support Vector Machines A Modfed Medan Flter for the Removal of Impulse Nose Based on the Support Vector Machnes H. GOMEZ-MORENO, S. MALDONADO-BASCON, F. LOPEZ-FERRERAS, M. UTRILLA- MANSO AND P. GIL-JIMENEZ Departamento de Teoría

More information

GSLM Operations Research II Fall 13/14

GSLM Operations Research II Fall 13/14 GSLM 58 Operatons Research II Fall /4 6. Separable Programmng Consder a general NLP mn f(x) s.t. g j (x) b j j =. m. Defnton 6.. The NLP s a separable program f ts objectve functon and all constrants are

More information

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points; Subspace clusterng Clusterng Fundamental to all clusterng technques s the choce of dstance measure between data ponts; D q ( ) ( ) 2 x x = x x, j k = 1 k jk Squared Eucldean dstance Assumpton: All features

More information

Parallelism for Nested Loops with Non-uniform and Flow Dependences

Parallelism for Nested Loops with Non-uniform and Flow Dependences Parallelsm for Nested Loops wth Non-unform and Flow Dependences Sam-Jn Jeong Dept. of Informaton & Communcaton Engneerng, Cheonan Unversty, 5, Anseo-dong, Cheonan, Chungnam, 330-80, Korea. seong@cheonan.ac.kr

More information

Data Mining: Model Evaluation

Data Mining: Model Evaluation Data Mnng: Model Evaluaton Aprl 16, 2013 1 Issues: Evaluatng Classfcaton Methods Accurac classfer accurac: predctng class label predctor accurac: guessng value of predcted attrbutes Speed tme to construct

More information

Some Advanced SPC Tools 1. Cumulative Sum Control (Cusum) Chart For the data shown in Table 9-1, the x chart can be generated.

Some Advanced SPC Tools 1. Cumulative Sum Control (Cusum) Chart For the data shown in Table 9-1, the x chart can be generated. Some Advanced SP Tools 1. umulatve Sum ontrol (usum) hart For the data shown n Table 9-1, the x chart can be generated. However, the shft taken place at sample #21 s not apparent. 92 For ths set samples,

More information

Air Transport Demand. Ta-Hui Yang Associate Professor Department of Logistics Management National Kaohsiung First Univ. of Sci. & Tech.

Air Transport Demand. Ta-Hui Yang Associate Professor Department of Logistics Management National Kaohsiung First Univ. of Sci. & Tech. Ar Transport Demand Ta-Hu Yang Assocate Professor Department of Logstcs Management Natonal Kaohsung Frst Unv. of Sc. & Tech. 1 Ar Transport Demand Demand for ar transport between two ctes or two regons

More information

Feature Selection as an Improving Step for Decision Tree Construction

Feature Selection as an Improving Step for Decision Tree Construction 2009 Internatonal Conference on Machne Learnng and Computng IPCSIT vol.3 (2011) (2011) IACSIT Press, Sngapore Feature Selecton as an Improvng Step for Decson Tree Constructon Mahd Esmael 1, Fazekas Gabor

More information

CHAPTER 3 SEQUENTIAL MINIMAL OPTIMIZATION TRAINED SUPPORT VECTOR CLASSIFIER FOR CANCER PREDICTION

CHAPTER 3 SEQUENTIAL MINIMAL OPTIMIZATION TRAINED SUPPORT VECTOR CLASSIFIER FOR CANCER PREDICTION 48 CHAPTER 3 SEQUENTIAL MINIMAL OPTIMIZATION TRAINED SUPPORT VECTOR CLASSIFIER FOR CANCER PREDICTION 3.1 INTRODUCTION The raw mcroarray data s bascally an mage wth dfferent colors ndcatng hybrdzaton (Xue

More information

Announcements. Supervised Learning

Announcements. Supervised Learning Announcements See Chapter 5 of Duda, Hart, and Stork. Tutoral by Burge lnked to on web page. Supervsed Learnng Classfcaton wth labeled eamples. Images vectors n hgh-d space. Supervsed Learnng Labeled eamples

More information

An Entropy-Based Approach to Integrated Information Needs Assessment

An Entropy-Based Approach to Integrated Information Needs Assessment Dstrbuton Statement A: Approved for publc release; dstrbuton s unlmted. An Entropy-Based Approach to ntegrated nformaton Needs Assessment June 8, 2004 Wllam J. Farrell Lockheed Martn Advanced Technology

More information

User Authentication Based On Behavioral Mouse Dynamics Biometrics

User Authentication Based On Behavioral Mouse Dynamics Biometrics User Authentcaton Based On Behavoral Mouse Dynamcs Bometrcs Chee-Hyung Yoon Danel Donghyun Km Department of Computer Scence Department of Computer Scence Stanford Unversty Stanford Unversty Stanford, CA

More information

Backpropagation: In Search of Performance Parameters

Backpropagation: In Search of Performance Parameters Bacpropagaton: In Search of Performance Parameters ANIL KUMAR ENUMULAPALLY, LINGGUO BU, and KHOSROW KAIKHAH, Ph.D. Computer Scence Department Texas State Unversty-San Marcos San Marcos, TX-78666 USA ae049@txstate.edu,

More information

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints Australan Journal of Basc and Appled Scences, 2(4): 1204-1208, 2008 ISSN 1991-8178 Sum of Lnear and Fractonal Multobjectve Programmng Problem under Fuzzy Rules Constrants 1 2 Sanjay Jan and Kalash Lachhwan

More information

Review of approximation techniques

Review of approximation techniques CHAPTER 2 Revew of appromaton technques 2. Introducton Optmzaton problems n engneerng desgn are characterzed by the followng assocated features: the objectve functon and constrants are mplct functons evaluated

More information

A Multivariate Analysis of Static Code Attributes for Defect Prediction

A Multivariate Analysis of Static Code Attributes for Defect Prediction Research Paper) A Multvarate Analyss of Statc Code Attrbutes for Defect Predcton Burak Turhan, Ayşe Bener Department of Computer Engneerng, Bogazc Unversty 3434, Bebek, Istanbul, Turkey {turhanb, bener}@boun.edu.tr

More information

Implementation Naïve Bayes Algorithm for Student Classification Based on Graduation Status

Implementation Naïve Bayes Algorithm for Student Classification Based on Graduation Status Internatonal Journal of Appled Busness and Informaton Systems ISSN: 2597-8993 Vol 1, No 2, September 2017, pp. 6-12 6 Implementaton Naïve Bayes Algorthm for Student Classfcaton Based on Graduaton Status

More information

Incremental Learning with Support Vector Machines and Fuzzy Set Theory

Incremental Learning with Support Vector Machines and Fuzzy Set Theory The 25th Workshop on Combnatoral Mathematcs and Computaton Theory Incremental Learnng wth Support Vector Machnes and Fuzzy Set Theory Yu-Mng Chuang 1 and Cha-Hwa Ln 2* 1 Department of Computer Scence and

More information

Discrimination of Faulted Transmission Lines Using Multi Class Support Vector Machines

Discrimination of Faulted Transmission Lines Using Multi Class Support Vector Machines 16th NAIONAL POWER SYSEMS CONFERENCE, 15th-17th DECEMBER, 2010 497 Dscrmnaton of Faulted ransmsson Lnes Usng Mult Class Support Vector Machnes D.hukaram, Senor Member IEEE, and Rmjhm Agrawal Abstract hs

More information

Synthesizer 1.0. User s Guide. A Varying Coefficient Meta. nalytic Tool. Z. Krizan Employing Microsoft Excel 2007

Synthesizer 1.0. User s Guide. A Varying Coefficient Meta. nalytic Tool. Z. Krizan Employing Microsoft Excel 2007 Syntheszer 1.0 A Varyng Coeffcent Meta Meta-Analytc nalytc Tool Employng Mcrosoft Excel 007.38.17.5 User s Gude Z. Krzan 009 Table of Contents 1. Introducton and Acknowledgments 3. Operatonal Functons

More information

Spam Filtering Based on Support Vector Machines with Taguchi Method for Parameter Selection

Spam Filtering Based on Support Vector Machines with Taguchi Method for Parameter Selection E-mal Spam Flterng Based on Support Vector Machnes wth Taguch Method for Parameter Selecton We-Chh Hsu, Tsan-Yng Yu E-mal Spam Flterng Based on Support Vector Machnes wth Taguch Method for Parameter Selecton

More information

SURFACE PROFILE EVALUATION BY FRACTAL DIMENSION AND STATISTIC TOOLS USING MATLAB

SURFACE PROFILE EVALUATION BY FRACTAL DIMENSION AND STATISTIC TOOLS USING MATLAB SURFACE PROFILE EVALUATION BY FRACTAL DIMENSION AND STATISTIC TOOLS USING MATLAB V. Hotař, A. Hotař Techncal Unversty of Lberec, Department of Glass Producng Machnes and Robotcs, Department of Materal

More information

A Semi-parametric Regression Model to Estimate Variability of NO 2

A Semi-parametric Regression Model to Estimate Variability of NO 2 Envronment and Polluton; Vol. 2, No. 1; 2013 ISSN 1927-0909 E-ISSN 1927-0917 Publshed by Canadan Center of Scence and Educaton A Sem-parametrc Regresson Model to Estmate Varablty of NO 2 Meczysław Szyszkowcz

More information

Data Mining For Multi-Criteria Energy Predictions

Data Mining For Multi-Criteria Energy Predictions Data Mnng For Mult-Crtera Energy Predctons Kashf Gll and Denns Moon Abstract We present a data mnng technque for mult-crtera predctons of wnd energy. A mult-crtera (MC) evolutonary computng method has

More information

Using the Correlation Criterion to Position and Shape RBF Units for Incremental Modelling

Using the Correlation Criterion to Position and Shape RBF Units for Incremental Modelling Internatonal Journal of Automaton and Computng 2 (2006) 0-0 Usng the Correlaton Crteron to Poston and Shape RBF Unts for Incremental Modellng Xun-xan Wang Neural Computng Research Group, Aston Unversty,

More information

Specialized Weighted Majority Statistical Techniques in Robotics (Fall 2009)

Specialized Weighted Majority Statistical Techniques in Robotics (Fall 2009) Statstcal Technques n Robotcs (Fall 09) Keywords: classfer ensemblng, onlne learnng, expert combnaton, machne learnng Javer Hernandez Alberto Rodrguez Tomas Smon javerhe@andrew.cmu.edu albertor@andrew.cmu.edu

More information

UB at GeoCLEF Department of Geography Abstract

UB at GeoCLEF Department of Geography   Abstract UB at GeoCLEF 2006 Mguel E. Ruz (1), Stuart Shapro (2), June Abbas (1), Slva B. Southwck (1) and Davd Mark (3) State Unversty of New York at Buffalo (1) Department of Lbrary and Informaton Studes (2) Department

More information

Network Intrusion Detection Based on PSO-SVM

Network Intrusion Detection Based on PSO-SVM TELKOMNIKA Indonesan Journal of Electrcal Engneerng Vol.1, No., February 014, pp. 150 ~ 1508 DOI: http://dx.do.org/10.11591/telkomnka.v1.386 150 Network Intruson Detecton Based on PSO-SVM Changsheng Xang*

More information

Optimizing Document Scoring for Query Retrieval

Optimizing Document Scoring for Query Retrieval Optmzng Document Scorng for Query Retreval Brent Ellwen baellwe@cs.stanford.edu Abstract The goal of ths project was to automate the process of tunng a document query engne. Specfcally, I used machne learnng

More information

Machine Learning: Algorithms and Applications

Machine Learning: Algorithms and Applications 14/05/1 Machne Learnng: Algorthms and Applcatons Florano Zn Free Unversty of Bozen-Bolzano Faculty of Computer Scence Academc Year 011-01 Lecture 10: 14 May 01 Unsupervsed Learnng cont Sldes courtesy of

More information

Complex System Reliability Evaluation using Support Vector Machine for Incomplete Data-set

Complex System Reliability Evaluation using Support Vector Machine for Incomplete Data-set Internatonal Journal of Performablty Engneerng, Vol. 7, No. 1, January 2010, pp.32-42. RAMS Consultants Prnted n Inda Complex System Relablty Evaluaton usng Support Vector Machne for Incomplete Data-set

More information

Model Selection with Cross-Validations and Bootstraps Application to Time Series Prediction with RBFN Models

Model Selection with Cross-Validations and Bootstraps Application to Time Series Prediction with RBFN Models Model Selecton wth Cross-Valdatons and Bootstraps Applcaton to Tme Seres Predcton wth RBF Models Amaury Lendasse Vncent Wertz and Mchel Verleysen Unversté catholque de Louvan CESAME av. G. Lemaître 3 B-348

More information

The Man-hour Estimation Models & Its Comparison of Interim Products Assembly for Shipbuilding

The Man-hour Estimation Models & Its Comparison of Interim Products Assembly for Shipbuilding Internatonal Journal of Operatons Research Internatonal Journal of Operatons Research Vol., No., 9 4 (005) The Man-hour Estmaton Models & Its Comparson of Interm Products Assembly for Shpbuldng Bn Lu and

More information

Face Recognition University at Buffalo CSE666 Lecture Slides Resources:

Face Recognition University at Buffalo CSE666 Lecture Slides Resources: Face Recognton Unversty at Buffalo CSE666 Lecture Sldes Resources: http://www.face-rec.org/algorthms/ Overvew of face recognton algorthms Correlaton - Pxel based correspondence between two face mages Structural

More information

Adaptive Virtual Support Vector Machine for the Reliability Analysis of High-Dimensional Problems

Adaptive Virtual Support Vector Machine for the Reliability Analysis of High-Dimensional Problems Proceedngs of the ASME 2 Internatonal Desgn Engneerng Techncal Conferences & Computers and Informaton n Engneerng Conference IDETC/CIE 2 August 29-3, 2, Washngton, D.C., USA DETC2-47538 Adaptve Vrtual

More information

CLASSIFICATION OF ULTRASONIC SIGNALS

CLASSIFICATION OF ULTRASONIC SIGNALS The 8 th Internatonal Conference of the Slovenan Socety for Non-Destructve Testng»Applcaton of Contemporary Non-Destructve Testng n Engneerng«September -3, 5, Portorož, Slovena, pp. 7-33 CLASSIFICATION

More information

Fuzzy Filtering Algorithms for Image Processing: Performance Evaluation of Various Approaches

Fuzzy Filtering Algorithms for Image Processing: Performance Evaluation of Various Approaches Proceedngs of the Internatonal Conference on Cognton and Recognton Fuzzy Flterng Algorthms for Image Processng: Performance Evaluaton of Varous Approaches Rajoo Pandey and Umesh Ghanekar Department of

More information

Reducing Frame Rate for Object Tracking

Reducing Frame Rate for Object Tracking Reducng Frame Rate for Object Trackng Pavel Korshunov 1 and We Tsang Oo 2 1 Natonal Unversty of Sngapore, Sngapore 11977, pavelkor@comp.nus.edu.sg 2 Natonal Unversty of Sngapore, Sngapore 11977, oowt@comp.nus.edu.sg

More information

Hermite Splines in Lie Groups as Products of Geodesics

Hermite Splines in Lie Groups as Products of Geodesics Hermte Splnes n Le Groups as Products of Geodescs Ethan Eade Updated May 28, 2017 1 Introducton 1.1 Goal Ths document defnes a curve n the Le group G parametrzed by tme and by structural parameters n the

More information

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching A Fast Vsual Trackng Algorthm Based on Crcle Pxels Matchng Zhqang Hou hou_zhq@sohu.com Chongzhao Han czhan@mal.xjtu.edu.cn Ln Zheng Abstract: A fast vsual trackng algorthm based on crcle pxels matchng

More information