Deep Neural Network Bottleneck Features For Generalized Variable Parameter HMMs

Size: px
Start display at page:

Download "Deep Neural Network Bottleneck Features For Generalized Variable Parameter HMMs"

Transcription

1 Deep Neural Network Bottleneck Features For Generalzed Varable Parameter HMMs Xurong Xe 1,3, Rongfeng Su 1,3, Xunyng Lu 1, & Lan Wang 1,3 1 Shenzhen Insttutes of Advanced Technology, Chnese Academy of Scences Cambrdge Unversty Engneerng Dept, Trumpngton St., Cambrdge, CB 1PZ U.K. 3 The Chnese Unversty of Hong Kong, Hong Kong, Chna xr.xe@sat.ac.cn, rf.su@sat.ac.cn, xl07@cam.ac.uk, lan.wang@sat.ac.cn Abstract Recently deep neural networks (DNNs have become ncreasngly popular for acoustc modellng n automatc speech recognton (ASR systems. As the bottleneck features they produce are nherently dscrmnatve and contan rch hdden factors that nfluence the surface acoustc realzaton, the standard approach s to augment the conventonal acoustc features wth the bottleneck features n a tandem framework. In ths paper, an alternatve approach to ncorporate bottleneck features s nvestgated. The complex relatonshp between acoustc features and DNN bottleneck features s modelled usng generalzed varable parameter HMMs (GVP-HMMs. The optmal GVP-HMM structural confguraton and model parameters are automatcally learnt. Sgnfcant error rate reductons of 48% and 8% relatve were obtaned over the baselne mult-style H- MM and tandem HMM systems respectvely on Aurora. Index Terms: generalzed varable parameter HMM, deep neural network, bottleneck features, robust speech recognton 1. Introducton Recently deep neural networks (DNNs have become ncreasngly popular for acoustc modellng n automatc speech recognton (ASR systems [1,, 3, 4, 5, 6, 7, 8]. In order to ncorporate DNNs, or mult-layer perceptrons (MLPs n general, nto HMM based acoustc models, two approaches can be used. The frst uses a hybrd archtecture that estmates the H- MM state emsson probabltes usng DNNs [9]. The second approach uses an MLP or DNN as a feature extractor, traned to produce phoneme posteror probabltes. The resultng probablstc features [10], or bottleneck features [11] are used to tran standard GMM-HMMs n a tandem fashon. As these features capture addtonal nformaton complementary to standard front-ends, they are often combned n tandem systems. One mportant ssue assocated wth the tandem HMM approach s the approprate method used to combne the conventonal and bottleneck features. The precse nature of the relatonshp between the two s hghly complex. Compared wth the standard front-ends, bottleneck features provde a dfferent vew of the same speech sgnals. Certan correlaton can therefore exst between the two. At the same tme, complementary nformaton characterzng the underlyng hdden factors nfluencng Ths work s supported by Natonal Natural Scence Foundaton of Chna (NSFC , Natonal Fundamental Research Grant of Scence and Technology (973 Project: 013CB39305 Shenzhen Fundamental Research Program JC A, J- CYJ the surface acoustc realzaton are also mplctly learnt by bottleneck features. They are propagated nto HMMs as addtonal cues and constrants to mprove dscrmnaton. The standard approach augments the conventonal front-ends wth bottleneck features n a concatenated form. More advanced approaches that explctly approxmate the correlaton between them usng lnear, affne transformatons have also been proposed [1, 13]. In order to better capture the complex relatonshp between standard acoustc and bottleneck features, technques motvated by speech producton that can fully explot the hdden varablty n the bottleneck features may be used. Along ths lne, an alternatve method to ncorporate bottleneck features nto a tandem system s proposed n ths paper. DNN bottleneck features are used as nfluence factors to drectly ntroduce controllablty to the underlyng generatve acoustc models that are based on generalzed varable parameter HMMs (GVP- HMMs [14, 15, 16, 17, 18]. The contnuous trajectores of optmal HMM parameters aganst the tme-varyng hdden factors n the bottleneck features are modelled usng polynomal functons. Ther effects on the acoustc parameters are automatcally learnt by locally optmzed polynomal parameters and degrees. Usng the proposed GVP-HMM tandem approach, sgnfcant error rate reductons of 48% and 8% relatve were obtaned over the mult-style baselne HMM and tandem HMM systems respectvely on Aurora. The rest of ths paper s organzed as follows. Generalzed varable parameter HMMs and an assocated effcent complexty control technque are ntroduced n secton. Deep neural networks and bottleneck features are revewed n secton 3. A range of GVP-HMM systems usng varous modellng confguratons are descrbed n secton 4. In secton 5 varous GVP- HMM systems usng DNN bottleneck features are evaluated on Aurora. Secton 6 s the concluson and future research.. Generalzed Varable Parameter HMMs Generalzed varable parameter HMMs (GVP-HMMs [14, 15, 16, 17] explctly model the parameter trajectores of optmal Gaussan components, or more compact ted lnear transformatons, that vary wth respect to some nfluence factors. In ths paper, trajectores of Gaussan means and varances are used..1. Model Defnton For a D dmensonal observaton o t emtted from Gaussan mxture component m, assumng P th order polynomals modellng a total of N regresson varables are used, the form of

2 GVP-HMMs consdered n ths paper s gven by ( o (t p o (t ; µ (m (v t, Σ (m (v t. (1 vt s a (P N + 1 dmensonal Vandermonde vector [19], [ vt = 1, f t,1,..., f t,p,..., f ] t,p. ( and ts N dmensonal pth order subvector s defned as f t,p = [v p t,1,..., vp t,j,..., vp t,n ], where v t,j s the jth element of an N dmensonal factor vector Gaussan parameters are condtoned on at frame t, for example, the DNN bottleneck features, ft = [v t,1,..., v t,j,..., v t,n ]. (3 µ (m ( and Σ (m ( are the P th order mean and covarance trajectory polynomals of component m respectvely. When dagonal covarances are used, the trajectores of the th dmenson of the mean and varance parameters are computed as µ (m (v t = v t c (µ(m σ (m, (v t = ˇσ (m v t c (σ(m, (4 where c ( s a (P N +1 dmensonal polynomal coeffcent vector and ˇσ (m, s the conventonal HMM varance estmate. As a natural form of generatve model nspred by speech producton, a range of factors nfluencng the acoustc realzaton of speech have been nvestgated n prevous research usng GVP-HMMs, or ther precursors based on more restrcted forms of parameter trajectores, such as multple regresson HMMs (MR-HMM [0] and varable parameter HMMs (VP-HMM [1, ]. These acoustc factors nclude prosodc features [0], envronment nose condton represented by the sgnal-to-nose rato (SNR [14, 15, 16, 17, 18, 1, ], and more recently artculatory features for speech synthess [3]. GVP-HMMs share the same nstantaneous adaptaton power and good controllablty as MR-HMMs and VP-HMMs. For any varablty ndcated by the factor vector, e.g. the bottleneck features, or SNR level, present or unseen n the tranng data, GVP-HMMs can nstantly produce the matchng HMM model parameters by-desgn wthout requrng any mult-pass decodng and adaptaton process... Parameter Estmaton for GVP-HMMs For the form of GVP-HMMs of equaton (1 the assocated ML auxlary functon s gven by [14, 15, 4], Q GVP (θ, θ = ( γ m(t log p o (t ; µ (m (v t, Σ (m (v t (5 m,t where γ m(t s the posteror probablty of frame o t beng emtted from component m at a tme nstance t. Combnng the above wth equatons (1 and (4, the correspondng parts of the above auxlary functon assocated wth the polynomal coeffcent vectors of the Gaussan mean and varance trajectores respectvely can be re-arranged nto convex quadratc forms, Q (µ(m, (θ, θ = 1 c(µ(m Q (σ(m, +k (µ(m (θ, θ = 1 c(σ(m +k (σ(m U (µ(m c (µ(m c (µ(m + const U (σ(m c (σ(m c (σ(m + const (6 where the constant terms ndependent of the coeffcent vectors c ( can be gnored. Settng the above gradents aganst the respectve polynomal coeffcent vectors to zero, the followng ML solutons of the coeffcent vectors can then be derved ĉ (µ(m ĉ (σ(m = U (µ(m, = U (σ(m, and the suffcent statstcs are U (µ(m k (µ(m = t = t U (σ(m, = t k (σ(m, = t 1 k (µ(m 1 k (σ(m γ m(tσ (m 1, (v tvt v t γ m(tσ (m 1, γ m(tˇσ (m, vt v t (v to (t, (7 v t ( γ m(t o (t µ (m (v t v t (8.3. Model Complexty Control for GVP-HMMs An mportant ssue assocated wth GVP-HMMs s the approprate polynomal degree to use. The use of hgher degree polynomals can result n severe over-fttng and oscllaton [5]. In addton, the precse form of ndvdual parameter trajectores should be n lne wth the nature of the dstnct effects mposed on them by the nfluencng factors. In order to more flexbly capture these complex, potentally locally varyng effects and mprove robustness, the optmal polynomal degrees of Gaussan mean and varance trajectores can be automatcally determned at local level usng complexty control technques [18]. In Bayesan learnng, when no pror knowledge over model structures {M} s avalable, the optmal model structure or complexty, s determned by maxmzng the evdence, p(o W, M = p(o θ, W, Mp(θ Mdθ (9 where θ s a parameterzaton of M, O = {o 1,..., o T } s a tranng data set of T frames and W the reference transcrpton. For standard HMMs and GVP-HMMs, t s computatonally ntractable to drectly compute the evdence n equaton (9. To handle ths problem, an effcent approxmaton usng the BIC style frst order asymptotc expanson [6] of a lower lower bound [18, 7, 8, 9] of the evdence ntegral can be used. The optmal model complexty s determned by ˆM = arg max M { Q (M (ˆθ, θ ρ k log T }. (10 where the ML auxlary functons assocated wth Gaussan mean and varance trajectory parameters gven n equaton (6 evaluated at the optmal model parameters ˆθ usng the statstcs gven n equatons (7 and (8. k denotes the number of free parameters n M and ρ s a tunable penalty term [30]. When determnng the optmal order for a partcular polynomal assocated wth the th dmenson of the mth Gaussan component n the system, µ (m (, for example, the above s- tatstcs n equaton (8 are accumulated for the hghest order P max beng consdered. The correspondng statstcs for any other order 0 P (µ(m < P max can be derved by takng the assocated submatrces or subvectors from the full matrx statstcs accumulated for P max. Usng these statstcs and the ML

3 solutons n equaton (7, the ML auxlary functon assocated wth µ (m ( n equaton (6, can be effcently evaluated at the optmum for each canddate polynomal degree. The number of free parameters (polynomal coeffcents n the BIC metrc of equaton (10 s k = P (µ(m +1. The number of frame samples for the current Gaussan s computed as the component level occupancy counts T (m = t,m γm(t. The same approach can also be used to determne the optmal degree of Gaussan varance polynomals by evaluatng the respectve auxlary functons wth ther respectve suffcent statstcs to compute the metrc n equaton ( DNN Bottleneck Features Bottleneck features are normally generated from a narrow hdden layer of an MLP that s traned to predct phonemes or phoneme states. Compared wth the sze of other layers, ths hdden layer has a sgnfcantly smaller number of hdden unts [11]. Ths narrow layer ntroduces a constrcton n the network whle retanng the nformaton useful to classfcaton n the resultng low dmensonal features extracted va a non-lnear and dscrmnatve transformaton. In ths paper the bottleneck features used for tandem HMM systems are extracted from deep neural network (DNN multlayer perceptrons (MLP [1,, 3, 4]. DNNs are MLPs wth many hdden layers. The nputs are formed from a stacked set of adjacent frames of the acoustc feature for each tme nstance. Wthn each hdden layer, the nput to each unt s computed as a lnearly weghted sum of the outputs from the prevous layer. Each hdden node transforms ts nput wth a sgmod actvaton to acheve non-lnearty. An softmax output actvaton functon s used at the output layer to compute the posteror probablty of phonemes or phoneme state targets. In all the experments of ths paper, a pretraned DNN consstng of sx hdden layers s used. The frst fve layers have a total of 51 hdden nodes whle the last bottleneck layer has 6 unts. The network s traned on nputs formed by splcng 11 frames of 39 dmensonal MFCC features together. The layer-by-layer RBM based pre-tranng mplemented n the Kald toolkt [31] was used. Followng DNN tranng 6 dmensonal bottleneck features are extracted and decorrelated usng PCA. For the baselne tandem HMM systems, they are appended to standard MFCC features to form the tandem feature vector. Pror to recognton, tandem GMM-HMMs are then traned based on the new concatenated tandem features. For GVP-HMM systems, these are used as the nput factor vectors at each frame to estmate contnuous trajectores of Gaussan mean and varance parameters. An extended verson of the HTK toolkt [3] was used to tran varous GVP-HMM systems. 4. Usng DNN Bottleneck Features In GVP-HMMs and Tandem GVP-HMMs In order to adjust the trade-off between modellng resoluton, robustness and computatonal effcency, a range of GVP-HMM confguratons may be consdered to ncorporate DNN bottleneck features. Descrpton of these GVP-HMM varant systems confguratons and the number of parameters used for the s- tandard Aurora task are shown n table dmensonal standard MFCC features ncludng the frst and second order dfferentals were used. All the baselne GVP-HMMs wth no complexty control used nd degree polynomals for all parameter trajectores, as suggested n [1, ]. The penalty term n the complexty control metrc of equaton (10 was fxed as ρ = 1 n all experments. For all parameter polynomals the range of canddate degree to consder s [0, 5]. Baselne HMM and tandem HMM systems: In the frst 3 lnes of table 1, the number of parameters for the multstyle [33] traned baselne HMM system and two tandem H- MM systems are shown. The second tandem HMM system, tandem, used 18 Gaussans per state thus has a model complexty comparable to the other complexty controlled GVP- HMM systems n the table. The Gaussan parameters of these baselne HMM or tandem HMM systems were traned on standard MFCC or tandem features whle no parameter trajectory modellng was used. Parm Poly Com Model Type System mean var Ctrl #Parm HMM GVP-HMM mcond tandem tandem mean mv mean 79K K 396k.15M 7K.7M 36K.M tandem 98K GVP-HMM.3M 406K Table 1: Descrpton of the baselne mult-style HMM, tandem HMM systems, GVP-HMM and tandem GVP-HMM systems on Aurora n terms of model confguratons and the number of parameters. Followng the settng of prevous works [1, 17, 18], all systems used 6 Gaussans per state except the tandem baselne system used 18 Gaussans per state. GVP-HMM systems: In the second secton of table 1, a total of four GVP-HMM modellng confguratons, denoted as mean and mv respectvely, whch use trajectory modellng for Gaussan component means usng the DNN bottleneck features as the factor nput n equatons from (1 to (3, wth the further optons of usng varance trajectores condtoned on the SNR varable, and wth or wthout applyng the model selecton technque presented n secton.3, are shown from the 4th to 7th lne n table 1. As expected, usng the standard GVP- HMMs wth no complexty control on the 6 dmensonal bottleneck features results n a massve ncrease n model parameters. Determnng the optmal degrees for parameter trajectory polynomals usng the model selecton method of secton.3 sgnfcantly reduced the model complexty by over to 80%. Tandem GVP-HMM systems: In the last secton of table 1, four comparable tandem varants of the above four GVP-HMM systems are shown. In these tandem GVP-HMM systems, the DNN bottleneck features are not only used as the nput factor vectors to estmate the contnuous trajectores of Gaussan parameters n the acoustc feature subspace, but also used as normal features to tran the standard mean and varance parameters n the bottleneck feature subspace. For example, the fnal mean vector of component m at tme nstance t s thus computed as µ (m t = [µ (m GVP (vbn t, µ (m BN ]. (11 where the µ (m GVP (vbn t s the mean subvector trajectory takng a Vandermonde vector nput vt BN constructed usng the 6 dmensonal DNN bottleneck features, as descrbed n secton.1.

4 µ (m BN s the remanng statc mean subvector estmated usng the bottleneck features. These tandem GVP-HMMs are expected to draw strength from both the conventonal tandem and GVP- HMM based approaches to fully explot the complementary nformaton n the DNN bottleneck features. 5. Experments and Results In ths secton, the performance of varous GVP-HMM systems usng DNN bottleneck features are evaluated the Aurora task. The Aurora database contans dfferent nosy condtons. Durng the experments, 40 utterances from each of four dfferent SNR condtons (-5dB, 5dB, 15dB, 5dB of nose envronments of subway, babble, car and exhbton were used to tran all the systems, whle 1000 utterances selected from each nose envronment at 0dB, 5dB, 10dB, 15dB and 0dB SNR respectvely were used for word error rate (WER evaluaton. Nose Com Type System Ctrl 0dB 5dB 10dB 15dB 0dB Ave subway babble car exhbton mcond mcond mcond mcond Table : WER performance of GVP-HMM systems usng DNN bottleneck features on Aurora test set A of four nose types. All systems used the same namng conventons as n table 1. The WER performance of the mult-style HMM baselne, mcond, and varous GVP-HMM systems shown from the 4th to 7th lne of table 1 are gven n table. The followng trends can be found n the table. Frst, the use of DNN bottleneck features gave sgnfcant WER reductons for all GVP-HMM modellng confguratons across varous nose types over the moncon HMM baselne. Second, as expected, usng the model selecton technque of secton.3, n addton to the model sze compresson shown prevously n table 1, an average WER reducton of.41% absolute (9% relatve was obtaned over varous standard GVP-HMM systems wth no complexty control. Thrd, combned wth model complexty control, the use of varance trajectory polynomals gave further mprovements over usng mean trajectory modellng only. Usng the best GVP- HMM systems hghlghted n bold n table, an average WER reducton of 3.64% absolute (40% relatve over the mult-style MFCC feature traned baselne mcond HMM system was obtaned. However, all of these four GVP-HMM systems were outperformed by the baselne tandem HMM system shown n the 1st lne of each nose specfc secton n table 3. The WER performance of the two baselne mult-style Nose Com Type System Ctrl 0dB 5dB 10dB 15dB 0dB Ave subway babble car exhbton tandem tandem tandem tandem tandem tandem tandem tandem Table 3: WER performance of tandem GVP-HMM systems usng DNN bottleneck features on Aurora test set A. All systems used the same namng conventons as n table 1. traned tandem HMM systems, tandem and tandem, and varous tandem GVP-HMM systems shown from the 8th to 11th lne n the bottom secton of table 1 are gven n table 3. Consstent wth the trends found n table, every complexty controlled tandem GVP-HMM system n table 3 outperformed ts comparable baselne usng no complexty control. The use of varance trajectory modellng also gave further small reductons n WER. Usng the best complexty controlled tandem GVP- HMM mv system hghlghted n bold n table 3, an average WER reducton of 4.38% absolute (48% relatve, and 0.4% absolute (8% relatve over the mult-style baselne mcond system of table, and the baselne tandem HMM system of table 3 respectvely were obtaned. Smlar consstent mprovements were also obtaned over the more complex baselne tandem system wth a comparable number of parameters as shown n table 1, and a thrd baselne tandem HMM system usng the bottleneck features extracted from a DNN traned on concatenated MFCC and SNR features. 6. Concluson An alternatve approach to ncorporate bottleneck features nto a tandem system usng generalzed varable parameter HMMs s nvestgated n ths paper. The complementary nformaton characterzng the hdden factors nfluencng the surface acoustc realzaton mplctly learnt by bottleneck features are exploted to mprove controllablty and robustness. The proposed technque sgnfcantly reduced the error rate by 48% and 8% relatve over the baselne mult-style HMM and tandem HMM systems respectvely on Aurora. Future research wll focus on usng bottleneck features to model the trajectores of more effcent feature space transforms [17].

5 7. References [1] F. Sde, G. L, and D. Yu (011. Conversatonal speech transcrpton usng context-dependent deep neural networks, n Proc. ISCA INTERSPEECH011, pp , Florence, Italy. [] D. Yu and M. L. Seltzer (011. Improved Bottleneck Features Usng Pretraned Deep Neural Networks, n Proc. ISCA INTER- SPEECH011, pp , Florence, Italy. [3] G. Dahl, D. Yu, L. Deng, and A. Acero (01. Context- Dependent Pre-Traned Deep Neural Networks for Large- Vocabulary Speech Recognton, n Proc. IEEE Transactons on Audo, Speech, and Language Processng, vol. 0, no. 1, pp. 30C4, jan 01. [4] G. E. Hnton, L. Deng, D. Yu, G. E. Dahl, A.-R. Mohamed, N. Jatly, A. Senor, V. Vanhoucke, P. Nguyen, T. N. Sanath, and B. Knsbury (01. Deep neural networks for acoustc modelng n speech recognton, IEEE Sgnal Processng Magazne, pp. -17, nov 01. [5] M. Seltzer, D. Yu and Y. Wang (013. An Investgaton Of Deep Neural Networks For Nose Robust Speech Recognton, n Proc. IEEE ICASSP013, pp , Vancouver, BC, Canada. [6] S. Thomas, M. L. Seltzer, K. Church, and H. Hermansky (013. Deep neural network features and sem-supervsed tranng for low resource speech recognton, n Proc. IEEE ICASSP013, pp , Vancouver, BC, Canada. [7] P. Bell, P. Swetojansk and S. Renals (013. Mult-level adaptve networks n tandem and hybrd ASR systems, n Proc. IEEE ICASSP013, pp , Vancouver, BC, Canada. [8] K. M. Knll, M. J. F. Gales, S. P. Rath, P. C. Woodland, C. Zhang and S.-X. Zhang (013. Investgaton of multlngual deep neural networks for spoken term detecton, n Proc. IEEE ASRU013, pp , Olomouc, Czech Republc. [9] H. A. Bourlard and N. Morgan (1993. Connectonst Speech Recognton: A Hybrd Approach, Kluwer Academc Publshers, Norwell, MA, USA, [10] H. Hermansky, D. Ells and S. Sharma (000. Tandem connectonst feature extracton for conventonal HMM systems, n Proc. IEEE ICASSP000, vol. 3, pp , Istanbul, Turkey. [11] F. Grezl, M. Karafat, S. Kontar and J. Cernocky (007. Probablstc and bottle-neck features for LVCSR of meetngs, n Proc. IEEE ICASSP007, Vol. 4, pp , Honolulu, Hawa, USA. [1] J. Zheng, O. Cetn, M. Y. Hwang, X. Le, A. Stolcke and N. Morgan (007. Combnng dscrmnatve feature, transform, and model tranng for large vocabulary speech recognton, n Proc. IEEE ICASSP007, Vol. 4, pp , Honolulu, Hawa, USA. [13] T. Ng, B. Zhang, S. Matsoukas and L. Nguyen (011. Regon dependent transform on MLP features for speech recognton, n Proc. ISCA INTERSPEECH011, pp. 1-4, Florence, Italy. [14] N. Cheng, X. Lu and L. Wang (011. Generalzed varable parameter HMMs for nose robust speech recognton, n Proc. IS- CA INTERSPEECH011, pp , Florence, Italy. [15] N. Cheng, X. Lu and L. Wang (011. A flexble framework for HMM based nose robust speech recognton usng generalzed parametrc space polynomal regresson, Scence Chna, Informaton Scences, 54(, pp , 011. [16] Y. L, X. Lu and L. Wang (01. Structured modelng based on generalzed varable parameter HMMs and speaker adaptaton, n Proc. IEEE ISCSLP01, pp , Hong Kong, Chna. [17] Y. L, X. Lu and L. Wang (013. Feature space generalzed varable parameter HMMs for nose robust recognton, n Proc. ISCA INTERSPEECH013, pp , Lyon, France. [18] R. Su, X. Lu, L. Wang (013. Automatc model complexty control for generalzed varable parameter HMMs, n Proc. IEEE ASRU013, pp , Olomouc, Czech Republc. [19] A. Bjorck and V. Pereyra (1970. Soluton of Vandermonde Systems of Equatons, Mathematcs of Computaton (Amercan Mathematcal Socety 4(11: pp [0] K. Fujnaga, M. Naka, H. Shmodara and S. Sagayama (001. Multple-Regresson Hdden Markov Model, n Proc. IEEE I- CASSP001, Vol. 1, pp , Salt Lake Cty, Utah, USA. [1] X. Cu and Y. Gong (007. A study of varable-parameter Gaussan mxture hdden Markov modelng for nosy speech recognton, IEEE Transactons on Audo, Speech and Language Processng, 15(4: , 007. [] D. Yu, L. Deng, Y. Gong and A. Acero (009. A novel framework and tranng algorthm for varable-parameter hdden Markov models, IEEE Transactons on Audo, Speech and Language Processng, Vol 17(7, pp , 009. [3] Z. Lng, K. Rchmond and J. Yamagsh (013. Artculatory control of HMM-based parametrc speech synthess usng feature s- pace swtched multple regresson, IEEE Transactons on Audo Speech and Language Processng, vol.1, no.1, pp , Jan do: /TASL [4] A. P. Dempster, N. M. Lard and D. B. Rubn (1977. Maxmum lkelhood from ncomplete data va the EM algorthm, Journal of the Royal Statstcal Socety, 39(1:1-39,1977. [5] C. Runge (1901. Über emprsche Funktonen und de Interpolaton zwschen äqudstanten Ordnaten, Zetschrft für Mathematk und Physk, 46:4-43. [6] G. Schwartz (1978. Estmatng the Dmenson of a Model, The Annals of Statstcs, pp , Vol. 6, No., February [7] X. Lu and M. J. F. Gales (003. Automatc model complexty control usng margnalzed dscrmnatve growth functons, n Proc. IEEE ASRU003, pp. 37-4, St. Thomas, U.S. Vrgn Islands. [8] X. Lu and M. J. F. Gales (004. Model complexty control and compresson usng dscrmnatve growth functons, n Proc. IEEE ICASSP004, Vol. 1, pp , Montreal, Quebec, Canada. [9] X. Lu and M. J. F. Gales (007. Automatc model complexty control usng margnalzed dscrmnatve growth functons, n Proc. IEEE Transactons on Audo, Speech and Language Processng, vol.15, no.4, pp , May 007. [30] W. Chou and W. Rechl (1999. Decson tree state tyng based on penalzed Bayesan nformaton crteron, n Proc. IEEE I- CASSP1999, Vol. 1, , Phoenx, Arzona, USA. [31] The Kald speech recognton toolkt. [3] S. Young et al., The HTK Book Verson 3.4.1, 009. [33] R. Lppmann, E. Martn and D. Paul (1987. Mult-style tranng for robust solated-word speech recognton, n Proc. IEEE ICASSP1987, pp , Dallas, Texas, USA.

Modeling Inter-cluster and Intra-cluster Discrimination Among Triphones

Modeling Inter-cluster and Intra-cluster Discrimination Among Triphones Modelng Inter-cluster and Intra-cluster Dscrmnaton Among Trphones Tom Ko, Bran Mak and Dongpeng Chen Department of Computer Scence and Engneerng The Hong Kong Unversty of Scence and Technology Clear Water

More information

Classifying Acoustic Transient Signals Using Artificial Intelligence

Classifying Acoustic Transient Signals Using Artificial Intelligence Classfyng Acoustc Transent Sgnals Usng Artfcal Intellgence Steve Sutton, Unversty of North Carolna At Wlmngton (suttons@charter.net) Greg Huff, Unversty of North Carolna At Wlmngton (jgh7476@uncwl.edu)

More information

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration

Improvement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration Improvement of Spatal Resoluton Usng BlockMatchng Based Moton Estmaton and Frame Integraton Danya Suga and Takayuk Hamamoto Graduate School of Engneerng, Tokyo Unversty of Scence, 6-3-1, Nuku, Katsuska-ku,

More information

Support Vector Machines

Support Vector Machines /9/207 MIST.6060 Busness Intellgence and Data Mnng What are Support Vector Machnes? Support Vector Machnes Support Vector Machnes (SVMs) are supervsed learnng technques that analyze data and recognze patterns.

More information

Feature Reduction and Selection

Feature Reduction and Selection Feature Reducton and Selecton Dr. Shuang LIANG School of Software Engneerng TongJ Unversty Fall, 2012 Today s Topcs Introducton Problems of Dmensonalty Feature Reducton Statstc methods Prncpal Components

More information

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization Problem efntons and Evaluaton Crtera for Computatonal Expensve Optmzaton B. Lu 1, Q. Chen and Q. Zhang 3, J. J. Lang 4, P. N. Suganthan, B. Y. Qu 6 1 epartment of Computng, Glyndwr Unversty, UK Faclty

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Decson surface s a hyperplane (lne n 2D) n feature space (smlar to the Perceptron) Arguably, the most mportant recent dscovery n machne learnng In a nutshell: map the data to a predetermned

More information

S1 Note. Basis functions.

S1 Note. Basis functions. S1 Note. Bass functons. Contents Types of bass functons...1 The Fourer bass...2 B-splne bass...3 Power and type I error rates wth dfferent numbers of bass functons...4 Table S1. Smulaton results of type

More information

Lecture 5: Multilayer Perceptrons

Lecture 5: Multilayer Perceptrons Lecture 5: Multlayer Perceptrons Roger Grosse 1 Introducton So far, we ve only talked about lnear models: lnear regresson and lnear bnary classfers. We noted that there are functons that can t be represented

More information

Classifier Selection Based on Data Complexity Measures *

Classifier Selection Based on Data Complexity Measures * Classfer Selecton Based on Data Complexty Measures * Edth Hernández-Reyes, J.A. Carrasco-Ochoa, and J.Fco. Martínez-Trndad Natonal Insttute for Astrophyscs, Optcs and Electroncs, Lus Enrque Erro No.1 Sta.

More information

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision SLAM Summer School 2006 Practcal 2: SLAM usng Monocular Vson Javer Cvera, Unversty of Zaragoza Andrew J. Davson, Imperal College London J.M.M Montel, Unversty of Zaragoza. josemar@unzar.es, jcvera@unzar.es,

More information

Feature Extraction and Test Algorithm for Speaker Verification

Feature Extraction and Test Algorithm for Speaker Verification Feature Extracton and Test Algorthm for Speaker Verfcaton Wu Guo, Renhua Wang and Lrong Da Unversty of Scence and Technology of Chna, Hefe guowu@mal.ustc.edu.cn,{rhw, lrda}@ustc,edu.cn Abstract. In ths

More information

Learning the Kernel Parameters in Kernel Minimum Distance Classifier

Learning the Kernel Parameters in Kernel Minimum Distance Classifier Learnng the Kernel Parameters n Kernel Mnmum Dstance Classfer Daoqang Zhang 1,, Songcan Chen and Zh-Hua Zhou 1* 1 Natonal Laboratory for Novel Software Technology Nanjng Unversty, Nanjng 193, Chna Department

More information

Hermite Splines in Lie Groups as Products of Geodesics

Hermite Splines in Lie Groups as Products of Geodesics Hermte Splnes n Le Groups as Products of Geodescs Ethan Eade Updated May 28, 2017 1 Introducton 1.1 Goal Ths document defnes a curve n the Le group G parametrzed by tme and by structural parameters n the

More information

Parallelism for Nested Loops with Non-uniform and Flow Dependences

Parallelism for Nested Loops with Non-uniform and Flow Dependences Parallelsm for Nested Loops wth Non-unform and Flow Dependences Sam-Jn Jeong Dept. of Informaton & Communcaton Engneerng, Cheonan Unversty, 5, Anseo-dong, Cheonan, Chungnam, 330-80, Korea. seong@cheonan.ac.kr

More information

An Entropy-Based Approach to Integrated Information Needs Assessment

An Entropy-Based Approach to Integrated Information Needs Assessment Dstrbuton Statement A: Approved for publc release; dstrbuton s unlmted. An Entropy-Based Approach to ntegrated nformaton Needs Assessment June 8, 2004 Wllam J. Farrell Lockheed Martn Advanced Technology

More information

Backpropagation: In Search of Performance Parameters

Backpropagation: In Search of Performance Parameters Bacpropagaton: In Search of Performance Parameters ANIL KUMAR ENUMULAPALLY, LINGGUO BU, and KHOSROW KAIKHAH, Ph.D. Computer Scence Department Texas State Unversty-San Marcos San Marcos, TX-78666 USA ae049@txstate.edu,

More information

Incremental MQDF Learning for Writer Adaptive Handwriting Recognition 1

Incremental MQDF Learning for Writer Adaptive Handwriting Recognition 1 200 2th Internatonal Conference on Fronters n Handwrtng Recognton Incremental MQDF Learnng for Wrter Adaptve Handwrtng Recognton Ka Dng, Lanwen Jn * School of Electronc and Informaton Engneerng, South

More information

CS 534: Computer Vision Model Fitting

CS 534: Computer Vision Model Fitting CS 534: Computer Vson Model Fttng Sprng 004 Ahmed Elgammal Dept of Computer Scence CS 534 Model Fttng - 1 Outlnes Model fttng s mportant Least-squares fttng Maxmum lkelhood estmaton MAP estmaton Robust

More information

Three supervised learning methods on pen digits character recognition dataset

Three supervised learning methods on pen digits character recognition dataset Three supervsed learnng methods on pen dgts character recognton dataset Chrs Flezach Department of Computer Scence and Engneerng Unversty of Calforna, San Dego San Dego, CA 92093 cflezac@cs.ucsd.edu Satoru

More information

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task

Term Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task Proceedngs of NTCIR-6 Workshop Meetng, May 15-18, 2007, Tokyo, Japan Term Weghtng Classfcaton System Usng the Ch-square Statstc for the Classfcaton Subtask at NTCIR-6 Patent Retreval Task Kotaro Hashmoto

More information

Network Coding as a Dynamical System

Network Coding as a Dynamical System Network Codng as a Dynamcal System Narayan B. Mandayam IEEE Dstngushed Lecture (jont work wth Dan Zhang and a Su) Department of Electrcal and Computer Engneerng Rutgers Unversty Outlne. Introducton 2.

More information

Smoothing Spline ANOVA for variable screening

Smoothing Spline ANOVA for variable screening Smoothng Splne ANOVA for varable screenng a useful tool for metamodels tranng and mult-objectve optmzaton L. Rcco, E. Rgon, A. Turco Outlne RSM Introducton Possble couplng Test case MOO MOO wth Game Theory

More information

A mathematical programming approach to the analysis, design and scheduling of offshore oilfields

A mathematical programming approach to the analysis, design and scheduling of offshore oilfields 17 th European Symposum on Computer Aded Process Engneerng ESCAPE17 V. Plesu and P.S. Agach (Edtors) 2007 Elsever B.V. All rghts reserved. 1 A mathematcal programmng approach to the analyss, desgn and

More information

Modeling Waveform Shapes with Random Effects Segmental Hidden Markov Models

Modeling Waveform Shapes with Random Effects Segmental Hidden Markov Models Modelng Waveform Shapes wth Random Effects Segmental Hdden Markov Models Seyoung Km, Padhrac Smyth Department of Computer Scence Unversty of Calforna, Irvne CA 9697-345 {sykm,smyth}@cs.uc.edu Abstract

More information

The Research of Support Vector Machine in Agricultural Data Classification

The Research of Support Vector Machine in Agricultural Data Classification The Research of Support Vector Machne n Agrcultural Data Classfcaton Le Sh, Qguo Duan, Xnmng Ma, Me Weng College of Informaton and Management Scence, HeNan Agrcultural Unversty, Zhengzhou 45000 Chna Zhengzhou

More information

y and the total sum of

y and the total sum of Lnear regresson Testng for non-lnearty In analytcal chemstry, lnear regresson s commonly used n the constructon of calbraton functons requred for analytcal technques such as gas chromatography, atomc absorpton

More information

arxiv: v1 [eess.as] 2 Apr 2018

arxiv: v1 [eess.as] 2 Apr 2018 ADVERSARIAL TEACHER-STUDENT LEARNING FOR UNSUPERVISED DOMAIN ADAPTATION Zhong Meng 1,2, Jnyu L 1, Yfan Gong 1, Bng-Hwang (Fred) Juang 2 1 Mcrosoft AI and Research, Redmond, WA, USA 2 Georga Insttute of

More information

Lecture 4: Principal components

Lecture 4: Principal components /3/6 Lecture 4: Prncpal components 3..6 Multvarate lnear regresson MLR s optmal for the estmaton data...but poor for handlng collnear data Covarance matrx s not nvertble (large condton number) Robustness

More information

A Semi-parametric Regression Model to Estimate Variability of NO 2

A Semi-parametric Regression Model to Estimate Variability of NO 2 Envronment and Polluton; Vol. 2, No. 1; 2013 ISSN 1927-0909 E-ISSN 1927-0917 Publshed by Canadan Center of Scence and Educaton A Sem-parametrc Regresson Model to Estmate Varablty of NO 2 Meczysław Szyszkowcz

More information

Mixed Linear System Estimation and Identification

Mixed Linear System Estimation and Identification 48th IEEE Conference on Decson and Control, Shangha, Chna, December 2009 Mxed Lnear System Estmaton and Identfcaton A. Zymns S. Boyd D. Gornevsky Abstract We consder a mxed lnear system model, wth both

More information

Cluster Analysis of Electrical Behavior

Cluster Analysis of Electrical Behavior Journal of Computer and Communcatons, 205, 3, 88-93 Publshed Onlne May 205 n ScRes. http://www.scrp.org/ournal/cc http://dx.do.org/0.4236/cc.205.350 Cluster Analyss of Electrcal Behavor Ln Lu Ln Lu, School

More information

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION Paulo Quntlano 1 & Antono Santa-Rosa 1 Federal Polce Department, Brasla, Brazl. E-mals: quntlano.pqs@dpf.gov.br and

More information

APPLICATION OF MULTIVARIATE LOSS FUNCTION FOR ASSESSMENT OF THE QUALITY OF TECHNOLOGICAL PROCESS MANAGEMENT

APPLICATION OF MULTIVARIATE LOSS FUNCTION FOR ASSESSMENT OF THE QUALITY OF TECHNOLOGICAL PROCESS MANAGEMENT 3. - 5. 5., Brno, Czech Republc, EU APPLICATION OF MULTIVARIATE LOSS FUNCTION FOR ASSESSMENT OF THE QUALITY OF TECHNOLOGICAL PROCESS MANAGEMENT Abstract Josef TOŠENOVSKÝ ) Lenka MONSPORTOVÁ ) Flp TOŠENOVSKÝ

More information

A CLASS OF TRANSFORMED EFFICIENT RATIO ESTIMATORS OF FINITE POPULATION MEAN. Department of Statistics, Islamia College, Peshawar, Pakistan 2

A CLASS OF TRANSFORMED EFFICIENT RATIO ESTIMATORS OF FINITE POPULATION MEAN. Department of Statistics, Islamia College, Peshawar, Pakistan 2 Pa. J. Statst. 5 Vol. 3(4), 353-36 A CLASS OF TRANSFORMED EFFICIENT RATIO ESTIMATORS OF FINITE POPULATION MEAN Sajjad Ahmad Khan, Hameed Al, Sadaf Manzoor and Alamgr Department of Statstcs, Islama College,

More information

Positive Semi-definite Programming Localization in Wireless Sensor Networks

Positive Semi-definite Programming Localization in Wireless Sensor Networks Postve Sem-defnte Programmng Localzaton n Wreless Sensor etworks Shengdong Xe 1,, Jn Wang, Aqun Hu 1, Yunl Gu, Jang Xu, 1 School of Informaton Scence and Engneerng, Southeast Unversty, 10096, anjng Computer

More information

Edge Detection in Noisy Images Using the Support Vector Machines

Edge Detection in Noisy Images Using the Support Vector Machines Edge Detecton n Nosy Images Usng the Support Vector Machnes Hlaro Gómez-Moreno, Saturnno Maldonado-Bascón, Francsco López-Ferreras Sgnal Theory and Communcatons Department. Unversty of Alcalá Crta. Madrd-Barcelona

More information

Some Advanced SPC Tools 1. Cumulative Sum Control (Cusum) Chart For the data shown in Table 9-1, the x chart can be generated.

Some Advanced SPC Tools 1. Cumulative Sum Control (Cusum) Chart For the data shown in Table 9-1, the x chart can be generated. Some Advanced SP Tools 1. umulatve Sum ontrol (usum) hart For the data shown n Table 9-1, the x chart can be generated. However, the shft taken place at sample #21 s not apparent. 92 For ths set samples,

More information

A Bilinear Model for Sparse Coding

A Bilinear Model for Sparse Coding A Blnear Model for Sparse Codng Davd B. Grmes and Rajesh P. N. Rao Department of Computer Scence and Engneerng Unversty of Washngton Seattle, WA 98195-2350, U.S.A. grmes,rao @cs.washngton.edu Abstract

More information

Artificial Intelligence (AI) methods are concerned with. Artificial Intelligence Techniques for Steam Generator Modelling

Artificial Intelligence (AI) methods are concerned with. Artificial Intelligence Techniques for Steam Generator Modelling Artfcal Intellgence Technques for Steam Generator Modellng Sarah Wrght and Tshldz Marwala Abstract Ths paper nvestgates the use of dfferent Artfcal Intellgence methods to predct the values of several contnuous

More information

2x x l. Module 3: Element Properties Lecture 4: Lagrange and Serendipity Elements

2x x l. Module 3: Element Properties Lecture 4: Lagrange and Serendipity Elements Module 3: Element Propertes Lecture : Lagrange and Serendpty Elements 5 In last lecture note, the nterpolaton functons are derved on the bass of assumed polynomal from Pascal s trangle for the fled varable.

More information

Machine Learning 9. week

Machine Learning 9. week Machne Learnng 9. week Mappng Concept Radal Bass Functons (RBF) RBF Networks 1 Mappng It s probably the best scenaro for the classfcaton of two dataset s to separate them lnearly. As you see n the below

More information

Adjustment methods for differential measurement errors in multimode surveys

Adjustment methods for differential measurement errors in multimode surveys Adjustment methods for dfferental measurement errors n multmode surveys Salah Merad UK Offce for Natonal Statstcs ESSnet MM DCSS, Fnal Meetng Wesbaden, Germany, 4-5 September 2014 Outlne Introducton Stablsng

More information

NAG Fortran Library Chapter Introduction. G10 Smoothing in Statistics

NAG Fortran Library Chapter Introduction. G10 Smoothing in Statistics Introducton G10 NAG Fortran Lbrary Chapter Introducton G10 Smoothng n Statstcs Contents 1 Scope of the Chapter... 2 2 Background to the Problems... 2 2.1 Smoothng Methods... 2 2.2 Smoothng Splnes and Regresson

More information

Compiler Design. Spring Register Allocation. Sample Exercises and Solutions. Prof. Pedro C. Diniz

Compiler Design. Spring Register Allocation. Sample Exercises and Solutions. Prof. Pedro C. Diniz Compler Desgn Sprng 2014 Regster Allocaton Sample Exercses and Solutons Prof. Pedro C. Dnz USC / Informaton Scences Insttute 4676 Admralty Way, Sute 1001 Marna del Rey, Calforna 90292 pedro@s.edu Regster

More information

Synthesizer 1.0. User s Guide. A Varying Coefficient Meta. nalytic Tool. Z. Krizan Employing Microsoft Excel 2007

Synthesizer 1.0. User s Guide. A Varying Coefficient Meta. nalytic Tool. Z. Krizan Employing Microsoft Excel 2007 Syntheszer 1.0 A Varyng Coeffcent Meta Meta-Analytc nalytc Tool Employng Mcrosoft Excel 007.38.17.5 User s Gude Z. Krzan 009 Table of Contents 1. Introducton and Acknowledgments 3. Operatonal Functons

More information

EXTENDED BIC CRITERION FOR MODEL SELECTION

EXTENDED BIC CRITERION FOR MODEL SELECTION IDIAP RESEARCH REPORT EXTEDED BIC CRITERIO FOR ODEL SELECTIO Itshak Lapdot Andrew orrs IDIAP-RR-0-4 Dalle olle Insttute for Perceptual Artfcal Intellgence P.O.Box 59 artgny Valas Swtzerland phone +4 7

More information

The Codesign Challenge

The Codesign Challenge ECE 4530 Codesgn Challenge Fall 2007 Hardware/Software Codesgn The Codesgn Challenge Objectves In the codesgn challenge, your task s to accelerate a gven software reference mplementaton as fast as possble.

More information

An Optimal Algorithm for Prufer Codes *

An Optimal Algorithm for Prufer Codes * J. Software Engneerng & Applcatons, 2009, 2: 111-115 do:10.4236/jsea.2009.22016 Publshed Onlne July 2009 (www.scrp.org/journal/jsea) An Optmal Algorthm for Prufer Codes * Xaodong Wang 1, 2, Le Wang 3,

More information

Fusion Performance Model for Distributed Tracking and Classification

Fusion Performance Model for Distributed Tracking and Classification Fuson Performance Model for Dstrbuted rackng and Classfcaton K.C. Chang and Yng Song Dept. of SEOR, School of I&E George Mason Unversty FAIRFAX, VA kchang@gmu.edu Martn Lggns Verdan Systems Dvson, Inc.

More information

Corner-Based Image Alignment using Pyramid Structure with Gradient Vector Similarity

Corner-Based Image Alignment using Pyramid Structure with Gradient Vector Similarity Journal of Sgnal and Informaton Processng, 013, 4, 114-119 do:10.436/jsp.013.43b00 Publshed Onlne August 013 (http://www.scrp.org/journal/jsp) Corner-Based Image Algnment usng Pyramd Structure wth Gradent

More information

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data A Fast Content-Based Multmeda Retreval Technque Usng Compressed Data Borko Furht and Pornvt Saksobhavvat NSF Multmeda Laboratory Florda Atlantc Unversty, Boca Raton, Florda 3343 ABSTRACT In ths paper,

More information

Learning a Class-Specific Dictionary for Facial Expression Recognition

Learning a Class-Specific Dictionary for Facial Expression Recognition BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 16, No 4 Sofa 016 Prnt ISSN: 1311-970; Onlne ISSN: 1314-4081 DOI: 10.1515/cat-016-0067 Learnng a Class-Specfc Dctonary for

More information

Available online at ScienceDirect. Procedia Environmental Sciences 26 (2015 )

Available online at   ScienceDirect. Procedia Environmental Sciences 26 (2015 ) Avalable onlne at www.scencedrect.com ScenceDrect Proceda Envronmental Scences 26 (2015 ) 109 114 Spatal Statstcs 2015: Emergng Patterns Calbratng a Geographcally Weghted Regresson Model wth Parameter-Specfc

More information

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning Outlne Artfcal Intellgence and ts applcatons Lecture 8 Unsupervsed Learnng Professor Danel Yeung danyeung@eee.org Dr. Patrck Chan patrckchan@eee.org South Chna Unversty of Technology, Chna Introducton

More information

Fuzzy Filtering Algorithms for Image Processing: Performance Evaluation of Various Approaches

Fuzzy Filtering Algorithms for Image Processing: Performance Evaluation of Various Approaches Proceedngs of the Internatonal Conference on Cognton and Recognton Fuzzy Flterng Algorthms for Image Processng: Performance Evaluaton of Varous Approaches Rajoo Pandey and Umesh Ghanekar Department of

More information

An Image Fusion Approach Based on Segmentation Region

An Image Fusion Approach Based on Segmentation Region Rong Wang, L-Qun Gao, Shu Yang, Yu-Hua Cha, and Yan-Chun Lu An Image Fuson Approach Based On Segmentaton Regon An Image Fuson Approach Based on Segmentaton Regon Rong Wang, L-Qun Gao, Shu Yang 3, Yu-Hua

More information

An Iterative Solution Approach to Process Plant Layout using Mixed Integer Optimisation

An Iterative Solution Approach to Process Plant Layout using Mixed Integer Optimisation 17 th European Symposum on Computer Aded Process Engneerng ESCAPE17 V. Plesu and P.S. Agach (Edtors) 2007 Elsever B.V. All rghts reserved. 1 An Iteratve Soluton Approach to Process Plant Layout usng Mxed

More information

Unsupervised Learning and Clustering

Unsupervised Learning and Clustering Unsupervsed Learnng and Clusterng Why consder unlabeled samples?. Collectng and labelng large set of samples s costly Gettng recorded speech s free, labelng s tme consumng 2. Classfer could be desgned

More information

Skew Angle Estimation and Correction of Hand Written, Textual and Large areas of Non-Textual Document Images: A Novel Approach

Skew Angle Estimation and Correction of Hand Written, Textual and Large areas of Non-Textual Document Images: A Novel Approach Angle Estmaton and Correcton of Hand Wrtten, Textual and Large areas of Non-Textual Document Images: A Novel Approach D.R.Ramesh Babu Pyush M Kumat Mahesh D Dhannawat PES Insttute of Technology Research

More information

Reducing Frame Rate for Object Tracking

Reducing Frame Rate for Object Tracking Reducng Frame Rate for Object Trackng Pavel Korshunov 1 and We Tsang Oo 2 1 Natonal Unversty of Sngapore, Sngapore 11977, pavelkor@comp.nus.edu.sg 2 Natonal Unversty of Sngapore, Sngapore 11977, oowt@comp.nus.edu.sg

More information

Unsupervised Learning

Unsupervised Learning Pattern Recognton Lecture 8 Outlne Introducton Unsupervsed Learnng Parametrc VS Non-Parametrc Approach Mxture of Denstes Maxmum-Lkelhood Estmates Clusterng Prof. Danel Yeung School of Computer Scence and

More information

Detection of an Object by using Principal Component Analysis

Detection of an Object by using Principal Component Analysis Detecton of an Object by usng Prncpal Component Analyss 1. G. Nagaven, 2. Dr. T. Sreenvasulu Reddy 1. M.Tech, Department of EEE, SVUCE, Trupath, Inda. 2. Assoc. Professor, Department of ECE, SVUCE, Trupath,

More information

A Binarization Algorithm specialized on Document Images and Photos

A Binarization Algorithm specialized on Document Images and Photos A Bnarzaton Algorthm specalzed on Document mages and Photos Ergna Kavalleratou Dept. of nformaton and Communcaton Systems Engneerng Unversty of the Aegean kavalleratou@aegean.gr Abstract n ths paper, a

More information

Optimizing Document Scoring for Query Retrieval

Optimizing Document Scoring for Query Retrieval Optmzng Document Scorng for Query Retreval Brent Ellwen baellwe@cs.stanford.edu Abstract The goal of ths project was to automate the process of tunng a document query engne. Specfcally, I used machne learnng

More information

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS ARPN Journal of Engneerng and Appled Scences 006-017 Asan Research Publshng Network (ARPN). All rghts reserved. NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS Igor Grgoryev, Svetlana

More information

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching A Fast Vsual Trackng Algorthm Based on Crcle Pxels Matchng Zhqang Hou hou_zhq@sohu.com Chongzhao Han czhan@mal.xjtu.edu.cn Ln Zheng Abstract: A fast vsual trackng algorthm based on crcle pxels matchng

More information

X- Chart Using ANOM Approach

X- Chart Using ANOM Approach ISSN 1684-8403 Journal of Statstcs Volume 17, 010, pp. 3-3 Abstract X- Chart Usng ANOM Approach Gullapall Chakravarth 1 and Chaluvad Venkateswara Rao Control lmts for ndvdual measurements (X) chart are

More information

Classification Based Mode Decisions for Video over Networks

Classification Based Mode Decisions for Video over Networks Classfcaton Based Mode Decsons for Vdeo over Networks Deepak S. Turaga and Tsuhan Chen Advanced Multmeda Processng Lab Tranng data for Inter-Intra Decson Inter-Intra Decson Regons pdf 6 5 6 5 Energy 4

More information

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance

Tsinghua University at TAC 2009: Summarizing Multi-documents by Information Distance Tsnghua Unversty at TAC 2009: Summarzng Mult-documents by Informaton Dstance Chong Long, Mnle Huang, Xaoyan Zhu State Key Laboratory of Intellgent Technology and Systems, Tsnghua Natonal Laboratory for

More information

A Statistical Model Selection Strategy Applied to Neural Networks

A Statistical Model Selection Strategy Applied to Neural Networks A Statstcal Model Selecton Strategy Appled to Neural Networks Joaquín Pzarro Elsa Guerrero Pedro L. Galndo joaqun.pzarro@uca.es elsa.guerrero@uca.es pedro.galndo@uca.es Dpto Lenguajes y Sstemas Informátcos

More information

Programming in Fortran 90 : 2017/2018

Programming in Fortran 90 : 2017/2018 Programmng n Fortran 90 : 2017/2018 Programmng n Fortran 90 : 2017/2018 Exercse 1 : Evaluaton of functon dependng on nput Wrte a program who evaluate the functon f (x,y) for any two user specfed values

More information

Parameter estimation for incomplete bivariate longitudinal data in clinical trials

Parameter estimation for incomplete bivariate longitudinal data in clinical trials Parameter estmaton for ncomplete bvarate longtudnal data n clncal trals Naum M. Khutoryansky Novo Nordsk Pharmaceutcals, Inc., Prnceton, NJ ABSTRACT Bvarate models are useful when analyzng longtudnal data

More information

Related-Mode Attacks on CTR Encryption Mode

Related-Mode Attacks on CTR Encryption Mode Internatonal Journal of Network Securty, Vol.4, No.3, PP.282 287, May 2007 282 Related-Mode Attacks on CTR Encrypton Mode Dayn Wang, Dongda Ln, and Wenlng Wu (Correspondng author: Dayn Wang) Key Laboratory

More information

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points; Subspace clusterng Clusterng Fundamental to all clusterng technques s the choce of dstance measure between data ponts; D q ( ) ( ) 2 x x = x x, j k = 1 k jk Squared Eucldean dstance Assumpton: All features

More information

A New Approach For the Ranking of Fuzzy Sets With Different Heights

A New Approach For the Ranking of Fuzzy Sets With Different Heights New pproach For the ankng of Fuzzy Sets Wth Dfferent Heghts Pushpnder Sngh School of Mathematcs Computer pplcatons Thapar Unversty, Patala-7 00 Inda pushpndersnl@gmalcom STCT ankng of fuzzy sets plays

More information

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints Australan Journal of Basc and Appled Scences, 2(4): 1204-1208, 2008 ISSN 1991-8178 Sum of Lnear and Fractonal Multobjectve Programmng Problem under Fuzzy Rules Constrants 1 2 Sanjay Jan and Kalash Lachhwan

More information

High Dimensional Data Clustering

High Dimensional Data Clustering Hgh Dmensonal Data Clusterng Charles Bouveyron 1,2, Stéphane Grard 1, and Cordela Schmd 2 1 LMC-IMAG, BP 53, Unversté Grenoble 1, 38041 Grenoble Cede 9, France charles.bouveyron@mag.fr, stephane.grard@mag.fr

More information

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers IOSR Journal of Electroncs and Communcaton Engneerng (IOSR-JECE) e-issn: 78-834,p- ISSN: 78-8735.Volume 9, Issue, Ver. IV (Mar - Apr. 04), PP 0-07 Content Based Image Retreval Usng -D Dscrete Wavelet wth

More information

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1

Outline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1 4/14/011 Outlne Dscrmnatve classfers for mage recognton Wednesday, Aprl 13 Krsten Grauman UT-Austn Last tme: wndow-based generc obect detecton basc ppelne face detecton wth boostng as case study Today:

More information

Implementation Naïve Bayes Algorithm for Student Classification Based on Graduation Status

Implementation Naïve Bayes Algorithm for Student Classification Based on Graduation Status Internatonal Journal of Appled Busness and Informaton Systems ISSN: 2597-8993 Vol 1, No 2, September 2017, pp. 6-12 6 Implementaton Naïve Bayes Algorthm for Student Classfcaton Based on Graduaton Status

More information

An Application of the Dulmage-Mendelsohn Decomposition to Sparse Null Space Bases of Full Row Rank Matrices

An Application of the Dulmage-Mendelsohn Decomposition to Sparse Null Space Bases of Full Row Rank Matrices Internatonal Mathematcal Forum, Vol 7, 2012, no 52, 2549-2554 An Applcaton of the Dulmage-Mendelsohn Decomposton to Sparse Null Space Bases of Full Row Rank Matrces Mostafa Khorramzadeh Department of Mathematcal

More information

CS246: Mining Massive Datasets Jure Leskovec, Stanford University

CS246: Mining Massive Datasets Jure Leskovec, Stanford University CS46: Mnng Massve Datasets Jure Leskovec, Stanford Unversty http://cs46.stanford.edu /19/013 Jure Leskovec, Stanford CS46: Mnng Massve Datasets, http://cs46.stanford.edu Perceptron: y = sgn( x Ho to fnd

More information

Determining the Optimal Bandwidth Based on Multi-criterion Fusion

Determining the Optimal Bandwidth Based on Multi-criterion Fusion Proceedngs of 01 4th Internatonal Conference on Machne Learnng and Computng IPCSIT vol. 5 (01) (01) IACSIT Press, Sngapore Determnng the Optmal Bandwdth Based on Mult-crteron Fuson Ha-L Lang 1+, Xan-Mn

More information

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification

12/2/2009. Announcements. Parametric / Non-parametric. Case-Based Reasoning. Nearest-Neighbor on Images. Nearest-Neighbor Classification Introducton to Artfcal Intellgence V22.0472-001 Fall 2009 Lecture 24: Nearest-Neghbors & Support Vector Machnes Rob Fergus Dept of Computer Scence, Courant Insttute, NYU Sldes from Danel Yeung, John DeNero

More information

UB at GeoCLEF Department of Geography Abstract

UB at GeoCLEF Department of Geography   Abstract UB at GeoCLEF 2006 Mguel E. Ruz (1), Stuart Shapro (2), June Abbas (1), Slva B. Southwck (1) and Davd Mark (3) State Unversty of New York at Buffalo (1) Department of Lbrary and Informaton Studes (2) Department

More information

A Multivariate Analysis of Static Code Attributes for Defect Prediction

A Multivariate Analysis of Static Code Attributes for Defect Prediction Research Paper) A Multvarate Analyss of Statc Code Attrbutes for Defect Predcton Burak Turhan, Ayşe Bener Department of Computer Engneerng, Bogazc Unversty 3434, Bebek, Istanbul, Turkey {turhanb, bener}@boun.edu.tr

More information

Comparison Study of Textural Descriptors for Training Neural Network Classifiers

Comparison Study of Textural Descriptors for Training Neural Network Classifiers Comparson Study of Textural Descrptors for Tranng Neural Network Classfers G.D. MAGOULAS (1) S.A. KARKANIS (1) D.A. KARRAS () and M.N. VRAHATIS (3) (1) Department of Informatcs Unversty of Athens GR-157.84

More information

Empirical Distributions of Parameter Estimates. in Binary Logistic Regression Using Bootstrap

Empirical Distributions of Parameter Estimates. in Binary Logistic Regression Using Bootstrap Int. Journal of Math. Analyss, Vol. 8, 4, no. 5, 7-7 HIKARI Ltd, www.m-hkar.com http://dx.do.org/.988/jma.4.494 Emprcal Dstrbutons of Parameter Estmates n Bnary Logstc Regresson Usng Bootstrap Anwar Ftranto*

More information

Face Recognition University at Buffalo CSE666 Lecture Slides Resources:

Face Recognition University at Buffalo CSE666 Lecture Slides Resources: Face Recognton Unversty at Buffalo CSE666 Lecture Sldes Resources: http://www.face-rec.org/algorthms/ Overvew of face recognton algorthms Correlaton - Pxel based correspondence between two face mages Structural

More information

Proper Choice of Data Used for the Estimation of Datum Transformation Parameters

Proper Choice of Data Used for the Estimation of Datum Transformation Parameters Proper Choce of Data Used for the Estmaton of Datum Transformaton Parameters Hakan S. KUTOGLU, Turkey Key words: Coordnate systems; transformaton; estmaton, relablty. SUMMARY Advances n technologes and

More information

Classification / Regression Support Vector Machines

Classification / Regression Support Vector Machines Classfcaton / Regresson Support Vector Machnes Jeff Howbert Introducton to Machne Learnng Wnter 04 Topcs SVM classfers for lnearly separable classes SVM classfers for non-lnearly separable classes SVM

More information

Face Recognition Based on SVM and 2DPCA

Face Recognition Based on SVM and 2DPCA Vol. 4, o. 3, September, 2011 Face Recognton Based on SVM and 2DPCA Tha Hoang Le, Len Bu Faculty of Informaton Technology, HCMC Unversty of Scence Faculty of Informaton Scences and Engneerng, Unversty

More information

BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION

BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION SHI-LIANG SUN, HONG-LEI SHI Department of Computer Scence and Technology, East Chna Normal Unversty 500 Dongchuan Road, Shangha 200241, P. R. Chna E-MAIL: slsun@cs.ecnu.edu.cn,

More information

Fast Sparse Gaussian Processes Learning for Man-Made Structure Classification

Fast Sparse Gaussian Processes Learning for Man-Made Structure Classification Fast Sparse Gaussan Processes Learnng for Man-Made Structure Classfcaton Hang Zhou Insttute for Vson Systems Engneerng, Dept Elec. & Comp. Syst. Eng. PO Box 35, Monash Unversty, Clayton, VIC 3800, Australa

More information

Wishing you all a Total Quality New Year!

Wishing you all a Total Quality New Year! Total Qualty Management and Sx Sgma Post Graduate Program 214-15 Sesson 4 Vnay Kumar Kalakband Assstant Professor Operatons & Systems Area 1 Wshng you all a Total Qualty New Year! Hope you acheve Sx sgma

More information

Collaboratively Regularized Nearest Points for Set Based Recognition

Collaboratively Regularized Nearest Points for Set Based Recognition Academc Center for Computng and Meda Studes, Kyoto Unversty Collaboratvely Regularzed Nearest Ponts for Set Based Recognton Yang Wu, Mchhko Mnoh, Masayuk Mukunok Kyoto Unversty 9/1/013 BMVC 013 @ Brstol,

More information

TECHNIQUE OF FORMATION HOMOGENEOUS SAMPLE SAME OBJECTS. Muradaliyev A.Z.

TECHNIQUE OF FORMATION HOMOGENEOUS SAMPLE SAME OBJECTS. Muradaliyev A.Z. TECHNIQUE OF FORMATION HOMOGENEOUS SAMPLE SAME OBJECTS Muradalyev AZ Azerbajan Scentfc-Research and Desgn-Prospectng Insttute of Energetc AZ1012, Ave HZardab-94 E-mal:aydn_murad@yahoocom Importance of

More information

Fitting & Matching. Lecture 4 Prof. Bregler. Slides from: S. Lazebnik, S. Seitz, M. Pollefeys, A. Effros.

Fitting & Matching. Lecture 4 Prof. Bregler. Slides from: S. Lazebnik, S. Seitz, M. Pollefeys, A. Effros. Fttng & Matchng Lecture 4 Prof. Bregler Sldes from: S. Lazebnk, S. Setz, M. Pollefeys, A. Effros. How do we buld panorama? We need to match (algn) mages Matchng wth Features Detect feature ponts n both

More information

Pruning Training Corpus to Speedup Text Classification 1

Pruning Training Corpus to Speedup Text Classification 1 Prunng Tranng Corpus to Speedup Text Classfcaton Jhong Guan and Shugeng Zhou School of Computer Scence, Wuhan Unversty, Wuhan, 430079, Chna hguan@wtusm.edu.cn State Key Lab of Software Engneerng, Wuhan

More information