A Binary Neural Decision Table Classifier
|
|
- Augustus Parrish
- 5 years ago
- Views:
Transcription
1 A Bnary Neural Decson Table Classfer Vctora J. Hodge Smon O Keefe Jm Austn vcky@cs.york.ac.uk sok@cs.york.ac.uk austn@cs.york.ac.uk Advanced Computer Archtecture Group Department of Computer Scence Unversty of York, York, YO0 5DD, UK Abstract In ths paper, we ntroduce a neural network -based decson table algorthm. We focus on the mplementaton detals of the decson table algorthm when t s constructed usng the neural network. Decson tables are smple supervsed classfers whch, Kohav demonstrated, can outperform state-of-the-art classfers such as C4.5. We couple ths power wth the effcency and flexblty of a bnary assocatve-memory neural network. We demonstrate how the bnary assocatve-memory neural network can form the decson table ndex to map between attrbute values and data records. We also show how two attrbute selecton algorthms, whch may be used to pre-select the attrbutes for the decson table, can easly be mplemented wthn the bnary assocatve-memory neural framework. The frst attrbute selector uses mutual nformaton between attrbutes and classes to select the attrbutes that classfy best. The second attrbute selector uses a probablstc approach to evaluate randomly selected attrbute subsets. Introducton Supervsed classfer algorthms am to predct the class of an unseen data tem. They nduce a hypothess usng the tranng data to map nputs onto classfed outputs (decsons). Ths hypothess should then correctly classfy prevously unseen data tems. There s a wde varety of classfers ncludng: decson trees, neural networks, Bayesan classfers, Support Vector Machnes and k-nearest neghbour. We have prevously developed a k-nn classfer[ha04] usng an assocatve memory neural network called the Advanced Uncertan Reasonng Archtecture (AURA)[A95]. In ths paper, we extend the approach to encompass a decson table supervsed classfer, couplng the classfcaton power of the decs on table wth the speed and storage effcency of an assocatve memory neural network The decson table has two components: a schema and a body. The schema s the set of attrbutes pre-selected to represent the data and s usually a subset of the data s total attrbutes. There are varous approaches for attrbute selecton; we dscuss two later n ths paper. The body s essentally a table of labelled data tems where the attrbutes specfed by the schema form the rows and the decsons (classfcatons) form the columns. Each column s mutually exclusve and represents an equvalence set of records as defned by the attrbutes of the schema. Kohav [K95] uses a Decson Table Maorty (DTM) for classfcaton whereby f an unseen tem exactly matches a stored tem n the body then the decson table assgns the stored tem s decson to the unseen tem. However, f there s no exact match then the decson table assgns the maorty class across all tems to the unseen tem. Our decson table approach mplements both DTM and proxmty-based matchng as mplemented n our k-nn classfer whereby f there s no exact match then the decson table assgns the class of the nearest stored tem to the unseen tem. RAM-based Neural Networks The AURA C++ lbrary provdes a range of classes and methods for rapd partal matchng of large data sets [A95]. In ths paper we defne partal matchng as the retreval of those stored records that match some or all of the nput record. In our AURA decson table, we use best partal matchng to retreve the records that are the top matches. AURA belongs to a class of neural networks called Random Access Memory (RAM-based) networks. RAM-based networks were frst developed by Bledsoe & Brownng [BB59] and Aleksander & Albrow [AA68] for pattern recognton and led to the WISARD pattern recognton machne [ATB84]. See also [A98] for a detaled complaton of RAM methods. RAMs are founded on the twn prncples of matrces (usually called Correlaton Matrx Memores (CMMs)) and n-tuplng. Each matrx accepts m nputs as a vector or tuple addressng m rows and n outputs as a vector addressng n columns of the matrx. Durng the tranng phase, the matrx weghts M lk are ncremented f both the nput row I l and output column O k are set. NC4. of 7
2 Therefore, tranng s a sngle epoch process wth one tranng step for each nput-output assocaton preservng the hgh speed. Durng recall, the presentaton of vector I elcts the recall of vector O as vector I contans all of the addressng nformaton necessary to access and retreve vector O. Ths tranng and recall makes RAMs computatonally smple and transparent wth wellunderstood propertes. RAMs are also able to partally match records durng retreval. Therefore, they can rapdly match records that are close to the nput but do not match exactly. AURA AURA has been used n an nformaton retreval system[h0], hgh speed rule matchng systems[akl95], 3-D structure matchng[ta00] and trademark searchng[aa98]. AURA technques have demonstrated superor performance wth respect to speed compared to conventonal data ndexng approaches [HA0] such as hashng and nverted fle lsts whch may be used for a decson table body. AURA trans 20 tmes faster than an nverted fle lst and 6 tmes faster than a hashng algorthm. It s up to 24 tmes faster than the nverted fle lst for recall and up to 4 tmes faster than the hashng algorthm. AURA technques have demonstrated superor speed and accuracy compared to conventonal neural classfers [ZAK99]. The rapd tranng, computatonal smplcty, network transparency and partal match capablty of RAM networks coupled wth our robust quantsaton and encodng method to map numerc attrbutes from the data set onto bnary vectors for tranng and recall make AURA deal to use as the bass of an effcent mplementaton. A more formal defnton of AURA, ts components and methods now follows. Correlaton Matrx Memores (CMMs) are the buldng blocks for AURA systems. AURA uses bnary nput I and output O vectors to tran records n to the CMM and recall sets of matchng records from the CMM as n Equaton and Fgure. Equaton CMM = I all O T where s logcal OR Tranng s a sngle epoch process wth one tranng step for each nput-output assocaton (each I x O T n Equaton ) whch equates to one step for each record n the data set. Fgure Showng a CMM wth nput vector and output vector o. Four matrx locatons are set followng tranng 0 o 0, 2 o n-2, m- o 0 and m o n. For the methodology descrbed n ths paper, we: Tran the data set nto the CMM (decson table body CMM) whch ndexes all records n the data set and allows them to be matched. Select the attrbutes for the schema usng schema CMMs. We descrbe two selecton algorthms. One uses a sngle CMM but the second algorthm uses two coupled CMMs. Match and classfy unseen tems usng the traned decson table. Data For the data sets: Symbolc and numercal unordered attrbutes are enumerated and each separate token maps onto an nteger (Text? Integer) whch dentfes the bt to set wthn the vector. For example, a SEX_TYPE attrbute would map as, (F? 0) and (M? ). Kohav s DTM methodology s prncpally amed at symbolc attrbutes but the AURA decson table can handle contnuous numerc attrbutes. Any real-valued or ordered numerc attrbutes, are quantsed (mapped to dscrete bns) and each ndvdual bn maps onto an nteger whch dentfes the bt to set n the nput vector. A range of nput values for attrbute f map onto each bn whch n turn maps to a unque nteger to ndex the vector as n Equaton 2. The range of attrbute values mappng to each bn s equal. Equaton 2 R f bns fk a Integer fk + offset f where FV f cardnalty( Integer f ) cardnalty( bnsf ) NC4. 2 of 7
3 In Equaton 2, offset f s a cumulatve nteger offset wthn the bnary vector for each attrbute f and offset f+ = offset f +nbns f, where nbns f s the number of bns for attrbute f, FV f s the set of attrbute values for attrbute f,? s a many-to-one mappng and a s a one-to-one mappng. Ths quantsaton (bnnng) approach ams to subdvde the attrbutes unformly across the range of each attrbute. The range of values s dvded nto b bns such that each bn s of equal wdth. The even wdths of the bns prevent dstorton of the nter-bn dstances. Once the bns and nteger mappngs have been determned, we map the records onto bnary vectors. Each attrbute maps onto a consecutve secton of bts n the bnary vector. For each record n the data set For each attrbute Calculate bn for attrbute value; Set bt n vector as n Equaton 2; Each bnary vector represents a record from the data set Body Tranng The decson table body s an ndex of all contngences and the decson to take for each. Input vectors represent quantsed records and form an nput I to the CMM durng tranng. The CMM assocates the nput wth a unque output vector O T durng tranng whch represents an equvalence set of records. Ths produces a CMM where the rows represent the attrbutes and ther respectve values and the columns represent equvalence sets of records (where equvalence s determned by the attrbutes desgnated by the schema). We use an array of lnked lsts to store the equvalence sets of records and a second array to store the counts of each class for the equvalence set as a hstogram. The algorthm s: ) Input vector to CMM; 2) Threshold at value nf; 3) If exact match 4) Add the record to column lst; 5) Add class to hstogram; 6) Else tran record as next column; nf s the number of attrbutes. Steps and 2 are equvalent to testng for an exact match durng body recall as descrbed next. Fgure 3 shows a traned CMM where each row s an attrbute value and each column represents an equvalence set. Body Recall The decson table classfes by fndng the set of matchng records. To recall the matches for a query record, we frstly produce an nput vector by quantsng the target values for each attrbute to dentfy the bns and thus CMM rows to actvate as n Equaton 2. To retreve the matchng records for a partcular record, AURA effectvely calculates the dot product of the nput vector I k and the CMM, computng a postve nteger-valued output vector O k (the summed output vector) as n Equaton 3 and Fgure 2 & Fgure 3. Equaton 3 O T k = I k CMM The AURA technque thresholds the summed output O k to produce a bnary output vector as n Fgure 2 for exact match and Fgure 3 for a partal match. Fgure 2 Dagram showng the CMM recall for an exact match. The left hand column s the nput vector. The dot s the value for each attrbute (a value for an unordered attrbute or a bn for an ordered numerc attrbute). AURA multples the nput vector by the values n the matrx columns, usng the dot product, sums each column to produce the summed output vector and then thresholds ths vector at a value equvalent to the number of attrbutes n the nput (6 here) to produce the thresholded attrbute vector whch ndcates the matchng column (the mddle column here). For exact match (as n Kohav s DTM), we use the Wllshaw threshold. It sets a bt n the thresholded output vector for every locaton n the summed output vector that has a value hgher than a threshold value. The threshold value s set to the number of attrbutes nf for an exact match. If there s an exact match there wll be a bt set n the thresholded output vector ndcatng the matchng equvalence set. It s then smply a case of lookng up the class hstogram for ths equvalence set n the stored array and classfyng the record by the maorty class n the hstogram. If there NC4. 3 of 7
4 are no bts set n the thresholded output vector then we classfy the unseen record accordng to the maorty class across the data set. Fgure 3 Dagram showng the CMM recall for a partal match. The left hand column s the nput vector. The dot s the value for each attrbute (a value for an unordered attrbute or a bn for an ordered numerc attrbute). AURA multples the nput vector by the values n the matrx columns, usng the dot product, sums each column to produce the summed output vector and then thresholds ths vector at a value equvalent to the hghest value n the vector (5 here) to produce the thresholded attrbute vector whch ndcates the matchng column (the mddle column here). For partal matchng, we use the L-Max threshold. L- Max thresholdng essentally retreves at least L top matches. It sets a bt n the thresholded output vector for every locaton n the summed output vector that has a value hgher than a threshold value. The AURA C++ lbrary automatcally sets the threshold value to the hghest nteger value that wll retreve at least L matches. For the AURA decson table, L s set to the value of. There wll be a bt set n the thresholded output vector ndcatng the best matchng equvalence set. It s then smply a case of lookng up the class hstogram for ths equvalence set n the stored array and classfyng the unseen record as the maorty class. We note there may be more than one best matchng equvalence set so the maorty class across all best matchng sets wll need to be calculated. Schema Tranng In the decson table body CMM, the rows represented attrbute values and the columns represented equvalence sets. In the schema CMM used for the frst attrbute selecton algorthm, the rows represent attrbute values and the columns represent ndvdual records. For our second attrbute selecton algorthm, we use two CMMs where the frst CMM ndexes the second CMM. In the frst CMM, the rows represent records and the columns represent attrbute values. In the second CMM 2, the rows represent attrbute values and the columns represent the records. Ths second CMM 2 s therefore dentcal to the CMM used for the frst attrbute selecton algorthm Durng tranng for the frst attrbute selecton algorthm and CMM 2 of the second attrbute selecton algorthm, the nput vectors I represent the attrbute values n the data records. The CMM assocates the nput wth a unque output vector O T. Each output vector s orthogonal wth a sngle bt set correspondng to the record s poston n the data set, the frst record has the frst bt set n the output vector, the second and so on. Durng tranng for CMM, the records represent the nput vectors I wth a sngle bt set and the output vectors O T represent the attrbute values n the data records. The CMM tranng process s gven n Equaton. Schema Attrbute Selecton As wth Kohav, we assume that all records are to be used n the body and durng attrbute selecton. There are two fundamental approaches to attrbute selecton whch are used n classfcaton: a flter approach that selects the optmal set of attrbutes ndependently of the classfer algorthm and the wrapper approach that selects attrbutes to optmse classfcaton usng the algorthm. We examne two flter approaches whch are more flexble than wrapper approaches as they are not drectly coupled to the classfcaton algorthm. For a data set wth N attrbutes there are O(N M ) possble combnatons of M attrbutes whch s ntractable to search exhaustvely. In the followng: we use one flter approach (mutual nformaton attrbute selecton) that examnes attrbutes on an ndvdual bass and another probablstc flter approach (probablstc Las Vegas algorthm) that examnes randomly selected subsets of attrbutes. Mutual Informaton Attrbute Selecton Wettscherek [W94] descrbes a mutual nformaton attrbute selecton algorthm whch calculates the NC4. 4 of 7
5 mutual nformaton between class C and each attrbute F. The mutual nformaton between two attrbutes s the reducton n uncertanty concernng the possble values of one attrbute that s obtaned when the value of the other attrbute s determned. For unordered attrbutes, nfv s the number of dstnct attrbute values (f ) for attrbute F and the number of classes (C): Equaton 4 I( C,F) = nfv = p(c = c F = f ) p(c= c F = f) log p(c = c) p(f c= = f ) For ordered numerc attrbutes, the technque computes the mutual nformaton between a dscrete random varable (class) and a contnuous random varable (attrbute). It estmates the probablty functon of the attrbutes usng densty estmaton. We assume attrbute F has densty f(x) and the ont densty of C and F s f(x,y). Then the mutual nformaton s: Equaton 5 I( C, F ) = x c= f ( x, C = c) f ( x, C = c) log dx f ( x) p(c = c) Equaton 5 requres an estmate of the densty functon f(x) and the ont densty functon f(x, C=c). To approxmate f(x) and f(x, C=c), we utlse the bnnng to represent the densty whch s analogous to the Trapezum Rule for usng the areas of slces (trapeza) to represent the area under the graph for ntegraton. We use the bns to represent strps of the probablty densty functon and count the number of records mappng nto each bn to estmate the densty. In AURA, for unordered data, the mutual nformaton s gven by Equaton 6: Equaton 6 nrowsfv I (C, F ) = = c= nrowf n n( BVf BVc ) nrowf nrowf n( BVf BVc ) n nrowf log nclass c nrowf n n Where nrowsfv s the number of rows n the CMM for attrbute F, n s the number of records n the data set, BVf s a bnary vector (CMM row) for f, BVc s a bnary vector wth one bt set for each record n class c, n(bvf BVc) s a count of the set bts when BVc s logcally anded wth BVf and nclass c s the number of records n class c. In AURA, for real/dscrete ordered numerc attrbutes, the mutual nformaton s gven by Equaton 7: Equaton 7 nrowsfv I (C, F ) = = c = nrowb n n( BVb BVc ) nrowb nrowb n( BVb BVc ) n nrowb log nclass nrowb c n n Where nb s the number of bns n the CMM for attrbute F, n s the number of records n the data set, BVb s a bnary vector (CMM row) for f, BVc s a bnary vector wth one bt set for each record n class c, n(bvf BVc) s a count of the set bts when BVc s logcally ANDed wth BVb and nclass c s the number of records n class c. The technque assumes ndependence of attrbutes and gnores mssng values. It s also the user s prerogatve to determne the number of attrbutes to select. Probablstc Las Vegas Algorthm Lu & Setono [LS96] ntroduced a probablstc Las Vegas algorthm whch uses random search and nconsstency to evaluate attrbute subsets. For each equvalence set of records (where the records match accordng to the attrbutes desgnated n the schema), consstency s defned as the number of matchng records mnus the largest number of matchng records n any one class. The nconsstency scores are summed across all equvalence sets to produce an nconsstency score for the partcular attrbute selecton. The technque uses random search to select attrbutes as random search s less susceptble to local mnma than heurstc searches such as forward search or backward search. Forward search works by greedly addng attrbutes to a subset of selected attrbutes untl some termnaton condton s met whereby addng new attrbutes to the subset does not ncrease the dscrmnatory power of the subset above a prespecfed threshold value. Backward search works by greedly removng attrbutes from an ntal set of all attrbutes untl some termnaton condton s met NC4. 5 of 7
6 whereby removng an attrbute from the subset decreases the dscrmnatory power of the subset above a prespecfed threshold. A poor attrbute choce at the begnnng of a forward or backward search wll adversely effect the fnal selecton whereas a random search wll not rely on any ntal choces. Lu and Setono defned ther algorthm as: ) F best = N; 2) For = to MAX_TRIES 3) S = randomattrbuteset(seed); 4) nf = numberofattrbutes(s); 5) If(nF < nf best ) 6) If(InconCheck(S,D) < γ) 7) S best = S; nf best = nf; 8) End for Where D s the dataset, N the number of attrbutes and γ the permssble nconsstency score. Lu & Setono recommend settng MAX_TRIES to 77xN 5. Fgure 4 Showng the two CMM combnaton we use for Lu & Setono s algorthm. In the frst CMM (CMM ), the records ndex the rows (one row per record) and the attrbute values ndex the columns. The outputs from the CMM (matchng attrbute values) feed straght nto the second CMM(CMM 2 ), where the attrbute values ndex the rows and the records ndex the columns (one column per record). Lu and Setono s algorthm may be calculated smply usng the AURA schema CMMs. We need to use two lnked CMMs for the calculaton as n Fgure 4. We rotate the schema CMM (CMM ) through 90º. CMM s rows ndex the records and CMM s columns ndex the attrbute values. If we feed the outputs from CMM (the actvated attrbute values) nto CMM 2 then we can calculate the nconsstency scores easly. Lne 6 of Lu and Setono s algorthm lsted above then becomes: Place all records n a queue Q; Whle!empty(Q) Remove R the head record from Q; Actvate row R n CMM; Threshold CMM at value ; Feed CMM output nto CMM2; Threshold CMM2 at value F best; ; B = bts set n thresholded_vector; Max = cardnalty of largest class; InconCheck(S,D) += B-Max; End whle The queue effectvely holds the unprocessed records. By actvatng the head record s row n CMM and Wllshaw thresholdng at value (denotng all actve columns (.e., all attrbute values n the record)), we can determne that records attrbute values. When these values are fed nto CMM 2, we effectvely actvate all records matchng these values. Ths approach s the most effcent as the CMMs store all attrbutes and ther values but we only focus on those attrbutes under nvestgaton durng each teraton of the algorthm. An alternatve approach would be to ust store those attrbutes selected n the random subset each tme we execute lne 6 of Lu and Setono s algorthm but the CMMs would need to be retraned many tmes (up to 77xN 5 ). After thresholdng CMM 2 at the value F best (the number of attrbutes), we retreve the equvalence set of matchng records where equvalence s specfed by the current attrbute selecton n the algorthm {S}. It s then smply a matter of countng the number of matchng records (the number of bts set n the thresholded output vector), calculatng the number of these matchng records n each class, dentfyng the largest class membershp and subtractng the largest class membershp from the number of records. The algorthm has now processed all of the records n ths equvalence set so t removes these records from the queue. If we repeat ths process for each record at the head of the queue untl the queue s empty, we wll have processed all equvalence sets. We can then calculate InconCheck(S,D) for ths attrbute selecton and compare t wth the threshold value as n lne 6 of Lu and Setono s algorthm. Once we have terated through Lu and Setono s algorthm MAX_TRIES tmes then we have selected an optmal attrbute subset. We have not tred all combnatons of all attrbutes as ths s ntractable for a large data set. However, we have made a suffcent approxmaton. NC4. 6 of 7
7 Concluson In ths paper we have ntroduced a bnary neural decson table classfer. The AURA neural archtecture, whch underpns the classfer, has demonstrated superor tranng and recall speed compared to conventonal ndexng approaches such as hashng or nverted fle lsts whch may be used for a decson table. AURA trans 20 tmes faster than an nverted fle lst and 6 tmes faster than a hashng algorthm. It s up to 24 tmes faster than the nverted fle lst for recall and up to 4 tmes faster than the hashng algorthm. In ths paper, we descrbed the mplementaton detals of the technque. Our next step s to evaluate the AURA decson table for speed and memory usage aganst a conventonal decson table mplementaton. We have shown how two qute dfferent attrbute selecton approaches may be mplemented wthn the AURA decson table framework. We descrbed a mutual nformaton attrbute selector that examnes attrbutes on an ndvdual bass and scores them accordng to ther class dscrmnaton ablty. We also demonstrated a probablstc Las Vegas algorthm whch uses random search and nconsstency to evaluate attrbute subsets. We feel the technque s flexble and easly extended to other attrbute selecton algorthms. Acknowledgement Ths work was supported by EPSRC Grant number GR/R550/0. References [AA68] Aleksander, I., & Albrow, R.C. Pattern recognton wth Adaptve Logc Elements. IEE Conference on Pattern Recognton, pp 68-74, 968. [ATB84] Aleksander, I., Thomas, W.V., & Bowden, P.A. Wsard: A radcal step forward n mage recognton. Sensor Revew, pp 20-24, 984. [AA98] Alws, S., & Austn, J. A Novel Archtecture for Trademark Image Retreval Systems. In, Electronc Workshops n Computng, 998. [A95] Austn, J. Dstrbuted Assocatve Memores for Hgh Speed Symbolc Reasonng. In, IJCAI Workng Notes of Workshop on Connectonst- Symbolc Integraton: From Unfed to Hybrd Approaches, pp 87-93, 995. [A98] Austn, J. RAM-based Neural Networks, Progress n Neural Processng 9, Sngapore: World Scentfc Pub. Co., 998. [AKL95] Austn, J., Kennedy, J., & Lees, K. A Neural Archtecture for Fast Rule Matchng. In, Artfcal Neural Networks and Expert Systems Conference (ANNES 95), Dunedn, New Zealand, 995. [BB59] Bledsoe, W.W., & Brownng, I. Pattern recognton and Readng by Machne. In, Proceedngs of Eastern Jont Computer Conference, pp , 959. [H0] Hodge, V., Integratng Informaton Retreval & Neural Networks, PhD Thess,Department of Computer Scence, The Unversty of York, 200. [HA0] Hodge, V. & Austn, J. An Evaluaton of Standard Retreval Algorthms and a Bnary Neural Approach. Neural Networks 4(3), pp , Elsever, 200. [HA04] Hodge, V. & Austn, J. A Hgh Performance k-nn Approach Usng Bnary Neural Networks. To appear, Neural Networks, Elsever, [K95] Kohav, R.. The Power of Decson Tables. In, Procs of Eurpean Confernce on Machne Learnng. LNAI 94, Sprnger-Verlag, pp74-89, 995. [LS96] Lu, H., and Setono, R. A probablstc approach to feature selecton - A flter soluton. In, 3th Internatonal Conference on Machne Learnng (ICML'96), pp , 996. [TA00] Turner, A., & Austn, J. Performance Evaluaton of a fast Chemcal Structure Matchng Method usng Dstrbuted Neural Relaxaton. In, 4 th Internatonal conference on Knowledge Based Intellgent Engneerng Systems, [W94] Wettscherek, D.. A Study of Dstance-Based Machne Learnng Algorthms. PhD Thess, Dept of Comp. Sc., Oregon State Unversty, 994. [ZAK99] Zhou, P., Austn, J. & Kennedy, J. A Hgh Performance k-nn Classfer Us ng a Bnary Correlaton Matrx Memory, Advances n Neural Informaton Processng Systems, Vol., 999. NC4. 7 of 7
A Binary Neural Decision Table Classifier
A Binary Neural Decision Table Classifier Victoria J. Hodge Simon O Keefe Jim Austin vicky@cs.york.ac.uk sok@cs.york.ac.uk austin@cs.york.ac.uk Advanced Computer Architecture Group Department of Computer
More informationContent Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers
IOSR Journal of Electroncs and Communcaton Engneerng (IOSR-JECE) e-issn: 78-834,p- ISSN: 78-8735.Volume 9, Issue, Ver. IV (Mar - Apr. 04), PP 0-07 Content Based Image Retreval Usng -D Dscrete Wavelet wth
More informationAn Entropy-Based Approach to Integrated Information Needs Assessment
Dstrbuton Statement A: Approved for publc release; dstrbuton s unlmted. An Entropy-Based Approach to ntegrated nformaton Needs Assessment June 8, 2004 Wllam J. Farrell Lockheed Martn Advanced Technology
More informationMachine Learning 9. week
Machne Learnng 9. week Mappng Concept Radal Bass Functons (RBF) RBF Networks 1 Mappng It s probably the best scenaro for the classfcaton of two dataset s to separate them lnearly. As you see n the below
More informationClassifying Acoustic Transient Signals Using Artificial Intelligence
Classfyng Acoustc Transent Sgnals Usng Artfcal Intellgence Steve Sutton, Unversty of North Carolna At Wlmngton (suttons@charter.net) Greg Huff, Unversty of North Carolna At Wlmngton (jgh7476@uncwl.edu)
More informationLearning the Kernel Parameters in Kernel Minimum Distance Classifier
Learnng the Kernel Parameters n Kernel Mnmum Dstance Classfer Daoqang Zhang 1,, Songcan Chen and Zh-Hua Zhou 1* 1 Natonal Laboratory for Novel Software Technology Nanjng Unversty, Nanjng 193, Chna Department
More informationSupport Vector Machines
/9/207 MIST.6060 Busness Intellgence and Data Mnng What are Support Vector Machnes? Support Vector Machnes Support Vector Machnes (SVMs) are supervsed learnng technques that analyze data and recognze patterns.
More informationSLAM Summer School 2006 Practical 2: SLAM using Monocular Vision
SLAM Summer School 2006 Practcal 2: SLAM usng Monocular Vson Javer Cvera, Unversty of Zaragoza Andrew J. Davson, Imperal College London J.M.M Montel, Unversty of Zaragoza. josemar@unzar.es, jcvera@unzar.es,
More informationSkew Angle Estimation and Correction of Hand Written, Textual and Large areas of Non-Textual Document Images: A Novel Approach
Angle Estmaton and Correcton of Hand Wrtten, Textual and Large areas of Non-Textual Document Images: A Novel Approach D.R.Ramesh Babu Pyush M Kumat Mahesh D Dhannawat PES Insttute of Technology Research
More informationTerm Weighting Classification System Using the Chi-square Statistic for the Classification Subtask at NTCIR-6 Patent Retrieval Task
Proceedngs of NTCIR-6 Workshop Meetng, May 15-18, 2007, Tokyo, Japan Term Weghtng Classfcaton System Usng the Ch-square Statstc for the Classfcaton Subtask at NTCIR-6 Patent Retreval Task Kotaro Hashmoto
More informationProblem Definitions and Evaluation Criteria for Computational Expensive Optimization
Problem efntons and Evaluaton Crtera for Computatonal Expensve Optmzaton B. Lu 1, Q. Chen and Q. Zhang 3, J. J. Lang 4, P. N. Suganthan, B. Y. Qu 6 1 epartment of Computng, Glyndwr Unversty, UK Faclty
More informationThe Research of Support Vector Machine in Agricultural Data Classification
The Research of Support Vector Machne n Agrcultural Data Classfcaton Le Sh, Qguo Duan, Xnmng Ma, Me Weng College of Informaton and Management Scence, HeNan Agrcultural Unversty, Zhengzhou 45000 Chna Zhengzhou
More informationA Unified Framework for Semantics and Feature Based Relevance Feedback in Image Retrieval Systems
A Unfed Framework for Semantcs and Feature Based Relevance Feedback n Image Retreval Systems Ye Lu *, Chunhu Hu 2, Xngquan Zhu 3*, HongJang Zhang 2, Qang Yang * School of Computng Scence Smon Fraser Unversty
More informationParallelism for Nested Loops with Non-uniform and Flow Dependences
Parallelsm for Nested Loops wth Non-unform and Flow Dependences Sam-Jn Jeong Dept. of Informaton & Communcaton Engneerng, Cheonan Unversty, 5, Anseo-dong, Cheonan, Chungnam, 330-80, Korea. seong@cheonan.ac.kr
More informationLecture 5: Multilayer Perceptrons
Lecture 5: Multlayer Perceptrons Roger Grosse 1 Introducton So far, we ve only talked about lnear models: lnear regresson and lnear bnary classfers. We noted that there are functons that can t be represented
More informationAn Image Fusion Approach Based on Segmentation Region
Rong Wang, L-Qun Gao, Shu Yang, Yu-Hua Cha, and Yan-Chun Lu An Image Fuson Approach Based On Segmentaton Regon An Image Fuson Approach Based on Segmentaton Regon Rong Wang, L-Qun Gao, Shu Yang 3, Yu-Hua
More informationA Binarization Algorithm specialized on Document Images and Photos
A Bnarzaton Algorthm specalzed on Document mages and Photos Ergna Kavalleratou Dept. of nformaton and Communcaton Systems Engneerng Unversty of the Aegean kavalleratou@aegean.gr Abstract n ths paper, a
More informationMULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION
MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION Paulo Quntlano 1 & Antono Santa-Rosa 1 Federal Polce Department, Brasla, Brazl. E-mals: quntlano.pqs@dpf.gov.br and
More informationA Fast Content-Based Multimedia Retrieval Technique Using Compressed Data
A Fast Content-Based Multmeda Retreval Technque Usng Compressed Data Borko Furht and Pornvt Saksobhavvat NSF Multmeda Laboratory Florda Atlantc Unversty, Boca Raton, Florda 3343 ABSTRACT In ths paper,
More informationCMPS 10 Introduction to Computer Science Lecture Notes
CPS 0 Introducton to Computer Scence Lecture Notes Chapter : Algorthm Desgn How should we present algorthms? Natural languages lke Englsh, Spansh, or French whch are rch n nterpretaton and meanng are not
More informationExtraction of Fuzzy Rules from Trained Neural Network Using Evolutionary Algorithm *
Extracton of Fuzzy Rules from Traned Neural Network Usng Evolutonary Algorthm * Urszula Markowska-Kaczmar, Wojcech Trelak Wrocław Unversty of Technology, Poland kaczmar@c.pwr.wroc.pl, trelak@c.pwr.wroc.pl
More informationSupport Vector Machines
Support Vector Machnes Decson surface s a hyperplane (lne n 2D) n feature space (smlar to the Perceptron) Arguably, the most mportant recent dscovery n machne learnng In a nutshell: map the data to a predetermned
More informationSHAPE RECOGNITION METHOD BASED ON THE k-nearest NEIGHBOR RULE
SHAPE RECOGNITION METHOD BASED ON THE k-nearest NEIGHBOR RULE Dorna Purcaru Faculty of Automaton, Computers and Electroncs Unersty of Craoa 13 Al. I. Cuza Street, Craoa RO-1100 ROMANIA E-mal: dpurcaru@electroncs.uc.ro
More informationMeta-heuristics for Multidimensional Knapsack Problems
2012 4th Internatonal Conference on Computer Research and Development IPCSIT vol.39 (2012) (2012) IACSIT Press, Sngapore Meta-heurstcs for Multdmensonal Knapsack Problems Zhbao Man + Computer Scence Department,
More informationOutline. Discriminative classifiers for image recognition. Where in the World? A nearest neighbor recognition example 4/14/2011. CS 376 Lecture 22 1
4/14/011 Outlne Dscrmnatve classfers for mage recognton Wednesday, Aprl 13 Krsten Grauman UT-Austn Last tme: wndow-based generc obect detecton basc ppelne face detecton wth boostng as case study Today:
More informationS1 Note. Basis functions.
S1 Note. Bass functons. Contents Types of bass functons...1 The Fourer bass...2 B-splne bass...3 Power and type I error rates wth dfferent numbers of bass functons...4 Table S1. Smulaton results of type
More informationCluster Analysis of Electrical Behavior
Journal of Computer and Communcatons, 205, 3, 88-93 Publshed Onlne May 205 n ScRes. http://www.scrp.org/ournal/cc http://dx.do.org/0.4236/cc.205.350 Cluster Analyss of Electrcal Behavor Ln Lu Ln Lu, School
More informationImplementation Naïve Bayes Algorithm for Student Classification Based on Graduation Status
Internatonal Journal of Appled Busness and Informaton Systems ISSN: 2597-8993 Vol 1, No 2, September 2017, pp. 6-12 6 Implementaton Naïve Bayes Algorthm for Student Classfcaton Based on Graduaton Status
More informationFEATURE EXTRACTION. Dr. K.Vijayarekha. Associate Dean School of Electrical and Electronics Engineering SASTRA University, Thanjavur
FEATURE EXTRACTION Dr. K.Vjayarekha Assocate Dean School of Electrcal and Electroncs Engneerng SASTRA Unversty, Thanjavur613 41 Jont Intatve of IITs and IISc Funded by MHRD Page 1 of 8 Table of Contents
More informationCS 534: Computer Vision Model Fitting
CS 534: Computer Vson Model Fttng Sprng 004 Ahmed Elgammal Dept of Computer Scence CS 534 Model Fttng - 1 Outlnes Model fttng s mportant Least-squares fttng Maxmum lkelhood estmaton MAP estmaton Robust
More informationUser Authentication Based On Behavioral Mouse Dynamics Biometrics
User Authentcaton Based On Behavoral Mouse Dynamcs Bometrcs Chee-Hyung Yoon Danel Donghyun Km Department of Computer Scence Department of Computer Scence Stanford Unversty Stanford Unversty Stanford, CA
More informationLearning-Based Top-N Selection Query Evaluation over Relational Databases
Learnng-Based Top-N Selecton Query Evaluaton over Relatonal Databases Lang Zhu *, Wey Meng ** * School of Mathematcs and Computer Scence, Hebe Unversty, Baodng, Hebe 071002, Chna, zhu@mal.hbu.edu.cn **
More informationClassifier Selection Based on Data Complexity Measures *
Classfer Selecton Based on Data Complexty Measures * Edth Hernández-Reyes, J.A. Carrasco-Ochoa, and J.Fco. Martínez-Trndad Natonal Insttute for Astrophyscs, Optcs and Electroncs, Lus Enrque Erro No.1 Sta.
More informationThe Codesign Challenge
ECE 4530 Codesgn Challenge Fall 2007 Hardware/Software Codesgn The Codesgn Challenge Objectves In the codesgn challenge, your task s to accelerate a gven software reference mplementaton as fast as possble.
More informationPerformance Evaluation of Information Retrieval Systems
Why System Evaluaton? Performance Evaluaton of Informaton Retreval Systems Many sldes n ths secton are adapted from Prof. Joydeep Ghosh (UT ECE) who n turn adapted them from Prof. Dk Lee (Unv. of Scence
More informationAn Optimal Algorithm for Prufer Codes *
J. Software Engneerng & Applcatons, 2009, 2: 111-115 do:10.4236/jsea.2009.22016 Publshed Onlne July 2009 (www.scrp.org/journal/jsea) An Optmal Algorthm for Prufer Codes * Xaodong Wang 1, 2, Le Wang 3,
More informationCorrelative features for the classification of textural images
Correlatve features for the classfcaton of textural mages M A Turkova 1 and A V Gadel 1, 1 Samara Natonal Research Unversty, Moskovskoe Shosse 34, Samara, Russa, 443086 Image Processng Systems Insttute
More informationThree supervised learning methods on pen digits character recognition dataset
Three supervsed learnng methods on pen dgts character recognton dataset Chrs Flezach Department of Computer Scence and Engneerng Unversty of Calforna, San Dego San Dego, CA 92093 cflezac@cs.ucsd.edu Satoru
More information6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour
6.854 Advanced Algorthms Petar Maymounkov Problem Set 11 (November 23, 2005) Wth: Benjamn Rossman, Oren Wemann, and Pouya Kheradpour Problem 1. We reduce vertex cover to MAX-SAT wth weghts, such that the
More informationInvestigating the Performance of Naïve- Bayes Classifiers and K- Nearest Neighbor Classifiers
Journal of Convergence Informaton Technology Volume 5, Number 2, Aprl 2010 Investgatng the Performance of Naïve- Bayes Classfers and K- Nearest Neghbor Classfers Mohammed J. Islam *, Q. M. Jonathan Wu,
More informationWishing you all a Total Quality New Year!
Total Qualty Management and Sx Sgma Post Graduate Program 214-15 Sesson 4 Vnay Kumar Kalakband Assstant Professor Operatons & Systems Area 1 Wshng you all a Total Qualty New Year! Hope you acheve Sx sgma
More informationImprovement of Spatial Resolution Using BlockMatching Based Motion Estimation and Frame. Integration
Improvement of Spatal Resoluton Usng BlockMatchng Based Moton Estmaton and Frame Integraton Danya Suga and Takayuk Hamamoto Graduate School of Engneerng, Tokyo Unversty of Scence, 6-3-1, Nuku, Katsuska-ku,
More informationDetermining the Optimal Bandwidth Based on Multi-criterion Fusion
Proceedngs of 01 4th Internatonal Conference on Machne Learnng and Computng IPCSIT vol. 5 (01) (01) IACSIT Press, Sngapore Determnng the Optmal Bandwdth Based on Mult-crteron Fuson Ha-L Lang 1+, Xan-Mn
More informationA mathematical programming approach to the analysis, design and scheduling of offshore oilfields
17 th European Symposum on Computer Aded Process Engneerng ESCAPE17 V. Plesu and P.S. Agach (Edtors) 2007 Elsever B.V. All rghts reserved. 1 A mathematcal programmng approach to the analyss, desgn and
More informationGA-Based Learning Algorithms to Identify Fuzzy Rules for Fuzzy Neural Networks
Seventh Internatonal Conference on Intellgent Systems Desgn and Applcatons GA-Based Learnng Algorthms to Identfy Fuzzy Rules for Fuzzy Neural Networks K Almejall, K Dahal, Member IEEE, and A Hossan, Member
More informationComputer Animation and Visualisation. Lecture 4. Rigging / Skinning
Computer Anmaton and Vsualsaton Lecture 4. Rggng / Sknnng Taku Komura Overvew Sknnng / Rggng Background knowledge Lnear Blendng How to decde weghts? Example-based Method Anatomcal models Sknnng Assume
More informationPrivate Information Retrieval (PIR)
2 Levente Buttyán Problem formulaton Alce wants to obtan nformaton from a database, but she does not want the database to learn whch nformaton she wanted e.g., Alce s an nvestor queryng a stock-market
More informationFeature Selection as an Improving Step for Decision Tree Construction
2009 Internatonal Conference on Machne Learnng and Computng IPCSIT vol.3 (2011) (2011) IACSIT Press, Sngapore Feature Selecton as an Improvng Step for Decson Tree Constructon Mahd Esmael 1, Fazekas Gabor
More informationProgramming in Fortran 90 : 2017/2018
Programmng n Fortran 90 : 2017/2018 Programmng n Fortran 90 : 2017/2018 Exercse 1 : Evaluaton of functon dependng on nput Wrte a program who evaluate the functon f (x,y) for any two user specfed values
More informationA New Feature of Uniformity of Image Texture Directions Coinciding with the Human Eyes Perception 1
A New Feature of Unformty of Image Texture Drectons Concdng wth the Human Eyes Percepton Xng-Jan He, De-Shuang Huang, Yue Zhang, Tat-Mng Lo 2, and Mchael R. Lyu 3 Intellgent Computng Lab, Insttute of Intellgent
More informationTECHNIQUE OF FORMATION HOMOGENEOUS SAMPLE SAME OBJECTS. Muradaliyev A.Z.
TECHNIQUE OF FORMATION HOMOGENEOUS SAMPLE SAME OBJECTS Muradalyev AZ Azerbajan Scentfc-Research and Desgn-Prospectng Insttute of Energetc AZ1012, Ave HZardab-94 E-mal:aydn_murad@yahoocom Importance of
More informationA MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS
Proceedngs of the Wnter Smulaton Conference M E Kuhl, N M Steger, F B Armstrong, and J A Jones, eds A MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS Mark W Brantley Chun-Hung
More informationCollaboratively Regularized Nearest Points for Set Based Recognition
Academc Center for Computng and Meda Studes, Kyoto Unversty Collaboratvely Regularzed Nearest Ponts for Set Based Recognton Yang Wu, Mchhko Mnoh, Masayuk Mukunok Kyoto Unversty 9/1/013 BMVC 013 @ Brstol,
More informationDetection of an Object by using Principal Component Analysis
Detecton of an Object by usng Prncpal Component Analyss 1. G. Nagaven, 2. Dr. T. Sreenvasulu Reddy 1. M.Tech, Department of EEE, SVUCE, Trupath, Inda. 2. Assoc. Professor, Department of ECE, SVUCE, Trupath,
More informationSubspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;
Subspace clusterng Clusterng Fundamental to all clusterng technques s the choce of dstance measure between data ponts; D q ( ) ( ) 2 x x = x x, j k = 1 k jk Squared Eucldean dstance Assumpton: All features
More informationAssociative Based Classification Algorithm For Diabetes Disease Prediction
Internatonal Journal of Engneerng Trends and Technology (IJETT) Volume-41 Number-3 - November 016 Assocatve Based Classfcaton Algorthm For Dabetes Dsease Predcton 1 N. Gnana Deepka, Y.surekha, 3 G.Laltha
More informationCourse Introduction. Algorithm 8/31/2017. COSC 320 Advanced Data Structures and Algorithms. COSC 320 Advanced Data Structures and Algorithms
Course Introducton Course Topcs Exams, abs, Proects A quc loo at a few algorthms 1 Advanced Data Structures and Algorthms Descrpton: We are gong to dscuss algorthm complexty analyss, algorthm desgn technques
More informationVirtual Memory. Background. No. 10. Virtual Memory: concept. Logical Memory Space (review) Demand Paging(1) Virtual Memory
Background EECS. Operatng System Fundamentals No. Vrtual Memory Prof. Hu Jang Department of Electrcal Engneerng and Computer Scence, York Unversty Memory-management methods normally requres the entre process
More informationA Post Randomization Framework for Privacy-Preserving Bayesian. Network Parameter Learning
A Post Randomzaton Framework for Prvacy-Preservng Bayesan Network Parameter Learnng JIANJIE MA K.SIVAKUMAR School Electrcal Engneerng and Computer Scence, Washngton State Unversty Pullman, WA. 9964-75
More informationFace Recognition University at Buffalo CSE666 Lecture Slides Resources:
Face Recognton Unversty at Buffalo CSE666 Lecture Sldes Resources: http://www.face-rec.org/algorthms/ Overvew of face recognton algorthms Correlaton - Pxel based correspondence between two face mages Structural
More informationA Fast Visual Tracking Algorithm Based on Circle Pixels Matching
A Fast Vsual Trackng Algorthm Based on Crcle Pxels Matchng Zhqang Hou hou_zhq@sohu.com Chongzhao Han czhan@mal.xjtu.edu.cn Ln Zheng Abstract: A fast vsual trackng algorthm based on crcle pxels matchng
More informationModule Management Tool in Software Development Organizations
Journal of Computer Scence (5): 8-, 7 ISSN 59-66 7 Scence Publcatons Management Tool n Software Development Organzatons Ahmad A. Al-Rababah and Mohammad A. Al-Rababah Faculty of IT, Al-Ahlyyah Amman Unversty,
More informationOutline. Type of Machine Learning. Examples of Application. Unsupervised Learning
Outlne Artfcal Intellgence and ts applcatons Lecture 8 Unsupervsed Learnng Professor Danel Yeung danyeung@eee.org Dr. Patrck Chan patrckchan@eee.org South Chna Unversty of Technology, Chna Introducton
More informationFeature Reduction and Selection
Feature Reducton and Selecton Dr. Shuang LIANG School of Software Engneerng TongJ Unversty Fall, 2012 Today s Topcs Introducton Problems of Dmensonalty Feature Reducton Statstc methods Prncpal Components
More informationAnalysis of Non-coherent Fault Trees Using Ternary Decision Diagrams
Analyss of Non-coherent Fault Trees Usng Ternary Decson Dagrams Rasa Remenyte-Prescott Dep. of Aeronautcal and Automotve Engneerng Loughborough Unversty, Loughborough, LE11 3TU, England R.Remenyte-Prescott@lboro.ac.uk
More informationData Representation in Digital Design, a Single Conversion Equation and a Formal Languages Approach
Data Representaton n Dgtal Desgn, a Sngle Converson Equaton and a Formal Languages Approach Hassan Farhat Unversty of Nebraska at Omaha Abstract- In the study of data representaton n dgtal desgn and computer
More informationUnsupervised Learning
Pattern Recognton Lecture 8 Outlne Introducton Unsupervsed Learnng Parametrc VS Non-Parametrc Approach Mxture of Denstes Maxmum-Lkelhood Estmates Clusterng Prof. Danel Yeung School of Computer Scence and
More informationAvailable online at Available online at Advanced in Control Engineering and Information Science
Avalable onlne at wwwscencedrectcom Avalable onlne at wwwscencedrectcom Proceda Proceda Engneerng Engneerng 00 (2011) 15000 000 (2011) 1642 1646 Proceda Engneerng wwwelsevercom/locate/proceda Advanced
More informationHierarchical Semantic Perceptron Grid based Neural Network CAO Huai-hu, YU Zhen-wei, WANG Yin-yan Abstract Key words 1.
Herarchcal Semantc Perceptron Grd based Neural CAO Hua-hu, YU Zhen-we, WANG Yn-yan (Dept. Computer of Chna Unversty of Mnng and Technology Bejng, Bejng 00083, chna) chhu@cumtb.edu.cn Abstract A herarchcal
More informationCHAPTER 3 SEQUENTIAL MINIMAL OPTIMIZATION TRAINED SUPPORT VECTOR CLASSIFIER FOR CANCER PREDICTION
48 CHAPTER 3 SEQUENTIAL MINIMAL OPTIMIZATION TRAINED SUPPORT VECTOR CLASSIFIER FOR CANCER PREDICTION 3.1 INTRODUCTION The raw mcroarray data s bascally an mage wth dfferent colors ndcatng hybrdzaton (Xue
More informationAn Iterative Solution Approach to Process Plant Layout using Mixed Integer Optimisation
17 th European Symposum on Computer Aded Process Engneerng ESCAPE17 V. Plesu and P.S. Agach (Edtors) 2007 Elsever B.V. All rghts reserved. 1 An Iteratve Soluton Approach to Process Plant Layout usng Mxed
More informationExperiments in Text Categorization Using Term Selection by Distance to Transition Point
Experments n Text Categorzaton Usng Term Selecton by Dstance to Transton Pont Edgar Moyotl-Hernández, Héctor Jménez-Salazar Facultad de Cencas de la Computacón, B. Unversdad Autónoma de Puebla, 14 Sur
More informationAn Application of the Dulmage-Mendelsohn Decomposition to Sparse Null Space Bases of Full Row Rank Matrices
Internatonal Mathematcal Forum, Vol 7, 2012, no 52, 2549-2554 An Applcaton of the Dulmage-Mendelsohn Decomposton to Sparse Null Space Bases of Full Row Rank Matrces Mostafa Khorramzadeh Department of Mathematcal
More informationBioTechnology. An Indian Journal FULL PAPER. Trade Science Inc.
[Type text] [Type text] [Type text] ISSN : 0974-74 Volume 0 Issue BoTechnology 04 An Indan Journal FULL PAPER BTAIJ 0() 04 [684-689] Revew on Chna s sports ndustry fnancng market based on market -orented
More informationSignature and Lexicon Pruning Techniques
Sgnature and Lexcon Prunng Technques Srnvas Palla, Hansheng Le, Venu Govndaraju Centre for Unfed Bometrcs and Sensors Unversty at Buffalo {spalla2, hle, govnd}@cedar.buffalo.edu Abstract Handwrtten word
More informationSRBIR: Semantic Region Based Image Retrieval by Extracting the Dominant Region and Semantic Learning
Journal of Computer Scence 7 (3): 400-408, 2011 ISSN 1549-3636 2011 Scence Publcatons SRBIR: Semantc Regon Based Image Retreval by Extractng the Domnant Regon and Semantc Learnng 1 I. Felc Raam and 2 S.
More informationCompiler Design. Spring Register Allocation. Sample Exercises and Solutions. Prof. Pedro C. Diniz
Compler Desgn Sprng 2014 Regster Allocaton Sample Exercses and Solutons Prof. Pedro C. Dnz USC / Informaton Scences Insttute 4676 Admralty Way, Sute 1001 Marna del Rey, Calforna 90292 pedro@s.edu Regster
More informationOptimizing Document Scoring for Query Retrieval
Optmzng Document Scorng for Query Retreval Brent Ellwen baellwe@cs.stanford.edu Abstract The goal of ths project was to automate the process of tunng a document query engne. Specfcally, I used machne learnng
More informationEYE CENTER LOCALIZATION ON A FACIAL IMAGE BASED ON MULTI-BLOCK LOCAL BINARY PATTERNS
P.G. Demdov Yaroslavl State Unversty Anatoly Ntn, Vladmr Khryashchev, Olga Stepanova, Igor Kostern EYE CENTER LOCALIZATION ON A FACIAL IMAGE BASED ON MULTI-BLOCK LOCAL BINARY PATTERNS Yaroslavl, 2015 Eye
More informationy and the total sum of
Lnear regresson Testng for non-lnearty In analytcal chemstry, lnear regresson s commonly used n the constructon of calbraton functons requred for analytcal technques such as gas chromatography, atomc absorpton
More informationR s s f. m y s. SPH3UW Unit 7.3 Spherical Concave Mirrors Page 1 of 12. Notes
SPH3UW Unt 7.3 Sphercal Concave Mrrors Page 1 of 1 Notes Physcs Tool box Concave Mrror If the reflectng surface takes place on the nner surface of the sphercal shape so that the centre of the mrror bulges
More informationAnnouncements. Supervised Learning
Announcements See Chapter 5 of Duda, Hart, and Stork. Tutoral by Burge lnked to on web page. Supervsed Learnng Classfcaton wth labeled eamples. Images vectors n hgh-d space. Supervsed Learnng Labeled eamples
More informationBOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET
1 BOOSTING CLASSIFICATION ACCURACY WITH SAMPLES CHOSEN FROM A VALIDATION SET TZU-CHENG CHUANG School of Electrcal and Computer Engneerng, Purdue Unversty, West Lafayette, Indana 47907 SAUL B. GELFAND School
More informationComparison Study of Textural Descriptors for Training Neural Network Classifiers
Comparson Study of Textural Descrptors for Tranng Neural Network Classfers G.D. MAGOULAS (1) S.A. KARKANIS (1) D.A. KARRAS () and M.N. VRAHATIS (3) (1) Department of Informatcs Unversty of Athens GR-157.84
More informationMathematics 256 a course in differential equations for engineering students
Mathematcs 56 a course n dfferental equatons for engneerng students Chapter 5. More effcent methods of numercal soluton Euler s method s qute neffcent. Because the error s essentally proportonal to the
More informationThe Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique
//00 :0 AM Outlne and Readng The Greedy Method The Greedy Method Technque (secton.) Fractonal Knapsack Problem (secton..) Task Schedulng (secton..) Mnmum Spannng Trees (secton.) Change Money Problem Greedy
More informationCS246: Mining Massive Datasets Jure Leskovec, Stanford University
CS46: Mnng Massve Datasets Jure Leskovec, Stanford Unversty http://cs46.stanford.edu /19/013 Jure Leskovec, Stanford CS46: Mnng Massve Datasets, http://cs46.stanford.edu Perceptron: y = sgn( x Ho to fnd
More informationMachine Learning. Topic 6: Clustering
Machne Learnng Topc 6: lusterng lusterng Groupng data nto (hopefully useful) sets. Thngs on the left Thngs on the rght Applcatons of lusterng Hypothess Generaton lusters mght suggest natural groups. Hypothess
More informationJournal of Chemical and Pharmaceutical Research, 2014, 6(6): Research Article. A selective ensemble classification method on microarray data
Avalable onlne www.ocpr.com Journal of Chemcal and Pharmaceutcal Research, 2014, 6(6):2860-2866 Research Artcle ISSN : 0975-7384 CODEN(USA) : JCPRC5 A selectve ensemble classfcaton method on mcroarray
More informationSum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints
Australan Journal of Basc and Appled Scences, 2(4): 1204-1208, 2008 ISSN 1991-8178 Sum of Lnear and Fractonal Multobjectve Programmng Problem under Fuzzy Rules Constrants 1 2 Sanjay Jan and Kalash Lachhwan
More informationFuzzy Filtering Algorithms for Image Processing: Performance Evaluation of Various Approaches
Proceedngs of the Internatonal Conference on Cognton and Recognton Fuzzy Flterng Algorthms for Image Processng: Performance Evaluaton of Varous Approaches Rajoo Pandey and Umesh Ghanekar Department of
More informationFuzzy Modeling of the Complexity vs. Accuracy Trade-off in a Sequential Two-Stage Multi-Classifier System
Fuzzy Modelng of the Complexty vs. Accuracy Trade-off n a Sequental Two-Stage Mult-Classfer System MARK LAST 1 Department of Informaton Systems Engneerng Ben-Guron Unversty of the Negev Beer-Sheva 84105
More informationCSCI 5417 Information Retrieval Systems Jim Martin!
CSCI 5417 Informaton Retreval Systems Jm Martn! Lecture 11 9/29/2011 Today 9/29 Classfcaton Naïve Bayes classfcaton Ungram LM 1 Where we are... Bascs of ad hoc retreval Indexng Term weghtng/scorng Cosne
More informationNUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS
ARPN Journal of Engneerng and Appled Scences 006-017 Asan Research Publshng Network (ARPN). All rghts reserved. NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS Igor Grgoryev, Svetlana
More informationHarvard University CS 101 Fall 2005, Shimon Schocken. Assembler. Elements of Computing Systems 1 Assembler (Ch. 6)
Harvard Unversty CS 101 Fall 2005, Shmon Schocken Assembler Elements of Computng Systems 1 Assembler (Ch. 6) Why care about assemblers? Because Assemblers employ some nfty trcks Assemblers are the frst
More informationBAYESIAN MULTI-SOURCE DOMAIN ADAPTATION
BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION SHI-LIANG SUN, HONG-LEI SHI Department of Computer Scence and Technology, East Chna Normal Unversty 500 Dongchuan Road, Shangha 200241, P. R. Chna E-MAIL: slsun@cs.ecnu.edu.cn,
More informationQuery Clustering Using a Hybrid Query Similarity Measure
Query clusterng usng a hybrd query smlarty measure Fu. L., Goh, D.H., & Foo, S. (2004). WSEAS Transacton on Computers, 3(3), 700-705. Query Clusterng Usng a Hybrd Query Smlarty Measure Ln Fu, Don Hoe-Lan
More informationLearning Non-Linearly Separable Boolean Functions With Linear Threshold Unit Trees and Madaline-Style Networks
In AAAI-93: Proceedngs of the 11th Natonal Conference on Artfcal Intellgence, 33-1. Menlo Park, CA: AAAI Press. Learnng Non-Lnearly Separable Boolean Functons Wth Lnear Threshold Unt Trees and Madalne-Style
More informationProblem Set 3 Solutions
Introducton to Algorthms October 4, 2002 Massachusetts Insttute of Technology 6046J/18410J Professors Erk Demane and Shaf Goldwasser Handout 14 Problem Set 3 Solutons (Exercses were not to be turned n,
More informationELEC 377 Operating Systems. Week 6 Class 3
ELEC 377 Operatng Systems Week 6 Class 3 Last Class Memory Management Memory Pagng Pagng Structure ELEC 377 Operatng Systems Today Pagng Szes Vrtual Memory Concept Demand Pagng ELEC 377 Operatng Systems
More information