Optimal connection strategies in one- and two-dimensional associative memory models

Size: px
Start display at page:

Download "Optimal connection strategies in one- and two-dimensional associative memory models"

Transcription

1 Optmal connecton strateges n one- and two-dmensonal assocatve memory models Lee Calcraft, Rod Adams, and Nel Davey School of Computer Scence, Unversty of Hertfordshre College lane, Hatfeld, Hertfordshre AL1 9AB, Unted Kngdom E-mal: l.calcraft@herts.ac.uk (Submtted 31 May, 27) Abstract Ths study examnes the performance of sparsely-connected assocatve memory models bult usng a number of dfferent connecton strateges, appled to one- and two-dmensonal topologes. Effcent patterns of connectvty are dentfed whch yeld hgh performance at relatvely low wrng costs n both topologes. It s found that two-dmensonal models are more tolerant of varatons n connecton strategy than ther one-dmensonal counterparts; though networks bult wth both topologes become less so as ther connecton densty s decreased. Keywords Assocatve memory models, effcent connecton strateges, sparse connectvty, comparng 1D and 2D topologes 1. Introducton Our studes of sparsely-connected one-dmensonal assocatve memory models [1, 2], ntally nspred by the work of Watts and Strogatz [3] on the small-world propertes of sparsely-connected systems, demonstrated the mportance of the pattern of connectvty between nodes n determnng network performance. In a small step towards bologcal plausblty, we extend our studes to encompass two-dmensonal networks. Our assocatve memory models now represent a 2D substrate of sparsely-connected neurons wth a connecton densty of.1 or.1. We wll compare the performance of dfferent connecton strateges n our 2D networks wth results obtaned from earler work usng a 1D arrangement. Ths should prove nstructve, snce 1D treatments of assocatve memory do not tend to establsh to what extent ther fndngs are applcable to more bologcally-plausble topologes [4-7]. In ths pursut we acknowledge of course that ths study falls short of a full 3D treatment, whch would requre more processng power than currently avalable to us. As wth our earler 1D work, our 2D studes wll focus on explorng connecton strateges whch acheve good pattern-completon for a mnmum wrng length. We are encouraged n ths pursut by recent studes whch suggest the mportance of wrng optmsaton n nature, both from the pont of vew of the cortcal volume taken up by axons and dendrtes, the delays and attenuaton mposed by long-dstance connectons, and the metabolc requrements of the connectve tssue [8-1]. A connecton strategy whch mnmses wrng length wthout mpactng upon network performance could potentally mtgate aganst these unwanted collaterals. It s the goal of the present work to dentfy such strateges, and to compare ther realsatons n 1D and 2D networks. 2. Network Dynamcs and Tranng Each unt n our networks s a smple, bpolar, threshold devce, summng ts net nput and frng determnstcally. The net nput, or local feld, of a unt, s gven by: h = w S where S ±1 j j j ( ) s the current state and w j s the weght on the connecton from unt j to unt. The dynamcs of the network s gven by the standard update: 1 S = 1 S f h f h f h > < = where S s the new state of S Unt states may be updated synchronously or asynchronously. Here we use asynchronous, random order updates.

2 If a tranng pattern, ξ μ, s one of the fxed ponts of the network, then t s successfully stored and s sad to be a fundamental memory. Gven a tranng set { ξ μ }, the tranng algorthm s desgned to drve the local felds of each unt the correct sde of a learnng threshold, T, for all the tranng patterns. Ths s equvalent to requrng that,μ h μ ξ μ T So the learnng rule s gven by: Begn wth a zero weght matrx Repeat untl all local felds are correct Set the state of the network to one of the ξ μ For each unt,, n turn Calculate h p p ξ. If ths s less than T then change the weghts on connectons nto unt accordng to: j w j = w j + C j ξ p ξ k p j where { C } s the connecton matrx j The form of the update s such that changes are only made on the weghts that are actually present n the connectvty matrx { C j } (where C j =1 f wj s present, and otherwse), and that the learnng rate s nversely proportonal to the number of connectons per unt, k. Earler work has establshed that a learnng threshold T = 1 gves good results [11]. 3. Measurng Performance The ablty to store patterns s not the only functonal requrement of an assocatve memory: fundamental memores should also act as attractors n the state space of the dynamc system resultng from the recurrent connectvty of the network, so that pattern correcton can take place. To measure ths we use the Effectve Capacty of the network, EC [7, 12]. The Effectve Capacty of a network s a measure of the maxmum number of patterns that can be stored n the network wth reasonable pattern correcton stll takng place. We take a farly arbtrary defnton of reasonable as correctng the addton of 6% nose to wthn an overlap of 95% wth the orgnal fundamental memory. Varyng these fgures gves dfferng values for EC but the values wth these settngs are robust for comparson purposes. For large fully-connected networks the EC value s proportonal to N, the total number of nodes n the network, and has a value of approxmately.1 of the maxmum theoretcal capacty of the network. For large sparse locally-connected networks, EC s proportonal to the number of connectons per node, whle wth other archtectures t s dependent upon the actual connecton matrx C. The Effectve Capacty of a partcular network s determned as follows: Intalse the number of patterns, P, to Repeat Increment P Create a tranng set of P random patterns Tran the network For each pattern n the tranng set Degrade the pattern randomly by addng 6% of nose Wth ths nosy pattern as start state, allow the network to converge Calculate the overlap of the fnal network state wth the orgnal pattern EndFor Calculate the mean pattern overlap over all fnal states Untl the mean pattern overlap s less than 95% The Effectve Capacty s P-1 4. Network Archtecture The networks dscussed here are based on one- and two-dmensonal lattces of N nodes wth perodc boundary condtons. Thus the 1D networks take the physcal form of a rng, and the 2D mplementatons that of a torus. The networks are sparse, n whch the nput of each node s connected to a relatvely small, but fxed number, k, of other nodes. The man 2D networks examned consst of 49 nodes arranged n a 7 x 7 array, wth 49 afferent (ncomng) connectons per node, gvng a connecton densty of.1; and of 484 nodes arranged n a 22 x 22 array, wth 48 afferent connectons

3 per node, gvng a connecton densty of.1. The 1D networks consst of 5 nodes and of 5 nodes, both wth 5 connectons per node, agan gvng connecton denstes of.1 and.1, respectvely. All references to spacng refer to the dstance between nodes around the rng n the case of the 1D network, and across the surface of the torus n the 2D case. Fgure 1a. 1D sparsely-connected network wth 14 nodes, and 4 afferent connectons per node, llustratng the connectons to a sngle node: Left, locally-connected, rght, after rewrng. Fgure 1b. 2D sparsely-connected network wth 64 nodes, and 8 afferent connectons per node, llustratng the connectons to a sngle node: Left, locally-connected, rght, after rewrng. We have already establshed for a 1D network that purely local connectvty results n networks wth low wrng length, but wth poor pattern-completon performance, whle randomly-connected networks perform well, but have hgh wrng costs [1]. In a search for a compromse between these two extremes we wll examne three dfferent connecton strateges here, applyng them to both 1D and 2D networks: Progressvely rewred Ths s based on the strategy ntroduced by Watts and Strogatz [3] for generatng small-world networks, and appled to a one-dmensonal assocatve memory by Bohland and Mna [6], and subsequently by Davey et al [13]. A locally-connected network s set up, and a fracton of the afferent connectons to each node s rewred to other randomly-selected nodes. See fgure 1a. It s found that rewrng a one-dmensonal network n ths way mproves communcaton throughout the network, and that as the degree of rewrng s ncreased, pattern completon progressvely mproves, up to the pont where about half the connectons have been rewred. Beyond ths pont, further rewrng seems to have lttle effect [6]. Gaussan Here the network s set up n such a way that the probablty of a connecton between any two nodes separated by a dstance d s proportonal to 2 1 ( 1) exp( d ) σ 2 2σ where d s defned as the dstance between nodes, and les n the range 1 d < N / 2. Network performance s tested for a wde range of values of σ. Exponental In ths case the network s set up n such a way that the probablty of a connecton between any two nodes separated by a dstance, d, (where1 d < N / 2 ) s proportonal to exp( λ( d 1)) Networks are tested over a wde range of λ.

4 5. Results and Dscusson 5.1 Progressve rewrng Ths connecton strategy was ntroduced by Watts and Strogatz as a way to move n a controlled manner from a locally-connected network to a random one, and as dscussed earler, t nvolves the progressve rewrng of a locally-connected network to randomly-chosen connecton stes. See fgure 1. The results of applyng ths procedure n 1D and 2D networks of smlar sze are shown n fgure 2. The networks are ntally bult wth local-only connectons, and ther Effectve Capacty s measured as the network s rewred n steps of 1%, untl all connectons have been rewred, at whch pont the network s randomly connected. As may be seen, both networks behave smlarly, mprovng n patterncompleton performance as the rewrng s ncreased, up to around 4 or 5% rewrng, after whch lttle further mprovement s apparent. Ths echoes the results reported by Bholand and Mna [6], for a 1D network. There s, however, an mportant dfference between the performance of the 1D and 2D networks here, snce although both acheve the same effectve Capacty of 23 when fully rewred, ther performances are very dfferent when connected locally (e when the rewrng s zero). In ths confguraton the 1D network has an Effectve Capacty of 6 patterns, whle the 2D network successfully recalls Effectve Capacty, patterns D network 2D network Degree of rewrng Fgure 2. Effectve Capacty vs degree of rewrng for a 1D network wth 5 unts and 5 ncomng connectons per node, and a 2D network wth 49 unts and 49 ncomng connectons per node. The 1D local network has an EC of just 6, whle n the 2D network t s a much healther 12. Once rewrng has reached around 4 or 5% there s lttle further mprovement n performance. In seekng an explanaton for ths consderable mprovement when movng from the 1D network to the 2D representaton, we would pont to two aspects of the network whch change as the dmensonalty s changed. Frstly, the degree of clusterng, the extent to whch nodes connected to any gven node are also connected to each other, decreases from.73 to.53 as we move from 1D to 2D n the above locally-connected networks; and we have prevously found that very tghtly clustered networks perform badly as assocators [14]. Secondly, there s an mprovement n communcaton across the network as we ncrease dmensonalty. In the 1D network t takes a maxmum of 99 steps to pass data between the furthest-separated nodes, whereas n ts 2D counterpart ths has dramatcally dropped to just 9 steps: or translated nto terms of characterstc path length [3], the 1D network has a mean mnmum path length of 48, whle n the 2D network ths drops to 6.5. We would also speculate that n a 3D mplementaton, a locally-connected network mght perform even better. The sgnfcant mprovement n local performance experenced when movng from 1D to 2D networks has consderable mplcatons when searchng for optmal patterns of connectvty. The reason for ths s that, snce n the 2D topology there s a much smaller dfference between the best and the worst performng archtectures, the rewards for usng optmum patterns of connectvty wll be correspondngly less - and we would speculate that ths s lkely to be even more sgnfcant n 3D networks.

5 5.2 Optmal archtectures n networks of connecton densty.1 In order to compare the performance of other connecton strateges wth that of progressvelyrewred networks, we measured the Effectve Capacty of networks whose patterns of connectvty were based on Gaussan and exponental probablty dstrbutons of varyng σ and λ. The Effectve Capacty of all three network types (Gaussan, exponental and progressvely-rewred) were then plotted aganst the mean wrng length of the correspondng networks, provdng us wth an effcent way to evaluate pattern-completon performance and correspondng wrng costs. Fgure 3a shows the results for a 1D network of 5 nodes wth 5 connectons per node, whle fgure 3b depcts a 2D network of 49 nodes wth 49 connectons per node Effectve Capacty, patterns Gaussan Exponental Progressvely-rewred Mean wrng length Fgure 3a. Effectve Capacty vs wrng length for Gaussan, exponental and progressvely-rewred archtectures on a 1D network wth 5 nodes and 5 connectons per node. Note that the leftmost pont on the rewred plot corresponds to a local-only network (zero rewrng), and the rghtmost to a random network (1% rewrng). Results are averages over 5 runs Effectve Capacty, patterns Gaussan Exponental Progressvely-rewred Mean wrng length Fgure 3b. Effectve Capacty vs wrng length for Gaussan, exponental and progressvely-rewred archtectures on a 2 D network wth 49 nodes, and 49 connectons per node. Agan the leftmost pont on the rewred plot corresponds to a local-only network, and the rghtmost to a random network. Results are averages over 5 runs. We can see from ths that n both the 1D and the 2D networks, all three archtectures acheve a maxmum pattern-completon performance of around 23 patterns. And n both topologes the Gaussan and exponental archtectures acheve ths at a consderably lower mean wrng length than the progressvely-rewred networks. But, largely because of the better performance of the local network n 2D topology, the dfferences are not so large n the 2D network. Thus, comparng network confguratons whch acheve an Effectve Capacty of 2 (a hgh value at a relatvely low mean wrng length), usng a Gaussan archtecture n the 1D network would use only one quarter of the wrng of the equvalent progressvely-rewred network. In the case of the 2D network, the correspondng savng n wrng drops to a half. Clearly, however, ths s stll far from a trval savng, and the fact that

6 connectvty between neurons n the cortex s beleved to follow a Gaussan archtecture [15] (e the probablty of any two neurons beng connected decreases wth dstance accordng to a Gaussan dstrbuton) bears wtness to the contnung benefts of ths archtecture n real 3D systems. 5.3 Optmal archtectures n networks of connecton densty.1 In our 1D studes usng networks of connecton densty.1 we reported that the dfferences between the rewred network and the Gaussan and exponental dstrbutons were notceably less than at the lower connecton densty of.1 [1], but that dfferences were stll n evdence. Once we move to a 2D topology, however, we see that whlst there contnues to be a notceable dfference n performance between the rewred network and the Gaussan and exponental dstrbutons at the lower,.1, connecton densty, ths effectvely dsappears at a connecton densty of.1. See fgure 4, whch llustrates the performance of a 1D network of 5 nodes, wth 5 connectons per node; and a 2D network wth 484 nodes, and 48 connectons per node Effectve Capacty, patterns Gaussan Exponental Progressvely-rewred Mean wrng length Fgure 4a. Effectve Capacty vs wrng length for Gaussan, exponental and progressvely-rewred archtectures on a 1D network wth 5 nodes, and 5 connectons per node. Results are averages over 5 runs Effectve Capacty, patterns Gaussan Exponental Progressvely-rewred Mean wrng length Fgure 4b. Effectve Capacty vs wrng length for Gaussan, exponental and progressvely-rewred archtectures on a 2D network wth 484 nodes and 48 connectons per node. Results are averages over 5 runs. However, the 2D network on whch we are basng ths concluson dffers from our low connecton densty 2D network n not one, but two respects. Its connecton densty s ndeed ten tmes greater, at.1, but the total sze of the network s also smaller by a smlar factor. Thus t s not yet

7 clear to what extent the mergng of performance of the dfferent archtectures seen n the 484 node 2D network s the result of the hgher connecton densty used here (.1 aganst.1), or whether t s due to the smaller sze of the network. In an attempt to dstngush between these two factors, we have repeated the experment for the 2D network at a sze of 49 unts, wth 49 connectons per node, thus retanng the hgher connecton densty of.1, but ncreasng the network sze to that used n the lower connecton densty experments. The results appear n fgure Effectve Capacty, patterns Gaussan Exponental Progressvely-rewred Mean wrng length Fgure 5. Effectve Capacty vs wrng length for Gaussan, exponental and progressvely-rewred archtectures on a 2D network wth 49 nodes and 49 connectons per node. Results are averages over 5 runs. Clearly, there s agan very lttle to choose n terms of performance between the three archtectures, and we must conclude that n 2D assocatve memory models wth connecton denstes of.1 and above, whether the pattern of connectvty s based on a Gaussan or exponental probablty dstrbuton, or whether a progressvely-rewred local network s used, the choce wll have very lttle nfluence on the pattern-completon performance of the network, or the amount of wrng used. However, the partcular parameters whch we adopt (the value of σ for a Gaussan dstrbuton, or of λ for an exponental, or the degree of rewrng used) wll stll have consderable nfluence on performance. These parameters wll determne the operaton pont of our network along the curve n fgure 5. At the left-hand end of the curve, a completely local network wll gve us an Effectve Capacty of around 15 patterns, at a mean wrng length of around 8. At the rght-hand end we obtan an Effectve Capacty of approachng 2 patterns at a mean wrng length of between 2 and 3. By contrast, n networks wth a connecton densty of.1, the Gaussan and exponental archtectures are clearly better performers than the progressvely-rewred network, and because of the relatvely steep rse n the Effectve Capacty aganst mean wrng length curves for these archtectures, t s easer to select an operaton pont along the curve whch has both a hgh Effectve Capacty and a low mean wrng length. 5. Concluson Usng hgh capacty assocatve memory models we have examned the pattern-completon performance and correspondng wrng costs of networks based on a number of dfferent connecton strateges, bult wth a 1D topology. All experments were repeated for smlar networks bult wth a 2D topology, and comparsons drawn between the two sets of results. In our frst set of experments we compared the performance of 1D and 2D networks of smlar sze, as they were progressvely rewred from a state of local-only connectvty to a state of fully random connectvty. It was found that although both topologes yelded the same results n the case of random connectvty (as must be the case), there were mportant dfferences when connectvty was purely local. In ths case the 2D network was able to recall twce the number of patterns acheved by the 1D network. It was suggested that ths may be the consequence both of the decrease n clusterng, and of the much mproved communcaton between dstant nodes n the 2D network. It was also suggested that for smlar reasons, a 3D network mght show even more pronounced effects. We then compared plots of Effectve Capacty aganst mean wrng length for Gaussan, exponental and progressvely-rewred networks. Our ntal tests used a connecton densty of.1. In both the 1D and 2D topologes the Gaussan and exponental networks consstently outperformed the

8 progressvely-rewred networks, though n movng from a 1D to a 2D topology, the benefts of usng Gaussan or exponental connectvty were less pronounced. In networks of connecton densty.1 t was found that the small advantages of usng Gaussan or exponental patterns of connectvty over the progressvely-rewred network n the 1D topology all but dsappeared n the 2D networks. Thus, whle 2D assocatve memory models appear to be more tolerant of varatons n connecton strategy than ther 1D counterparts, networks of both types become less so as ther connecton densty s decreased. In future work we wll nvestgate whether these fndngs are also vald for networks n whch the pont of axonal arborsaton s dsplaced a fnte dstance from the presynaptc node. References [1] L. Calcraft, R. Adams, and N. Davey, "Gaussan and exponental archtectures n small-world assocatve memores," Proceedngs of ESANN 26: 14th European Symposum on Artfcal Neural Networks. Advances n Computatonal Intellgence and Learnng, pp , 26. [2] L. Calcraft, R. Adams, and N. Davey, "Hgh performance assocatve memory models wth low wrng costs," Proceedngs of 3rd IEEE Conference on Intellgent Systems, Unversty of Westmnster, 4-6 September , pp , 26. [3] D. Watts and S. Strogatz, "Collectve dynamcs of 'small-world' networks," Nature, vol. 393, pp , [4] P. McGraw and M. Menznger, "Topology and computatonal performance of attractor neural networks," Physcal Revew E, vol. 68, pp. 4712, 23. [5] F. Emmert-Streb, "Influence of the neural network topology on the learnng dynamcs," Neurocomputng, vol. 69, pp , 26. [6] J. Bohland and A. Mna, "Effcent assocatve memory usng small-world archtecture," Neurocomputng, vol. 38-4, pp , 21. [7] L. Calcraft, R. Adams, and N. Davey, "Locally-connected and small-world assocatve memores n large networks," Neural Informaton Processng - Letters and Revews, vol. 1, pp , 26. [8] D. Chklovsk, "Synaptc connectvty and neuronal morphology: two sdes of the same con," Neuron, vol. 43, pp , 24. [9] G. Mtchson, "Neuronal branchng patterns and the economy of cortcal wrng," Proceedngs: Bologcal Scences, vol. 245, pp , [1] D. Attwell and S. Laughln, "An energy budget for sgnalng n the grey matter of the bran," Journal of Cerebral Blood Flow and Metabolsm, vol. 21, pp , 21. [11] N. Davey, S. P. Hunt, and R. G. Adams, "Hgh capacty recurrent assocatve memores," Neurocomputng, vol. 62, pp , 24. [12] L. Calcraft, "Measurng the performance of assocatve memores," Unversty of Hertfordshre Techncal Report (42) May 25. [13] N. Davey, B. Chrstanson, and R. Adams, "Hgh capacty assocatve memores and small world networks," Proceedngs of the IEEE Internatonal Jont Conference on Neural Networks, 24. [14] L. Calcraft, R. Adams, and N. Davey, "Effcent archtectures for sparsely-connected hgh capacty assocatve memory models," Connecton Scence, vol. 19, 27. [15] B. Hellwg, "A quanttatve analyss of the local connectvty between pyramdal neurons n layers 2/3 of the rat vsual cortex," Bologcal Cybernetcs, vol. 82, pp , 2.

Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like:

Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like: Self-Organzng Maps (SOM) Turgay İBRİKÇİ, PhD. Outlne Introducton Structures of SOM SOM Archtecture Neghborhoods SOM Algorthm Examples Summary 1 2 Unsupervsed Hebban Learnng US Hebban Learnng, Cntd 3 A

More information

6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour

6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour 6.854 Advanced Algorthms Petar Maymounkov Problem Set 11 (November 23, 2005) Wth: Benjamn Rossman, Oren Wemann, and Pouya Kheradpour Problem 1. We reduce vertex cover to MAX-SAT wth weghts, such that the

More information

A mathematical programming approach to the analysis, design and scheduling of offshore oilfields

A mathematical programming approach to the analysis, design and scheduling of offshore oilfields 17 th European Symposum on Computer Aded Process Engneerng ESCAPE17 V. Plesu and P.S. Agach (Edtors) 2007 Elsever B.V. All rghts reserved. 1 A mathematcal programmng approach to the analyss, desgn and

More information

Parallelism for Nested Loops with Non-uniform and Flow Dependences

Parallelism for Nested Loops with Non-uniform and Flow Dependences Parallelsm for Nested Loops wth Non-unform and Flow Dependences Sam-Jn Jeong Dept. of Informaton & Communcaton Engneerng, Cheonan Unversty, 5, Anseo-dong, Cheonan, Chungnam, 330-80, Korea. seong@cheonan.ac.kr

More information

Learning the Kernel Parameters in Kernel Minimum Distance Classifier

Learning the Kernel Parameters in Kernel Minimum Distance Classifier Learnng the Kernel Parameters n Kernel Mnmum Dstance Classfer Daoqang Zhang 1,, Songcan Chen and Zh-Hua Zhou 1* 1 Natonal Laboratory for Novel Software Technology Nanjng Unversty, Nanjng 193, Chna Department

More information

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points; Subspace clusterng Clusterng Fundamental to all clusterng technques s the choce of dstance measure between data ponts; D q ( ) ( ) 2 x x = x x, j k = 1 k jk Squared Eucldean dstance Assumpton: All features

More information

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers IOSR Journal of Electroncs and Communcaton Engneerng (IOSR-JECE) e-issn: 78-834,p- ISSN: 78-8735.Volume 9, Issue, Ver. IV (Mar - Apr. 04), PP 0-07 Content Based Image Retreval Usng -D Dscrete Wavelet wth

More information

Hierarchical clustering for gene expression data analysis

Hierarchical clustering for gene expression data analysis Herarchcal clusterng for gene expresson data analyss Gorgo Valentn e-mal: valentn@ds.unm.t Clusterng of Mcroarray Data. Clusterng of gene expresson profles (rows) => dscovery of co-regulated and functonally

More information

Classifying Acoustic Transient Signals Using Artificial Intelligence

Classifying Acoustic Transient Signals Using Artificial Intelligence Classfyng Acoustc Transent Sgnals Usng Artfcal Intellgence Steve Sutton, Unversty of North Carolna At Wlmngton (suttons@charter.net) Greg Huff, Unversty of North Carolna At Wlmngton (jgh7476@uncwl.edu)

More information

Mathematics 256 a course in differential equations for engineering students

Mathematics 256 a course in differential equations for engineering students Mathematcs 56 a course n dfferental equatons for engneerng students Chapter 5. More effcent methods of numercal soluton Euler s method s qute neffcent. Because the error s essentally proportonal to the

More information

An Iterative Solution Approach to Process Plant Layout using Mixed Integer Optimisation

An Iterative Solution Approach to Process Plant Layout using Mixed Integer Optimisation 17 th European Symposum on Computer Aded Process Engneerng ESCAPE17 V. Plesu and P.S. Agach (Edtors) 2007 Elsever B.V. All rghts reserved. 1 An Iteratve Soluton Approach to Process Plant Layout usng Mxed

More information

Cluster Analysis of Electrical Behavior

Cluster Analysis of Electrical Behavior Journal of Computer and Communcatons, 205, 3, 88-93 Publshed Onlne May 205 n ScRes. http://www.scrp.org/ournal/cc http://dx.do.org/0.4236/cc.205.350 Cluster Analyss of Electrcal Behavor Ln Lu Ln Lu, School

More information

Parallel matrix-vector multiplication

Parallel matrix-vector multiplication Appendx A Parallel matrx-vector multplcaton The reduced transton matrx of the three-dmensonal cage model for gel electrophoress, descrbed n secton 3.2, becomes excessvely large for polymer lengths more

More information

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS ARPN Journal of Engneerng and Appled Scences 006-017 Asan Research Publshng Network (ARPN). All rghts reserved. NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS Igor Grgoryev, Svetlana

More information

Feature Reduction and Selection

Feature Reduction and Selection Feature Reducton and Selecton Dr. Shuang LIANG School of Software Engneerng TongJ Unversty Fall, 2012 Today s Topcs Introducton Problems of Dmensonalty Feature Reducton Statstc methods Prncpal Components

More information

A Deflected Grid-based Algorithm for Clustering Analysis

A Deflected Grid-based Algorithm for Clustering Analysis A Deflected Grd-based Algorthm for Clusterng Analyss NANCY P. LIN, CHUNG-I CHANG, HAO-EN CHUEH, HUNG-JEN CHEN, WEI-HUA HAO Department of Computer Scence and Informaton Engneerng Tamkang Unversty 5 Yng-chuan

More information

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision SLAM Summer School 2006 Practcal 2: SLAM usng Monocular Vson Javer Cvera, Unversty of Zaragoza Andrew J. Davson, Imperal College London J.M.M Montel, Unversty of Zaragoza. josemar@unzar.es, jcvera@unzar.es,

More information

Hermite Splines in Lie Groups as Products of Geodesics

Hermite Splines in Lie Groups as Products of Geodesics Hermte Splnes n Le Groups as Products of Geodescs Ethan Eade Updated May 28, 2017 1 Introducton 1.1 Goal Ths document defnes a curve n the Le group G parametrzed by tme and by structural parameters n the

More information

Course Introduction. Algorithm 8/31/2017. COSC 320 Advanced Data Structures and Algorithms. COSC 320 Advanced Data Structures and Algorithms

Course Introduction. Algorithm 8/31/2017. COSC 320 Advanced Data Structures and Algorithms. COSC 320 Advanced Data Structures and Algorithms Course Introducton Course Topcs Exams, abs, Proects A quc loo at a few algorthms 1 Advanced Data Structures and Algorthms Descrpton: We are gong to dscuss algorthm complexty analyss, algorthm desgn technques

More information

Analysis of Continuous Beams in General

Analysis of Continuous Beams in General Analyss of Contnuous Beams n General Contnuous beams consdered here are prsmatc, rgdly connected to each beam segment and supported at varous ponts along the beam. onts are selected at ponts of support,

More information

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique //00 :0 AM Outlne and Readng The Greedy Method The Greedy Method Technque (secton.) Fractonal Knapsack Problem (secton..) Task Schedulng (secton..) Mnmum Spannng Trees (secton.) Change Money Problem Greedy

More information

High resolution 3D Tau-p transform by matching pursuit Weiping Cao* and Warren S. Ross, Shearwater GeoServices

High resolution 3D Tau-p transform by matching pursuit Weiping Cao* and Warren S. Ross, Shearwater GeoServices Hgh resoluton 3D Tau-p transform by matchng pursut Wepng Cao* and Warren S. Ross, Shearwater GeoServces Summary The 3D Tau-p transform s of vtal sgnfcance for processng sesmc data acqured wth modern wde

More information

Optimizing Document Scoring for Query Retrieval

Optimizing Document Scoring for Query Retrieval Optmzng Document Scorng for Query Retreval Brent Ellwen baellwe@cs.stanford.edu Abstract The goal of ths project was to automate the process of tunng a document query engne. Specfcally, I used machne learnng

More information

Module Management Tool in Software Development Organizations

Module Management Tool in Software Development Organizations Journal of Computer Scence (5): 8-, 7 ISSN 59-66 7 Scence Publcatons Management Tool n Software Development Organzatons Ahmad A. Al-Rababah and Mohammad A. Al-Rababah Faculty of IT, Al-Ahlyyah Amman Unversty,

More information

A MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS

A MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS Proceedngs of the Wnter Smulaton Conference M E Kuhl, N M Steger, F B Armstrong, and J A Jones, eds A MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS Mark W Brantley Chun-Hung

More information

Backpropagation: In Search of Performance Parameters

Backpropagation: In Search of Performance Parameters Bacpropagaton: In Search of Performance Parameters ANIL KUMAR ENUMULAPALLY, LINGGUO BU, and KHOSROW KAIKHAH, Ph.D. Computer Scence Department Texas State Unversty-San Marcos San Marcos, TX-78666 USA ae049@txstate.edu,

More information

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching A Fast Vsual Trackng Algorthm Based on Crcle Pxels Matchng Zhqang Hou hou_zhq@sohu.com Chongzhao Han czhan@mal.xjtu.edu.cn Ln Zheng Abstract: A fast vsual trackng algorthm based on crcle pxels matchng

More information

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning Outlne Artfcal Intellgence and ts applcatons Lecture 8 Unsupervsed Learnng Professor Danel Yeung danyeung@eee.org Dr. Patrck Chan patrckchan@eee.org South Chna Unversty of Technology, Chna Introducton

More information

Solving two-person zero-sum game by Matlab

Solving two-person zero-sum game by Matlab Appled Mechancs and Materals Onlne: 2011-02-02 ISSN: 1662-7482, Vols. 50-51, pp 262-265 do:10.4028/www.scentfc.net/amm.50-51.262 2011 Trans Tech Publcatons, Swtzerland Solvng two-person zero-sum game by

More information

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION Paulo Quntlano 1 & Antono Santa-Rosa 1 Federal Polce Department, Brasla, Brazl. E-mals: quntlano.pqs@dpf.gov.br and

More information

Compiler Design. Spring Register Allocation. Sample Exercises and Solutions. Prof. Pedro C. Diniz

Compiler Design. Spring Register Allocation. Sample Exercises and Solutions. Prof. Pedro C. Diniz Compler Desgn Sprng 2014 Regster Allocaton Sample Exercses and Solutons Prof. Pedro C. Dnz USC / Informaton Scences Insttute 4676 Admralty Way, Sute 1001 Marna del Rey, Calforna 90292 pedro@s.edu Regster

More information

UB at GeoCLEF Department of Geography Abstract

UB at GeoCLEF Department of Geography   Abstract UB at GeoCLEF 2006 Mguel E. Ruz (1), Stuart Shapro (2), June Abbas (1), Slva B. Southwck (1) and Davd Mark (3) State Unversty of New York at Buffalo (1) Department of Lbrary and Informaton Studes (2) Department

More information

Meta-heuristics for Multidimensional Knapsack Problems

Meta-heuristics for Multidimensional Knapsack Problems 2012 4th Internatonal Conference on Computer Research and Development IPCSIT vol.39 (2012) (2012) IACSIT Press, Sngapore Meta-heurstcs for Multdmensonal Knapsack Problems Zhbao Man + Computer Scence Department,

More information

Complex Numbers. Now we also saw that if a and b were both positive then ab = a b. For a second let s forget that restriction and do the following.

Complex Numbers. Now we also saw that if a and b were both positive then ab = a b. For a second let s forget that restriction and do the following. Complex Numbers The last topc n ths secton s not really related to most of what we ve done n ths chapter, although t s somewhat related to the radcals secton as we wll see. We also won t need the materal

More information

KOHONEN'S SELF ORGANIZING NETWORKS WITH "CONSCIENCE"

KOHONEN'S SELF ORGANIZING NETWORKS WITH CONSCIENCE Kohonen's Self Organzng Maps and ther use n Interpretaton, Dr. M. Turhan (Tury) Taner, Rock Sold Images Page: 1 KOHONEN'S SELF ORGANIZING NETWORKS WITH "CONSCIENCE" By: Dr. M. Turhan (Tury) Taner, Rock

More information

Three supervised learning methods on pen digits character recognition dataset

Three supervised learning methods on pen digits character recognition dataset Three supervsed learnng methods on pen dgts character recognton dataset Chrs Flezach Department of Computer Scence and Engneerng Unversty of Calforna, San Dego San Dego, CA 92093 cflezac@cs.ucsd.edu Satoru

More information

A Binarization Algorithm specialized on Document Images and Photos

A Binarization Algorithm specialized on Document Images and Photos A Bnarzaton Algorthm specalzed on Document mages and Photos Ergna Kavalleratou Dept. of nformaton and Communcaton Systems Engneerng Unversty of the Aegean kavalleratou@aegean.gr Abstract n ths paper, a

More information

BioTechnology. An Indian Journal FULL PAPER. Trade Science Inc.

BioTechnology. An Indian Journal FULL PAPER. Trade Science Inc. [Type text] [Type text] [Type text] ISSN : 0974-74 Volume 0 Issue BoTechnology 04 An Indan Journal FULL PAPER BTAIJ 0() 04 [684-689] Revew on Chna s sports ndustry fnancng market based on market -orented

More information

Machine Learning: Algorithms and Applications

Machine Learning: Algorithms and Applications 14/05/1 Machne Learnng: Algorthms and Applcatons Florano Zn Free Unversty of Bozen-Bolzano Faculty of Computer Scence Academc Year 011-01 Lecture 10: 14 May 01 Unsupervsed Learnng cont Sldes courtesy of

More information

An Entropy-Based Approach to Integrated Information Needs Assessment

An Entropy-Based Approach to Integrated Information Needs Assessment Dstrbuton Statement A: Approved for publc release; dstrbuton s unlmted. An Entropy-Based Approach to ntegrated nformaton Needs Assessment June 8, 2004 Wllam J. Farrell Lockheed Martn Advanced Technology

More information

Lobachevsky State University of Nizhni Novgorod. Polyhedron. Quick Start Guide

Lobachevsky State University of Nizhni Novgorod. Polyhedron. Quick Start Guide Lobachevsky State Unversty of Nzhn Novgorod Polyhedron Quck Start Gude Nzhn Novgorod 2016 Contents Specfcaton of Polyhedron software... 3 Theoretcal background... 4 1. Interface of Polyhedron... 6 1.1.

More information

X- Chart Using ANOM Approach

X- Chart Using ANOM Approach ISSN 1684-8403 Journal of Statstcs Volume 17, 010, pp. 3-3 Abstract X- Chart Usng ANOM Approach Gullapall Chakravarth 1 and Chaluvad Venkateswara Rao Control lmts for ndvdual measurements (X) chart are

More information

Load Balancing for Hex-Cell Interconnection Network

Load Balancing for Hex-Cell Interconnection Network Int. J. Communcatons, Network and System Scences,,, - Publshed Onlne Aprl n ScRes. http://www.scrp.org/journal/jcns http://dx.do.org/./jcns.. Load Balancng for Hex-Cell Interconnecton Network Saher Manaseer,

More information

The Codesign Challenge

The Codesign Challenge ECE 4530 Codesgn Challenge Fall 2007 Hardware/Software Codesgn The Codesgn Challenge Objectves In the codesgn challenge, your task s to accelerate a gven software reference mplementaton as fast as possble.

More information

Query Clustering Using a Hybrid Query Similarity Measure

Query Clustering Using a Hybrid Query Similarity Measure Query clusterng usng a hybrd query smlarty measure Fu. L., Goh, D.H., & Foo, S. (2004). WSEAS Transacton on Computers, 3(3), 700-705. Query Clusterng Usng a Hybrd Query Smlarty Measure Ln Fu, Don Hoe-Lan

More information

For instance, ; the five basic number-sets are increasingly more n A B & B A A = B (1)

For instance, ; the five basic number-sets are increasingly more n A B & B A A = B (1) Secton 1.2 Subsets and the Boolean operatons on sets If every element of the set A s an element of the set B, we say that A s a subset of B, or that A s contaned n B, or that B contans A, and we wrte A

More information

Learning-Based Top-N Selection Query Evaluation over Relational Databases

Learning-Based Top-N Selection Query Evaluation over Relational Databases Learnng-Based Top-N Selecton Query Evaluaton over Relatonal Databases Lang Zhu *, Wey Meng ** * School of Mathematcs and Computer Scence, Hebe Unversty, Baodng, Hebe 071002, Chna, zhu@mal.hbu.edu.cn **

More information

Wishing you all a Total Quality New Year!

Wishing you all a Total Quality New Year! Total Qualty Management and Sx Sgma Post Graduate Program 214-15 Sesson 4 Vnay Kumar Kalakband Assstant Professor Operatons & Systems Area 1 Wshng you all a Total Qualty New Year! Hope you acheve Sx sgma

More information

CS 534: Computer Vision Model Fitting

CS 534: Computer Vision Model Fitting CS 534: Computer Vson Model Fttng Sprng 004 Ahmed Elgammal Dept of Computer Scence CS 534 Model Fttng - 1 Outlnes Model fttng s mportant Least-squares fttng Maxmum lkelhood estmaton MAP estmaton Robust

More information

A NOTE ON FUZZY CLOSURE OF A FUZZY SET

A NOTE ON FUZZY CLOSURE OF A FUZZY SET (JPMNT) Journal of Process Management New Technologes, Internatonal A NOTE ON FUZZY CLOSURE OF A FUZZY SET Bhmraj Basumatary Department of Mathematcal Scences, Bodoland Unversty, Kokrajhar, Assam, Inda,

More information

User Authentication Based On Behavioral Mouse Dynamics Biometrics

User Authentication Based On Behavioral Mouse Dynamics Biometrics User Authentcaton Based On Behavoral Mouse Dynamcs Bometrcs Chee-Hyung Yoon Danel Donghyun Km Department of Computer Scence Department of Computer Scence Stanford Unversty Stanford Unversty Stanford, CA

More information

An Optimal Algorithm for Prufer Codes *

An Optimal Algorithm for Prufer Codes * J. Software Engneerng & Applcatons, 2009, 2: 111-115 do:10.4236/jsea.2009.22016 Publshed Onlne July 2009 (www.scrp.org/journal/jsea) An Optmal Algorthm for Prufer Codes * Xaodong Wang 1, 2, Le Wang 3,

More information

SHAPE RECOGNITION METHOD BASED ON THE k-nearest NEIGHBOR RULE

SHAPE RECOGNITION METHOD BASED ON THE k-nearest NEIGHBOR RULE SHAPE RECOGNITION METHOD BASED ON THE k-nearest NEIGHBOR RULE Dorna Purcaru Faculty of Automaton, Computers and Electroncs Unersty of Craoa 13 Al. I. Cuza Street, Craoa RO-1100 ROMANIA E-mal: dpurcaru@electroncs.uc.ro

More information

S1 Note. Basis functions.

S1 Note. Basis functions. S1 Note. Bass functons. Contents Types of bass functons...1 The Fourer bass...2 B-splne bass...3 Power and type I error rates wth dfferent numbers of bass functons...4 Table S1. Smulaton results of type

More information

Intelligent Information Acquisition for Improved Clustering

Intelligent Information Acquisition for Improved Clustering Intellgent Informaton Acquston for Improved Clusterng Duy Vu Unversty of Texas at Austn duyvu@cs.utexas.edu Mkhal Blenko Mcrosoft Research mblenko@mcrosoft.com Prem Melvlle IBM T.J. Watson Research Center

More information

An Indian Journal FULL PAPER ABSTRACT KEYWORDS. Trade Science Inc.

An Indian Journal FULL PAPER ABSTRACT KEYWORDS. Trade Science Inc. [Type text] [Type text] [Type text] ISSN : 97-735 Volume Issue 9 BoTechnology An Indan Journal FULL PAPER BTAIJ, (9), [333-3] Matlab mult-dmensonal model-based - 3 Chnese football assocaton super league

More information

High-Boost Mesh Filtering for 3-D Shape Enhancement

High-Boost Mesh Filtering for 3-D Shape Enhancement Hgh-Boost Mesh Flterng for 3-D Shape Enhancement Hrokazu Yagou Λ Alexander Belyaev y Damng We z Λ y z ; ; Shape Modelng Laboratory, Unversty of Azu, Azu-Wakamatsu 965-8580 Japan y Computer Graphcs Group,

More information

Imperialist Competitive Algorithm with Variable Parameters to Determine the Global Minimum of Functions with Several Arguments

Imperialist Competitive Algorithm with Variable Parameters to Determine the Global Minimum of Functions with Several Arguments Fourth Internatonal Conference Modellng and Development of Intellgent Systems October 8 - November, 05 Lucan Blaga Unversty Sbu - Romana Imperalst Compettve Algorthm wth Varable Parameters to Determne

More information

Performance Evaluation of Information Retrieval Systems

Performance Evaluation of Information Retrieval Systems Why System Evaluaton? Performance Evaluaton of Informaton Retreval Systems Many sldes n ths secton are adapted from Prof. Joydeep Ghosh (UT ECE) who n turn adapted them from Prof. Dk Lee (Unv. of Scence

More information

Reducing Frame Rate for Object Tracking

Reducing Frame Rate for Object Tracking Reducng Frame Rate for Object Trackng Pavel Korshunov 1 and We Tsang Oo 2 1 Natonal Unversty of Sngapore, Sngapore 11977, pavelkor@comp.nus.edu.sg 2 Natonal Unversty of Sngapore, Sngapore 11977, oowt@comp.nus.edu.sg

More information

The Research of Support Vector Machine in Agricultural Data Classification

The Research of Support Vector Machine in Agricultural Data Classification The Research of Support Vector Machne n Agrcultural Data Classfcaton Le Sh, Qguo Duan, Xnmng Ma, Me Weng College of Informaton and Management Scence, HeNan Agrcultural Unversty, Zhengzhou 45000 Chna Zhengzhou

More information

DESIGNING TRANSMISSION SCHEDULES FOR WIRELESS AD HOC NETWORKS TO MAXIMIZE NETWORK THROUGHPUT

DESIGNING TRANSMISSION SCHEDULES FOR WIRELESS AD HOC NETWORKS TO MAXIMIZE NETWORK THROUGHPUT DESIGNING TRANSMISSION SCHEDULES FOR WIRELESS AD HOC NETWORKS TO MAXIMIZE NETWORK THROUGHPUT Bran J. Wolf, Joseph L. Hammond, and Harlan B. Russell Dept. of Electrcal and Computer Engneerng, Clemson Unversty,

More information

Virtual Machine Migration based on Trust Measurement of Computer Node

Virtual Machine Migration based on Trust Measurement of Computer Node Appled Mechancs and Materals Onlne: 2014-04-04 ISSN: 1662-7482, Vols. 536-537, pp 678-682 do:10.4028/www.scentfc.net/amm.536-537.678 2014 Trans Tech Publcatons, Swtzerland Vrtual Machne Mgraton based on

More information

Clustering Algorithm of Similarity Segmentation based on Point Sorting

Clustering Algorithm of Similarity Segmentation based on Point Sorting Internatonal onference on Logstcs Engneerng, Management and omputer Scence (LEMS 2015) lusterng Algorthm of Smlarty Segmentaton based on Pont Sortng Hanbng L, Yan Wang*, Lan Huang, Mngda L, Yng Sun, Hanyuan

More information

Electrical analysis of light-weight, triangular weave reflector antennas

Electrical analysis of light-weight, triangular weave reflector antennas Electrcal analyss of lght-weght, trangular weave reflector antennas Knud Pontoppdan TICRA Laederstraede 34 DK-121 Copenhagen K Denmark Emal: kp@tcra.com INTRODUCTION The new lght-weght reflector antenna

More information

An Improved Image Segmentation Algorithm Based on the Otsu Method

An Improved Image Segmentation Algorithm Based on the Otsu Method 3th ACIS Internatonal Conference on Software Engneerng, Artfcal Intellgence, Networkng arallel/dstrbuted Computng An Improved Image Segmentaton Algorthm Based on the Otsu Method Mengxng Huang, enjao Yu,

More information

FEATURE EXTRACTION. Dr. K.Vijayarekha. Associate Dean School of Electrical and Electronics Engineering SASTRA University, Thanjavur

FEATURE EXTRACTION. Dr. K.Vijayarekha. Associate Dean School of Electrical and Electronics Engineering SASTRA University, Thanjavur FEATURE EXTRACTION Dr. K.Vjayarekha Assocate Dean School of Electrcal and Electroncs Engneerng SASTRA Unversty, Thanjavur613 41 Jont Intatve of IITs and IISc Funded by MHRD Page 1 of 8 Table of Contents

More information

Unsupervised Learning and Clustering

Unsupervised Learning and Clustering Unsupervsed Learnng and Clusterng Why consder unlabeled samples?. Collectng and labelng large set of samples s costly Gettng recorded speech s free, labelng s tme consumng 2. Classfer could be desgned

More information

A Saturation Binary Neural Network for Crossbar Switching Problem

A Saturation Binary Neural Network for Crossbar Switching Problem A Saturaton Bnary Neural Network for Crossbar Swtchng Problem Cu Zhang 1, L-Qng Zhao 2, and Rong-Long Wang 2 1 Department of Autocontrol, Laonng Insttute of Scence and Technology, Benx, Chna bxlkyzhangcu@163.com

More information

Problem Set 3 Solutions

Problem Set 3 Solutions Introducton to Algorthms October 4, 2002 Massachusetts Insttute of Technology 6046J/18410J Professors Erk Demane and Shaf Goldwasser Handout 14 Problem Set 3 Solutons (Exercses were not to be turned n,

More information

Finite Element Analysis of Rubber Sealing Ring Resilience Behavior Qu Jia 1,a, Chen Geng 1,b and Yang Yuwei 2,c

Finite Element Analysis of Rubber Sealing Ring Resilience Behavior Qu Jia 1,a, Chen Geng 1,b and Yang Yuwei 2,c Advanced Materals Research Onlne: 03-06-3 ISSN: 66-8985, Vol. 705, pp 40-44 do:0.408/www.scentfc.net/amr.705.40 03 Trans Tech Publcatons, Swtzerland Fnte Element Analyss of Rubber Sealng Rng Reslence Behavor

More information

cos(a, b) = at b a b. To get a distance measure, subtract the cosine similarity from one. dist(a, b) =1 cos(a, b)

cos(a, b) = at b a b. To get a distance measure, subtract the cosine similarity from one. dist(a, b) =1 cos(a, b) 8 Clusterng 8.1 Some Clusterng Examples Clusterng comes up n many contexts. For example, one mght want to cluster journal artcles nto clusters of artcles on related topcs. In dong ths, one frst represents

More information

Random Kernel Perceptron on ATTiny2313 Microcontroller

Random Kernel Perceptron on ATTiny2313 Microcontroller Random Kernel Perceptron on ATTny233 Mcrocontroller Nemanja Djurc Department of Computer and Informaton Scences, Temple Unversty Phladelpha, PA 922, USA nemanja.djurc@temple.edu Slobodan Vucetc Department

More information

Using graph theoretic measures to predict the performance of associative memory models

Using graph theoretic measures to predict the performance of associative memory models Using graph theoretic measures to predict the performance of associative memory models Lee Calcraft, Rod Adams, Weiliang Chen and Neil Davey School of Computer Science, University of Hertfordshire College

More information

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints Australan Journal of Basc and Appled Scences, 2(4): 1204-1208, 2008 ISSN 1991-8178 Sum of Lnear and Fractonal Multobjectve Programmng Problem under Fuzzy Rules Constrants 1 2 Sanjay Jan and Kalash Lachhwan

More information

Unsupervised Learning

Unsupervised Learning Pattern Recognton Lecture 8 Outlne Introducton Unsupervsed Learnng Parametrc VS Non-Parametrc Approach Mxture of Denstes Maxmum-Lkelhood Estmates Clusterng Prof. Danel Yeung School of Computer Scence and

More information

Private Information Retrieval (PIR)

Private Information Retrieval (PIR) 2 Levente Buttyán Problem formulaton Alce wants to obtan nformaton from a database, but she does not want the database to learn whch nformaton she wanted e.g., Alce s an nvestor queryng a stock-market

More information

Virtual Memory. Background. No. 10. Virtual Memory: concept. Logical Memory Space (review) Demand Paging(1) Virtual Memory

Virtual Memory. Background. No. 10. Virtual Memory: concept. Logical Memory Space (review) Demand Paging(1) Virtual Memory Background EECS. Operatng System Fundamentals No. Vrtual Memory Prof. Hu Jang Department of Electrcal Engneerng and Computer Scence, York Unversty Memory-management methods normally requres the entre process

More information

Support Vector Machines

Support Vector Machines Support Vector Machnes Decson surface s a hyperplane (lne n 2D) n feature space (smlar to the Perceptron) Arguably, the most mportant recent dscovery n machne learnng In a nutshell: map the data to a predetermned

More information

Artificial Intelligence (AI) methods are concerned with. Artificial Intelligence Techniques for Steam Generator Modelling

Artificial Intelligence (AI) methods are concerned with. Artificial Intelligence Techniques for Steam Generator Modelling Artfcal Intellgence Technques for Steam Generator Modellng Sarah Wrght and Tshldz Marwala Abstract Ths paper nvestgates the use of dfferent Artfcal Intellgence methods to predct the values of several contnuous

More information

Design of Structure Optimization with APDL

Design of Structure Optimization with APDL Desgn of Structure Optmzaton wth APDL Yanyun School of Cvl Engneerng and Archtecture, East Chna Jaotong Unversty Nanchang 330013 Chna Abstract In ths paper, the desgn process of structure optmzaton wth

More information

Load-Balanced Anycast Routing

Load-Balanced Anycast Routing Load-Balanced Anycast Routng Chng-Yu Ln, Jung-Hua Lo, and Sy-Yen Kuo Department of Electrcal Engneerng atonal Tawan Unversty, Tape, Tawan sykuo@cc.ee.ntu.edu.tw Abstract For fault-tolerance and load-balance

More information

Determining the Optimal Bandwidth Based on Multi-criterion Fusion

Determining the Optimal Bandwidth Based on Multi-criterion Fusion Proceedngs of 01 4th Internatonal Conference on Machne Learnng and Computng IPCSIT vol. 5 (01) (01) IACSIT Press, Sngapore Determnng the Optmal Bandwdth Based on Mult-crteron Fuson Ha-L Lang 1+, Xan-Mn

More information

FINDING IMPORTANT NODES IN SOCIAL NETWORKS BASED ON MODIFIED PAGERANK

FINDING IMPORTANT NODES IN SOCIAL NETWORKS BASED ON MODIFIED PAGERANK FINDING IMPORTANT NODES IN SOCIAL NETWORKS BASED ON MODIFIED PAGERANK L-qng Qu, Yong-quan Lang 2, Jng-Chen 3, 2 College of Informaton Scence and Technology, Shandong Unversty of Scence and Technology,

More information

Video Proxy System for a Large-scale VOD System (DINA)

Video Proxy System for a Large-scale VOD System (DINA) Vdeo Proxy System for a Large-scale VOD System (DINA) KWUN-CHUNG CHAN #, KWOK-WAI CHEUNG *# #Department of Informaton Engneerng *Centre of Innovaton and Technology The Chnese Unversty of Hong Kong SHATIN,

More information

Simulation Based Analysis of FAST TCP using OMNET++

Simulation Based Analysis of FAST TCP using OMNET++ Smulaton Based Analyss of FAST TCP usng OMNET++ Umar ul Hassan 04030038@lums.edu.pk Md Term Report CS678 Topcs n Internet Research Sprng, 2006 Introducton Internet traffc s doublng roughly every 3 months

More information

BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION

BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION SHI-LIANG SUN, HONG-LEI SHI Department of Computer Scence and Technology, East Chna Normal Unversty 500 Dongchuan Road, Shangha 200241, P. R. Chna E-MAIL: slsun@cs.ecnu.edu.cn,

More information

A fast algorithm for color image segmentation

A fast algorithm for color image segmentation Unersty of Wollongong Research Onlne Faculty of Informatcs - Papers (Arche) Faculty of Engneerng and Informaton Scences 006 A fast algorthm for color mage segmentaton L. Dong Unersty of Wollongong, lju@uow.edu.au

More information

Support Vector Machines

Support Vector Machines /9/207 MIST.6060 Busness Intellgence and Data Mnng What are Support Vector Machnes? Support Vector Machnes Support Vector Machnes (SVMs) are supervsed learnng technques that analyze data and recognze patterns.

More information

CMPS 10 Introduction to Computer Science Lecture Notes

CMPS 10 Introduction to Computer Science Lecture Notes CPS 0 Introducton to Computer Scence Lecture Notes Chapter : Algorthm Desgn How should we present algorthms? Natural languages lke Englsh, Spansh, or French whch are rch n nterpretaton and meanng are not

More information

Understanding K-Means Non-hierarchical Clustering

Understanding K-Means Non-hierarchical Clustering SUNY Albany - Techncal Report 0- Understandng K-Means Non-herarchcal Clusterng Ian Davdson State Unversty of New York, 1400 Washngton Ave., Albany, 105. DAVIDSON@CS.ALBANY.EDU Abstract The K-means algorthm

More information

Lecture 5: Multilayer Perceptrons

Lecture 5: Multilayer Perceptrons Lecture 5: Multlayer Perceptrons Roger Grosse 1 Introducton So far, we ve only talked about lnear models: lnear regresson and lnear bnary classfers. We noted that there are functons that can t be represented

More information

Advanced Computer Networks

Advanced Computer Networks Char of Network Archtectures and Servces Department of Informatcs Techncal Unversty of Munch Note: Durng the attendance check a stcker contanng a unque QR code wll be put on ths exam. Ths QR code contans

More information

Improving Low Density Parity Check Codes Over the Erasure Channel. The Nelder Mead Downhill Simplex Method. Scott Stransky

Improving Low Density Parity Check Codes Over the Erasure Channel. The Nelder Mead Downhill Simplex Method. Scott Stransky Improvng Low Densty Party Check Codes Over the Erasure Channel The Nelder Mead Downhll Smplex Method Scott Stransky Programmng n conjuncton wth: Bors Cukalovc 18.413 Fnal Project Sprng 2004 Page 1 Abstract

More information

Efficient Distributed File System (EDFS)

Efficient Distributed File System (EDFS) Effcent Dstrbuted Fle System (EDFS) (Sem-Centralzed) Debessay(Debsh) Fesehaye, Rahul Malk & Klara Naherstedt Unversty of Illnos-Urbana Champagn Contents Problem Statement, Related Work, EDFS Desgn Rate

More information

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data A Fast Content-Based Multmeda Retreval Technque Usng Compressed Data Borko Furht and Pornvt Saksobhavvat NSF Multmeda Laboratory Florda Atlantc Unversty, Boca Raton, Florda 3343 ABSTRACT In ths paper,

More information

GSLM Operations Research II Fall 13/14

GSLM Operations Research II Fall 13/14 GSLM 58 Operatons Research II Fall /4 6. Separable Programmng Consder a general NLP mn f(x) s.t. g j (x) b j j =. m. Defnton 6.. The NLP s a separable program f ts objectve functon and all constrants are

More information

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization Problem efntons and Evaluaton Crtera for Computatonal Expensve Optmzaton B. Lu 1, Q. Chen and Q. Zhang 3, J. J. Lang 4, P. N. Suganthan, B. Y. Qu 6 1 epartment of Computng, Glyndwr Unversty, UK Faclty

More information

NOVEL CONSTRUCTION OF SHORT LENGTH LDPC CODES FOR SIMPLE DECODING

NOVEL CONSTRUCTION OF SHORT LENGTH LDPC CODES FOR SIMPLE DECODING Journal of Theoretcal and Appled Informaton Technology 27 JATIT. All rghts reserved. www.jatt.org NOVEL CONSTRUCTION OF SHORT LENGTH LDPC CODES FOR SIMPLE DECODING Fatma A. Newagy, Yasmne A. Fahmy, and

More information

Detection of an Object by using Principal Component Analysis

Detection of an Object by using Principal Component Analysis Detecton of an Object by usng Prncpal Component Analyss 1. G. Nagaven, 2. Dr. T. Sreenvasulu Reddy 1. M.Tech, Department of EEE, SVUCE, Trupath, Inda. 2. Assoc. Professor, Department of ECE, SVUCE, Trupath,

More information