Optimal connection strategies in one- and two-dimensional associative memory models

Similar documents
Outline. Self-Organizing Maps (SOM) US Hebbian Learning, Cntd. The learning rule is Hebbian like:

6.854 Advanced Algorithms Petar Maymounkov Problem Set 11 (November 23, 2005) With: Benjamin Rossman, Oren Weimann, and Pouya Kheradpour

A mathematical programming approach to the analysis, design and scheduling of offshore oilfields

Parallelism for Nested Loops with Non-uniform and Flow Dependences

Learning the Kernel Parameters in Kernel Minimum Distance Classifier

Subspace clustering. Clustering. Fundamental to all clustering techniques is the choice of distance measure between data points;

Content Based Image Retrieval Using 2-D Discrete Wavelet with Texture Feature with Different Classifiers

Hierarchical clustering for gene expression data analysis

Classifying Acoustic Transient Signals Using Artificial Intelligence

Mathematics 256 a course in differential equations for engineering students

An Iterative Solution Approach to Process Plant Layout using Mixed Integer Optimisation

Cluster Analysis of Electrical Behavior

Parallel matrix-vector multiplication

NUMERICAL SOLVING OPTIMAL CONTROL PROBLEMS BY THE METHOD OF VARIATIONS

Feature Reduction and Selection

A Deflected Grid-based Algorithm for Clustering Analysis

SLAM Summer School 2006 Practical 2: SLAM using Monocular Vision

Hermite Splines in Lie Groups as Products of Geodesics

Course Introduction. Algorithm 8/31/2017. COSC 320 Advanced Data Structures and Algorithms. COSC 320 Advanced Data Structures and Algorithms

Analysis of Continuous Beams in General

The Greedy Method. Outline and Reading. Change Money Problem. Greedy Algorithms. Applications of the Greedy Strategy. The Greedy Method Technique

High resolution 3D Tau-p transform by matching pursuit Weiping Cao* and Warren S. Ross, Shearwater GeoServices

Optimizing Document Scoring for Query Retrieval

Module Management Tool in Software Development Organizations

A MOVING MESH APPROACH FOR SIMULATION BUDGET ALLOCATION ON CONTINUOUS DOMAINS

Backpropagation: In Search of Performance Parameters

A Fast Visual Tracking Algorithm Based on Circle Pixels Matching

Outline. Type of Machine Learning. Examples of Application. Unsupervised Learning

Solving two-person zero-sum game by Matlab

MULTISPECTRAL IMAGES CLASSIFICATION BASED ON KLT AND ATR AUTOMATIC TARGET RECOGNITION

Compiler Design. Spring Register Allocation. Sample Exercises and Solutions. Prof. Pedro C. Diniz

UB at GeoCLEF Department of Geography Abstract

Meta-heuristics for Multidimensional Knapsack Problems

Complex Numbers. Now we also saw that if a and b were both positive then ab = a b. For a second let s forget that restriction and do the following.

KOHONEN'S SELF ORGANIZING NETWORKS WITH "CONSCIENCE"

Three supervised learning methods on pen digits character recognition dataset

A Binarization Algorithm specialized on Document Images and Photos

BioTechnology. An Indian Journal FULL PAPER. Trade Science Inc.

Machine Learning: Algorithms and Applications

An Entropy-Based Approach to Integrated Information Needs Assessment

Lobachevsky State University of Nizhni Novgorod. Polyhedron. Quick Start Guide

X- Chart Using ANOM Approach

Load Balancing for Hex-Cell Interconnection Network

The Codesign Challenge

Query Clustering Using a Hybrid Query Similarity Measure

For instance, ; the five basic number-sets are increasingly more n A B & B A A = B (1)

Learning-Based Top-N Selection Query Evaluation over Relational Databases

Wishing you all a Total Quality New Year!

CS 534: Computer Vision Model Fitting

A NOTE ON FUZZY CLOSURE OF A FUZZY SET

User Authentication Based On Behavioral Mouse Dynamics Biometrics

An Optimal Algorithm for Prufer Codes *

SHAPE RECOGNITION METHOD BASED ON THE k-nearest NEIGHBOR RULE

S1 Note. Basis functions.

Intelligent Information Acquisition for Improved Clustering

An Indian Journal FULL PAPER ABSTRACT KEYWORDS. Trade Science Inc.

High-Boost Mesh Filtering for 3-D Shape Enhancement

Imperialist Competitive Algorithm with Variable Parameters to Determine the Global Minimum of Functions with Several Arguments

Performance Evaluation of Information Retrieval Systems

Reducing Frame Rate for Object Tracking

The Research of Support Vector Machine in Agricultural Data Classification

DESIGNING TRANSMISSION SCHEDULES FOR WIRELESS AD HOC NETWORKS TO MAXIMIZE NETWORK THROUGHPUT

Virtual Machine Migration based on Trust Measurement of Computer Node

Clustering Algorithm of Similarity Segmentation based on Point Sorting

Electrical analysis of light-weight, triangular weave reflector antennas

An Improved Image Segmentation Algorithm Based on the Otsu Method

FEATURE EXTRACTION. Dr. K.Vijayarekha. Associate Dean School of Electrical and Electronics Engineering SASTRA University, Thanjavur

Unsupervised Learning and Clustering

A Saturation Binary Neural Network for Crossbar Switching Problem

Problem Set 3 Solutions

Finite Element Analysis of Rubber Sealing Ring Resilience Behavior Qu Jia 1,a, Chen Geng 1,b and Yang Yuwei 2,c

cos(a, b) = at b a b. To get a distance measure, subtract the cosine similarity from one. dist(a, b) =1 cos(a, b)

Random Kernel Perceptron on ATTiny2313 Microcontroller

Using graph theoretic measures to predict the performance of associative memory models

Sum of Linear and Fractional Multiobjective Programming Problem under Fuzzy Rules Constraints

Unsupervised Learning

Private Information Retrieval (PIR)

Virtual Memory. Background. No. 10. Virtual Memory: concept. Logical Memory Space (review) Demand Paging(1) Virtual Memory

Support Vector Machines

Artificial Intelligence (AI) methods are concerned with. Artificial Intelligence Techniques for Steam Generator Modelling

Design of Structure Optimization with APDL

Load-Balanced Anycast Routing

Determining the Optimal Bandwidth Based on Multi-criterion Fusion

FINDING IMPORTANT NODES IN SOCIAL NETWORKS BASED ON MODIFIED PAGERANK

Video Proxy System for a Large-scale VOD System (DINA)

Simulation Based Analysis of FAST TCP using OMNET++

BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION

A fast algorithm for color image segmentation

Support Vector Machines

CMPS 10 Introduction to Computer Science Lecture Notes

Understanding K-Means Non-hierarchical Clustering

Lecture 5: Multilayer Perceptrons

Advanced Computer Networks

Improving Low Density Parity Check Codes Over the Erasure Channel. The Nelder Mead Downhill Simplex Method. Scott Stransky

Efficient Distributed File System (EDFS)

A Fast Content-Based Multimedia Retrieval Technique Using Compressed Data

GSLM Operations Research II Fall 13/14

Problem Definitions and Evaluation Criteria for Computational Expensive Optimization

NOVEL CONSTRUCTION OF SHORT LENGTH LDPC CODES FOR SIMPLE DECODING

Detection of an Object by using Principal Component Analysis

Transcription:

Optmal connecton strateges n one- and two-dmensonal assocatve memory models Lee Calcraft, Rod Adams, and Nel Davey School of Computer Scence, Unversty of Hertfordshre College lane, Hatfeld, Hertfordshre AL1 9AB, Unted Kngdom E-mal: l.calcraft@herts.ac.uk (Submtted 31 May, 27) Abstract Ths study examnes the performance of sparsely-connected assocatve memory models bult usng a number of dfferent connecton strateges, appled to one- and two-dmensonal topologes. Effcent patterns of connectvty are dentfed whch yeld hgh performance at relatvely low wrng costs n both topologes. It s found that two-dmensonal models are more tolerant of varatons n connecton strategy than ther one-dmensonal counterparts; though networks bult wth both topologes become less so as ther connecton densty s decreased. Keywords Assocatve memory models, effcent connecton strateges, sparse connectvty, comparng 1D and 2D topologes 1. Introducton Our studes of sparsely-connected one-dmensonal assocatve memory models [1, 2], ntally nspred by the work of Watts and Strogatz [3] on the small-world propertes of sparsely-connected systems, demonstrated the mportance of the pattern of connectvty between nodes n determnng network performance. In a small step towards bologcal plausblty, we extend our studes to encompass two-dmensonal networks. Our assocatve memory models now represent a 2D substrate of sparsely-connected neurons wth a connecton densty of.1 or.1. We wll compare the performance of dfferent connecton strateges n our 2D networks wth results obtaned from earler work usng a 1D arrangement. Ths should prove nstructve, snce 1D treatments of assocatve memory do not tend to establsh to what extent ther fndngs are applcable to more bologcally-plausble topologes [4-7]. In ths pursut we acknowledge of course that ths study falls short of a full 3D treatment, whch would requre more processng power than currently avalable to us. As wth our earler 1D work, our 2D studes wll focus on explorng connecton strateges whch acheve good pattern-completon for a mnmum wrng length. We are encouraged n ths pursut by recent studes whch suggest the mportance of wrng optmsaton n nature, both from the pont of vew of the cortcal volume taken up by axons and dendrtes, the delays and attenuaton mposed by long-dstance connectons, and the metabolc requrements of the connectve tssue [8-1]. A connecton strategy whch mnmses wrng length wthout mpactng upon network performance could potentally mtgate aganst these unwanted collaterals. It s the goal of the present work to dentfy such strateges, and to compare ther realsatons n 1D and 2D networks. 2. Network Dynamcs and Tranng Each unt n our networks s a smple, bpolar, threshold devce, summng ts net nput and frng determnstcally. The net nput, or local feld, of a unt, s gven by: h = w S where S ±1 j j j ( ) s the current state and w j s the weght on the connecton from unt j to unt. The dynamcs of the network s gven by the standard update: 1 S = 1 S f h f h f h > < = where S s the new state of S Unt states may be updated synchronously or asynchronously. Here we use asynchronous, random order updates.

If a tranng pattern, ξ μ, s one of the fxed ponts of the network, then t s successfully stored and s sad to be a fundamental memory. Gven a tranng set { ξ μ }, the tranng algorthm s desgned to drve the local felds of each unt the correct sde of a learnng threshold, T, for all the tranng patterns. Ths s equvalent to requrng that,μ h μ ξ μ T So the learnng rule s gven by: Begn wth a zero weght matrx Repeat untl all local felds are correct Set the state of the network to one of the ξ μ For each unt,, n turn Calculate h p p ξ. If ths s less than T then change the weghts on connectons nto unt accordng to: j w j = w j + C j ξ p ξ k p j where { C } s the connecton matrx j The form of the update s such that changes are only made on the weghts that are actually present n the connectvty matrx { C j } (where C j =1 f wj s present, and otherwse), and that the learnng rate s nversely proportonal to the number of connectons per unt, k. Earler work has establshed that a learnng threshold T = 1 gves good results [11]. 3. Measurng Performance The ablty to store patterns s not the only functonal requrement of an assocatve memory: fundamental memores should also act as attractors n the state space of the dynamc system resultng from the recurrent connectvty of the network, so that pattern correcton can take place. To measure ths we use the Effectve Capacty of the network, EC [7, 12]. The Effectve Capacty of a network s a measure of the maxmum number of patterns that can be stored n the network wth reasonable pattern correcton stll takng place. We take a farly arbtrary defnton of reasonable as correctng the addton of 6% nose to wthn an overlap of 95% wth the orgnal fundamental memory. Varyng these fgures gves dfferng values for EC but the values wth these settngs are robust for comparson purposes. For large fully-connected networks the EC value s proportonal to N, the total number of nodes n the network, and has a value of approxmately.1 of the maxmum theoretcal capacty of the network. For large sparse locally-connected networks, EC s proportonal to the number of connectons per node, whle wth other archtectures t s dependent upon the actual connecton matrx C. The Effectve Capacty of a partcular network s determned as follows: Intalse the number of patterns, P, to Repeat Increment P Create a tranng set of P random patterns Tran the network For each pattern n the tranng set Degrade the pattern randomly by addng 6% of nose Wth ths nosy pattern as start state, allow the network to converge Calculate the overlap of the fnal network state wth the orgnal pattern EndFor Calculate the mean pattern overlap over all fnal states Untl the mean pattern overlap s less than 95% The Effectve Capacty s P-1 4. Network Archtecture The networks dscussed here are based on one- and two-dmensonal lattces of N nodes wth perodc boundary condtons. Thus the 1D networks take the physcal form of a rng, and the 2D mplementatons that of a torus. The networks are sparse, n whch the nput of each node s connected to a relatvely small, but fxed number, k, of other nodes. The man 2D networks examned consst of 49 nodes arranged n a 7 x 7 array, wth 49 afferent (ncomng) connectons per node, gvng a connecton densty of.1; and of 484 nodes arranged n a 22 x 22 array, wth 48 afferent connectons

per node, gvng a connecton densty of.1. The 1D networks consst of 5 nodes and of 5 nodes, both wth 5 connectons per node, agan gvng connecton denstes of.1 and.1, respectvely. All references to spacng refer to the dstance between nodes around the rng n the case of the 1D network, and across the surface of the torus n the 2D case. Fgure 1a. 1D sparsely-connected network wth 14 nodes, and 4 afferent connectons per node, llustratng the connectons to a sngle node: Left, locally-connected, rght, after rewrng. Fgure 1b. 2D sparsely-connected network wth 64 nodes, and 8 afferent connectons per node, llustratng the connectons to a sngle node: Left, locally-connected, rght, after rewrng. We have already establshed for a 1D network that purely local connectvty results n networks wth low wrng length, but wth poor pattern-completon performance, whle randomly-connected networks perform well, but have hgh wrng costs [1]. In a search for a compromse between these two extremes we wll examne three dfferent connecton strateges here, applyng them to both 1D and 2D networks: Progressvely rewred Ths s based on the strategy ntroduced by Watts and Strogatz [3] for generatng small-world networks, and appled to a one-dmensonal assocatve memory by Bohland and Mna [6], and subsequently by Davey et al [13]. A locally-connected network s set up, and a fracton of the afferent connectons to each node s rewred to other randomly-selected nodes. See fgure 1a. It s found that rewrng a one-dmensonal network n ths way mproves communcaton throughout the network, and that as the degree of rewrng s ncreased, pattern completon progressvely mproves, up to the pont where about half the connectons have been rewred. Beyond ths pont, further rewrng seems to have lttle effect [6]. Gaussan Here the network s set up n such a way that the probablty of a connecton between any two nodes separated by a dstance d s proportonal to 2 1 ( 1) exp( d ) σ 2 2σ where d s defned as the dstance between nodes, and les n the range 1 d < N / 2. Network performance s tested for a wde range of values of σ. Exponental In ths case the network s set up n such a way that the probablty of a connecton between any two nodes separated by a dstance, d, (where1 d < N / 2 ) s proportonal to exp( λ( d 1)) Networks are tested over a wde range of λ.

5. Results and Dscusson 5.1 Progressve rewrng Ths connecton strategy was ntroduced by Watts and Strogatz as a way to move n a controlled manner from a locally-connected network to a random one, and as dscussed earler, t nvolves the progressve rewrng of a locally-connected network to randomly-chosen connecton stes. See fgure 1. The results of applyng ths procedure n 1D and 2D networks of smlar sze are shown n fgure 2. The networks are ntally bult wth local-only connectons, and ther Effectve Capacty s measured as the network s rewred n steps of 1%, untl all connectons have been rewred, at whch pont the network s randomly connected. As may be seen, both networks behave smlarly, mprovng n patterncompleton performance as the rewrng s ncreased, up to around 4 or 5% rewrng, after whch lttle further mprovement s apparent. Ths echoes the results reported by Bholand and Mna [6], for a 1D network. There s, however, an mportant dfference between the performance of the 1D and 2D networks here, snce although both acheve the same effectve Capacty of 23 when fully rewred, ther performances are very dfferent when connected locally (e when the rewrng s zero). In ths confguraton the 1D network has an Effectve Capacty of 6 patterns, whle the 2D network successfully recalls 12. 25 Effectve Capacty, patterns 2 15 1 5 1D network 2D network 2 4 6 8 1 Degree of rewrng Fgure 2. Effectve Capacty vs degree of rewrng for a 1D network wth 5 unts and 5 ncomng connectons per node, and a 2D network wth 49 unts and 49 ncomng connectons per node. The 1D local network has an EC of just 6, whle n the 2D network t s a much healther 12. Once rewrng has reached around 4 or 5% there s lttle further mprovement n performance. In seekng an explanaton for ths consderable mprovement when movng from the 1D network to the 2D representaton, we would pont to two aspects of the network whch change as the dmensonalty s changed. Frstly, the degree of clusterng, the extent to whch nodes connected to any gven node are also connected to each other, decreases from.73 to.53 as we move from 1D to 2D n the above locally-connected networks; and we have prevously found that very tghtly clustered networks perform badly as assocators [14]. Secondly, there s an mprovement n communcaton across the network as we ncrease dmensonalty. In the 1D network t takes a maxmum of 99 steps to pass data between the furthest-separated nodes, whereas n ts 2D counterpart ths has dramatcally dropped to just 9 steps: or translated nto terms of characterstc path length [3], the 1D network has a mean mnmum path length of 48, whle n the 2D network ths drops to 6.5. We would also speculate that n a 3D mplementaton, a locally-connected network mght perform even better. The sgnfcant mprovement n local performance experenced when movng from 1D to 2D networks has consderable mplcatons when searchng for optmal patterns of connectvty. The reason for ths s that, snce n the 2D topology there s a much smaller dfference between the best and the worst performng archtectures, the rewards for usng optmum patterns of connectvty wll be correspondngly less - and we would speculate that ths s lkely to be even more sgnfcant n 3D networks.

5.2 Optmal archtectures n networks of connecton densty.1 In order to compare the performance of other connecton strateges wth that of progressvelyrewred networks, we measured the Effectve Capacty of networks whose patterns of connectvty were based on Gaussan and exponental probablty dstrbutons of varyng σ and λ. The Effectve Capacty of all three network types (Gaussan, exponental and progressvely-rewred) were then plotted aganst the mean wrng length of the correspondng networks, provdng us wth an effcent way to evaluate pattern-completon performance and correspondng wrng costs. Fgure 3a shows the results for a 1D network of 5 nodes wth 5 connectons per node, whle fgure 3b depcts a 2D network of 49 nodes wth 49 connectons per node. 25 2 Effectve Capacty, patterns 15 1 5 Gaussan Exponental Progressvely-rewred 2 4 6 8 1 12 Mean wrng length Fgure 3a. Effectve Capacty vs wrng length for Gaussan, exponental and progressvely-rewred archtectures on a 1D network wth 5 nodes and 5 connectons per node. Note that the leftmost pont on the rewred plot corresponds to a local-only network (zero rewrng), and the rghtmost to a random network (1% rewrng). Results are averages over 5 runs. 25 2 Effectve Capacty, patterns 15 1 5 Gaussan Exponental Progressvely-rewred 5 1 15 2 25 3 Mean wrng length Fgure 3b. Effectve Capacty vs wrng length for Gaussan, exponental and progressvely-rewred archtectures on a 2 D network wth 49 nodes, and 49 connectons per node. Agan the leftmost pont on the rewred plot corresponds to a local-only network, and the rghtmost to a random network. Results are averages over 5 runs. We can see from ths that n both the 1D and the 2D networks, all three archtectures acheve a maxmum pattern-completon performance of around 23 patterns. And n both topologes the Gaussan and exponental archtectures acheve ths at a consderably lower mean wrng length than the progressvely-rewred networks. But, largely because of the better performance of the local network n 2D topology, the dfferences are not so large n the 2D network. Thus, comparng network confguratons whch acheve an Effectve Capacty of 2 (a hgh value at a relatvely low mean wrng length), usng a Gaussan archtecture n the 1D network would use only one quarter of the wrng of the equvalent progressvely-rewred network. In the case of the 2D network, the correspondng savng n wrng drops to a half. Clearly, however, ths s stll far from a trval savng, and the fact that

connectvty between neurons n the cortex s beleved to follow a Gaussan archtecture [15] (e the probablty of any two neurons beng connected decreases wth dstance accordng to a Gaussan dstrbuton) bears wtness to the contnung benefts of ths archtecture n real 3D systems. 5.3 Optmal archtectures n networks of connecton densty.1 In our 1D studes usng networks of connecton densty.1 we reported that the dfferences between the rewred network and the Gaussan and exponental dstrbutons were notceably less than at the lower connecton densty of.1 [1], but that dfferences were stll n evdence. Once we move to a 2D topology, however, we see that whlst there contnues to be a notceable dfference n performance between the rewred network and the Gaussan and exponental dstrbutons at the lower,.1, connecton densty, ths effectvely dsappears at a connecton densty of.1. See fgure 4, whch llustrates the performance of a 1D network of 5 nodes, wth 5 connectons per node; and a 2D network wth 484 nodes, and 48 connectons per node. 2 18 16 Effectve Capacty, patterns 14 12 1 8 6 4 2 Gaussan Exponental Progressvely-rewred 2 4 6 8 1 12 Mean wrng length Fgure 4a. Effectve Capacty vs wrng length for Gaussan, exponental and progressvely-rewred archtectures on a 1D network wth 5 nodes, and 5 connectons per node. Results are averages over 5 runs. 2 18 Effectve Capacty, patterns 16 14 12 1 8 6 4 Gaussan Exponental Progressvely-rewred 2 1 2 3 4 5 6 7 8 9 Mean wrng length Fgure 4b. Effectve Capacty vs wrng length for Gaussan, exponental and progressvely-rewred archtectures on a 2D network wth 484 nodes and 48 connectons per node. Results are averages over 5 runs. However, the 2D network on whch we are basng ths concluson dffers from our low connecton densty 2D network n not one, but two respects. Its connecton densty s ndeed ten tmes greater, at.1, but the total sze of the network s also smaller by a smlar factor. Thus t s not yet

clear to what extent the mergng of performance of the dfferent archtectures seen n the 484 node 2D network s the result of the hgher connecton densty used here (.1 aganst.1), or whether t s due to the smaller sze of the network. In an attempt to dstngush between these two factors, we have repeated the experment for the 2D network at a sze of 49 unts, wth 49 connectons per node, thus retanng the hgher connecton densty of.1, but ncreasng the network sze to that used n the lower connecton densty experments. The results appear n fgure 5. 25 2 Effectve Capacty, patterns 15 1 5 Gaussan Exponental Progressvely-rewred 5 1 15 2 25 3 Mean wrng length Fgure 5. Effectve Capacty vs wrng length for Gaussan, exponental and progressvely-rewred archtectures on a 2D network wth 49 nodes and 49 connectons per node. Results are averages over 5 runs. Clearly, there s agan very lttle to choose n terms of performance between the three archtectures, and we must conclude that n 2D assocatve memory models wth connecton denstes of.1 and above, whether the pattern of connectvty s based on a Gaussan or exponental probablty dstrbuton, or whether a progressvely-rewred local network s used, the choce wll have very lttle nfluence on the pattern-completon performance of the network, or the amount of wrng used. However, the partcular parameters whch we adopt (the value of σ for a Gaussan dstrbuton, or of λ for an exponental, or the degree of rewrng used) wll stll have consderable nfluence on performance. These parameters wll determne the operaton pont of our network along the curve n fgure 5. At the left-hand end of the curve, a completely local network wll gve us an Effectve Capacty of around 15 patterns, at a mean wrng length of around 8. At the rght-hand end we obtan an Effectve Capacty of approachng 2 patterns at a mean wrng length of between 2 and 3. By contrast, n networks wth a connecton densty of.1, the Gaussan and exponental archtectures are clearly better performers than the progressvely-rewred network, and because of the relatvely steep rse n the Effectve Capacty aganst mean wrng length curves for these archtectures, t s easer to select an operaton pont along the curve whch has both a hgh Effectve Capacty and a low mean wrng length. 5. Concluson Usng hgh capacty assocatve memory models we have examned the pattern-completon performance and correspondng wrng costs of networks based on a number of dfferent connecton strateges, bult wth a 1D topology. All experments were repeated for smlar networks bult wth a 2D topology, and comparsons drawn between the two sets of results. In our frst set of experments we compared the performance of 1D and 2D networks of smlar sze, as they were progressvely rewred from a state of local-only connectvty to a state of fully random connectvty. It was found that although both topologes yelded the same results n the case of random connectvty (as must be the case), there were mportant dfferences when connectvty was purely local. In ths case the 2D network was able to recall twce the number of patterns acheved by the 1D network. It was suggested that ths may be the consequence both of the decrease n clusterng, and of the much mproved communcaton between dstant nodes n the 2D network. It was also suggested that for smlar reasons, a 3D network mght show even more pronounced effects. We then compared plots of Effectve Capacty aganst mean wrng length for Gaussan, exponental and progressvely-rewred networks. Our ntal tests used a connecton densty of.1. In both the 1D and 2D topologes the Gaussan and exponental networks consstently outperformed the

progressvely-rewred networks, though n movng from a 1D to a 2D topology, the benefts of usng Gaussan or exponental connectvty were less pronounced. In networks of connecton densty.1 t was found that the small advantages of usng Gaussan or exponental patterns of connectvty over the progressvely-rewred network n the 1D topology all but dsappeared n the 2D networks. Thus, whle 2D assocatve memory models appear to be more tolerant of varatons n connecton strategy than ther 1D counterparts, networks of both types become less so as ther connecton densty s decreased. In future work we wll nvestgate whether these fndngs are also vald for networks n whch the pont of axonal arborsaton s dsplaced a fnte dstance from the presynaptc node. References [1] L. Calcraft, R. Adams, and N. Davey, "Gaussan and exponental archtectures n small-world assocatve memores," Proceedngs of ESANN 26: 14th European Symposum on Artfcal Neural Networks. Advances n Computatonal Intellgence and Learnng, pp. 617-622, 26. [2] L. Calcraft, R. Adams, and N. Davey, "Hgh performance assocatve memory models wth low wrng costs," Proceedngs of 3rd IEEE Conference on Intellgent Systems, Unversty of Westmnster, 4-6 September 26-9-29, pp. 612-616, 26. [3] D. Watts and S. Strogatz, "Collectve dynamcs of 'small-world' networks," Nature, vol. 393, pp. 44-442, 1998. [4] P. McGraw and M. Menznger, "Topology and computatonal performance of attractor neural networks," Physcal Revew E, vol. 68, pp. 4712, 23. [5] F. Emmert-Streb, "Influence of the neural network topology on the learnng dynamcs," Neurocomputng, vol. 69, pp. 1179-1182, 26. [6] J. Bohland and A. Mna, "Effcent assocatve memory usng small-world archtecture," Neurocomputng, vol. 38-4, pp. 489-496, 21. [7] L. Calcraft, R. Adams, and N. Davey, "Locally-connected and small-world assocatve memores n large networks," Neural Informaton Processng - Letters and Revews, vol. 1, pp. 19-26, 26. [8] D. Chklovsk, "Synaptc connectvty and neuronal morphology: two sdes of the same con," Neuron, vol. 43, pp. 69-617, 24. [9] G. Mtchson, "Neuronal branchng patterns and the economy of cortcal wrng," Proceedngs: Bologcal Scences, vol. 245, pp. 151-158, 1991. [1] D. Attwell and S. Laughln, "An energy budget for sgnalng n the grey matter of the bran," Journal of Cerebral Blood Flow and Metabolsm, vol. 21, pp. 1133 1145, 21. [11] N. Davey, S. P. Hunt, and R. G. Adams, "Hgh capacty recurrent assocatve memores," Neurocomputng, vol. 62, pp. 459-491, 24. [12] L. Calcraft, "Measurng the performance of assocatve memores," Unversty of Hertfordshre Techncal Report (42) May 25. [13] N. Davey, B. Chrstanson, and R. Adams, "Hgh capacty assocatve memores and small world networks," Proceedngs of the IEEE Internatonal Jont Conference on Neural Networks, 24. [14] L. Calcraft, R. Adams, and N. Davey, "Effcent archtectures for sparsely-connected hgh capacty assocatve memory models," Connecton Scence, vol. 19, 27. [15] B. Hellwg, "A quanttatve analyss of the local connectvty between pyramdal neurons n layers 2/3 of the rat vsual cortex," Bologcal Cybernetcs, vol. 82, pp. 111-121, 2.