Improving the Interpretability of Data-Driven Evolving Fuzzy-Systems

Size: px
Start display at page:

Download "Improving the Interpretability of Data-Driven Evolving Fuzzy-Systems"

Transcription

1 Improving the Interpretability of Data-Driven Evolving Fuzzy-Systems Edwin Lughofer Johannes Kepler University Linz Eyke Hüllermeier Otto-von-Guericke-University Magdeburg Erich Peter Klement Johannes Kepler University Linz Abstract This paper develops methods for reducing the complexity and, thereby, improving the linguistic interpretability of Takagi-Sugeno fuzzy systems that are learned online in a data-driven, incremental way. In order to ensure the transparency of the evolving fuzzy system at any time, complexity reduction must be performed in an online mode as well. Our methods are evaluated on high-dimensional data coming from an industrial measuring process. Keywords: Incremental learning, evolving fuzzy systems, complexity reduction. 1 Introduction Takagi-Sugeno (TS) fuzzy systems [10] play an important role for system modelling and identification, as they combine potentially high approximation accuracy [11] with linguistic interpretability. Recently, [5] has pointed out the importance of the problem to identify TS fuzzy systems in a data-driven, online manner. Among other things, this problem requires incremental learning methods that update a model whenever new observations have been made, referring to the current model and the new data but not to old observations. In contrast to batch learning algorithms, which train a model (from scratch) using all of the data observed so far (e.g. [1]), incremental methods guarantee fast training of a model and thus qualify for online applications. 28 When learning fuzzy models in a data-driven way, the focus is usually on high approximation accuracy. Unfortunately, accurate models are often complex at the same time, hence aspects of transparency and readability necessarily suffer [5]. This motivates the consideration of methods for reducing the complexity and, thereby, improving the interpretability of fuzzy models. In this paper, corresponding methods will be developed for evolving TS fuzzy systems. Even though we shall focus on FLEXFIS [6, 5], a specific variant for incremental learning of TS models, our methods are of a more general nature. In principle, it should be possible to use them, perhaps in a slightly modified way, for other approaches as well. FLEXFIS implements the Fuzzy Basis Function Network architecture, which is a specific case of a TS fuzzy system with multi-dimensional input x = (x 1,..., x p ) and a single output variable y. Thus, the latter is given by ˆf( x) = ŷ = C l i ( x) Ψ i ( x) (1) i=1 where C is the number of rules, and the basis functions Ψ i are Gaussian kernels: ( ) exp 1 p (x j c ij ) 2 2 j=1 σij 2 Ψ i ( x) = ( ) (2) C k=1 exp 1 p 2 j=1 (x j c kj ) 2 σ 2 kj Moreover, the consequent functions are defined as l i ( x) = w i0 + w i1 x 1 + w i2 x w ip x p. (3) Note that a multi-dimensional Gaussian kernel represents the premise part of a fuzzy rule, the antecedents (one-dimensional Gaussian fuzzy sets)

2 of which are combined by means of the product t-norm. Learning a fuzzy model as defined above means fitting all of its parameters to the data: the number of rules (C), the centers (c ij ) and widths (σ ij ) of the multivariate Gaussian kernels, and the parameters appearing in the rule consequents as output weights (w i0, w i1,..., w ip ). As we mentioned above, these parameter estimation problems have to be solved in an incremental manner. 2 Basics of FLEXFIS FLEXFIS consists of two main components, one for estimating the parameters that specify the antecedent parts of the rules in model (1) and one for updating the parameters in the rule consequents. These two problems are handled separately, because they are of a quite different nature: The first one is an inherently nonlinear problem, while the second one can be approached by linear regression techniques. To solve the first problem, FLEXFIS exploits vector quantization in an incremental mode, combined with the idea of ART-networks: Whenever a new observation (data point) x has arrived, its distance to all existing cluster centers is compared to a so-called vigilance parameter ρ. The observation initializes a new cluster if all distances are larger than ρ. Otherwise, it is assigned to the nearest cluster center, c win, which is then updated as follows: c (new) win = c (old) win + η ( x c(old) ), (4) win with η being the learning rate; the latter decreases with the number k win of data points lying nearest to c win (this number is simply updated through counting). Moreover, the widths σ win,j are adapted by exploiting a recursive variance formula: k i σ 2 win,j = (k i 1)σ 2 win,j + k i ( c win,j ) 2 + (5) + (c win,j x kj ) 2, where c win,j is the distance between the jth entries of the vectors c win and c (new) win. Each new cluster gives rise to a new rule, i.e., there is a one-to-one correspondence between clusters and rules: The fuzzy sets that appear in the antecedent part of a rule are obtained by projecting a cluster to the various dimensions of the input space, and these fuzzy sets are connected with the product t-norm, see (2). A new cluster, consisting of a single point x, is initialized with this point as its center; moreover, the width in the jth dimension is set to ɛ (u j l j ), where ɛ is a small constant and [l j, u j ] the range of jth input variable. Updating the parameters in the rule consequents is accomplished by means of a recursive weighted least squares (RWLS) estimation. In our approach, the parameters are estimated for each rule individually, which turned out to have several advantages [5]. This is why the weighted version of the well-known RLS estimation [4] has to be used: The influence of a data point on the parameter estimation is proportional to its membership in the corresponding cluster. Finally, in order to improve the adaptation to non-steady functional relationships that change over time, a forgetting factor λ is used that limits the influence of old observations. This leads to the following update scheme for linear consequent parameters: ˆ w i (k + 1) = ˆ w i (k) + γ(k) (y(k + 1) (6) r T (k + 1) ˆ w i (k)) P i (k) r(k + 1) γ(k) = λ Ψ i ( x(k+1)) + rt (k + 1) P i (k) r(k + 1) (7) P i (k + 1) = (I γ(k) r T (k + 1)) P i (k) 1 (8) λ with P i (k) = (R i (k) T Q i (k)r i (k)) 1 the weighted inverse Hesse matrix and r(k + 1) = [1 x 1 (k + 1) x 2 (k + 1)... x p (k + 1)] T the regressor values of the (k + 1)th data point (which are identical for all C rules). Note that an incremental learning of rule consequents, using RWLS, actually assumes stable rule premises. However, since the premise parts are adapted as well, the RWLS estimates need to be corrected. Even though this can principally be done by incorporating corresponding correction terms, it is not possible in online training since these terms cannot be derived incrementally. Still, it can be shown that when omitting the correction terms (setting them to 0), our algorithm 29

3 still converges for a typical parameter adjustment, especially with respect to η in (4), and delivers a suboptimal solution in the least squares sense [5, 6]. 3 Improving Interpretability The main objective of FLEXFIS as outlined above is to learn highly accurate models. The aspect of interpretability, on the other hand, has been neglected so far. Consequently, there is a high danger to obtain models that are hardly more understandable than black box models such as, e.g., neural networks. The strategies and algorithms proposed in this section are meant to overcome this problem, at least to some extent. 3.1 Interpretability of Rule Consequents To guarantee the interpretability of an individual rule consequent, the corresponding linear function should first of all provide a good local approximation to the data in the input/output space. Both, formal and empirical investigations have shown that this requirement is often violated when using a global approach to parameter estimation. Here, global means that the parameters of all C rule consequents (1) are adapted simultaneously to the data. This way, an optimal approximation quality can be achieved. However, due to the induced interaction between rules, a single rule consequent considered in isolation might no longer provide a good (local) approximation to the data. In the local estimation approach as outlined in the previous section, the parameters are estimated separately for each rule consequent. On the one hand, the overall approximation quality, as induced by the complete rule base, might deteriorate. On the other hand, the local approach guarantees that individual rule consequents fit the data rather nicely. See [5] for a more detailed comparison between global and local estimation. The aforementioned effects are illustrated in Fig. 1. As can be seen, the linear rule consequents (shown as line segments) break out in the global approach, some of them lying far away from the data and, hence, from the graph of the original function. In contrast, the local approach (with 30 Figure 1: A sinusoidal relationship approximated with the local (left) and the global approach (right). initial values of all linear parameters set to 0 and the corresponding inverse Hesse matrix to αi for large enough α) yields rule consequents in the immediate vicinity of the graph. 3.2 Interpretability of Fuzzy Sets In this section, an attempt is made to ensure a set of reasonable properties for the membership functions of the fuzzy sets appearing in the rule premises. In particular, these properties should guarantee linguistically interpretable partitions of the input variables. In this regard, we refer to four semantic properties found to be crucial in [7]: a moderate number of membership functions, distinguishability, normality and unimodality, and coverage. Normality and unimodality as well as coverage of the input space are obviously fulfilled when choosing Gaussian fuzzy sets. The number of fuzzy sets, however, usually depends on the degree of nonlinearity of the underlying functional relationship. In FLEXFIS, this number can be controlled, at least to some extent, by the vigilance parameter ρ (see Section 2): The larger ρ, the less clusters are created; less clusters will in turn produce less rules and, hence, less fuzzy sets in each dimension. The main problem of FLEXFIS is distinguishability. In fact, by projecting high-dimensional clusters to the one-dimensional axes of the input space, strongly overlapping fuzzy sets might be produced. For the two-dimensional case, this problem is illustrated in the left part of Fig. 2. A straightforward idea to avoid the above problem

4 Figure 2: Clusters causing two strongly overlapping sets (left) and sets with close modal values (right) Figure 3: Approximation and fuzzy sets obtained by merging two sets (left) and three sets which are close to each other. is to merge very similar fuzzy sets. Putting this idea into practice of course presupposes a suitable similarity measure. A standard measure in this regard is the so-called Jaccard index, which defines the similarity between two fuzzy sets A and B as follows: (µa µ B )(x) dx S(A, B) = (µa µ B )(x) dx, (9) where the intersection in the nominator is given by the pointwise minimum of the membership functions, and the union in the denominator by the pointwise maximum. (In practice, the integrals are of course approximated numerically.) On the basis of this measure, an algorithm for merging fuzzy sets has been proposed in [9]. Fortunately, this algorithm is directly applicable within our incremental framework, as it does not need any information about previous data points. In our approach, two Gaussian fuzzy sets are merged into a new Gaussian kernel with the following parameters: µ new = (max(u) + min(u))/2, (10) σ new = (max(u) min(u))/2, (11) where U = {µ A ± σ A, µ B ± σ B }. The idea underlying this definition is to reduce the approximate merging of two Gaussian kernels to the exact merging of two of their α-cuts, for a specific value of α. Here, we choose α = exp( 1/2) 0.6, which is the membership degree of the inflection points µ±σ of a Gaussian kernel with parameters µ and σ. Next, we consider the situation where modal values of adjacent fuzzy sets are close to each other. Since VQ-ART generates axis-parallel clusters, this situation can occur in the vicinity of very steep parts of the function to be approximated: The only chance to approximate such steep regions is to cover it by a relatively large number of such clusters (see Fig. 2). From Fig. 2 it becomes obvious that the narrow fuzzy sets on the right can be merged, as each of them triggers a rule consequent with almost the same slope. The FuZion algorithm is a routine that merges consecutive triangular membership functions whenever their modal values are close enough [2]. In FLEXFIS, we used this algorithm is a modified way: Firstly, two fuzzy sets A and B, both referring to the jth input variable, are merged only if the (partial) slopes of the consequents of rules in which these fuzzy sets occur are approximately equal, i.e., if w ij wkj whenever A resp. B occur in the premise of the ith resp. kth rule. Thus, a merging is prevented in the case of a highly fluctuating functional relationship. Secondly, we merge Gaussian instead of triangular fuzzy sets, which can be done by extending (10 11) in a canonical way. 1 1 In the current implementation the width of the new 31

5 Table 1: Comparison between FLEXFIS with and without interpretability improvements Method Quality Av. No. Av. No. Test of Sets of Rules FLEXFIS FLEXFIS* Figure 4: building. A two-layer-architecture for model When applying this version of the FuZion algorithm to the fuzzy partition as shown in Fig. 2 (right), one obtains the fuzzy sets (and the corresponding approximation) in Fig. 3. In this example, the approximation accuracy hardly suffered (compare solid lines vs. dotted lines), while the number of fuzzy sets could be reduced from 7 to 6 resp. 5 for different threshold values. 3.3 Rule Base Reduction One possibility to reduce the number of rules is to delete rules that become redundant in the course of the iterative merging process of the fuzzy sets as outlined above. A well-known method for accomplishing this task is the fuzzy system simplification (FSS) algorithm [8]. Yet, this algorithm suffers from the drawback that, when removing redundant rules, it needs all of the training data for re-estimating the rule consequents of the replacing rule. This algorithm is hence not applicable in an online mode. For this reason, we developed a variant that builds models according to the two-layer-architecture shown in Fig. 4. The first layer updates the original model, i.e., the model that has been trained using the techniques from Section 2. This model is maximally accurate, which might be of critical importance for applications in fields like fault detection, prediction, or controlling. The second layer improves the interpretability of the updated fuzzy set is still defined in a slightly different way as σ new = max( c new c a, c new c b ) + max(σ a, σ b ), where c a is the leftmost center, c b the rightmost, and c new the new one. 32 model whenever insight into the system behavior becomes a major issue. To this end, FSS is employed in combination with a merging strategy which can be used in an online mode: w (new)j = w 1jk w qj k q k k q (12) for j = 0,..., p, where w ij is the parameter in the consequent of the ith redundant rule, pertaining to the jth input variable, and k i is the number of data points belonging to the corresponding cluster. Thus, the parameters of the new rule consequent are defined as a weighted average of the consequent parameters of the redundant rules, with the weights representing the relevancy of the rules. This merging strategy can obviously be applied in online mode as it does not require any training data. Note that it would be possible to submit the improved model to the incremental learning process for new incoming data points (see the feed-back path in Fig. 4 represented by the dotted line). However, this would entail a worse approximation quality, as the inverse Hesse matrix for the newly obtained rule, which is required for an accurate update of rule consequent parameters, has to be re-initialized by αi (merging the Hesse matrices of the redundant rules is extremely difficult) [5]. 4 Empirical Evaluation In order to examine the effectiveness of the improvements suggested in the previous section, we have compared the original version of FLEXFIS with the extended variant, FLEXFIS*. In the experiments we used data coming from a diesel engine. This data has been recorded at an engine test bench. The original data set, which was split into a training set of 1810 and a test set

6 of 136 samples, contained 80 measurement channels. However, 18 channels were neglected due to missing data. For each of the remaining 62 channels, a fuzzy model was trained, using that channel as the output variable and a subset of maximally 5 of the other channels as input variables. In each case, the subset was determined by means of a feature selection technique [3]. The accuracy of FLEXFIS and FLEXFIS* was measured in terms of the average of the r-squaredadjusted values obtained for the 62 fuzzy models. Likewise, complexity was measured in terms of the average number of fuzzy sets and rules. From Table 1 it becomes obvious that the fuzzy set and rule merging strategies as presented in the previous section lead to significant improvements regarding model complexity. In particular, the average number of fuzzy sets per input dimension could be decreased from about seven to four. At the same time, the approximation accuracy could be maintained at a high level. The complexity reduction was even stronger for the housing data from the UCI repository 2. By approximating the output variable in this data set with the five most relevant inputs, the average number of fuzzy sets was reduced from 13 to 5.8. The model quality, measured in terms of mean squared error between predicted and measured outputs, decreased by only 0.87%. In this connection, let us note that for these data sets, FLEXFIS is also competitive to conventional batch learning methods (see [5]). 5 Conclusion This paper has presented methods which aim at reducing the complexity of data-driven, incremental Takagi-Sugeno fuzzy models. In particular, we have addressed the problems to obtain interpretable rule consequents, to guarantee linguistically interpretable fuzzy partitions in each input dimension, and to reduce the number of rules while maintaining a reasonably high approximation accuracy. Experiments with highdimensional, real-world data sets have shown these methods to be effective in practice. References [1] R. Babuska. Fuzzy Modeling for Control. Kluwer Academic Publishers, Boston, [2] J. Espinosa and J. Vandewalle. Constructing fuzzy models with linguistic intergrity from numerical data - afreli algorithm. IEEE Transactions on Fuzzy Systems, 8: , [3] W. Groißböck, E. Lughofer, and E.P. Klement. A comparison of variable selection methods with the main focus on orthogonalization. In Proceedings of SMPS Conference 2004, Oviedo, Spain, [4] L. Ljung. System Identification: Theory for the User. Prentice Hall PTR, Prentic Hall Inc., Upper Saddle River, New Jersey 07458, [5] E. Lughofer. Data-Driven Incremental Learning of Takagi-Sugeno Fuzzy Models. PhD thesis, Department of Knowledge-Based Mathematical Systems, Johannes Kepler University Linz, February [6] E. Lughofer and E.P. Klement. FLEXFIS: A variant for incremental learning of Takagi-Sugeno fuzzy systems. In to appear in Proceedings of FUZZ-IEEE 2005, Reno, Nevada, U.S.A., [7] J. Valente De Oliveira. Semantic constraints for membership function optimization. IEEE Transactions on Systems, Man and Cybernetics - part A: Systems and Humans, 29(1): , [8] M. Setnes. Simplification and reduction of fuzzy rules. In J. Casillas, O. Cordón, F. Herrera, and L. Magdalena, editors, Interpretability Issues in Fuzzy Modeling, volume 128 of Studies in Fuzziness and Soft Computing, pages Springer, Berlin, [9] M. Setnes, R. Babuska, U. Kaymak, and H.R.v.N. Lemke. Similarity measures in fuzzy rule base simplification. IEEE Trans. SMC-B, 28: , [10] T. Takagi and M. Sugeno. Fuzzy identification of systems and its applications to modeling and control. IEEE Transactions on Systems, Man and Cybernetics, 15(1): , [11] L.X. Wang. Fuzzy systems are universal approximators. In Proceedings of the IEEE International Conference on Fuzzy Systems, pages , mlearn/mlrepository.html 33

Incremental Learning of Fuzzy Basis Function Networks with a Modified Version of Vector Quantization

Incremental Learning of Fuzzy Basis Function Networks with a Modified Version of Vector Quantization Incremental Learning of Fuzzy Basis Function Networks with a Modified Version of Vector Quantization Edwin Lughofer Fuzzy Logic Laboratorium Linz Johannes Kepler University Linz A-4040 Linz, Austria E-mail:

More information

On-line Elimination of Local Redundancies in Evolving Fuzzy Systems

On-line Elimination of Local Redundancies in Evolving Fuzzy Systems Noname manuscript No. (will be inserted by the editor) On-line Elimination of Local Redundancies in Evolving Fuzzy Systems Edwin Lughofer Jean-Luc Bouchot Ammar Shaker Received: 04.02.2011, Revised: 20.04.2011

More information

Improving interpretability in approximative fuzzy models via multi-objective evolutionary algorithms.

Improving interpretability in approximative fuzzy models via multi-objective evolutionary algorithms. Improving interpretability in approximative fuzzy models via multi-objective evolutionary algorithms. Gómez-Skarmeta, A.F. University of Murcia skarmeta@dif.um.es Jiménez, F. University of Murcia fernan@dif.um.es

More information

Rule Weight Optimization and Feature Selection in Fuzzy Systems with Sparsity Constraints

Rule Weight Optimization and Feature Selection in Fuzzy Systems with Sparsity Constraints Rule Weight Optimization and Feature Selection in Fuzzy Systems with Sparsity Constraints Edwin Lughofer 1 Stefan Kindermann 2 1. Department of Knowledge-Based Mathematical Systems, Johannes Kepler University

More information

Compact and Transparent Fuzzy Models and Classifiers Through Iterative Complexity Reduction

Compact and Transparent Fuzzy Models and Classifiers Through Iterative Complexity Reduction 516 IEEE TRANSACTIONS ON FUZZY SYSTEMS, VOL. 9, NO. 4, AUGUST 2001 Compact and Transparent Fuzzy Models and Classifiers Through Iterative Complexity Reduction Hans Roubos, Student Member, IEEE, and Magne

More information

European Journal of Science and Engineering Vol. 1, Issue 1, 2013 ADAPTIVE NEURO-FUZZY INFERENCE SYSTEM IDENTIFICATION OF AN INDUCTION MOTOR

European Journal of Science and Engineering Vol. 1, Issue 1, 2013 ADAPTIVE NEURO-FUZZY INFERENCE SYSTEM IDENTIFICATION OF AN INDUCTION MOTOR ADAPTIVE NEURO-FUZZY INFERENCE SYSTEM IDENTIFICATION OF AN INDUCTION MOTOR Ahmed A. M. Emam College of Engineering Karrary University SUDAN ahmedimam1965@yahoo.co.in Eisa Bashier M. Tayeb College of Engineering

More information

On-Line Valuation of Residential Premises with Evolving Fuzzy Models

On-Line Valuation of Residential Premises with Evolving Fuzzy Models On-Line Valuation of Residential Premises with Evolving Fuzzy Models Edwin Lughofer 1, Bogdan Trawiński 2, Krzysztof Trawiński 3, and Tadeusz Lasota 4 1 Johannes Kepler University Linz, Department of Knowledge-Based

More information

Rule Base Reduction: Some Comments on the Use of Orthogonal Transforms

Rule Base Reduction: Some Comments on the Use of Orthogonal Transforms IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS PART C: APPLICATIONS AND REVIEWS, VOL. 31, NO. 2, MAY 2001 199 Rule Base Reduction: Some Comments on the Use of Orthogonal Transforms Magne Setnes, Member,

More information

Online Adaptation of Correlation and Regression Models

Online Adaptation of Correlation and Regression Models Technical Report FLLL TR 0213 Online Adaptation of Correlation and Regression Models Edwin Lughofer Fuzzy Logic Laboratorium Linz-Hagenberg e-mail edwin.lughofer@jku.at Werner Groissböck Fuzzy Logic Laboratorium

More information

Fuzzy Modeling using Vector Quantization with Supervised Learning

Fuzzy Modeling using Vector Quantization with Supervised Learning Fuzzy Modeling using Vector Quantization with Supervised Learning Hirofumi Miyajima, Noritaka Shigei, and Hiromi Miyajima Abstract It is known that learning methods of fuzzy modeling using vector quantization

More information

Extracting Interpretable Fuzzy Rules from RBF Networks

Extracting Interpretable Fuzzy Rules from RBF Networks Extracting Interpretable Fuzzy Rules from RBF Networks Yaochu Jin (yaochu.jin@hre-ftr.f.rd.honda.co.jp) Future Technology Research, Honda R&D Europe(D), 6373 Offenbach/Main, Germany Bernhard Sendhoff (bernhard.sendhoff@hre-ftr.f.rd.honda.co.jp)

More information

FEATURE EXTRACTION USING FUZZY RULE BASED SYSTEM

FEATURE EXTRACTION USING FUZZY RULE BASED SYSTEM International Journal of Computer Science and Applications, Vol. 5, No. 3, pp 1-8 Technomathematics Research Foundation FEATURE EXTRACTION USING FUZZY RULE BASED SYSTEM NARENDRA S. CHAUDHARI and AVISHEK

More information

Improving the Wang and Mendel s Fuzzy Rule Learning Method by Inducing Cooperation Among Rules 1

Improving the Wang and Mendel s Fuzzy Rule Learning Method by Inducing Cooperation Among Rules 1 Improving the Wang and Mendel s Fuzzy Rule Learning Method by Inducing Cooperation Among Rules 1 J. Casillas DECSAI, University of Granada 18071 Granada, Spain casillas@decsai.ugr.es O. Cordón DECSAI,

More information

Fuzzy Modeling for Control.,,i.

Fuzzy Modeling for Control.,,i. Fuzzy Modeling for Control,,i. INTERNATIONAL SERIES IN INTELLIGENT TECHNOLOGIES Prof. Dr. Dr. h.c. Hans-Jiirgen Zimmermann, Editor European Laboratory for Intelligent Techniques Engineering Aachen, Germany

More information

OPTIMIZATION. Optimization. Derivative-based optimization. Derivative-free optimization. Steepest descent (gradient) methods Newton s method

OPTIMIZATION. Optimization. Derivative-based optimization. Derivative-free optimization. Steepest descent (gradient) methods Newton s method OPTIMIZATION Optimization Derivative-based optimization Steepest descent (gradient) methods Newton s method Derivative-free optimization Simplex Simulated Annealing Genetic Algorithms Ant Colony Optimization...

More information

12 IEEE TRANSACTIONS ON FUZZY SYSTEMS, VOL. 6, NO. 1, FEBRUARY An On-Line Self-Constructing Neural Fuzzy Inference Network and Its Applications

12 IEEE TRANSACTIONS ON FUZZY SYSTEMS, VOL. 6, NO. 1, FEBRUARY An On-Line Self-Constructing Neural Fuzzy Inference Network and Its Applications 12 IEEE TRANSACTIONS ON FUZZY SYSTEMS, VOL. 6, NO. 1, FEBRUARY 1998 An On-Line Self-Constructing Neural Fuzzy Inference Network Its Applications Chia-Feng Juang Chin-Teng Lin Abstract A self-constructing

More information

CHAPTER 4 FUZZY LOGIC, K-MEANS, FUZZY C-MEANS AND BAYESIAN METHODS

CHAPTER 4 FUZZY LOGIC, K-MEANS, FUZZY C-MEANS AND BAYESIAN METHODS CHAPTER 4 FUZZY LOGIC, K-MEANS, FUZZY C-MEANS AND BAYESIAN METHODS 4.1. INTRODUCTION This chapter includes implementation and testing of the student s academic performance evaluation to achieve the objective(s)

More information

In the Name of God. Lecture 17: ANFIS Adaptive Network-Based Fuzzy Inference System

In the Name of God. Lecture 17: ANFIS Adaptive Network-Based Fuzzy Inference System In the Name of God Lecture 17: ANFIS Adaptive Network-Based Fuzzy Inference System Outline ANFIS Architecture Hybrid Learning Algorithm Learning Methods that Cross-Fertilize ANFIS and RBFN ANFIS as a universal

More information

Function approximation using RBF network. 10 basis functions and 25 data points.

Function approximation using RBF network. 10 basis functions and 25 data points. 1 Function approximation using RBF network F (x j ) = m 1 w i ϕ( x j t i ) i=1 j = 1... N, m 1 = 10, N = 25 10 basis functions and 25 data points. Basis function centers are plotted with circles and data

More information

MultiGrid-Based Fuzzy Systems for Function Approximation

MultiGrid-Based Fuzzy Systems for Function Approximation MultiGrid-Based Fuzzy Systems for Function Approximation Luis Javier Herrera 1,Héctor Pomares 1, Ignacio Rojas 1, Olga Valenzuela 2, and Mohammed Awad 1 1 University of Granada, Department of Computer

More information

Processing and Others. Xiaojun Qi -- REU Site Program in CVMA

Processing and Others. Xiaojun Qi -- REU Site Program in CVMA Advanced Digital Image Processing and Others Xiaojun Qi -- REU Site Program in CVMA (0 Summer) Segmentation Outline Strategies and Data Structures Overview of Algorithms Region Splitting Region Merging

More information

Evolution of Recurrent Fuzzy Controllers

Evolution of Recurrent Fuzzy Controllers Evolution of Recurrent Fuzzy Controllers Carlos Kavka, Patricia Roggero and Javier Apolloni LIDIC Departamento de Informática Universidad Nacional de San Luis Ejército de los Andes 950 D5700HHW - San Luis

More information

Cluster Analysis. Mu-Chun Su. Department of Computer Science and Information Engineering National Central University 2003/3/11 1

Cluster Analysis. Mu-Chun Su. Department of Computer Science and Information Engineering National Central University 2003/3/11 1 Cluster Analysis Mu-Chun Su Department of Computer Science and Information Engineering National Central University 2003/3/11 1 Introduction Cluster analysis is the formal study of algorithms and methods

More information

Segmentation: Clustering, Graph Cut and EM

Segmentation: Clustering, Graph Cut and EM Segmentation: Clustering, Graph Cut and EM Ying Wu Electrical Engineering and Computer Science Northwestern University, Evanston, IL 60208 yingwu@northwestern.edu http://www.eecs.northwestern.edu/~yingwu

More information

Lecture 6: Unsupervised Machine Learning Dagmar Gromann International Center For Computational Logic

Lecture 6: Unsupervised Machine Learning Dagmar Gromann International Center For Computational Logic SEMANTIC COMPUTING Lecture 6: Unsupervised Machine Learning Dagmar Gromann International Center For Computational Logic TU Dresden, 23 November 2018 Overview Unsupervised Machine Learning overview Association

More information

Data Engineering Fuzzy Mathematics in System Theory and Data Analysis

Data Engineering Fuzzy Mathematics in System Theory and Data Analysis Data Engineering Fuzzy Mathematics in System Theory and Data Analysis Olaf Wolkenhauer Control Systems Centre UMIST o.wolkenhauer@umist.ac.uk www.csc.umist.ac.uk/people/wolkenhauer.htm 2 Introduction General

More information

Fuzzy Sets and Systems. Lecture 1 (Introduction) Bu- Ali Sina University Computer Engineering Dep. Spring 2010

Fuzzy Sets and Systems. Lecture 1 (Introduction) Bu- Ali Sina University Computer Engineering Dep. Spring 2010 Fuzzy Sets and Systems Lecture 1 (Introduction) Bu- Ali Sina University Computer Engineering Dep. Spring 2010 Fuzzy sets and system Introduction and syllabus References Grading Fuzzy sets and system Syllabus

More information

ANFIS: ADAPTIVE-NETWORK-BASED FUZZY INFERENCE SYSTEMS (J.S.R. Jang 1993,1995) bell x; a, b, c = 1 a

ANFIS: ADAPTIVE-NETWORK-BASED FUZZY INFERENCE SYSTEMS (J.S.R. Jang 1993,1995) bell x; a, b, c = 1 a ANFIS: ADAPTIVE-NETWORK-ASED FUZZ INFERENCE SSTEMS (J.S.R. Jang 993,995) Membership Functions triangular triangle( ; a, a b, c c) ma min = b a, c b, 0, trapezoidal trapezoid( ; a, b, a c, d d) ma min =

More information

Radial Basis Function Networks: Algorithms

Radial Basis Function Networks: Algorithms Radial Basis Function Networks: Algorithms Neural Computation : Lecture 14 John A. Bullinaria, 2015 1. The RBF Mapping 2. The RBF Network Architecture 3. Computational Power of RBF Networks 4. Training

More information

CHAPTER 3 FUZZY INFERENCE SYSTEM

CHAPTER 3 FUZZY INFERENCE SYSTEM CHAPTER 3 FUZZY INFERENCE SYSTEM Fuzzy inference is the process of formulating the mapping from a given input to an output using fuzzy logic. There are three types of fuzzy inference system that can be

More information

Neuro-fuzzy systems 1

Neuro-fuzzy systems 1 1 : Trends and Applications International Conference on Control, Engineering & Information Technology (CEIT 14), March 22-25, Tunisia Dr/ Ahmad Taher Azar Assistant Professor, Faculty of Computers and

More information

Fuzzy Transform for Practical Problems

Fuzzy Transform for Practical Problems University of Ostrava Institute for Research and Applications of Fuzzy Modeling Fuzzy Transform for Practical Problems Martin Štěpnička Research report No. 114 2007 Submitted/to appear: Inteligentní systémy

More information

Equi-sized, Homogeneous Partitioning

Equi-sized, Homogeneous Partitioning Equi-sized, Homogeneous Partitioning Frank Klawonn and Frank Höppner 2 Department of Computer Science University of Applied Sciences Braunschweig /Wolfenbüttel Salzdahlumer Str 46/48 38302 Wolfenbüttel,

More information

Approximate Reasoning with Fuzzy Booleans

Approximate Reasoning with Fuzzy Booleans Approximate Reasoning with Fuzzy Booleans P.M. van den Broek Department of Computer Science, University of Twente,P.O.Box 217, 7500 AE Enschede, the Netherlands pimvdb@cs.utwente.nl J.A.R. Noppen Department

More information

Hard clustering. Each object is assigned to one and only one cluster. Hierarchical clustering is usually hard. Soft (fuzzy) clustering

Hard clustering. Each object is assigned to one and only one cluster. Hierarchical clustering is usually hard. Soft (fuzzy) clustering An unsupervised machine learning problem Grouping a set of objects in such a way that objects in the same group (a cluster) are more similar (in some sense or another) to each other than to those in other

More information

Using level-2 fuzzy sets to combine uncertainty and imprecision in fuzzy regions

Using level-2 fuzzy sets to combine uncertainty and imprecision in fuzzy regions Using level-2 fuzzy sets to combine uncertainty and imprecision in fuzzy regions Verstraete Jörg Abstract In many applications, spatial data need to be considered but are prone to uncertainty or imprecision.

More information

Machine Learning. B. Unsupervised Learning B.1 Cluster Analysis. Lars Schmidt-Thieme

Machine Learning. B. Unsupervised Learning B.1 Cluster Analysis. Lars Schmidt-Thieme Machine Learning B. Unsupervised Learning B.1 Cluster Analysis Lars Schmidt-Thieme Information Systems and Machine Learning Lab (ISMLL) Institute for Computer Science University of Hildesheim, Germany

More information

Fuzzy Mod. Department of Electrical Engineering and Computer Science University of California, Berkeley, CA Generalized Neural Networks

Fuzzy Mod. Department of Electrical Engineering and Computer Science University of California, Berkeley, CA Generalized Neural Networks From: AAAI-91 Proceedings. Copyright 1991, AAAI (www.aaai.org). All rights reserved. Fuzzy Mod Department of Electrical Engineering and Computer Science University of California, Berkeley, CA 94 720 1

More information

Clustering Lecture 5: Mixture Model

Clustering Lecture 5: Mixture Model Clustering Lecture 5: Mixture Model Jing Gao SUNY Buffalo 1 Outline Basics Motivation, definition, evaluation Methods Partitional Hierarchical Density-based Mixture model Spectral methods Advanced topics

More information

Learning Fuzzy Rules Using Ant Colony Optimization Algorithms 1

Learning Fuzzy Rules Using Ant Colony Optimization Algorithms 1 Learning Fuzzy Rules Using Ant Colony Optimization Algorithms 1 Jorge Casillas, Oscar Cordón, Francisco Herrera Department of Computer Science and Artificial Intelligence, University of Granada, E-18071

More information

A Combination-of-Tools Method for Learning Interpretable Fuzzy Rule-based Classifiers from Support Vector Machines

A Combination-of-Tools Method for Learning Interpretable Fuzzy Rule-based Classifiers from Support Vector Machines A Combination-of-Tools Method for Learning Interpretable Fuzzy Rule-based Classifiers from Support Vector Machines Tamas Kenesei, Johannes A. Roubos, Janos Abonyi Department of Process Engineering, University

More information

Fuzzy Reasoning. Linguistic Variables

Fuzzy Reasoning. Linguistic Variables Fuzzy Reasoning Linguistic Variables Linguistic variable is an important concept in fuzzy logic and plays a key role in its applications, especially in the fuzzy expert system Linguistic variable is a

More information

Optimization with linguistic variables

Optimization with linguistic variables Optimization with linguistic variables Christer Carlsson christer.carlsson@abo.fi Robert Fullér rfuller@abo.fi Abstract We consider fuzzy mathematical programming problems (FMP) in which the functional

More information

Multiresponse Sparse Regression with Application to Multidimensional Scaling

Multiresponse Sparse Regression with Application to Multidimensional Scaling Multiresponse Sparse Regression with Application to Multidimensional Scaling Timo Similä and Jarkko Tikka Helsinki University of Technology, Laboratory of Computer and Information Science P.O. Box 54,

More information

A New Fuzzy Neural System with Applications

A New Fuzzy Neural System with Applications A New Fuzzy Neural System with Applications Yuanyuan Chai 1, Jun Chen 1 and Wei Luo 1 1-China Defense Science and Technology Information Center -Network Center Fucheng Road 26#, Haidian district, Beijing

More information

SGN (4 cr) Chapter 11

SGN (4 cr) Chapter 11 SGN-41006 (4 cr) Chapter 11 Clustering Jussi Tohka & Jari Niemi Department of Signal Processing Tampere University of Technology February 25, 2014 J. Tohka & J. Niemi (TUT-SGN) SGN-41006 (4 cr) Chapter

More information

Collaborative Rough Clustering

Collaborative Rough Clustering Collaborative Rough Clustering Sushmita Mitra, Haider Banka, and Witold Pedrycz Machine Intelligence Unit, Indian Statistical Institute, Kolkata, India {sushmita, hbanka r}@isical.ac.in Dept. of Electrical

More information

A new fuzzy set merging technique using inclusion based fuzzy clustering

A new fuzzy set merging technique using inclusion based fuzzy clustering A new fuzzy set merging technique using inclusion based fuzzy clustering Nefti Meziani, S, Kaymak, U and Oussalah, M http://dx.doi.org/10.1109/tfuzz.2007.902011 Title Authors Type URL A new fuzzy set merging

More information

3 No-Wait Job Shops with Variable Processing Times

3 No-Wait Job Shops with Variable Processing Times 3 No-Wait Job Shops with Variable Processing Times In this chapter we assume that, on top of the classical no-wait job shop setting, we are given a set of processing times for each operation. We may select

More information

GRANULAR COMPUTING AND EVOLUTIONARY FUZZY MODELLING FOR MECHANICAL PROPERTIES OF ALLOY STEELS. G. Panoutsos and M. Mahfouf

GRANULAR COMPUTING AND EVOLUTIONARY FUZZY MODELLING FOR MECHANICAL PROPERTIES OF ALLOY STEELS. G. Panoutsos and M. Mahfouf GRANULAR COMPUTING AND EVOLUTIONARY FUZZY MODELLING FOR MECHANICAL PROPERTIES OF ALLOY STEELS G. Panoutsos and M. Mahfouf Institute for Microstructural and Mechanical Process Engineering: The University

More information

Semi-Supervised Clustering with Partial Background Information

Semi-Supervised Clustering with Partial Background Information Semi-Supervised Clustering with Partial Background Information Jing Gao Pang-Ning Tan Haibin Cheng Abstract Incorporating background knowledge into unsupervised clustering algorithms has been the subject

More information

QUALITATIVE MODELING FOR MAGNETIZATION CURVE

QUALITATIVE MODELING FOR MAGNETIZATION CURVE Journal of Marine Science and Technology, Vol. 8, No. 2, pp. 65-70 (2000) 65 QUALITATIVE MODELING FOR MAGNETIZATION CURVE Pei-Hwa Huang and Yu-Shuo Chang Keywords: Magnetization curve, Qualitative modeling,

More information

Extracting Interpretable Fuzzy Rules from RBF Networks

Extracting Interpretable Fuzzy Rules from RBF Networks Neural Processing Letters 17: 149 164, 2003. 149 # 2003 Kluwer Academic Publishers. Printed in the Netherlands. Extracting Interpretable Fuzzy Rules from RBF Networks YAOCHU JIN and BERNHARD SENDHOFF Honda

More information

Face Hallucination Based on Eigentransformation Learning

Face Hallucination Based on Eigentransformation Learning Advanced Science and Technology etters, pp.32-37 http://dx.doi.org/10.14257/astl.2016. Face allucination Based on Eigentransformation earning Guohua Zou School of software, East China University of Technology,

More information

Keywords - Fuzzy rule-based systems, clustering, system design

Keywords - Fuzzy rule-based systems, clustering, system design CHAPTER 7 Application of Fuzzy Rule Base Design Method Peter Grabusts In many classification tasks the final goal is usually to determine classes of objects. The final goal of fuzzy clustering is also

More information

Clustering and Visualisation of Data

Clustering and Visualisation of Data Clustering and Visualisation of Data Hiroshi Shimodaira January-March 28 Cluster analysis aims to partition a data set into meaningful or useful groups, based on distances between data points. In some

More information

Matrix Inference in Fuzzy Decision Trees

Matrix Inference in Fuzzy Decision Trees Matrix Inference in Fuzzy Decision Trees Santiago Aja-Fernández LPI, ETSIT Telecomunicación University of Valladolid, Spain sanaja@tel.uva.es Carlos Alberola-López LPI, ETSIT Telecomunicación University

More information

Using a fuzzy inference system for the map overlay problem

Using a fuzzy inference system for the map overlay problem Using a fuzzy inference system for the map overlay problem Abstract Dr. Verstraete Jörg 1 1 Systems esearch Institute, Polish Academy of Sciences ul. Newelska 6, Warsaw, 01-447, Warsaw jorg.verstraete@ibspan.waw.pl

More information

FuzzConRI - A Fuzzy Conjunctive Rule Inducer

FuzzConRI - A Fuzzy Conjunctive Rule Inducer FuzzConRI - A Fuzzy Conjunctive Rule Inducer Jacobus van Zyl and Ian Cloete School of Information Technology, International University in Germany, 76646 Bruchsal, Germany {jacobus.vanzyl,ian.cloete}@i-u.de

More information

MODELING FOR RESIDUAL STRESS, SURFACE ROUGHNESS AND TOOL WEAR USING AN ADAPTIVE NEURO FUZZY INFERENCE SYSTEM

MODELING FOR RESIDUAL STRESS, SURFACE ROUGHNESS AND TOOL WEAR USING AN ADAPTIVE NEURO FUZZY INFERENCE SYSTEM CHAPTER-7 MODELING FOR RESIDUAL STRESS, SURFACE ROUGHNESS AND TOOL WEAR USING AN ADAPTIVE NEURO FUZZY INFERENCE SYSTEM 7.1 Introduction To improve the overall efficiency of turning, it is necessary to

More information

Classification with Diffuse or Incomplete Information

Classification with Diffuse or Incomplete Information Classification with Diffuse or Incomplete Information AMAURY CABALLERO, KANG YEN Florida International University Abstract. In many different fields like finance, business, pattern recognition, communication

More information

Learning Inverse Dynamics: a Comparison

Learning Inverse Dynamics: a Comparison Learning Inverse Dynamics: a Comparison Duy Nguyen-Tuong, Jan Peters, Matthias Seeger, Bernhard Schölkopf Max Planck Institute for Biological Cybernetics Spemannstraße 38, 72076 Tübingen - Germany Abstract.

More information

Association Rule Mining and Clustering

Association Rule Mining and Clustering Association Rule Mining and Clustering Lecture Outline: Classification vs. Association Rule Mining vs. Clustering Association Rule Mining Clustering Types of Clusters Clustering Algorithms Hierarchical:

More information

Joint Entity Resolution

Joint Entity Resolution Joint Entity Resolution Steven Euijong Whang, Hector Garcia-Molina Computer Science Department, Stanford University 353 Serra Mall, Stanford, CA 94305, USA {swhang, hector}@cs.stanford.edu No Institute

More information

Chapter 4 Fuzzy Logic

Chapter 4 Fuzzy Logic 4.1 Introduction Chapter 4 Fuzzy Logic The human brain interprets the sensory information provided by organs. Fuzzy set theory focus on processing the information. Numerical computation can be performed

More information

arxiv: v2 [stat.ml] 5 Nov 2018

arxiv: v2 [stat.ml] 5 Nov 2018 Kernel Distillation for Fast Gaussian Processes Prediction arxiv:1801.10273v2 [stat.ml] 5 Nov 2018 Congzheng Song Cornell Tech cs2296@cornell.edu Abstract Yiming Sun Cornell University ys784@cornell.edu

More information

Clustering. CS294 Practical Machine Learning Junming Yin 10/09/06

Clustering. CS294 Practical Machine Learning Junming Yin 10/09/06 Clustering CS294 Practical Machine Learning Junming Yin 10/09/06 Outline Introduction Unsupervised learning What is clustering? Application Dissimilarity (similarity) of objects Clustering algorithm K-means,

More information

Observational Learning with Modular Networks

Observational Learning with Modular Networks Observational Learning with Modular Networks Hyunjung Shin, Hyoungjoo Lee and Sungzoon Cho {hjshin72, impatton, zoon}@snu.ac.kr Department of Industrial Engineering, Seoul National University, San56-1,

More information

ESSENTIALLY, system modeling is the task of building

ESSENTIALLY, system modeling is the task of building IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, VOL. 53, NO. 4, AUGUST 2006 1269 An Algorithm for Extracting Fuzzy Rules Based on RBF Neural Network Wen Li and Yoichi Hori, Fellow, IEEE Abstract A four-layer

More information

Unsupervised learning in Vision

Unsupervised learning in Vision Chapter 7 Unsupervised learning in Vision The fields of Computer Vision and Machine Learning complement each other in a very natural way: the aim of the former is to extract useful information from visual

More information

Machine Learning for Signal Processing Clustering. Bhiksha Raj Class Oct 2016

Machine Learning for Signal Processing Clustering. Bhiksha Raj Class Oct 2016 Machine Learning for Signal Processing Clustering Bhiksha Raj Class 11. 13 Oct 2016 1 Statistical Modelling and Latent Structure Much of statistical modelling attempts to identify latent structure in the

More information

An efficient algorithm for rank-1 sparse PCA

An efficient algorithm for rank-1 sparse PCA An efficient algorithm for rank- sparse PCA Yunlong He Georgia Institute of Technology School of Mathematics heyunlong@gatech.edu Renato Monteiro Georgia Institute of Technology School of Industrial &

More information

Recent advances in Metamodel of Optimal Prognosis. Lectures. Thomas Most & Johannes Will

Recent advances in Metamodel of Optimal Prognosis. Lectures. Thomas Most & Johannes Will Lectures Recent advances in Metamodel of Optimal Prognosis Thomas Most & Johannes Will presented at the Weimar Optimization and Stochastic Days 2010 Source: www.dynardo.de/en/library Recent advances in

More information

Chapter 2 Basic Structure of High-Dimensional Spaces

Chapter 2 Basic Structure of High-Dimensional Spaces Chapter 2 Basic Structure of High-Dimensional Spaces Data is naturally represented geometrically by associating each record with a point in the space spanned by the attributes. This idea, although simple,

More information

Self-Organized Similarity based Kernel Fuzzy Clustering Model and Its Applications

Self-Organized Similarity based Kernel Fuzzy Clustering Model and Its Applications Fifth International Workshop on Computational Intelligence & Applications IEEE SMC Hiroshima Chapter, Hiroshima University, Japan, November 10, 11 & 12, 2009 Self-Organized Similarity based Kernel Fuzzy

More information

Outline. Advanced Digital Image Processing and Others. Importance of Segmentation (Cont.) Importance of Segmentation

Outline. Advanced Digital Image Processing and Others. Importance of Segmentation (Cont.) Importance of Segmentation Advanced Digital Image Processing and Others Xiaojun Qi -- REU Site Program in CVIP (7 Summer) Outline Segmentation Strategies and Data Structures Algorithms Overview K-Means Algorithm Hidden Markov Model

More information

Machine Learning & Statistical Models

Machine Learning & Statistical Models Astroinformatics Machine Learning & Statistical Models Neural Networks Feed Forward Hybrid Decision Analysis Decision Trees Random Decision Forests Evolving Trees Minimum Spanning Trees Perceptron Multi

More information

An Efficient Method for Extracting Fuzzy Classification Rules from High Dimensional Data

An Efficient Method for Extracting Fuzzy Classification Rules from High Dimensional Data Published in J. Advanced Computational Intelligence, Vol., No., 997 An Efficient Method for Extracting Fuzzy Classification Rules from High Dimensional Data Stephen L. Chiu Rockwell Science Center 049

More information

Multi Domain Logic and its Applications to SAT

Multi Domain Logic and its Applications to SAT Multi Domain Logic and its Applications to SAT Tudor Jebelean RISC Linz, Austria Tudor.Jebelean@risc.uni-linz.ac.at Gábor Kusper Eszterházy Károly College gkusper@aries.ektf.hu Abstract We describe a new

More information

Application of fuzzy set theory in image analysis. Nataša Sladoje Centre for Image Analysis

Application of fuzzy set theory in image analysis. Nataša Sladoje Centre for Image Analysis Application of fuzzy set theory in image analysis Nataša Sladoje Centre for Image Analysis Our topics for today Crisp vs fuzzy Fuzzy sets and fuzzy membership functions Fuzzy set operators Approximate

More information

Fast Fuzzy Clustering of Infrared Images. 2. brfcm

Fast Fuzzy Clustering of Infrared Images. 2. brfcm Fast Fuzzy Clustering of Infrared Images Steven Eschrich, Jingwei Ke, Lawrence O. Hall and Dmitry B. Goldgof Department of Computer Science and Engineering, ENB 118 University of South Florida 4202 E.

More information

NCC 2009, January 16-18, IIT Guwahati 267

NCC 2009, January 16-18, IIT Guwahati 267 NCC 2009, January 6-8, IIT Guwahati 267 Unsupervised texture segmentation based on Hadamard transform Tathagata Ray, Pranab Kumar Dutta Department Of Electrical Engineering Indian Institute of Technology

More information

An Approach for Fuzzy Modeling based on Self-Organizing Feature Maps Neural Network

An Approach for Fuzzy Modeling based on Self-Organizing Feature Maps Neural Network Appl. Math. Inf. Sci. 8, No. 3, 27-2 (24) 27 Applied Mathematics & Information Sciences An International Journal http://dx.doi.org/.278/amis/8334 An Approach for Fuzzy Modeling based on Self-Organizing

More information

FUZZY BOOLEAN ALGEBRAS AND LUKASIEWICZ LOGIC. Angel Garrido

FUZZY BOOLEAN ALGEBRAS AND LUKASIEWICZ LOGIC. Angel Garrido Acta Universitatis Apulensis ISSN: 1582-5329 No. 22/2010 pp. 101-111 FUZZY BOOLEAN ALGEBRAS AND LUKASIEWICZ LOGIC Angel Garrido Abstract. In this paper, we analyze the more adequate tools to solve many

More information

Parameterized Complexity of Independence and Domination on Geometric Graphs

Parameterized Complexity of Independence and Domination on Geometric Graphs Parameterized Complexity of Independence and Domination on Geometric Graphs Dániel Marx Institut für Informatik, Humboldt-Universität zu Berlin, Unter den Linden 6, 10099 Berlin, Germany. dmarx@informatik.hu-berlin.de

More information

fuzzylite a fuzzy logic control library in C++

fuzzylite a fuzzy logic control library in C++ fuzzylite a fuzzy logic control library in C++ Juan Rada-Vilela jcrada@fuzzylite.com Abstract Fuzzy Logic Controllers (FLCs) are software components found nowadays within well-known home appliances such

More information

Robust Kernel Methods in Clustering and Dimensionality Reduction Problems

Robust Kernel Methods in Clustering and Dimensionality Reduction Problems Robust Kernel Methods in Clustering and Dimensionality Reduction Problems Jian Guo, Debadyuti Roy, Jing Wang University of Michigan, Department of Statistics Introduction In this report we propose robust

More information

Visual object classification by sparse convolutional neural networks

Visual object classification by sparse convolutional neural networks Visual object classification by sparse convolutional neural networks Alexander Gepperth 1 1- Ruhr-Universität Bochum - Institute for Neural Dynamics Universitätsstraße 150, 44801 Bochum - Germany Abstract.

More information

Methods for Intelligent Systems

Methods for Intelligent Systems Methods for Intelligent Systems Lecture Notes on Clustering (II) Davide Eynard eynard@elet.polimi.it Department of Electronics and Information Politecnico di Milano Davide Eynard - Lecture Notes on Clustering

More information

Distribution Comparison for Site-Specific Regression Modeling in Agriculture *

Distribution Comparison for Site-Specific Regression Modeling in Agriculture * Distribution Comparison for Site-Specific Regression Modeling in Agriculture * Dragoljub Pokrajac 1, Tim Fiez 2, Dragan Obradovic 3, Stephen Kwek 1 and Zoran Obradovic 1 {dpokraja, tfiez, kwek, zoran}@eecs.wsu.edu

More information

Figure (5) Kohonen Self-Organized Map

Figure (5) Kohonen Self-Organized Map 2- KOHONEN SELF-ORGANIZING MAPS (SOM) - The self-organizing neural networks assume a topological structure among the cluster units. - There are m cluster units, arranged in a one- or two-dimensional array;

More information

OBJECT-CENTERED INTERACTIVE MULTI-DIMENSIONAL SCALING: ASK THE EXPERT

OBJECT-CENTERED INTERACTIVE MULTI-DIMENSIONAL SCALING: ASK THE EXPERT OBJECT-CENTERED INTERACTIVE MULTI-DIMENSIONAL SCALING: ASK THE EXPERT Joost Broekens Tim Cocx Walter A. Kosters Leiden Institute of Advanced Computer Science Leiden University, The Netherlands Email: {broekens,

More information

3 Nonlinear Regression

3 Nonlinear Regression 3 Linear models are often insufficient to capture the real-world phenomena. That is, the relation between the inputs and the outputs we want to be able to predict are not linear. As a consequence, nonlinear

More information

MSA220 - Statistical Learning for Big Data

MSA220 - Statistical Learning for Big Data MSA220 - Statistical Learning for Big Data Lecture 13 Rebecka Jörnsten Mathematical Sciences University of Gothenburg and Chalmers University of Technology Clustering Explorative analysis - finding groups

More information

LEARNING WEIGHTS OF FUZZY RULES BY USING GRAVITATIONAL SEARCH ALGORITHM

LEARNING WEIGHTS OF FUZZY RULES BY USING GRAVITATIONAL SEARCH ALGORITHM International Journal of Innovative Computing, Information and Control ICIC International c 2013 ISSN 1349-4198 Volume 9, Number 4, April 2013 pp. 1593 1601 LEARNING WEIGHTS OF FUZZY RULES BY USING GRAVITATIONAL

More information

CHAPTER 4 FREQUENCY STABILIZATION USING FUZZY LOGIC CONTROLLER

CHAPTER 4 FREQUENCY STABILIZATION USING FUZZY LOGIC CONTROLLER 60 CHAPTER 4 FREQUENCY STABILIZATION USING FUZZY LOGIC CONTROLLER 4.1 INTRODUCTION Problems in the real world quite often turn out to be complex owing to an element of uncertainty either in the parameters

More information

Discrete Optimization. Lecture Notes 2

Discrete Optimization. Lecture Notes 2 Discrete Optimization. Lecture Notes 2 Disjunctive Constraints Defining variables and formulating linear constraints can be straightforward or more sophisticated, depending on the problem structure. The

More information

Kernel Density Construction Using Orthogonal Forward Regression

Kernel Density Construction Using Orthogonal Forward Regression Kernel ensity Construction Using Orthogonal Forward Regression S. Chen, X. Hong and C.J. Harris School of Electronics and Computer Science University of Southampton, Southampton SO7 BJ, U.K. epartment

More information

On the Max Coloring Problem

On the Max Coloring Problem On the Max Coloring Problem Leah Epstein Asaf Levin May 22, 2010 Abstract We consider max coloring on hereditary graph classes. The problem is defined as follows. Given a graph G = (V, E) and positive

More information

A Predictive Controller for Object Tracking of a Mobile Robot

A Predictive Controller for Object Tracking of a Mobile Robot A Predictive Controller for Object Tracking of a Mobile Robot Xiaowei Zhou, Plamen Angelov and Chengwei Wang Intelligent Systems Research Laboratory Lancaster University, Lancaster, LA1 4WA, U. K. p.angelov@lancs.ac.uk

More information