Lecture 2 :: Decision Trees Learning

Size: px
Start display at page:

Download "Lecture 2 :: Decision Trees Learning"

Transcription

1 Lecture 2 :: Decision Trees Learning 1 / 62

2 Designing a learning system What to learn? Learning setting. Learning mechanism. Evaluation. 2 / 62

3 Prediction task Figure 1: Prediction task :: Supervised learning 3 / 62

4 Lecture outline Motivation Decision trees in general Classification and regression trees (CART & ID3) Building tree Binary questions, i.e. binary splits Best split selection Learning mechanism :: ID3 algorithm Over-fitting, pruning More than one decision tree 4 / 62

5 Motivation :: Medical diagnosis example Real world objects heart attack patients admitted to the hospital. Features age and medical symptoms indicating the patient s conditions (got by measurements), like blood pressure. Output values high risk patients (G) and not high risk patients (F). Goal Identification of G- and F- patients on the basis of measurements. 5 / 62

6 Motivation :: Medical diagnosis example (2) Like a doctor, based on your experience, try to formulate criteria to identify the patients: 1 1st question: Is the minimum systolic blood pressure over the initial 24 hour period greater than 91? 2 If "no", then the patient is not high risk patient. 3 If "yes", 2nd question: Is age greater than 62.5? 4 If "no", then the patient is not high risk patient. 5 If "yes", 3rd question:... 6 / 62

7 Motivation :: Medical diagnosis example (3) Figure 2: Medical diagnosis problem: classification rules (Credits: Leo Breiman et al., 1984) 7 / 62

8 Motivation :: Medical diagnosis example (4) Like a machine learning expert, design a classifier based on the training data... 8 / 62

9 Decision trees in general A tree is a graph G = (V, E) in which any two vertices are connected by exactly one simple path. Figure 3: A tree in graph theory: any two vertices are connected by exactly one simple path. 9 / 62

10 Decision trees in general (2) A tree is called a rooted tree if one vertex has been designated the root, in which case the edges have a natural orientation, towards or away from the root. Figure 4: A rooted tree 10 / 62

11 Decision trees in general (3) A decision tree G DT = (V DT, E DT ) is a rooted tree where V DT is composed of internal decision nodes and terminal leaf nodes. Figure 5: A decision tree 11 / 62

12 Decision trees in general (4) Decision tree learner building corresponds to building decision tree GDT D = (V DT D, E DT D ) based on a data set D = { x, y : x X, y Y }. A decision node t VDT D is a subset of D and the root node t = D. Each leaf node is designated by an output value. Decision tree learner building GDT D = (V DT D, E DT D ) corresponds to repeated splits of subsets of D into descendant subsets, beginning with D itself. Splits (decisions) are generated by questions. 12 / 62

13 Figure 6: A decision tree learner 13 / 62

14 Decision trees in general :: Geometrical point of view Tree-based methods partition the feature space into a set of regions, and then fit a simple model in each region, for ex. majority vote for classification, constant value for regression. 14 / 62

15 Classification trees :: Geometrical point of view (2) Figure 7: Decision trees: a geometrical point of view (Credits: Alpaydin, 2004) 15 / 62

16 Regression trees :: Geometrical point of view Figure 8: Regression trees: a geometrical point of view 16 / 62

17 Historical excursion Classification trees: Y is a categorical output feature. Regression trees: Y is a numerical output feature. CART: ClAssification and Regression trees ID3 C4.5: classification trees 17 / 62

18 Historical excursion (2) Figure 9: Two decision tree conceptions 1 1 Automatic Interaction Detection(AID) 18 / 62

19 Once the decision tree learner is built, an unseen instance is classified by starting at the root node and moving down the tree branch corresponding to the feature values asked in questions. 19 / 62

20 Milestone #1 To learn a decision tree, one has to 1 define question types, 2 generate (best) splits, 3 design/apply a learning mechanism, 4 face troubles with over-fitting. 20 / 62

21 Notation Features Attr = {A 1, A 2,..., A M }, Values(A i ) is a set of all possible values for feature A i. Instances (i.e. feature vectors) x = x 1, x 2,..., x M, Output Values Y. D A i v = { x, y D x i = v}. 21 / 62

22 CART :: Question type definition Let s define a set of binary questions Q as follows: Each split depends on the value of only a single feature, i.e. each question tests a feature. Then each branch corresponds to feature value. For each categorical feature A m having values Values(A m ) = {b 1, b 2,..., b L }, Q includes all the questions of the form Is x m R? as R 2 Values(Am). For each numerical feature A m, Q includes all questions of the form Is x m k? for all k ranging over (, + ). 22 / 62

23 CART :: Question type definition (2) I.e. we restrict attention to recursive binary partitions, i.e. the feature speace is partitioned into a set of rectangles. 23 / 62

24 ID3 :: Question type definition Let s define a set of questions Q as follows: Each split depends on the value of only a single feature, i.e. each question tests a feature. Then each branch corresponds to feature value. For each categorical feature A m having values Values(A m ) = {b 1, b 2,..., b L }, Q includes all the questions of the form Is x m = b j? as j = 1,..., L. For each categorical feature A m having values Values(A m ) = {b 1, b 2,..., b L }, Q includes all the questions of the form Is x m R? as R 2 Values(Am). For each numerical feature A m, Q includes all questions of the form Is x m k? for all k ranging over (, + ). 24 / 62

25 ID3 :: Question type definition (2) Figure 10: ID3: Non-binary splits 25 / 62

26 Milestone #2 To learn a decision tree, one has to 1 define question types, 2 generate (best) splits, 3 design/apply a learning mechanism, 4 face troubles with over-fitting. 26 / 62

27 Best split generation :: Classification trees We say a data set is pure (or homogenous) if it contains only a single class. If a data set contains several classes, then we say that the data set is impure (or heterogenous). Decision tree classifier attempts to divide the M-dimensional feature space into homogenous regions. The goal of adding new nodes to a tree is to split up the instance space so as to minimize the impurity of the training set. : 5, : 5 : 9, : 1 heterogenous homogenous high level of impurity low degree of impurity 27 / 62

28 Best split generation :: Classification trees (2) 1. Define the node proportions p(j t), j = 1,..., c, to be the proportion of the instances x, x, j t, so that c j=1 p(j t) = Define a measure i(t) of the impurity of t as a nonnegative function Φ of the p(1 t), p(2 t),..., p(c t), i(t) = Φ(p(1 t), p(2 t),..., p(c t)), (1) such that Φ( 1 c, 1 c,..., 1 c ) = max, i.e. the node impurity is largest when all examples are equally mixed together in it. Φ(1, 0,..., 0) = 0, Φ(0, 1,..., 0) = 0,..., Φ(0, 0,..., 1) = 0, i.e. the node impurity is smallest when the node contains instances of only one class 3. Define the goodness of split s to be the decrease in impurity i(s, t) = i(t) p L i(t L ) p R i(t R ). p L is a proportin of examples in t going to t L, similarly with p R. 28 / 62

29 Best split generation :: Classification trees (3) Figure 11: Split in decision tree (Credits: Leo Breiman et al., 1984) 4. Define a candidate set S of splits at each node: the set Q of questions generates a set S of splits of every node t. Find split s with the largest decrease in impurity: i(s, t) = max s S i(s, t). 29 / 62

30 Best split generation :: Classification trees (4) 5. Discuss four splitting criteria: Misclassification Error i(t) MisclassificationError, Information Gain i(t) InformationGain, Gini Index i(t) GiniIndex. 30 / 62

31 Best split generation :: Classification trees (5) i(t) MisclassificationError = 1 max j=1,...,c p(j t) (2) : 0, : 6 : 1, : 5 : 2, : 4 : 3, : 3 M. E = = 0, = 0, = 0, 5 31 / 62

32 Best split generation :: Classification trees (6) c i(t) InformationGain = p(j t) log(p(j t). (3) j=1 i(s Ak, t) InformationGain = i(t) v Values(A k ) p tv i(t v ) = H(t) v Values(A k ) t v H(tv ) (4) t where t v is the subset of t for which attribute A k has value v. Gain(s, t) = i(s, t) InformationGain (5) 32 / 62

33 Best split generation :: Classification trees (7) The information gain favors features with many values over those with few values (many values, many branches, impurity can be much less). Example Let s consider the feature Date. It has so many values that it is bound to separate the training examples into very small subsets. As it has a high information gain relative to the training data, it is probably selected for the root node. But prediction behind the training examples is poor. Let s incorporate an alternative measure Gain ratio that takes into account the number of values of a feature. 33 / 62

34 Best split generation :: Classification trees (8) where GainRatio(s Ak, t) = Gain(s Ak, t) SplitInformation(s Ak, t), (6) SplitInformation(s Ak, t) = v Values(A k ) t v t log t v 2 t, (7) SplitInformation(s Ak, t) is the entropy of t with respect to the split s Ak, i.e. to the values of feature A k. 34 / 62

35 Best split generation :: Classification trees (9) i(t) GiniIndex = 1 j p 2 (j t). (8) 35 / 62

36 Best split generation :: Classification trees (10) : 0 : 1 : 2 : 3 : 6 : 5 : 4 : 3 Gini Entropy M.E For two classes (c = 2), if p is the proportion in the class "1", the measures are: Misclassification error: 1 max(p, 1 p) Entropy: p logp (1 p) log(1 p) Gini: 2p (1 p) 36 / 62

37 Best split generation :: Classification trees (11) Figure 12: Selection the best split :: Misclassification vs. Entropy vs. Gini See 37 / 62

38 Milestone #3 To learn a decision tree, one has to 1 define question types, 2 generate (best) splits: classification trees, regression trees, 3 design/apply a learning mechanism, 4 face troubles with over-fitting. 38 / 62

39 Best split generation :: Regression trees i(t) LeastSquaredError = 1 t (y i f (x i )) 2, (9) x i t K f (x i ) = ĉ k δ(x i R k ), (10) k=1 ĉ k = 1 R k x i R k y i. (11) I.e. i(t) LeastSquaredError = 1 t (y i ĉ k ) 2 (12) x i t 39 / 62

40 ĉ k as an average of y i for all training instances x i falling into t (i.e. into R k ) minimizes i(t) LeastSquaredError. The proof is based on seeing that the number which minimizes i (y i a) 2 is a = 1 N i y i. 40 / 62

41 Best split generation :: Regression trees (2) 2. Define the goodness of split s to be. i(s, t) = i(t) i(t L ) i(t R ) (13) 3. Define a candidate set S of splits at each node: the set Q of questions generates a set S of splits of every node t. Find split s : i(s, t) = min s S i(s, t). 41 / 62

42 Milestone #4 To learn a decision tree, one has to 1 define question types, 2 generate (best) splits: classification trees, regression trees, 3 design/apply a learning mechanism, 4 face troubles with over-fitting. 42 / 62

43 Learning mechanism :: ID3 algorithm(d, Y, Attr) Create a Root node for the tree. If x, x, y D : y = 1, return single node tree Root with label = +. If x, x, y D : y = 1, return single node tree Root with label = -. If Attr =, return a single node tree Root with label = most common value of Y in D, i.e. n = max v v Y, x,y D δ(v, Y (x)). Otherwise A := Splitting Criterion(Attr) Root := A v Values(A) Add a new tree branch below Root, corresponding to the test A(x) = v If D v = then below this new branch add a leaf node with label = n = max v δ(v, Y (x)); else below this new branch v Values(Y ), x,y D add the subtree ID3(D v, Y, Attr {A}) Return(Root). 43 / 62

44 Learning mechanism :: ID3 algorithm :: Comments ID3 is a recursive partitioning algorithm (divide & conquer), performs top-down tree construction. ID3 performs a simple-to-complex, hill-climbing search through H guided by a particular splitting criterion. ID3 maintains only a single current hypothesis. So ID3 for example is not able to determine any other decision trees consistent with training data. ID3 does not employ backtracking. 44 / 62

45 Milestone #5 To learn a decision tree, one has to 1 define question types, 2 generate (best) splits: classification trees, regression trees, 3 design/apply a learning mechanism, 4 face troubles with over-fitting. 45 / 62

46 Over-Fitting The divide and conquer algorithm partitions the data until every leaf contains examples of a single value, or until further partitioning is impossible because two examples have the same values for each feature but belong to different classes. Consequently, if there are no noisy examples, the decision tree will correctly classify all training examples. Over-fitting can be avoided by a stopping criterion that prevents some sets of training examples from being subdivided, or by removing some of the structure of the decision tree after it has been produced. 46 / 62

47 Over-Fitting When a node t was reached such that no significant decrease in impurity was possible (i.e. max s i(s, t) < β), then t was not split and became a terminal node. The class of this terminal node is determined as follows: majority vote for classification and average value for regression. But... If β is set too low, then there is too much splitting and the tree is too large. This tree might overfit the data. Increasing β: there maybe nodes t such that max s i(s, t) < β is small. But the descendants nodes t L and t R of t may have splits with large decreases in impurity. By declaring t terminal, one looses the good splits on t L or t R. This tree might not capture the important structure. Preferred strategy: Grow a large tree T 0, stopping the splitting process when only some minimum node size (say 5) is reached. Then prune T 0 using some pruning criteria. 47 / 62

48 Pruning :: C4.5 :: Reduced error pruning Figure 13: Reduced Error Pruning subtrees Ti already pruned, let Tf be the subtree corresponding to the most frequent value of A in the data, let L be a leaf labelled with the most frequent class in the data. 48 / 62

49 Pruning :: C4.5 :: Reduced error pruning (2) Split the data into training D and validation D v sets. Build a tree T to fit the data D. Evaluate impact of pruning each possible node on validation set D v : let E T be the sample error rate of T on D v, let E T be the sample error rate of T f f on D v, let E L be the sample error rate of L on D v. Let U CF (E T, D v ), U CF (E T f, D v ), U CF (E L, D v ) be the three corresponding estimated error rates (via confidence intervals). Depending on whichever is lower, C4.5 leaves T unchanged, replaces T by the leaf L, or replaces T by its subtree T f. 49 / 62

50 Milestone #6 To learn a decision tree, one has to 1 define question types, 2 generate (best) splits: classification trees, regression trees, 3 design/apply a learning mechanism, 4 face troubles with over-fitting. Going from ID3 to C / 62

51 ID3 :: Incorporating continuous-valued features ID3 is originally designed with two restrictions: 1 Discrete-valued predicted feature. 2 Discrete-valued features tested in the decision tree nodes. Let s extend it for the continuous-valued features. For a continuous-valued feature A, define a boolean-valued feature A c so that if A(x) c then A c (x) = 1 else A c (x) = / 62

52 ID3 :: Incorporating continuous-valued features (3) How to select the best value for the threshold c? Criterion-example Choose such c that produces the greatest information gain. Temperature EnjoySport No No Yes Yes Yes No c 1 = , c 2 = Gain(D, Temperature c1 ) =?, Gain(D, Temperature c2 ) =? 52 / 62

53 ID3 :: Handling training examples with missing feature values Consider the situation in which Gain(D, A) is to be calculated at node t in the decision tree. Suppose that x, c(x) is one of the training examples in D and that the value A(x) is unknown. Possible solutions Assign the value that is most common among training examples at node t. Alternatively, assign the most common value among examples at node t that have the classification c(x). 53 / 62

54 Milestone #7 To learn a decision tree, one has to 1 define question types, 2 generate (best) splits: classification trees, regression trees, 3 design/apply a learning mechanism, 4 face troubles with over-fitting. Going from ID3 to C Having a bag of decision trees / 62

55 More than one decision tree Some techniques, often called ensemble methods, construct more than one decision tree, e.g. Boosted Trees. 55 / 62

56 More than one decision tree :: Boosted trees In general, The more data, the better :: an approach of resampling Same learning mechanism, different classifiers h 1, h 2,..., h n. Boosting is a procedure that combines the outputs of many weak classifiers to produce a powerfull "committee". A weak classifier has an error rate only slightly better than random guessing. Schapire s strategy: Change the distribution over the training examples in each iteration, feed the resulting sample into the weak classifier, and then combine the resulting hypotheses into a voting ensemble, which, in the end, would have a boosted classification accuracy. 56 / 62

57 More than one decision tree :: Boosted trees (2) AdaBoost.M1 algorithm An iterative algorithm that selects h t given the performance obtained by weak learners h 1 h t 1 At each step t modifies training data distribution in order to favour hard-to-classify examples according to previous weak classifiers, trains a new weak classifier h t, selects the new weight α t of h t by optimizing a final classifier stops when impossible to find a weak classifier being better than chance, outputs a final classifier being the weighted sum of all weak classifiers. 57 / 62

58 More than one decision tree :: Boosted trees (3) Figure 14: AdaBoost (Credits: Hastie et al, 2009, pp. 338) 58 / 62

59 More than one decision tree :: Boosted trees (4) AdaBoost.M1 algorithm D 1 (i) = 1 n given D t and h t (i.e. update D t ): D t+1 (i) = D t(i) Z t.{ e αt if y i = h t (x i ) e αt if y i h t (x i ) = D t(i) Z t. exp( α t y i h t (x i )), (14) where Z t is normalization constant Z t = i D t(i) exp( α t y i h t (x i )) and α t = 1 1 ɛt 2 ln( ɛ t ) and ɛ t = Pr Dt [h t (x i ) y i ] = i:h t(x i ) y i D t (i) α t measures the importance that is assigned to h t. If ɛ t 1 2 (what is assumed without loss of generality), then α t 0; α t gets larger as ɛ t gets smaller. h final (x) = sgn( t α th t (x)): the final hypothesis is a weighted majority of the T hypotheses. 59 / 62

60 Final remarks Decision tree based classifiers are readily interpretable by humans. Instability of trees: learned tree structure is sensitive to the training data, so that a small change to the data can result in a very different set of splits. 60 / 62

61 For more details refer to Hastie, T. et al. The Elements of Statistical Learning. Springer, 2009, Section 9.2; Chapter 10 pp , 61 / 62

62 References Breiman Leo, Friedman Jerome H., Olshen Richard A., Stone Charles J. Classification and Regression Trees. Chapman & Hall/CRC, Boosting: A sample example: http: // Hunt, E. B. Concept Learning: An Information Processing Problem, Wiley Morgan, J. N., Sonquist, J. A. Problems in the analysis of survey data, and a proposal. Journal of the American Statistical Association 58, pp Quinlan, J. R. Discovering rules from large collections of examples: A case study, in D. Michie, ed., Expert Systems in the Micro Electronic Age. Edinburgh University Press Quinlan, J. R. C4.5: Programs for Machine Learning, Morgan Kaufmann, San Mateo, California / 62

Random Forest A. Fornaser

Random Forest A. Fornaser Random Forest A. Fornaser alberto.fornaser@unitn.it Sources Lecture 15: decision trees, information theory and random forests, Dr. Richard E. Turner Trees and Random Forests, Adele Cutler, Utah State University

More information

Decision Trees Dr. G. Bharadwaja Kumar VIT Chennai

Decision Trees Dr. G. Bharadwaja Kumar VIT Chennai Decision Trees Decision Tree Decision Trees (DTs) are a nonparametric supervised learning method used for classification and regression. The goal is to create a model that predicts the value of a target

More information

Decision Tree CE-717 : Machine Learning Sharif University of Technology

Decision Tree CE-717 : Machine Learning Sharif University of Technology Decision Tree CE-717 : Machine Learning Sharif University of Technology M. Soleymani Fall 2012 Some slides have been adapted from: Prof. Tom Mitchell Decision tree Approximating functions of usually discrete

More information

Classification and Regression Trees

Classification and Regression Trees Classification and Regression Trees Matthew S. Shotwell, Ph.D. Department of Biostatistics Vanderbilt University School of Medicine Nashville, TN, USA March 16, 2018 Introduction trees partition feature

More information

Data Mining. Decision Tree. Hamid Beigy. Sharif University of Technology. Fall 1396

Data Mining. Decision Tree. Hamid Beigy. Sharif University of Technology. Fall 1396 Data Mining Decision Tree Hamid Beigy Sharif University of Technology Fall 1396 Hamid Beigy (Sharif University of Technology) Data Mining Fall 1396 1 / 24 Table of contents 1 Introduction 2 Decision tree

More information

BITS F464: MACHINE LEARNING

BITS F464: MACHINE LEARNING BITS F464: MACHINE LEARNING Lecture-16: Decision Tree (contd.) + Random Forest Dr. Kamlesh Tiwari Assistant Professor Department of Computer Science and Information Systems Engineering, BITS Pilani, Rajasthan-333031

More information

Ensemble Methods, Decision Trees

Ensemble Methods, Decision Trees CS 1675: Intro to Machine Learning Ensemble Methods, Decision Trees Prof. Adriana Kovashka University of Pittsburgh November 13, 2018 Plan for This Lecture Ensemble methods: introduction Boosting Algorithm

More information

Lecture 5: Decision Trees (Part II)

Lecture 5: Decision Trees (Part II) Lecture 5: Decision Trees (Part II) Dealing with noise in the data Overfitting Pruning Dealing with missing attribute values Dealing with attributes with multiple values Integrating costs into node choice

More information

Data Mining. 3.2 Decision Tree Classifier. Fall Instructor: Dr. Masoud Yaghini. Chapter 5: Decision Tree Classifier

Data Mining. 3.2 Decision Tree Classifier. Fall Instructor: Dr. Masoud Yaghini. Chapter 5: Decision Tree Classifier Data Mining 3.2 Decision Tree Classifier Fall 2008 Instructor: Dr. Masoud Yaghini Outline Introduction Basic Algorithm for Decision Tree Induction Attribute Selection Measures Information Gain Gain Ratio

More information

Decision trees. Decision trees are useful to a large degree because of their simplicity and interpretability

Decision trees. Decision trees are useful to a large degree because of their simplicity and interpretability Decision trees A decision tree is a method for classification/regression that aims to ask a few relatively simple questions about an input and then predicts the associated output Decision trees are useful

More information

Lecture 7: Decision Trees

Lecture 7: Decision Trees Lecture 7: Decision Trees Instructor: Outline 1 Geometric Perspective of Classification 2 Decision Trees Geometric Perspective of Classification Perspective of Classification Algorithmic Geometric Probabilistic...

More information

COMP 465: Data Mining Classification Basics

COMP 465: Data Mining Classification Basics Supervised vs. Unsupervised Learning COMP 465: Data Mining Classification Basics Slides Adapted From : Jiawei Han, Micheline Kamber & Jian Pei Data Mining: Concepts and Techniques, 3 rd ed. Supervised

More information

Chapter ML:III. III. Decision Trees. Decision Trees Basics Impurity Functions Decision Tree Algorithms Decision Tree Pruning

Chapter ML:III. III. Decision Trees. Decision Trees Basics Impurity Functions Decision Tree Algorithms Decision Tree Pruning Chapter ML:III III. Decision Trees Decision Trees Basics Impurity Functions Decision Tree Algorithms Decision Tree Pruning ML:III-67 Decision Trees STEIN/LETTMANN 2005-2017 ID3 Algorithm [Quinlan 1986]

More information

Data Mining Lecture 8: Decision Trees

Data Mining Lecture 8: Decision Trees Data Mining Lecture 8: Decision Trees Jo Houghton ECS Southampton March 8, 2019 1 / 30 Decision Trees - Introduction A decision tree is like a flow chart. E. g. I need to buy a new car Can I afford it?

More information

Lecture 5 of 42. Decision Trees, Occam s Razor, and Overfitting

Lecture 5 of 42. Decision Trees, Occam s Razor, and Overfitting Lecture 5 of 42 Decision Trees, Occam s Razor, and Overfitting Friday, 01 February 2008 William H. Hsu, KSU http://www.cis.ksu.edu/~bhsu Readings: Chapter 3.6-3.8, Mitchell Lecture Outline Read Sections

More information

Classification with Decision Tree Induction

Classification with Decision Tree Induction Classification with Decision Tree Induction This algorithm makes Classification Decision for a test sample with the help of tree like structure (Similar to Binary Tree OR k-ary tree) Nodes in the tree

More information

Classification and Regression Trees

Classification and Regression Trees Classification and Regression Trees David S. Rosenberg New York University April 3, 2018 David S. Rosenberg (New York University) DS-GA 1003 / CSCI-GA 2567 April 3, 2018 1 / 51 Contents 1 Trees 2 Regression

More information

CSE4334/5334 DATA MINING

CSE4334/5334 DATA MINING CSE4334/5334 DATA MINING Lecture 4: Classification (1) CSE4334/5334 Data Mining, Fall 2014 Department of Computer Science and Engineering, University of Texas at Arlington Chengkai Li (Slides courtesy

More information

Decision tree learning

Decision tree learning Decision tree learning Andrea Passerini passerini@disi.unitn.it Machine Learning Learning the concept Go to lesson OUTLOOK Rain Overcast Sunny TRANSPORTATION LESSON NO Uncovered Covered Theoretical Practical

More information

Extra readings beyond the lecture slides are important:

Extra readings beyond the lecture slides are important: 1 Notes To preview next lecture: Check the lecture notes, if slides are not available: http://web.cse.ohio-state.edu/~sun.397/courses/au2017/cse5243-new.html Check UIUC course on the same topic. All their

More information

CS Machine Learning

CS Machine Learning CS 60050 Machine Learning Decision Tree Classifier Slides taken from course materials of Tan, Steinbach, Kumar 10 10 Illustrating Classification Task Tid Attrib1 Attrib2 Attrib3 Class 1 Yes Large 125K

More information

Classification: Decision Trees

Classification: Decision Trees Classification: Decision Trees IST557 Data Mining: Techniques and Applications Jessie Li, Penn State University 1 Decision Tree Example Will a pa)ent have high-risk based on the ini)al 24-hour observa)on?

More information

INTRO TO RANDOM FOREST BY ANTHONY ANH QUOC DOAN

INTRO TO RANDOM FOREST BY ANTHONY ANH QUOC DOAN INTRO TO RANDOM FOREST BY ANTHONY ANH QUOC DOAN MOTIVATION FOR RANDOM FOREST Random forest is a great statistical learning model. It works well with small to medium data. Unlike Neural Network which requires

More information

A Bagging Method using Decision Trees in the Role of Base Classifiers

A Bagging Method using Decision Trees in the Role of Base Classifiers A Bagging Method using Decision Trees in the Role of Base Classifiers Kristína Machová 1, František Barčák 2, Peter Bednár 3 1 Department of Cybernetics and Artificial Intelligence, Technical University,

More information

International Journal of Software and Web Sciences (IJSWS)

International Journal of Software and Web Sciences (IJSWS) International Association of Scientific Innovation and Research (IASIR) (An Association Unifying the Sciences, Engineering, and Applied Research) ISSN (Print): 2279-0063 ISSN (Online): 2279-0071 International

More information

Supervised Learning. Decision trees Artificial neural nets K-nearest neighbor Support vectors Linear regression Logistic regression...

Supervised Learning. Decision trees Artificial neural nets K-nearest neighbor Support vectors Linear regression Logistic regression... Supervised Learning Decision trees Artificial neural nets K-nearest neighbor Support vectors Linear regression Logistic regression... Supervised Learning y=f(x): true function (usually not known) D: training

More information

Decision Tree Learning

Decision Tree Learning Decision Tree Learning Debapriyo Majumdar Data Mining Fall 2014 Indian Statistical Institute Kolkata August 25, 2014 Example: Age, Income and Owning a flat Monthly income (thousand rupees) 250 200 150

More information

Network Traffic Measurements and Analysis

Network Traffic Measurements and Analysis DEIB - Politecnico di Milano Fall, 2017 Sources Hastie, Tibshirani, Friedman: The Elements of Statistical Learning James, Witten, Hastie, Tibshirani: An Introduction to Statistical Learning Andrew Ng:

More information

MIT 801. Machine Learning I. [Presented by Anna Bosman] 16 February 2018

MIT 801. Machine Learning I. [Presented by Anna Bosman] 16 February 2018 MIT 801 [Presented by Anna Bosman] 16 February 2018 Machine Learning What is machine learning? Artificial Intelligence? Yes as we know it. What is intelligence? The ability to acquire and apply knowledge

More information

Machine Learning. Decision Trees. Le Song /15-781, Spring Lecture 6, September 6, 2012 Based on slides from Eric Xing, CMU

Machine Learning. Decision Trees. Le Song /15-781, Spring Lecture 6, September 6, 2012 Based on slides from Eric Xing, CMU Machine Learning 10-701/15-781, Spring 2008 Decision Trees Le Song Lecture 6, September 6, 2012 Based on slides from Eric Xing, CMU Reading: Chap. 1.6, CB & Chap 3, TM Learning non-linear functions f:

More information

8. Tree-based approaches

8. Tree-based approaches Foundations of Machine Learning École Centrale Paris Fall 2015 8. Tree-based approaches Chloé-Agathe Azencott Centre for Computational Biology, Mines ParisTech chloe agathe.azencott@mines paristech.fr

More information

Decision Tree Learning

Decision Tree Learning Decision Tree Learning 1 Simple example of object classification Instances Size Color Shape C(x) x1 small red circle positive x2 large red circle positive x3 small red triangle negative x4 large blue circle

More information

An introduction to random forests

An introduction to random forests An introduction to random forests Eric Debreuve / Team Morpheme Institutions: University Nice Sophia Antipolis / CNRS / Inria Labs: I3S / Inria CRI SA-M / ibv Outline Machine learning Decision tree Random

More information

Lecture 19: Decision trees

Lecture 19: Decision trees Lecture 19: Decision trees Reading: Section 8.1 STATS 202: Data mining and analysis November 10, 2017 1 / 17 Decision trees, 10,000 foot view R2 R5 t4 1. Find a partition of the space of predictors. X2

More information

Classification/Regression Trees and Random Forests

Classification/Regression Trees and Random Forests Classification/Regression Trees and Random Forests Fabio G. Cozman - fgcozman@usp.br November 6, 2018 Classification tree Consider binary class variable Y and features X 1,..., X n. Decide Ŷ after a series

More information

CS 229 Midterm Review

CS 229 Midterm Review CS 229 Midterm Review Course Staff Fall 2018 11/2/2018 Outline Today: SVMs Kernels Tree Ensembles EM Algorithm / Mixture Models [ Focus on building intuition, less so on solving specific problems. Ask

More information

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany Syllabus Fri. 27.10. (1) 0. Introduction A. Supervised Learning: Linear Models & Fundamentals Fri. 3.11. (2) A.1 Linear Regression Fri. 10.11. (3) A.2 Linear Classification Fri. 17.11. (4) A.3 Regularization

More information

Ensemble Learning: An Introduction. Adapted from Slides by Tan, Steinbach, Kumar

Ensemble Learning: An Introduction. Adapted from Slides by Tan, Steinbach, Kumar Ensemble Learning: An Introduction Adapted from Slides by Tan, Steinbach, Kumar 1 General Idea D Original Training data Step 1: Create Multiple Data Sets... D 1 D 2 D t-1 D t Step 2: Build Multiple Classifiers

More information

Business Club. Decision Trees

Business Club. Decision Trees Business Club Decision Trees Business Club Analytics Team December 2017 Index 1. Motivation- A Case Study 2. The Trees a. What is a decision tree b. Representation 3. Regression v/s Classification 4. Building

More information

The digital copy of this thesis is protected by the Copyright Act 1994 (New Zealand).

The digital copy of this thesis is protected by the Copyright Act 1994 (New Zealand). http://waikato.researchgateway.ac.nz/ Research Commons at the University of Waikato Copyright Statement: The digital copy of this thesis is protected by the Copyright Act 1994 (New Zealand). The thesis

More information

Nearest neighbor classification DSE 220

Nearest neighbor classification DSE 220 Nearest neighbor classification DSE 220 Decision Trees Target variable Label Dependent variable Output space Person ID Age Gender Income Balance Mortgag e payment 123213 32 F 25000 32000 Y 17824 49 M 12000-3000

More information

Data Mining Concepts & Techniques

Data Mining Concepts & Techniques Data Mining Concepts & Techniques Lecture No. 03 Data Processing, Data Mining Naeem Ahmed Email: naeemmahoto@gmail.com Department of Software Engineering Mehran Univeristy of Engineering and Technology

More information

An Empirical Comparison of Ensemble Methods Based on Classification Trees. Mounir Hamza and Denis Larocque. Department of Quantitative Methods

An Empirical Comparison of Ensemble Methods Based on Classification Trees. Mounir Hamza and Denis Larocque. Department of Quantitative Methods An Empirical Comparison of Ensemble Methods Based on Classification Trees Mounir Hamza and Denis Larocque Department of Quantitative Methods HEC Montreal Canada Mounir Hamza and Denis Larocque 1 June 2005

More information

CART. Classification and Regression Trees. Rebecka Jörnsten. Mathematical Sciences University of Gothenburg and Chalmers University of Technology

CART. Classification and Regression Trees. Rebecka Jörnsten. Mathematical Sciences University of Gothenburg and Chalmers University of Technology CART Classification and Regression Trees Rebecka Jörnsten Mathematical Sciences University of Gothenburg and Chalmers University of Technology CART CART stands for Classification And Regression Trees.

More information

What is Learning? CS 343: Artificial Intelligence Machine Learning. Raymond J. Mooney. Problem Solving / Planning / Control.

What is Learning? CS 343: Artificial Intelligence Machine Learning. Raymond J. Mooney. Problem Solving / Planning / Control. What is Learning? CS 343: Artificial Intelligence Machine Learning Herbert Simon: Learning is any process by which a system improves performance from experience. What is the task? Classification Problem

More information

Lecture 20: Classification and Regression Trees

Lecture 20: Classification and Regression Trees Fall, 2017 Outline Basic Ideas Basic Ideas Tree Construction Algorithm Parameter Tuning Choice of Impurity Measure Missing Values Characteristics of Classification Trees Main Characteristics: very flexible,

More information

Adaptive Building of Decision Trees by Reinforcement Learning

Adaptive Building of Decision Trees by Reinforcement Learning Proceedings of the 7th WSEAS International Conference on Applied Informatics and Communications, Athens, Greece, August 24-26, 2007 34 Adaptive Building of Decision Trees by Reinforcement Learning MIRCEA

More information

Nominal Data. May not have a numerical representation Distance measures might not make sense. PR and ANN

Nominal Data. May not have a numerical representation Distance measures might not make sense. PR and ANN NonMetric Data Nominal Data So far we consider patterns to be represented by feature vectors of real or integer values Easy to come up with a distance (similarity) measure by using a variety of mathematical

More information

Text Categorization. Foundations of Statistic Natural Language Processing The MIT Press1999

Text Categorization. Foundations of Statistic Natural Language Processing The MIT Press1999 Text Categorization Foundations of Statistic Natural Language Processing The MIT Press1999 Outline Introduction Decision Trees Maximum Entropy Modeling (optional) Perceptrons K Nearest Neighbor Classification

More information

Introduction to Machine Learning

Introduction to Machine Learning Introduction to Machine Learning Decision Tree Example Three variables: Attribute 1: Hair = {blond, dark} Attribute 2: Height = {tall, short} Class: Country = {Gromland, Polvia} CS4375 --- Fall 2018 a

More information

Artificial Intelligence. Programming Styles

Artificial Intelligence. Programming Styles Artificial Intelligence Intro to Machine Learning Programming Styles Standard CS: Explicitly program computer to do something Early AI: Derive a problem description (state) and use general algorithms to

More information

Lecture outline. Decision-tree classification

Lecture outline. Decision-tree classification Lecture outline Decision-tree classification Decision Trees Decision tree A flow-chart-like tree structure Internal node denotes a test on an attribute Branch represents an outcome of the test Leaf nodes

More information

Classification with PAM and Random Forest

Classification with PAM and Random Forest 5/7/2007 Classification with PAM and Random Forest Markus Ruschhaupt Practical Microarray Analysis 2007 - Regensburg Two roads to classification Given: patient profiles already diagnosed by an expert.

More information

University of Cambridge Engineering Part IIB Paper 4F10: Statistical Pattern Processing Handout 10: Decision Trees

University of Cambridge Engineering Part IIB Paper 4F10: Statistical Pattern Processing Handout 10: Decision Trees University of Cambridge Engineering Part IIB Paper 4F10: Statistical Pattern Processing Handout 10: Decision Trees colour=green? size>20cm? colour=red? watermelon size>5cm? size>5cm? colour=yellow? apple

More information

Machine Learning. A. Supervised Learning A.7. Decision Trees. Lars Schmidt-Thieme

Machine Learning. A. Supervised Learning A.7. Decision Trees. Lars Schmidt-Thieme Machine Learning A. Supervised Learning A.7. Decision Trees Lars Schmidt-Thieme Information Systems and Machine Learning Lab (ISMLL) Institute for Computer Science University of Hildesheim, Germany 1 /

More information

Improving Tree-Based Classification Rules Using a Particle Swarm Optimization

Improving Tree-Based Classification Rules Using a Particle Swarm Optimization Improving Tree-Based Classification Rules Using a Particle Swarm Optimization Chi-Hyuck Jun *, Yun-Ju Cho, and Hyeseon Lee Department of Industrial and Management Engineering Pohang University of Science

More information

7. Decision or classification trees

7. Decision or classification trees 7. Decision or classification trees Next we are going to consider a rather different approach from those presented so far to machine learning that use one of the most common and important data structure,

More information

University of Cambridge Engineering Part IIB Paper 4F10: Statistical Pattern Processing Handout 10: Decision Trees

University of Cambridge Engineering Part IIB Paper 4F10: Statistical Pattern Processing Handout 10: Decision Trees University of Cambridge Engineering Part IIB Paper 4F10: Statistical Pattern Processing Handout 10: Decision Trees colour=green? size>20cm? colour=red? watermelon size>5cm? size>5cm? colour=yellow? apple

More information

Data Mining Part 5. Prediction

Data Mining Part 5. Prediction Data Mining Part 5. Prediction 5.4. Spring 2010 Instructor: Dr. Masoud Yaghini Outline Using IF-THEN Rules for Classification Rule Extraction from a Decision Tree 1R Algorithm Sequential Covering Algorithms

More information

April 3, 2012 T.C. Havens

April 3, 2012 T.C. Havens April 3, 2012 T.C. Havens Different training parameters MLP with different weights, number of layers/nodes, etc. Controls instability of classifiers (local minima) Similar strategies can be used to generate

More information

Part I. Classification & Decision Trees. Classification. Classification. Week 4 Based in part on slides from textbook, slides of Susan Holmes

Part I. Classification & Decision Trees. Classification. Classification. Week 4 Based in part on slides from textbook, slides of Susan Holmes Week 4 Based in part on slides from textbook, slides of Susan Holmes Part I Classification & Decision Trees October 19, 2012 1 / 1 2 / 1 Classification Classification Problem description We are given a

More information

Uninformed Search Methods. Informed Search Methods. Midterm Exam 3/13/18. Thursday, March 15, 7:30 9:30 p.m. room 125 Ag Hall

Uninformed Search Methods. Informed Search Methods. Midterm Exam 3/13/18. Thursday, March 15, 7:30 9:30 p.m. room 125 Ag Hall Midterm Exam Thursday, March 15, 7:30 9:30 p.m. room 125 Ag Hall Covers topics through Decision Trees and Random Forests (does not include constraint satisfaction) Closed book 8.5 x 11 sheet with notes

More information

Machine Learning Lecture 11

Machine Learning Lecture 11 Machine Learning Lecture 11 Random Forests 23.11.2017 Bastian Leibe RWTH Aachen http://www.vision.rwth-aachen.de leibe@vision.rwth-aachen.de Course Outline Fundamentals Bayes Decision Theory Probability

More information

Data Mining: Concepts and Techniques Classification and Prediction Chapter 6.1-3

Data Mining: Concepts and Techniques Classification and Prediction Chapter 6.1-3 Data Mining: Concepts and Techniques Classification and Prediction Chapter 6.1-3 January 25, 2007 CSE-4412: Data Mining 1 Chapter 6 Classification and Prediction 1. What is classification? What is prediction?

More information

The Basics of Decision Trees

The Basics of Decision Trees Tree-based Methods Here we describe tree-based methods for regression and classification. These involve stratifying or segmenting the predictor space into a number of simple regions. Since the set of splitting

More information

Classification: Basic Concepts, Decision Trees, and Model Evaluation

Classification: Basic Concepts, Decision Trees, and Model Evaluation Classification: Basic Concepts, Decision Trees, and Model Evaluation Data Warehousing and Mining Lecture 4 by Hossen Asiful Mustafa Classification: Definition Given a collection of records (training set

More information

Pattern Recognition. Kjell Elenius. Speech, Music and Hearing KTH. March 29, 2007 Speech recognition

Pattern Recognition. Kjell Elenius. Speech, Music and Hearing KTH. March 29, 2007 Speech recognition Pattern Recognition Kjell Elenius Speech, Music and Hearing KTH March 29, 2007 Speech recognition 2007 1 Ch 4. Pattern Recognition 1(3) Bayes Decision Theory Minimum-Error-Rate Decision Rules Discriminant

More information

Data Mining Practical Machine Learning Tools and Techniques

Data Mining Practical Machine Learning Tools and Techniques Decision trees Extending previous approach: Data Mining Practical Machine Learning Tools and Techniques Slides for Chapter 6 of Data Mining by I. H. Witten and E. Frank to permit numeric s: straightforward

More information

Data Mining Classification - Part 1 -

Data Mining Classification - Part 1 - Data Mining Classification - Part 1 - Universität Mannheim Bizer: Data Mining I FSS2019 (Version: 20.2.2018) Slide 1 Outline 1. What is Classification? 2. K-Nearest-Neighbors 3. Decision Trees 4. Model

More information

Tree-based methods for classification and regression

Tree-based methods for classification and regression Tree-based methods for classification and regression Ryan Tibshirani Data Mining: 36-462/36-662 April 11 2013 Optional reading: ISL 8.1, ESL 9.2 1 Tree-based methods Tree-based based methods for predicting

More information

Machine Learning Techniques

Machine Learning Techniques Machine Learning Techniques ( 機器學習技法 ) Lecture 9: Decision Tree Hsuan-Tien Lin ( 林軒田 ) htlin@csie.ntu.edu.tw Department of Computer Science & Information Engineering National Taiwan University ( 國立台灣大學資訊工程系

More information

Ensemble Learning. Another approach is to leverage the algorithms we have via ensemble methods

Ensemble Learning. Another approach is to leverage the algorithms we have via ensemble methods Ensemble Learning Ensemble Learning So far we have seen learning algorithms that take a training set and output a classifier What if we want more accuracy than current algorithms afford? Develop new learning

More information

MIT Samberg Center Cambridge, MA, USA. May 30 th June 2 nd, by C. Rea, R.S. Granetz MIT Plasma Science and Fusion Center, Cambridge, MA, USA

MIT Samberg Center Cambridge, MA, USA. May 30 th June 2 nd, by C. Rea, R.S. Granetz MIT Plasma Science and Fusion Center, Cambridge, MA, USA Exploratory Machine Learning studies for disruption prediction on DIII-D by C. Rea, R.S. Granetz MIT Plasma Science and Fusion Center, Cambridge, MA, USA Presented at the 2 nd IAEA Technical Meeting on

More information

Classification. Instructor: Wei Ding

Classification. Instructor: Wei Ding Classification Decision Tree Instructor: Wei Ding Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 1 Preliminaries Each data record is characterized by a tuple (x, y), where x is the attribute

More information

Data Mining. 3.3 Rule-Based Classification. Fall Instructor: Dr. Masoud Yaghini. Rule-Based Classification

Data Mining. 3.3 Rule-Based Classification. Fall Instructor: Dr. Masoud Yaghini. Rule-Based Classification Data Mining 3.3 Fall 2008 Instructor: Dr. Masoud Yaghini Outline Using IF-THEN Rules for Classification Rules With Exceptions Rule Extraction from a Decision Tree 1R Algorithm Sequential Covering Algorithms

More information

Machine Learning Techniques for Data Mining

Machine Learning Techniques for Data Mining Machine Learning Techniques for Data Mining Eibe Frank University of Waikato New Zealand 10/25/2000 1 PART VII Moving on: Engineering the input and output 10/25/2000 2 Applying a learner is not all Already

More information

Univariate and Multivariate Decision Trees

Univariate and Multivariate Decision Trees Univariate and Multivariate Decision Trees Olcay Taner Yıldız and Ethem Alpaydın Department of Computer Engineering Boğaziçi University İstanbul 80815 Turkey Abstract. Univariate decision trees at each

More information

Computer Vision Group Prof. Daniel Cremers. 8. Boosting and Bagging

Computer Vision Group Prof. Daniel Cremers. 8. Boosting and Bagging Prof. Daniel Cremers 8. Boosting and Bagging Repetition: Regression We start with a set of basis functions (x) =( 0 (x), 1(x),..., M 1(x)) x 2 í d The goal is to fit a model into the data y(x, w) =w T

More information

Part I. Instructor: Wei Ding

Part I. Instructor: Wei Ding Classification Part I Instructor: Wei Ding Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 1 Classification: Definition Given a collection of records (training set ) Each record contains a set

More information

Additive Models, Trees, etc. Based in part on Chapter 9 of Hastie, Tibshirani, and Friedman David Madigan

Additive Models, Trees, etc. Based in part on Chapter 9 of Hastie, Tibshirani, and Friedman David Madigan Additive Models, Trees, etc. Based in part on Chapter 9 of Hastie, Tibshirani, and Friedman David Madigan Predictive Modeling Goal: learn a mapping: y = f(x;θ) Need: 1. A model structure 2. A score function

More information

Decision Trees. This week: Next week: Algorithms for constructing DT. Pruning DT Ensemble methods. Random Forest. Intro to ML

Decision Trees. This week: Next week: Algorithms for constructing DT. Pruning DT Ensemble methods. Random Forest. Intro to ML Decision Trees This week: Algorithms for constructing DT Next week: Pruning DT Ensemble methods Random Forest 2 Decision Trees - Boolean x 1 0 1 +1 x 6 0 1 +1-1 3 Decision Trees - Continuous Decision stumps

More information

Softening Splits in Decision Trees Using Simulated Annealing

Softening Splits in Decision Trees Using Simulated Annealing Softening Splits in Decision Trees Using Simulated Annealing Jakub Dvořák and Petr Savický Institute of Computer Science, Academy of Sciences of the Czech Republic {dvorak,savicky}@cs.cas.cz Abstract.

More information

ARTIFICIAL INTELLIGENCE (CS 370D)

ARTIFICIAL INTELLIGENCE (CS 370D) Princess Nora University Faculty of Computer & Information Systems ARTIFICIAL INTELLIGENCE (CS 370D) (CHAPTER-18) LEARNING FROM EXAMPLES DECISION TREES Outline 1- Introduction 2- know your data 3- Classification

More information

Chapter 5. Tree-based Methods

Chapter 5. Tree-based Methods Chapter 5. Tree-based Methods Wei Pan Division of Biostatistics, School of Public Health, University of Minnesota, Minneapolis, MN 55455 Email: weip@biostat.umn.edu PubH 7475/8475 c Wei Pan Regression

More information

An Information-Theoretic Approach to the Prepruning of Classification Rules

An Information-Theoretic Approach to the Prepruning of Classification Rules An Information-Theoretic Approach to the Prepruning of Classification Rules Max Bramer University of Portsmouth, Portsmouth, UK Abstract: Keywords: The automatic induction of classification rules from

More information

A Systematic Overview of Data Mining Algorithms. Sargur Srihari University at Buffalo The State University of New York

A Systematic Overview of Data Mining Algorithms. Sargur Srihari University at Buffalo The State University of New York A Systematic Overview of Data Mining Algorithms Sargur Srihari University at Buffalo The State University of New York 1 Topics Data Mining Algorithm Definition Example of CART Classification Iris, Wine

More information

arxiv: v1 [stat.ml] 25 Jan 2018

arxiv: v1 [stat.ml] 25 Jan 2018 arxiv:1801.08310v1 [stat.ml] 25 Jan 2018 Information gain ratio correction: Improving prediction with more balanced decision tree splits Antonin Leroux 1, Matthieu Boussard 1, and Remi Dès 1 1 craft ai

More information

The Curse of Dimensionality

The Curse of Dimensionality The Curse of Dimensionality ACAS 2002 p1/66 Curse of Dimensionality The basic idea of the curse of dimensionality is that high dimensional data is difficult to work with for several reasons: Adding more

More information

7. Boosting and Bagging Bagging

7. Boosting and Bagging Bagging Group Prof. Daniel Cremers 7. Boosting and Bagging Bagging Bagging So far: Boosting as an ensemble learning method, i.e.: a combination of (weak) learners A different way to combine classifiers is known

More information

Logical Rhythm - Class 3. August 27, 2018

Logical Rhythm - Class 3. August 27, 2018 Logical Rhythm - Class 3 August 27, 2018 In this Class Neural Networks (Intro To Deep Learning) Decision Trees Ensemble Methods(Random Forest) Hyperparameter Optimisation and Bias Variance Tradeoff Biological

More information

Decision Trees. This week: Next week: constructing DT. Pruning DT. Ensemble methods. Greedy Algorithm Potential Function.

Decision Trees. This week: Next week: constructing DT. Pruning DT. Ensemble methods. Greedy Algorithm Potential Function. Decision Trees This week: constructing DT Greedy Algorithm Potential Function upper bounds the error Pruning DT Next week: Ensemble methods Random Forest 2 Decision Trees - Boolean x 0 + x 6 0 + - 3 Decision

More information

CS 559: Machine Learning Fundamentals and Applications 10 th Set of Notes

CS 559: Machine Learning Fundamentals and Applications 10 th Set of Notes 1 CS 559: Machine Learning Fundamentals and Applications 10 th Set of Notes Instructor: Philippos Mordohai Webpage: www.cs.stevens.edu/~mordohai E-mail: Philippos.Mordohai@stevens.edu Office: Lieb 215

More information

Exam Advanced Data Mining Date: Time:

Exam Advanced Data Mining Date: Time: Exam Advanced Data Mining Date: 11-11-2010 Time: 13.30-16.30 General Remarks 1. You are allowed to consult 1 A4 sheet with notes written on both sides. 2. Always show how you arrived at the result of your

More information

Induction of Multivariate Decision Trees by Using Dipolar Criteria

Induction of Multivariate Decision Trees by Using Dipolar Criteria Induction of Multivariate Decision Trees by Using Dipolar Criteria Leon Bobrowski 1,2 and Marek Krȩtowski 1 1 Institute of Computer Science, Technical University of Bia lystok, Poland 2 Institute of Biocybernetics

More information

An Empirical Comparison of Spectral Learning Methods for Classification

An Empirical Comparison of Spectral Learning Methods for Classification An Empirical Comparison of Spectral Learning Methods for Classification Adam Drake and Dan Ventura Computer Science Department Brigham Young University, Provo, UT 84602 USA Email: adam drake1@yahoo.com,

More information

Example of DT Apply Model Example Learn Model Hunt s Alg. Measures of Node Impurity DT Examples and Characteristics. Classification.

Example of DT Apply Model Example Learn Model Hunt s Alg. Measures of Node Impurity DT Examples and Characteristics. Classification. lassification-decision Trees, Slide 1/56 Classification Decision Trees Huiping Cao lassification-decision Trees, Slide 2/56 Examples of a Decision Tree Tid Refund Marital Status Taxable Income Cheat 1

More information

Nominal Data. May not have a numerical representation Distance measures might not make sense PR, ANN, & ML

Nominal Data. May not have a numerical representation Distance measures might not make sense PR, ANN, & ML Decision Trees Nominal Data So far we consider patterns to be represented by feature vectors of real or integer values Easy to come up with a distance (similarity) measure by using a variety of mathematical

More information

Slides for Data Mining by I. H. Witten and E. Frank

Slides for Data Mining by I. H. Witten and E. Frank Slides for Data Mining by I. H. Witten and E. Frank 7 Engineering the input and output Attribute selection Scheme-independent, scheme-specific Attribute discretization Unsupervised, supervised, error-

More information

A Systematic Overview of Data Mining Algorithms

A Systematic Overview of Data Mining Algorithms A Systematic Overview of Data Mining Algorithms 1 Data Mining Algorithm A well-defined procedure that takes data as input and produces output as models or patterns well-defined: precisely encoded as a

More information

Performance Evaluation of Various Classification Algorithms

Performance Evaluation of Various Classification Algorithms Performance Evaluation of Various Classification Algorithms Shafali Deora Amritsar College of Engineering & Technology, Punjab Technical University -----------------------------------------------------------***----------------------------------------------------------

More information