Lecture 19: Decision trees

Size: px
Start display at page:

Download "Lecture 19: Decision trees"

Transcription

1 Lecture 19: Decision trees Reading: Section 8.1 STATS 202: Data mining and analysis November 10, / 17

2 Decision trees, 10,000 foot view R2 R5 t4 1. Find a partition of the space of predictors. X2 X2 t2 R1 R3 R4 2. Predict a constant in each set of the partition. t1 t3 X1 X1 X1 t1 X2 t2 X1 t3 X2 t4 R1 R2 R3 X2 X1 R4 R5 2 / 17

3 Decision trees, 10,000 foot view R2 R5 t4 1. Find a partition of the space of predictors. X2 X2 t2 R1 R3 R4 2. Predict a constant in each set of the partition. X1 t1 X1 t1 t3 X1 3. The partition is defined by splitting the range of one predictor at a time. X2 t2 X1 t3 X2 t4 R1 R2 R3 X2 X1 R4 R5 2 / 17

4 Decision trees, 10,000 foot view R2 R5 t4 1. Find a partition of the space of predictors. X2 X2 t2 R1 R3 R4 2. Predict a constant in each set of the partition. X2 t2 R1 X1 X1 t1 X1 t3 X2 t4 R2 R3 R4 R5 X2 t1 t3 X1 X1 3. The partition is defined by splitting the range of one predictor at a time. Can be represented as a decision tree. Not all partitions are possible. 2 / 17

5 Example: Predicting a baseball player s salary Years < R 3 Hits R Hits < R Years The prediction for a point in region R i is the average of the training points in R i. 3 / 17

6 How is a decision tree built? Start with a single region R 1 (entire input space), and iterate: 1. Select a region R k, a predictor X j, and a splitting point s, such that splitting R k with the criterion X j < s produces the largest decrease in RSS: T m=1 x i R m (y i ȳ Rm ) 2 2. Redefine the regions with this additional split. 4 / 17

7 How is a decision tree built? Start with a single region R 1 (entire input space), and iterate: 1. Select a region R k, a predictor X j, and a splitting point s, such that splitting R k with the criterion X j < s produces the largest decrease in RSS: T m=1 x i R m (y i ȳ Rm ) 2 2. Redefine the regions with this additional split. Terminate when there are 5 observations or fewer in each region (or use a different stopping criterion.) 4 / 17

8 How is a decision tree built? Start with a single region R 1 (entire input space), and iterate: 1. Select a region R k, a predictor X j, and a splitting point s, such that splitting R k with the criterion X j < s produces the largest decrease in RSS: T m=1 x i R m (y i ȳ Rm ) 2 2. Redefine the regions with this additional split. Terminate when there are 5 observations or fewer in each region (or use a different stopping criterion.) This grows the tree from the root towards the leaves (Top-down greedy approach). 4 / 17

9 A decision tree for baseball salaries Years < 4.5 RBI < 60.5 Hits < Putouts < 82 Years < 3.5 Years < Walks < 43.5 Runs < Walks < 52.5 RBI < 80.5 Years < / 17

10 How do we control overfitting? Idea 1: Find the optimal subtree by cross validation. 6 / 17

11 How do we control overfitting? Idea 1: Find the optimal subtree by cross validation. There are too many possibilities, so we would still over fit. 6 / 17

12 How do we control overfitting? Idea 1: Find the optimal subtree by cross validation. There are too many possibilities, so we would still over fit. Idea 2: Stop growing the tree when the RSS doesn t drop by more than a threshold with any new cut. 6 / 17

13 How do we control overfitting? Idea 1: Find the optimal subtree by cross validation. There are too many possibilities, so we would still over fit. Idea 2: Stop growing the tree when the RSS doesn t drop by more than a threshold with any new cut. In our greedy algorithm, it is possible to find good cuts after bad ones. 6 / 17

14 How do we control overfitting? Solution: Prune a large tree from the leaves to the root. Weakest link pruning: 7 / 17

15 How do we control overfitting? Solution: Prune a large tree from the leaves to the root. Weakest link pruning: Starting with with the initial full tree T0, replace a subtree with a leaf node to obtain a new tree T 1. Select subtree to prune by minimizing: RSS(T 1 ) RSS(T 0 ). T 0 T 1 7 / 17

16 How do we control overfitting? Solution: Prune a large tree from the leaves to the root. Weakest link pruning: Starting with with the initial full tree T0, replace a subtree with a leaf node to obtain a new tree T 1. Select subtree to prune by minimizing: RSS(T 1 ) RSS(T 0 ). T 0 T 1 Iterate this pruning to obtain a sequence T0, T 1, T 2,..., T m where T m is the tree with a single leaf node. 7 / 17

17 How do we control overfitting? Solution: Prune a large tree from the leaves to the root. Weakest link pruning: Starting with with the initial full tree T0, replace a subtree with a leaf node to obtain a new tree T 1. Select subtree to prune by minimizing: RSS(T 1 ) RSS(T 0 ). T 0 T 1 Iterate this pruning to obtain a sequence T0, T 1, T 2,..., T m where T m is the tree with a single leaf node. Select the optimal tree T i by cross validation. 7 / 17

18 How do we control overfitting?... or an equivalent procedure Cost complexity pruning: 8 / 17

19 How do we control overfitting?... or an equivalent procedure Cost complexity pruning: Minimize the following objective over all prunings T of T 0 : minimize (y i ȳ Rm ) 2 + α T. R m T x i R m 8 / 17

20 How do we control overfitting?... or an equivalent procedure Cost complexity pruning: Minimize the following objective over all prunings T of T 0 : minimize R m T x i R m (y i ȳ Rm ) 2 + α T. When α =, we select the null tree (=tree with one leaf node.) 8 / 17

21 How do we control overfitting?... or an equivalent procedure Cost complexity pruning: Minimize the following objective over all prunings T of T 0 : minimize R m T x i R m (y i ȳ Rm ) 2 + α T. When α =, we select the null tree (=tree with one leaf node.) When α = 0, we select the full tree. 8 / 17

22 How do we control overfitting?... or an equivalent procedure Cost complexity pruning: Minimize the following objective over all prunings T of T 0 : minimize R m T x i R m (y i ȳ Rm ) 2 + α T. When α =, we select the null tree (=tree with one leaf node.) When α = 0, we select the full tree. Fun fact: The solution for each α is among T 1, T 2,..., T m from weakest link pruning. 8 / 17

23 How do we control overfitting?... or an equivalent procedure Cost complexity pruning: Minimize the following objective over all prunings T of T 0 : minimize R m T x i R m (y i ȳ Rm ) 2 + α T. When α =, we select the null tree (=tree with one leaf node.) When α = 0, we select the full tree. Fun fact: The solution for each α is among T 1, T 2,..., T m from weakest link pruning. Choose the optimal α (the optimal Ti ) by cross validation. 8 / 17

24 Cross validation, the wrong way 1. Construct a sequence of trees T 0,..., T m for a range of values of α. 9 / 17

25 Cross validation, the wrong way 1. Construct a sequence of trees T 0,..., T m for a range of values of α. 2. Split the training points into 10 folds. 9 / 17

26 Cross validation, the wrong way 1. Construct a sequence of trees T 0,..., T m for a range of values of α. 2. Split the training points into 10 folds. 3. For k = 1,..., 10, For each tree Ti, use every fold except the kth to estimate the averages in each region. For each tree T i, calculate the RSS in the test fold. 9 / 17

27 Cross validation, the wrong way 1. Construct a sequence of trees T 0,..., T m for a range of values of α. 2. Split the training points into 10 folds. 3. For k = 1,..., 10, For each tree Ti, use every fold except the kth to estimate the averages in each region. For each tree T i, calculate the RSS in the test fold. 4. For each tree T i, average the 10 test errors, and select the value of α that minimizes the error. 9 / 17

28 Cross validation, the wrong way 1. Construct a sequence of trees T 0,..., T m for a range of values of α. 2. Split the training points into 10 folds. 3. For k = 1,..., 10, For each tree Ti, use every fold except the kth to estimate the averages in each region. For each tree T i, calculate the RSS in the test fold. 4. For each tree T i, average the 10 test errors, and select the value of α that minimizes the error. WRONG WAY TO DO CROSS VALIDATION! 9 / 17

29 Cross validation, the right way 1. Split the training points into 10 folds. 10 / 17

30 Cross validation, the right way 1. Split the training points into 10 folds. 2. For k = 1,..., 10, using every fold except the kth: Construct a sequence of trees T1,..., T m for a range of values of α, and find the prediction for each region in each one. For each tree Ti, calculate the RSS on the test set. 10 / 17

31 Cross validation, the right way 1. Split the training points into 10 folds. 2. For k = 1,..., 10, using every fold except the kth: Construct a sequence of trees T1,..., T m for a range of values of α, and find the prediction for each region in each one. For each tree Ti, calculate the RSS on the test set. 3. Select the parameter α that minimizes the average test error. 10 / 17

32 Cross validation, the right way 1. Split the training points into 10 folds. 2. For k = 1,..., 10, using every fold except the kth: Construct a sequence of trees T1,..., T m for a range of values of α, and find the prediction for each region in each one. For each tree Ti, calculate the RSS on the test set. 3. Select the parameter α that minimizes the average test error. Note: We are doing all fitting, including the construction of the trees, using only the training data. 10 / 17

33 Example. Predicting baseball salaries Years < 4.5 RBI < 60.5 Putouts < 82 Years < 3.5 Hits < Mean Squared Error Training Cross Validation Test Years < Walks < 43.5 Walks < 52.5 Runs < 47.5 RBI < Years < Tree Size 11 / 17

34 Example. Predicting baseball salaries Years < Hits < Mean Squared Error Training Cross Validation Test Tree Size / 17

35 Classification trees They work much like regression trees. 13 / 17

36 Classification trees They work much like regression trees. We predict the response by majority vote, i.e. pick the most common class in every region. 13 / 17

37 Classification trees They work much like regression trees. We predict the response by majority vote, i.e. pick the most common class in every region. Instead of trying to minimize the RSS: T m=1 x i R m (y i ȳ Rm ) 2 we minimize a classification loss function. 13 / 17

38 Classification losses The 0-1 loss or misclassification rate: The Gini index: T m=1 x i R m 1(y i ŷ Rm ) T m=1 q m K k=1 ˆp mk (1 ˆp mk ), where ˆp m,k is the proportion of class k within R m, and q m is the proportion of samples in R m. The cross-entropy: T m=1 q m K k=1 ˆp mk log(ˆp mk ). 14 / 17

39 Classification losses The Gini index and cross-entropy are better measures of the purity of a region, i.e. they are low when the region is mostly one category. 15 / 17

40 Classification losses The Gini index and cross-entropy are better measures of the purity of a region, i.e. they are low when the region is mostly one category. Motivation for the Gini index: If instead of predicting the most likely class, we predict a random sample from the distribution (ˆp 1,m, ˆp 2,m,..., ˆp K,m ), the Gini index is the expected misclassification rate. 15 / 17

41 Classification losses The Gini index and cross-entropy are better measures of the purity of a region, i.e. they are low when the region is mostly one category. Motivation for the Gini index: If instead of predicting the most likely class, we predict a random sample from the distribution (ˆp 1,m, ˆp 2,m,..., ˆp K,m ), the Gini index is the expected misclassification rate. It is typical to use the Gini index or cross-entropy for growing the tree, while using the misclassification rate when pruning the tree. 15 / 17

42 Example. Heart dataset. Thal:a Ca < 0.5 Ca < 0.5 Slope < 1.5 Oldpeak < 1.1 MaxHR < RestBP < 157 Chol < 244 MaxHR < 156 MaxHR < Yes No No No Yes No ChestPain:bc Chol < 244 Sex < 0.5 No No No Yes Age < 52 Thal:b ChestPain:a Yes No No No Yes RestECG < 1 Yes Yes Yes Error Training Cross Validation Test MaxHR < No No Ca < 0.5 ChestPain:bc Thal:a Yes Ca < 0.5 Yes No Yes Tree Size 16 / 17

43 Some advantages of decision trees Very easy to interpret! 17 / 17

44 Some advantages of decision trees Very easy to interpret! Closer to human decision-making. 17 / 17

45 Some advantages of decision trees Very easy to interpret! Closer to human decision-making. Easy to visualize graphically. 17 / 17

46 Some advantages of decision trees Very easy to interpret! Closer to human decision-making. Easy to visualize graphically. They easily handle qualitative predictors and missing data. 17 / 17

Classification/Regression Trees and Random Forests

Classification/Regression Trees and Random Forests Classification/Regression Trees and Random Forests Fabio G. Cozman - fgcozman@usp.br November 6, 2018 Classification tree Consider binary class variable Y and features X 1,..., X n. Decide Ŷ after a series

More information

The Basics of Decision Trees

The Basics of Decision Trees Tree-based Methods Here we describe tree-based methods for regression and classification. These involve stratifying or segmenting the predictor space into a number of simple regions. Since the set of splitting

More information

Lecture 20: Bagging, Random Forests, Boosting

Lecture 20: Bagging, Random Forests, Boosting Lecture 20: Bagging, Random Forests, Boosting Reading: Chapter 8 STATS 202: Data mining and analysis November 13, 2017 1 / 17 Classification and Regression trees, in a nut shell Grow the tree by recursively

More information

Statistical Methods for Data Mining

Statistical Methods for Data Mining Statistical Methods for Data Mining Kuangnan Fang Xiamen University Email: xmufkn@xmu.edu.cn Tree-based Methods Here we describe tree-based methods for regression and classification. These involve stratifying

More information

Random Forests and Boosting

Random Forests and Boosting Random Forests and Boosting Tree-based methods are simple and useful for interpretation. However they typically are not competitive with the best supervised learning approaches in terms of prediction accuracy.

More information

Classification and Regression Trees

Classification and Regression Trees Classification and Regression Trees Matthew S. Shotwell, Ph.D. Department of Biostatistics Vanderbilt University School of Medicine Nashville, TN, USA March 16, 2018 Introduction trees partition feature

More information

Classification and Regression Trees

Classification and Regression Trees Classification and Regression Trees David S. Rosenberg New York University April 3, 2018 David S. Rosenberg (New York University) DS-GA 1003 / CSCI-GA 2567 April 3, 2018 1 / 51 Contents 1 Trees 2 Regression

More information

Tree-based methods for classification and regression

Tree-based methods for classification and regression Tree-based methods for classification and regression Ryan Tibshirani Data Mining: 36-462/36-662 April 11 2013 Optional reading: ISL 8.1, ESL 9.2 1 Tree-based methods Tree-based based methods for predicting

More information

Chapter 5. Tree-based Methods

Chapter 5. Tree-based Methods Chapter 5. Tree-based Methods Wei Pan Division of Biostatistics, School of Public Health, University of Minnesota, Minneapolis, MN 55455 Email: weip@biostat.umn.edu PubH 7475/8475 c Wei Pan Regression

More information

Decision trees. Decision trees are useful to a large degree because of their simplicity and interpretability

Decision trees. Decision trees are useful to a large degree because of their simplicity and interpretability Decision trees A decision tree is a method for classification/regression that aims to ask a few relatively simple questions about an input and then predicts the associated output Decision trees are useful

More information

Knowledge Discovery and Data Mining

Knowledge Discovery and Data Mining Knowledge Discovery and Data Mining Lecture 10 - Classification trees Tom Kelsey School of Computer Science University of St Andrews http://tom.home.cs.st-andrews.ac.uk twk@st-andrews.ac.uk Tom Kelsey

More information

Introduction to Classification & Regression Trees

Introduction to Classification & Regression Trees Introduction to Classification & Regression Trees ISLR Chapter 8 vember 8, 2017 Classification and Regression Trees Carseat data from ISLR package Classification and Regression Trees Carseat data from

More information

8. Tree-based approaches

8. Tree-based approaches Foundations of Machine Learning École Centrale Paris Fall 2015 8. Tree-based approaches Chloé-Agathe Azencott Centre for Computational Biology, Mines ParisTech chloe agathe.azencott@mines paristech.fr

More information

Decision Tree Learning

Decision Tree Learning Decision Tree Learning Debapriyo Majumdar Data Mining Fall 2014 Indian Statistical Institute Kolkata August 25, 2014 Example: Age, Income and Owning a flat Monthly income (thousand rupees) 250 200 150

More information

Random Forest A. Fornaser

Random Forest A. Fornaser Random Forest A. Fornaser alberto.fornaser@unitn.it Sources Lecture 15: decision trees, information theory and random forests, Dr. Richard E. Turner Trees and Random Forests, Adele Cutler, Utah State University

More information

MIT 801. Machine Learning I. [Presented by Anna Bosman] 16 February 2018

MIT 801. Machine Learning I. [Presented by Anna Bosman] 16 February 2018 MIT 801 [Presented by Anna Bosman] 16 February 2018 Machine Learning What is machine learning? Artificial Intelligence? Yes as we know it. What is intelligence? The ability to acquire and apply knowledge

More information

Decision tree learning

Decision tree learning Decision tree learning Andrea Passerini passerini@disi.unitn.it Machine Learning Learning the concept Go to lesson OUTLOOK Rain Overcast Sunny TRANSPORTATION LESSON NO Uncovered Covered Theoretical Practical

More information

CS229 Lecture notes. Raphael John Lamarre Townshend

CS229 Lecture notes. Raphael John Lamarre Townshend CS229 Lecture notes Raphael John Lamarre Townshend Decision Trees We now turn our attention to decision trees, a simple yet flexible class of algorithms. We will first consider the non-linear, region-based

More information

STAT Midterm Research Project: Random Forest. Vicky Xu. Xiyu Liang. Xue Cao

STAT Midterm Research Project: Random Forest. Vicky Xu. Xiyu Liang. Xue Cao STAT 5703 Midterm Research Project: Random Forest Vicky Xu Xiyu Liang Xue Cao 1 Table of Contents Abstract... 4 Literature Review... 5 Decision Tree... 6 1. Definition and Overview... 6 2. Some most common

More information

Data Mining Lecture 8: Decision Trees

Data Mining Lecture 8: Decision Trees Data Mining Lecture 8: Decision Trees Jo Houghton ECS Southampton March 8, 2019 1 / 30 Decision Trees - Introduction A decision tree is like a flow chart. E. g. I need to buy a new car Can I afford it?

More information

Lecture 5: Decision Trees (Part II)

Lecture 5: Decision Trees (Part II) Lecture 5: Decision Trees (Part II) Dealing with noise in the data Overfitting Pruning Dealing with missing attribute values Dealing with attributes with multiple values Integrating costs into node choice

More information

Notes based on: Data Mining for Business Intelligence

Notes based on: Data Mining for Business Intelligence Chapter 9 Classification and Regression Trees Roger Bohn April 2017 Notes based on: Data Mining for Business Intelligence 1 Shmueli, Patel & Bruce 2 3 II. Results and Interpretation There are 1183 auction

More information

Network Traffic Measurements and Analysis

Network Traffic Measurements and Analysis DEIB - Politecnico di Milano Fall, 2017 Sources Hastie, Tibshirani, Friedman: The Elements of Statistical Learning James, Witten, Hastie, Tibshirani: An Introduction to Statistical Learning Andrew Ng:

More information

Lecture 20: Classification and Regression Trees

Lecture 20: Classification and Regression Trees Fall, 2017 Outline Basic Ideas Basic Ideas Tree Construction Algorithm Parameter Tuning Choice of Impurity Measure Missing Values Characteristics of Classification Trees Main Characteristics: very flexible,

More information

Regression trees with R

Regression trees with R Regression trees with R Emanuele Taufer file:///c:/users/emanuele.taufer/google%20drive/2%20corsi/5%20qmma%20-%20mim/0%20labs/l7-regression-trees.html#(1) 1/17 Implement decision trees with R To adapt

More information

Classification with Decision Tree Induction

Classification with Decision Tree Induction Classification with Decision Tree Induction This algorithm makes Classification Decision for a test sample with the help of tree like structure (Similar to Binary Tree OR k-ary tree) Nodes in the tree

More information

Data Mining. 3.2 Decision Tree Classifier. Fall Instructor: Dr. Masoud Yaghini. Chapter 5: Decision Tree Classifier

Data Mining. 3.2 Decision Tree Classifier. Fall Instructor: Dr. Masoud Yaghini. Chapter 5: Decision Tree Classifier Data Mining 3.2 Decision Tree Classifier Fall 2008 Instructor: Dr. Masoud Yaghini Outline Introduction Basic Algorithm for Decision Tree Induction Attribute Selection Measures Information Gain Gain Ratio

More information

Lecture 7: Decision Trees

Lecture 7: Decision Trees Lecture 7: Decision Trees Instructor: Outline 1 Geometric Perspective of Classification 2 Decision Trees Geometric Perspective of Classification Perspective of Classification Algorithmic Geometric Probabilistic...

More information

Patrick Breheny. November 10

Patrick Breheny. November 10 Patrick Breheny November Patrick Breheny BST 764: Applied Statistical Modeling /6 Introduction Our discussion of tree-based methods in the previous section assumed that the outcome was continuous Tree-based

More information

Lecture 25: Review I

Lecture 25: Review I Lecture 25: Review I Reading: Up to chapter 5 in ISLR. STATS 202: Data mining and analysis Jonathan Taylor 1 / 18 Unsupervised learning In unsupervised learning, all the variables are on equal standing,

More information

Decision Trees / Discrete Variables

Decision Trees / Discrete Variables Decision trees Decision Trees / Discrete Variables Season Location Fun? Location summer prison -1 summer beach +1 Prison Beach Ski Slope Winter ski-slope +1-1 Season +1 Winter beach -1 Winter Summer -1

More information

Statistical Machine Learning Hilary Term 2018

Statistical Machine Learning Hilary Term 2018 Statistical Machine Learning Hilary Term 2018 Pier Francesco Palamara Department of Statistics University of Oxford Slide credits and other course material can be found at: http://www.stats.ox.ac.uk/~palamara/sml18.html

More information

The digital copy of this thesis is protected by the Copyright Act 1994 (New Zealand).

The digital copy of this thesis is protected by the Copyright Act 1994 (New Zealand). http://waikato.researchgateway.ac.nz/ Research Commons at the University of Waikato Copyright Statement: The digital copy of this thesis is protected by the Copyright Act 1994 (New Zealand). The thesis

More information

CS 229 Midterm Review

CS 229 Midterm Review CS 229 Midterm Review Course Staff Fall 2018 11/2/2018 Outline Today: SVMs Kernels Tree Ensembles EM Algorithm / Mixture Models [ Focus on building intuition, less so on solving specific problems. Ask

More information

Nonparametric Classification Methods

Nonparametric Classification Methods Nonparametric Classification Methods We now examine some modern, computationally intensive methods for regression and classification. Recall that the LDA approach constructs a line (or plane or hyperplane)

More information

Decision Trees Dr. G. Bharadwaja Kumar VIT Chennai

Decision Trees Dr. G. Bharadwaja Kumar VIT Chennai Decision Trees Decision Tree Decision Trees (DTs) are a nonparametric supervised learning method used for classification and regression. The goal is to create a model that predicts the value of a target

More information

CART. Classification and Regression Trees. Rebecka Jörnsten. Mathematical Sciences University of Gothenburg and Chalmers University of Technology

CART. Classification and Regression Trees. Rebecka Jörnsten. Mathematical Sciences University of Gothenburg and Chalmers University of Technology CART Classification and Regression Trees Rebecka Jörnsten Mathematical Sciences University of Gothenburg and Chalmers University of Technology CART CART stands for Classification And Regression Trees.

More information

Machine Learning. Decision Trees. Manfred Huber

Machine Learning. Decision Trees. Manfred Huber Machine Learning Decision Trees Manfred Huber 2015 1 Decision Trees Classifiers covered so far have been Non-parametric (KNN) Probabilistic with independence (Naïve Bayes) Linear in features (Logistic

More information

Nonparametric Methods Recap

Nonparametric Methods Recap Nonparametric Methods Recap Aarti Singh Machine Learning 10-701/15-781 Oct 4, 2010 Nonparametric Methods Kernel Density estimate (also Histogram) Weighted frequency Classification - K-NN Classifier Majority

More information

Business Club. Decision Trees

Business Club. Decision Trees Business Club Decision Trees Business Club Analytics Team December 2017 Index 1. Motivation- A Case Study 2. The Trees a. What is a decision tree b. Representation 3. Regression v/s Classification 4. Building

More information

CLASSIFICATION USING E-MINER: LOGISTIC REGRESSION AND CART

CLASSIFICATION USING E-MINER: LOGISTIC REGRESSION AND CART CLASSIFICATION USING E-MINER: LOGISTIC REGRESSION AND CART Ramasubramanian V. I.A.S.R.I., Library Avenue, PUSA, New Delhi-110012 ramsub@iasri.res.in 1. Logistic regression Generally, conventional theory

More information

Lecture 2 :: Decision Trees Learning

Lecture 2 :: Decision Trees Learning Lecture 2 :: Decision Trees Learning 1 / 62 Designing a learning system What to learn? Learning setting. Learning mechanism. Evaluation. 2 / 62 Prediction task Figure 1: Prediction task :: Supervised learning

More information

Decision Tree CE-717 : Machine Learning Sharif University of Technology

Decision Tree CE-717 : Machine Learning Sharif University of Technology Decision Tree CE-717 : Machine Learning Sharif University of Technology M. Soleymani Fall 2012 Some slides have been adapted from: Prof. Tom Mitchell Decision tree Approximating functions of usually discrete

More information

Algorithms: Decision Trees

Algorithms: Decision Trees Algorithms: Decision Trees A small dataset: Miles Per Gallon Suppose we want to predict MPG From the UCI repository A Decision Stump Recursion Step Records in which cylinders = 4 Records in which cylinders

More information

Lecture 13: Model selection and regularization

Lecture 13: Model selection and regularization Lecture 13: Model selection and regularization Reading: Sections 6.1-6.2.1 STATS 202: Data mining and analysis October 23, 2017 1 / 17 What do we know so far In linear regression, adding predictors always

More information

9/6/14. Our first learning algorithm. Comp 135 Introduction to Machine Learning and Data Mining. knn Algorithm. knn Algorithm (simple form)

9/6/14. Our first learning algorithm. Comp 135 Introduction to Machine Learning and Data Mining. knn Algorithm. knn Algorithm (simple form) Comp 135 Introduction to Machine Learning and Data Mining Our first learning algorithm How would you classify the next example? Fall 2014 Professor: Roni Khardon Computer Science Tufts University o o o

More information

Midterm Examination CS540-2: Introduction to Artificial Intelligence

Midterm Examination CS540-2: Introduction to Artificial Intelligence Midterm Examination CS540-2: Introduction to Artificial Intelligence March 15, 2018 LAST NAME: FIRST NAME: Problem Score Max Score 1 12 2 13 3 9 4 11 5 8 6 13 7 9 8 16 9 9 Total 100 Question 1. [12] Search

More information

Lecture outline. Decision-tree classification

Lecture outline. Decision-tree classification Lecture outline Decision-tree classification Decision Trees Decision tree A flow-chart-like tree structure Internal node denotes a test on an attribute Branch represents an outcome of the test Leaf nodes

More information

7. Decision or classification trees

7. Decision or classification trees 7. Decision or classification trees Next we are going to consider a rather different approach from those presented so far to machine learning that use one of the most common and important data structure,

More information

Data Mining Practical Machine Learning Tools and Techniques

Data Mining Practical Machine Learning Tools and Techniques Decision trees Extending previous approach: Data Mining Practical Machine Learning Tools and Techniques Slides for Chapter 6 of Data Mining by I. H. Witten and E. Frank to permit numeric s: straightforward

More information

CSE4334/5334 DATA MINING

CSE4334/5334 DATA MINING CSE4334/5334 DATA MINING Lecture 4: Classification (1) CSE4334/5334 Data Mining, Fall 2014 Department of Computer Science and Engineering, University of Texas at Arlington Chengkai Li (Slides courtesy

More information

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany Syllabus Fri. 27.10. (1) 0. Introduction A. Supervised Learning: Linear Models & Fundamentals Fri. 3.11. (2) A.1 Linear Regression Fri. 10.11. (3) A.2 Linear Classification Fri. 17.11. (4) A.3 Regularization

More information

Topics in Machine Learning-EE 5359 Model Assessment and Selection

Topics in Machine Learning-EE 5359 Model Assessment and Selection Topics in Machine Learning-EE 5359 Model Assessment and Selection Ioannis D. Schizas Electrical Engineering Department University of Texas at Arlington 1 Training and Generalization Training stage: Utilizing

More information

Additive Models, Trees, etc. Based in part on Chapter 9 of Hastie, Tibshirani, and Friedman David Madigan

Additive Models, Trees, etc. Based in part on Chapter 9 of Hastie, Tibshirani, and Friedman David Madigan Additive Models, Trees, etc. Based in part on Chapter 9 of Hastie, Tibshirani, and Friedman David Madigan Predictive Modeling Goal: learn a mapping: y = f(x;θ) Need: 1. A model structure 2. A score function

More information

Decision Trees: Discussion

Decision Trees: Discussion Decision Trees: Discussion Machine Learning Spring 2018 The slides are mainly from Vivek Srikumar 1 This lecture: Learning Decision Trees 1. Representation: What are decision trees? 2. Algorithm: Learning

More information

Ensemble Learning: An Introduction. Adapted from Slides by Tan, Steinbach, Kumar

Ensemble Learning: An Introduction. Adapted from Slides by Tan, Steinbach, Kumar Ensemble Learning: An Introduction Adapted from Slides by Tan, Steinbach, Kumar 1 General Idea D Original Training data Step 1: Create Multiple Data Sets... D 1 D 2 D t-1 D t Step 2: Build Multiple Classifiers

More information

Machine Learning. A. Supervised Learning A.7. Decision Trees. Lars Schmidt-Thieme

Machine Learning. A. Supervised Learning A.7. Decision Trees. Lars Schmidt-Thieme Machine Learning A. Supervised Learning A.7. Decision Trees Lars Schmidt-Thieme Information Systems and Machine Learning Lab (ISMLL) Institute for Computer Science University of Hildesheim, Germany 1 /

More information

Supervised Learning. Decision trees Artificial neural nets K-nearest neighbor Support vectors Linear regression Logistic regression...

Supervised Learning. Decision trees Artificial neural nets K-nearest neighbor Support vectors Linear regression Logistic regression... Supervised Learning Decision trees Artificial neural nets K-nearest neighbor Support vectors Linear regression Logistic regression... Supervised Learning y=f(x): true function (usually not known) D: training

More information

University of Cambridge Engineering Part IIB Paper 4F10: Statistical Pattern Processing Handout 10: Decision Trees

University of Cambridge Engineering Part IIB Paper 4F10: Statistical Pattern Processing Handout 10: Decision Trees University of Cambridge Engineering Part IIB Paper 4F10: Statistical Pattern Processing Handout 10: Decision Trees colour=green? size>20cm? colour=red? watermelon size>5cm? size>5cm? colour=yellow? apple

More information

Classification. Instructor: Wei Ding

Classification. Instructor: Wei Ding Classification Part II Instructor: Wei Ding Tan,Steinbach, Kumar Introduction to Data Mining 4/18/004 1 Practical Issues of Classification Underfitting and Overfitting Missing Values Costs of Classification

More information

INTRO TO RANDOM FOREST BY ANTHONY ANH QUOC DOAN

INTRO TO RANDOM FOREST BY ANTHONY ANH QUOC DOAN INTRO TO RANDOM FOREST BY ANTHONY ANH QUOC DOAN MOTIVATION FOR RANDOM FOREST Random forest is a great statistical learning model. It works well with small to medium data. Unlike Neural Network which requires

More information

Data Mining Part 5. Prediction

Data Mining Part 5. Prediction Data Mining Part 5. Prediction 5.4. Spring 2010 Instructor: Dr. Masoud Yaghini Outline Using IF-THEN Rules for Classification Rule Extraction from a Decision Tree 1R Algorithm Sequential Covering Algorithms

More information

Data Mining Classification - Part 1 -

Data Mining Classification - Part 1 - Data Mining Classification - Part 1 - Universität Mannheim Bizer: Data Mining I FSS2019 (Version: 20.2.2018) Slide 1 Outline 1. What is Classification? 2. K-Nearest-Neighbors 3. Decision Trees 4. Model

More information

Machine Learning. Decision Trees. Le Song /15-781, Spring Lecture 6, September 6, 2012 Based on slides from Eric Xing, CMU

Machine Learning. Decision Trees. Le Song /15-781, Spring Lecture 6, September 6, 2012 Based on slides from Eric Xing, CMU Machine Learning 10-701/15-781, Spring 2008 Decision Trees Le Song Lecture 6, September 6, 2012 Based on slides from Eric Xing, CMU Reading: Chap. 1.6, CB & Chap 3, TM Learning non-linear functions f:

More information

Pattern Recognition. Kjell Elenius. Speech, Music and Hearing KTH. March 29, 2007 Speech recognition

Pattern Recognition. Kjell Elenius. Speech, Music and Hearing KTH. March 29, 2007 Speech recognition Pattern Recognition Kjell Elenius Speech, Music and Hearing KTH March 29, 2007 Speech recognition 2007 1 Ch 4. Pattern Recognition 1(3) Bayes Decision Theory Minimum-Error-Rate Decision Rules Discriminant

More information

Big Data Methods. Chapter 5: Machine learning. Big Data Methods, Chapter 5, Slide 1

Big Data Methods. Chapter 5: Machine learning. Big Data Methods, Chapter 5, Slide 1 Big Data Methods Chapter 5: Machine learning Big Data Methods, Chapter 5, Slide 1 5.1 Introduction to machine learning What is machine learning? Concerned with the study and development of algorithms that

More information

Data Mining. Decision Tree. Hamid Beigy. Sharif University of Technology. Fall 1396

Data Mining. Decision Tree. Hamid Beigy. Sharif University of Technology. Fall 1396 Data Mining Decision Tree Hamid Beigy Sharif University of Technology Fall 1396 Hamid Beigy (Sharif University of Technology) Data Mining Fall 1396 1 / 24 Table of contents 1 Introduction 2 Decision tree

More information

University of Cambridge Engineering Part IIB Paper 4F10: Statistical Pattern Processing Handout 10: Decision Trees

University of Cambridge Engineering Part IIB Paper 4F10: Statistical Pattern Processing Handout 10: Decision Trees University of Cambridge Engineering Part IIB Paper 4F10: Statistical Pattern Processing Handout 10: Decision Trees colour=green? size>20cm? colour=red? watermelon size>5cm? size>5cm? colour=yellow? apple

More information

CS Machine Learning

CS Machine Learning CS 60050 Machine Learning Decision Tree Classifier Slides taken from course materials of Tan, Steinbach, Kumar 10 10 Illustrating Classification Task Tid Attrib1 Attrib2 Attrib3 Class 1 Yes Large 125K

More information

Ensemble Methods, Decision Trees

Ensemble Methods, Decision Trees CS 1675: Intro to Machine Learning Ensemble Methods, Decision Trees Prof. Adriana Kovashka University of Pittsburgh November 13, 2018 Plan for This Lecture Ensemble methods: introduction Boosting Algorithm

More information

10601 Machine Learning. Model and feature selection

10601 Machine Learning. Model and feature selection 10601 Machine Learning Model and feature selection Model selection issues We have seen some of this before Selecting features (or basis functions) Logistic regression SVMs Selecting parameter value Prior

More information

Data Mining and Knowledge Discovery: Practice Notes

Data Mining and Knowledge Discovery: Practice Notes Data Mining and Knowledge Discovery: Practice Notes Petra Kralj Novak Petra.Kralj.Novak@ijs.si 2016/01/12 1 Keywords Data Attribute, example, attribute-value data, target variable, class, discretization

More information

Classification: Decision Trees

Classification: Decision Trees Classification: Decision Trees IST557 Data Mining: Techniques and Applications Jessie Li, Penn State University 1 Decision Tree Example Will a pa)ent have high-risk based on the ini)al 24-hour observa)on?

More information

Predicting Diabetes and Heart Disease Using Diagnostic Measurements and Supervised Learning Classification Models

Predicting Diabetes and Heart Disease Using Diagnostic Measurements and Supervised Learning Classification Models Predicting Diabetes and Heart Disease Using Diagnostic Measurements and Supervised Learning Classification Models Kunal Sharma CS 4641 Machine Learning Abstract Supervised learning classification algorithms

More information

Data Mining and Knowledge Discovery: Practice Notes

Data Mining and Knowledge Discovery: Practice Notes Data Mining and Knowledge Discovery: Practice Notes Petra Kralj Novak Petra.Kralj.Novak@ijs.si 2016/01/12 1 Keywords Data Attribute, example, attribute-value data, target variable, class, discretization

More information

Implementierungstechniken für Hauptspeicherdatenbanksysteme Classification: Decision Trees

Implementierungstechniken für Hauptspeicherdatenbanksysteme Classification: Decision Trees Implementierungstechniken für Hauptspeicherdatenbanksysteme Classification: Decision Trees Dominik Vinan February 6, 2018 Abstract Decision Trees are a well-known part of most modern Machine Learning toolboxes.

More information

Uninformed Search Methods. Informed Search Methods. Midterm Exam 3/13/18. Thursday, March 15, 7:30 9:30 p.m. room 125 Ag Hall

Uninformed Search Methods. Informed Search Methods. Midterm Exam 3/13/18. Thursday, March 15, 7:30 9:30 p.m. room 125 Ag Hall Midterm Exam Thursday, March 15, 7:30 9:30 p.m. room 125 Ag Hall Covers topics through Decision Trees and Random Forests (does not include constraint satisfaction) Closed book 8.5 x 11 sheet with notes

More information

Data Mining and Knowledge Discovery Practice notes: Numeric Prediction, Association Rules

Data Mining and Knowledge Discovery Practice notes: Numeric Prediction, Association Rules Keywords Data Mining and Knowledge Discovery: Practice Notes Petra Kralj Novak Petra.Kralj.Novak@ijs.si 06/0/ Data Attribute, example, attribute-value data, target variable, class, discretization Algorithms

More information

Machine Learning (CSE 446): Concepts & the i.i.d. Supervised Learning Paradigm

Machine Learning (CSE 446): Concepts & the i.i.d. Supervised Learning Paradigm Machine Learning (CSE 446): Concepts & the i.i.d. Supervised Learning Paradigm Sham M Kakade c 2018 University of Washington cse446-staff@cs.washington.edu 1 / 17 Review 1 / 17 Decision Tree: Making a

More information

Classification: Basic Concepts, Decision Trees, and Model Evaluation

Classification: Basic Concepts, Decision Trees, and Model Evaluation Classification: Basic Concepts, Decision Trees, and Model Evaluation Data Warehousing and Mining Lecture 4 by Hossen Asiful Mustafa Classification: Definition Given a collection of records (training set

More information

A Systematic Overview of Data Mining Algorithms

A Systematic Overview of Data Mining Algorithms A Systematic Overview of Data Mining Algorithms 1 Data Mining Algorithm A well-defined procedure that takes data as input and produces output as models or patterns well-defined: precisely encoded as a

More information

Nominal Data. May not have a numerical representation Distance measures might not make sense. PR and ANN

Nominal Data. May not have a numerical representation Distance measures might not make sense. PR and ANN NonMetric Data Nominal Data So far we consider patterns to be represented by feature vectors of real or integer values Easy to come up with a distance (similarity) measure by using a variety of mathematical

More information

Lecture 16: High-dimensional regression, non-linear regression

Lecture 16: High-dimensional regression, non-linear regression Lecture 16: High-dimensional regression, non-linear regression Reading: Sections 6.4, 7.1 STATS 202: Data mining and analysis November 3, 2017 1 / 17 High-dimensional regression Most of the methods we

More information

Cost-complexity pruning of random forests

Cost-complexity pruning of random forests Cost-complexity pruning of random forests B Ravi Kiran and Jean Serra 2 CRIStAL Lab, UMR 989, Université Charles de Gaulle, Lille 3 kiran.ravi@univ-lille3.fr 2 Université Paris-Est, A3SI-ESIEE LIGM jean.serra@esiee.fr

More information

CSC411/2515 Tutorial: K-NN and Decision Tree

CSC411/2515 Tutorial: K-NN and Decision Tree CSC411/2515 Tutorial: K-NN and Decision Tree Mengye Ren csc{411,2515}ta@cs.toronto.edu September 25, 2016 Cross-validation K-nearest-neighbours Decision Trees Review: Motivation for Validation Framework:

More information

CS178: Machine Learning and Data Mining. Complexity & Nearest Neighbor Methods

CS178: Machine Learning and Data Mining. Complexity & Nearest Neighbor Methods + CS78: Machine Learning and Data Mining Complexity & Nearest Neighbor Methods Prof. Erik Sudderth Some materials courtesy Alex Ihler & Sameer Singh Machine Learning Complexity and Overfitting Nearest

More information

4. Ad-hoc I: Hierarchical clustering

4. Ad-hoc I: Hierarchical clustering 4. Ad-hoc I: Hierarchical clustering Hierarchical versus Flat Flat methods generate a single partition into k clusters. The number k of clusters has to be determined by the user ahead of time. Hierarchical

More information

Midterm Examination CS 540-2: Introduction to Artificial Intelligence

Midterm Examination CS 540-2: Introduction to Artificial Intelligence Midterm Examination CS 54-2: Introduction to Artificial Intelligence March 9, 217 LAST NAME: FIRST NAME: Problem Score Max Score 1 15 2 17 3 12 4 6 5 12 6 14 7 15 8 9 Total 1 1 of 1 Question 1. [15] State

More information

Cross-validation and the Bootstrap

Cross-validation and the Bootstrap Cross-validation and the Bootstrap In the section we discuss two resampling methods: cross-validation and the bootstrap. These methods refit a model of interest to samples formed from the training set,

More information

Model Selection and Assessment

Model Selection and Assessment Model Selection and Assessment CS4780/5780 Machine Learning Fall 2014 Thorsten Joachims Cornell University Reading: Mitchell Chapter 5 Dietterich, T. G., (1998). Approximate Statistical Tests for Comparing

More information

8.11 Multivariate regression trees (MRT)

8.11 Multivariate regression trees (MRT) Multivariate regression trees (MRT) 375 8.11 Multivariate regression trees (MRT) Univariate classification tree analysis (CT) refers to problems where a qualitative response variable is to be predicted

More information

Lecture 06 Decision Trees I

Lecture 06 Decision Trees I Lecture 06 Decision Trees I 08 February 2016 Taylor B. Arnold Yale Statistics STAT 365/665 1/33 Problem Set #2 Posted Due February 19th Piazza site https://piazza.com/ 2/33 Last time we starting fitting

More information

Distribution-free Predictive Approaches

Distribution-free Predictive Approaches Distribution-free Predictive Approaches The methods discussed in the previous sections are essentially model-based. Model-free approaches such as tree-based classification also exist and are popular for

More information

List of Exercises: Data Mining 1 December 12th, 2015

List of Exercises: Data Mining 1 December 12th, 2015 List of Exercises: Data Mining 1 December 12th, 2015 1. We trained a model on a two-class balanced dataset using five-fold cross validation. One person calculated the performance of the classifier by measuring

More information

An introduction to random forests

An introduction to random forests An introduction to random forests Eric Debreuve / Team Morpheme Institutions: University Nice Sophia Antipolis / CNRS / Inria Labs: I3S / Inria CRI SA-M / ibv Outline Machine learning Decision tree Random

More information

CISC 4631 Data Mining

CISC 4631 Data Mining CISC 4631 Data Mining Lecture 05: Overfitting Evaluation: accuracy, precision, recall, ROC Theses slides are based on the slides by Tan, Steinbach and Kumar (textbook authors) Eamonn Koegh (UC Riverside)

More information

Nearest neighbor classification DSE 220

Nearest neighbor classification DSE 220 Nearest neighbor classification DSE 220 Decision Trees Target variable Label Dependent variable Output space Person ID Age Gender Income Balance Mortgag e payment 123213 32 F 25000 32000 Y 17824 49 M 12000-3000

More information

Classification. Instructor: Wei Ding

Classification. Instructor: Wei Ding Classification Decision Tree Instructor: Wei Ding Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 1 Preliminaries Each data record is characterized by a tuple (x, y), where x is the attribute

More information

14. League: A factor with levels A and N indicating player s league at the end of 1986

14. League: A factor with levels A and N indicating player s league at the end of 1986 PENALIZED REGRESSION Ridge and The LASSO Note: The example contained herein was copied from the lab exercise in Chapter 6 of Introduction to Statistical Learning by. For this exercise, we ll use some baseball

More information

BITS F464: MACHINE LEARNING

BITS F464: MACHINE LEARNING BITS F464: MACHINE LEARNING Lecture-16: Decision Tree (contd.) + Random Forest Dr. Kamlesh Tiwari Assistant Professor Department of Computer Science and Information Systems Engineering, BITS Pilani, Rajasthan-333031

More information