Classification: Decision Trees
|
|
- Terence Rich
- 6 years ago
- Views:
Transcription
1 Classification: Decision Trees IST557 Data Mining: Techniques and Applications Jessie Li, Penn State University 1
2 Decision Tree Example Will a pa)ent have high-risk based on the ini)al 24-hour observa)on? A high-risk pa)ent usually cannot survive for more than 30 days. 2
3 Decision Tree Example Business marke)ng: predict whether a person will buy a computer? age income student credit_rating buys_computer 21 high no fair no 18 high no excellent no 35 high no fair no 45 medium no fair yes 46 low yes fair yes 50 low yes excellent no 32 low yes excellent yes 33 medium no fair no 21 low yes fair yes 50 medium yes fair yes 22 medium yes excellent yes 34 medium no excellent no 33 high yes fair yes 55 medium no excellent no 3
4 Basic Concept age? <=40 >40 student? credit rating? no yes excellent fair no yes no yes Tree is constructed by repeated splifng feature space into two subsets Defini)ons: node, terminal node (leaf node), parent node, child node. The union of the regions occupied by two child nodes is the region occupied by their parent node. Every leaf node is assigned with a class. A query is associated with class of the leaf node it lands it. 4 features label 4
5 5
6 2 x x1 6
7 The Three Elements to Grow a Tree The construction of a tree involves the following three elements: 1. The selection of the splits 2. The decisions when to declare a node terminal or to continue splitting it 3. The assignment of each terminal node to a class 7
8 The Three Elements The construction of a tree involves the following three elements: 1. The selection of the splits 2. The decisions when to declare a node terminal or to continue splitting it 3. The assignment of each terminal node to a class 8
9 Standard Set of Questions The input vector X = (X 1, X 2,, X p ) contains features of both categorical and ordered (numerical) types. Each split depends on the value of only a unique variable. In the example below, the split lines are parallel to coordinates 9
10 Standard Set of Questions The ques)ons in the splifng node are in the form of {Is X j apple c?} for all real-valued c. How many candidate ques)ons? Infinite? 10
11 Standard Set of Questions The ques)ons in the splifng node are in the form of {Is X j apple c?} for all real-valued c. Since the training data set is finite, there are only finitely many dis)nct splits that can be generated. 11
12 Standard Set of Questions Let a[ribute A be a con)nuous-valued a[ribute Must determine the best split point c for A Sort the value A in increasing order Typically, the midpoint between each pair of adjacent values is considered as a possible split point c Split: A c A > c (a i +a i+1 )/2 is the midpoint between the values of a i and a i+1 12
13 Standard Set of Questions How about categorical features? If Xj is categorical, taking values,say in {1,2,, M}, the ques)ons are in the form {Is X j 2 A}, wherea is a subset of {1, 2,...,M}. How many candidate ques)ons for one variable? Exponen)al number of subsets: 2 M Heuris)c efficient algorithms: see this link 13
14 Goodness of Split The goodness of split is measured by an impurity func)on defined for each node. Intui)vely, we want each leaf node to be pure, that is, one class dominates. VS. 14
15 The Impurity Function Leh region x: 8; o: 2 (0.8, 0.2) Right region x: 2; o: 12 (0.14, 0.86) All region x: 10; o: 14 (0.42, 0.58) Figure and slides are from h[p://sites.stat.psu.edu/~jiali/course/stat557/ 15
16 Goodness of a split: i(t)-i(t L )-i(t R )? Leh region x: 8; o: 2 i(t L ) = Φ(0.8, 0.2) All region x: 10; o: 14 i(t) = Φ (0.42, 0.58) Right region x: 2; o: 12 i(t R ) = Φ (0.14, 0.86) 16
17 Leh region x: 8; o: 2 i(t L ) = Φ(0.8, 0.2) All region x: 10; o: 14 i(t) = Φ (0.42, 0.58) Right region x: 2; o: 12 i(t R ) = Φ (0.14, 0.86) 17
18 Leh region x: 8; o: 2 i(t L ) = Φ(0.8, 0.2) I(t L ) = i(t L ) * 10 All region x: 10; o: 14 i(t) = Φ(0.42, 0.58) * 24 Right region x: 2; o: 12 i(t R ) = Φ(0.14, 0.86) I(t R ) = i(t R ) * 14 18
19 All region x: 10; o: 14 i(t) = Φ(0.42, 0.58) I(t) = i(t) * 24 Leh region x: 8; o: 2 i(t L ) = Φ(0.8, 0.2) I(t L ) = i(t L ) * 10 Right region x: 2; o: 12 i(t R ) = Φ(0.14, 0.86) I(t R ) = i(t R ) * ΔI(s,t) = Φ (0.42, 0.58)*24 - Φ(0.8, 0.2)*10 Φ(0.14, 0.86)*14 = 24*(Φ(0.42, 0.58) - Φ (0.8, 0.2)*(10/24) - Φ(0.14, 0.86)*(14/24))
20 The Impurity Function: Entropy Algorithms using Entropy: ID3 (Intera)ve Dichotomiser 3) C4.5 (successor of ID3) 20
21 The Impurity Function: Gini index Algorithms using Gini index: CART: Classifica)on And Regression Tree Breiman, L., J. Friedman, R. Olshen, and C. Stone. Classifica/on and Regression Trees. Boca Raton, FL: CRC Press,
22 Murphy, Figure 16.3, ginidemo Error rate Gini Entropy Consider 400 cases in each class One split op)on: (300, 100) and (100, 300) Another split op)on: (200, 400) and (200, 0) Error rate treats them the same Entropy and Gini index prefers la[er Node impurity measures for binary classifica)on. Entropy has been rescaled. 22
23 The Three Elements The construction of a tree involves the following three elements: 1. The selection of the splits 2. The decisions when to declare a node terminal or to continue splitting it 3. The assignment of each terminal node to a class 23
24 Stopping Criteria 1. The samples in a node have the same labels 2. The goodness of split is lower than a threshold 3. The number of samples in every leaf is below a threshold 4. The number of leaf nodes reaches a limit 5. Tree has exceeded the maximum desired depth 24
25 Stopping Criteria 25
26 In this example, the first split is always not good. Using stopping criteria based on threshold, it will will not split. Anima)on 26
27 The Three Elements The construction of a tree involves the following three elements: 1. The selection of the splits 2. The decisions when to declare a node terminal or to continue splitting it 3. The assignment of each terminal node to a class 27
28 Class Assignment Rule 28
29 The Three Elements to Grow a Tree The construction of a tree involves the following three elements: 1. The selection of the splits Gini index (CART) Entropy (ID3, C4.5) Misclassification error 2. The decisions when to declare a node terminal or to continue splitting it 3. The assignment of each terminal node to a class 29
30 Example: Iris Flower Data Iris setosa Iris versicolor Iris virginica 4 features Sepal length Sepal width Petal length Petal width h[p://en.wikipedia.org/wiki/iris_flower_data_set 30
31 Murphy, Figure 16.4(a) setosa versicolor virginica Sepal width Sepal length Iris data. We only show the first two features, sepal length and sepal width 31
32 4.5 4 setosa versicolor virginica Murphy, Figure 16.5(a), dtreedemoiris Sepal width Sepal length Full decision tree 32
33 4.5 4 Murphy, Figure 16.4(b) setosa versicolor virginica Sepal width Sepal length 33
34 Example: Diabetes data Diabetes data 768 samples 8 quantitative variables 2 classes R script: h[ps://onlinecourses.science.psu.edu/stat857/node/133 CART in matlab: h[p:// Decision tree in R: h[p://cran.r-project.org/web/packages/rpart/rpart.pdf 34
35 The Three Elements to Grow a Tree The construction of a tree involves the following three elements: 1. The selection of the splits Gini index (CART) Entropy (ID3, C4.5) Misclassification error 2. The decisions when to declare a node terminal or to continue splitting it grow to a full tree and then prune it 3. The assignment of each terminal node to a class 35
36 Bias and Variance (high) Bias High training error Under-fit Simple model (high) Variance High tes)ng error Over-fit Complex model Examples: Examples: An over-pruned A full tree tree A high-degree One-degree linear polynomial linear regression 36
37 Bias and Variance (high) Bias High training error Under-fit Simple model (high) Variance High tes)ng error Over-fit Complex model Examples: Examples: An over-pruned A full tree tree A high-degree One-degree linear polynomial linear regression 37
38 Pruning a Tree To prevent overfitting, we can stop growing the tree, but the stopping criteria tends to be myopic The standard approach is therefore to grow a full tree, and then to perform pruning We can use any of the previous stopping criteria with very low threshold to grow a full tree (a very large tree) 38
39 How to Evaluate a Tree (or generally a classification model)? All the labeled data Train Test Randomly select a por)on of data as training (e.g., 80%) Remaining data as tes)ng (e.g., 20%) Then, compare different trees (or models) 39
40 How to Evaluate a Tree (or generally a classification model)? Random par))on could be biased. Cross valida)on is preferred. Example: 5-fold cross valida)on Final result to report: Average accuracy over 5 rounds Standard devia)on of accuracy over 5 rounds 40
41 Parameter Tuning by Random Partition of Train/Test All the labeled data Train Test 1. Randomly choose a por)on as train and the remaining as test 1. Tree starts with one (terminal) node (m=1) 2. Train a tree with m terminal nodes 3. Test it and get the error rate 4. Grow the tree with one more split (m ß m + 1) 5. If the tree is not full yet, go to 2 2. Pick m* terminal nodes which give us the lowest error 41
42 Parameter Tuning by Cross-Validation Random par))on could be biased. Cross valida)on is preferred. 5-fold cross valida)on 1. Use k-fold cross-valida)on on training data 2. In each valida)on, (k-1)/k of training data is used as train, 1/k of training data is used as test 1. Tree starts with one (terminal) node (m=1) 2. Train a tree with m terminal nodes 3. Test it and get the error rate 4. Grow the tree with one more split (m ß m + 1) 5. If the tree is not full yet, go to 2 3. Pick m* terminal nodes which give us the lowest error 42
43 Evaluating a Tree with the Best Parameter Randomly pick 80% as train, 20% as test (if using 5-fold cross valida)on, need to do this process 5 )mes) 80% train 20% test Use m* terminal nodes to train a model on all training data Pick m* terminal nodes Note: there are 2 layers of train/test spli3ng!!! 43
44 Murphy, Figure 16.5, dtreedemoiris Cost (misclassification error) Cross validation Training set Min + 1 std. err. Best choice Number of terminal nodes 44
45 Murphy, Figure 16.6, dtreedemoiris 45
46 Murphy, Figure 16.6, dtreedemoiris pruned decision tree versicolor setosa virginica 3.5 y x 46
47 Example: Diabetes data Pick the minimum crossvalida)on error Nsplit = 5 # of terminal nodes = 6 R script: h[ps://onlinecourses.science.psu.edu/stat857/node/133 47
48 Pros of Trees Easy to interpret Handle mixed discrete and continuous inputs Insensitive to monotone transformations of the inputs Because the split points are based on ranking the data points Perform automatic variable selection Relatively robust to outliers Scale well to large data sets Can be modified to handle missing features 48
49 Missing Values Certain variables are missing in some training samples Often occurs in gene-expression microarray data Suppose each variable has 5% chance being missing independently. Then for a training sample with 50 variables, the probability of missing some variables is as high as 92.3% A sample to be classified may have missing variables Find surrogate splits Suppose the best split for node t is s which involves a question on X m. Find another split s on a variable X j, j m, which is most similar to s in a certain sense. Similarly, the second best surrogate split, the third, and so on, can be found Advantage over a generative model: not modeling the entire joint distribution of inputs Disadvantage: entirely ad-hoc Code missing as a new value for categorical variables 49
50 Cons of Trees Do not predict very accurately compared to other kinds of model Due to the greedy nature of the tree construction algorithm Trees are unstable Small changes to the input data can have large effects on the structure of the tree Due to the hierarchical nature of the tree-growing process Solution: bagging and boosting 50
51 Summary Decision tree CART: Classification And Regression Tree Grow a tree Selection of splits Stopping criteria Class assignments in leaf nodes Prune a tree Grow to a full tree Use cross-validation to prune 51
COMP 465: Data Mining Classification Basics
Supervised vs. Unsupervised Learning COMP 465: Data Mining Classification Basics Slides Adapted From : Jiawei Han, Micheline Kamber & Jian Pei Data Mining: Concepts and Techniques, 3 rd ed. Supervised
More informationClassification with Decision Tree Induction
Classification with Decision Tree Induction This algorithm makes Classification Decision for a test sample with the help of tree like structure (Similar to Binary Tree OR k-ary tree) Nodes in the tree
More informationData Mining Lecture 8: Decision Trees
Data Mining Lecture 8: Decision Trees Jo Houghton ECS Southampton March 8, 2019 1 / 30 Decision Trees - Introduction A decision tree is like a flow chart. E. g. I need to buy a new car Can I afford it?
More informationDecision Trees Dr. G. Bharadwaja Kumar VIT Chennai
Decision Trees Decision Tree Decision Trees (DTs) are a nonparametric supervised learning method used for classification and regression. The goal is to create a model that predicts the value of a target
More informationExtra readings beyond the lecture slides are important:
1 Notes To preview next lecture: Check the lecture notes, if slides are not available: http://web.cse.ohio-state.edu/~sun.397/courses/au2017/cse5243-new.html Check UIUC course on the same topic. All their
More informationCSE4334/5334 DATA MINING
CSE4334/5334 DATA MINING Lecture 4: Classification (1) CSE4334/5334 Data Mining, Fall 2014 Department of Computer Science and Engineering, University of Texas at Arlington Chengkai Li (Slides courtesy
More informationRandom Forest A. Fornaser
Random Forest A. Fornaser alberto.fornaser@unitn.it Sources Lecture 15: decision trees, information theory and random forests, Dr. Richard E. Turner Trees and Random Forests, Adele Cutler, Utah State University
More informationIntroduction to Artificial Intelligence
Introduction to Artificial Intelligence COMP307 Machine Learning 2: 3-K Techniques Yi Mei yi.mei@ecs.vuw.ac.nz 1 Outline K-Nearest Neighbour method Classification (Supervised learning) Basic NN (1-NN)
More informationA Systematic Overview of Data Mining Algorithms. Sargur Srihari University at Buffalo The State University of New York
A Systematic Overview of Data Mining Algorithms Sargur Srihari University at Buffalo The State University of New York 1 Topics Data Mining Algorithm Definition Example of CART Classification Iris, Wine
More informationTree-based methods for classification and regression
Tree-based methods for classification and regression Ryan Tibshirani Data Mining: 36-462/36-662 April 11 2013 Optional reading: ISL 8.1, ESL 9.2 1 Tree-based methods Tree-based based methods for predicting
More informationData Mining. 3.2 Decision Tree Classifier. Fall Instructor: Dr. Masoud Yaghini. Chapter 5: Decision Tree Classifier
Data Mining 3.2 Decision Tree Classifier Fall 2008 Instructor: Dr. Masoud Yaghini Outline Introduction Basic Algorithm for Decision Tree Induction Attribute Selection Measures Information Gain Gain Ratio
More informationClassification and Regression Trees
Classification and Regression Trees David S. Rosenberg New York University April 3, 2018 David S. Rosenberg (New York University) DS-GA 1003 / CSCI-GA 2567 April 3, 2018 1 / 51 Contents 1 Trees 2 Regression
More informationExample of DT Apply Model Example Learn Model Hunt s Alg. Measures of Node Impurity DT Examples and Characteristics. Classification.
lassification-decision Trees, Slide 1/56 Classification Decision Trees Huiping Cao lassification-decision Trees, Slide 2/56 Examples of a Decision Tree Tid Refund Marital Status Taxable Income Cheat 1
More informationDecision trees. Decision trees are useful to a large degree because of their simplicity and interpretability
Decision trees A decision tree is a method for classification/regression that aims to ask a few relatively simple questions about an input and then predicts the associated output Decision trees are useful
More informationData Mining: Concepts and Techniques Classification and Prediction Chapter 6.1-3
Data Mining: Concepts and Techniques Classification and Prediction Chapter 6.1-3 January 25, 2007 CSE-4412: Data Mining 1 Chapter 6 Classification and Prediction 1. What is classification? What is prediction?
More informationDecision Tree CE-717 : Machine Learning Sharif University of Technology
Decision Tree CE-717 : Machine Learning Sharif University of Technology M. Soleymani Fall 2012 Some slides have been adapted from: Prof. Tom Mitchell Decision tree Approximating functions of usually discrete
More informationAdditive Models, Trees, etc. Based in part on Chapter 9 of Hastie, Tibshirani, and Friedman David Madigan
Additive Models, Trees, etc. Based in part on Chapter 9 of Hastie, Tibshirani, and Friedman David Madigan Predictive Modeling Goal: learn a mapping: y = f(x;θ) Need: 1. A model structure 2. A score function
More informationClassification. Instructor: Wei Ding
Classification Decision Tree Instructor: Wei Ding Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 1 Preliminaries Each data record is characterized by a tuple (x, y), where x is the attribute
More informationLecture 20: Classification and Regression Trees
Fall, 2017 Outline Basic Ideas Basic Ideas Tree Construction Algorithm Parameter Tuning Choice of Impurity Measure Missing Values Characteristics of Classification Trees Main Characteristics: very flexible,
More informationFitting Classification and Regression Trees Using Statgraphics and R. Presented by Dr. Neil W. Polhemus
Fitting Classification and Regression Trees Using Statgraphics and R Presented by Dr. Neil W. Polhemus Classification and Regression Trees Machine learning methods used to construct predictive models from
More information8. Tree-based approaches
Foundations of Machine Learning École Centrale Paris Fall 2015 8. Tree-based approaches Chloé-Agathe Azencott Centre for Computational Biology, Mines ParisTech chloe agathe.azencott@mines paristech.fr
More informationComputer Vision Group Prof. Daniel Cremers. 6. Boosting
Prof. Daniel Cremers 6. Boosting Repetition: Regression We start with a set of basis functions (x) =( 0 (x), 1(x),..., M 1(x)) x 2 í d The goal is to fit a model into the data y(x, w) =w T (x) To do this,
More informationCS Machine Learning
CS 60050 Machine Learning Decision Tree Classifier Slides taken from course materials of Tan, Steinbach, Kumar 10 10 Illustrating Classification Task Tid Attrib1 Attrib2 Attrib3 Class 1 Yes Large 125K
More informationLars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany
Syllabus Fri. 27.10. (1) 0. Introduction A. Supervised Learning: Linear Models & Fundamentals Fri. 3.11. (2) A.1 Linear Regression Fri. 10.11. (3) A.2 Linear Classification Fri. 17.11. (4) A.3 Regularization
More informationClassification and Regression Trees
Classification and Regression Trees Matthew S. Shotwell, Ph.D. Department of Biostatistics Vanderbilt University School of Medicine Nashville, TN, USA March 16, 2018 Introduction trees partition feature
More informationClassification: Basic Concepts, Decision Trees, and Model Evaluation
Classification: Basic Concepts, Decision Trees, and Model Evaluation Data Warehousing and Mining Lecture 4 by Hossen Asiful Mustafa Classification: Definition Given a collection of records (training set
More informationLecture 7: Decision Trees
Lecture 7: Decision Trees Instructor: Outline 1 Geometric Perspective of Classification 2 Decision Trees Geometric Perspective of Classification Perspective of Classification Algorithmic Geometric Probabilistic...
More informationNominal Data. May not have a numerical representation Distance measures might not make sense. PR and ANN
NonMetric Data Nominal Data So far we consider patterns to be represented by feature vectors of real or integer values Easy to come up with a distance (similarity) measure by using a variety of mathematical
More informationClassification/Regression Trees and Random Forests
Classification/Regression Trees and Random Forests Fabio G. Cozman - fgcozman@usp.br November 6, 2018 Classification tree Consider binary class variable Y and features X 1,..., X n. Decide Ŷ after a series
More informationMachine Learning. A. Supervised Learning A.7. Decision Trees. Lars Schmidt-Thieme
Machine Learning A. Supervised Learning A.7. Decision Trees Lars Schmidt-Thieme Information Systems and Machine Learning Lab (ISMLL) Institute for Computer Science University of Hildesheim, Germany 1 /
More informationPart I. Instructor: Wei Ding
Classification Part I Instructor: Wei Ding Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 1 Classification: Definition Given a collection of records (training set ) Each record contains a set
More informationPart I. Classification & Decision Trees. Classification. Classification. Week 4 Based in part on slides from textbook, slides of Susan Holmes
Week 4 Based in part on slides from textbook, slides of Susan Holmes Part I Classification & Decision Trees October 19, 2012 1 / 1 2 / 1 Classification Classification Problem description We are given a
More informationMachine Learning. Decision Trees. Le Song /15-781, Spring Lecture 6, September 6, 2012 Based on slides from Eric Xing, CMU
Machine Learning 10-701/15-781, Spring 2008 Decision Trees Le Song Lecture 6, September 6, 2012 Based on slides from Eric Xing, CMU Reading: Chap. 1.6, CB & Chap 3, TM Learning non-linear functions f:
More informationModel Selection Introduction to Machine Learning. Matt Gormley Lecture 4 January 29, 2018
10-601 Introduction to Machine Learning Machine Learning Department School of Computer Science Carnegie Mellon University Model Selection Matt Gormley Lecture 4 January 29, 2018 1 Q&A Q: How do we deal
More informationData Warehousing and Machine Learning
Data Warehousing and Machine Learning Introduction Thomas D. Nielsen Aalborg University Department of Computer Science Spring 2008 DWML Spring 2008 1 / 47 What is Data Mining?? Introduction DWML Spring
More informationMachine Learning: Algorithms and Applications Mockup Examination
Machine Learning: Algorithms and Applications Mockup Examination 14 May 2012 FIRST NAME STUDENT NUMBER LAST NAME SIGNATURE Instructions for students Write First Name, Last Name, Student Number and Signature
More informationAlgorithms: Decision Trees
Algorithms: Decision Trees A small dataset: Miles Per Gallon Suppose we want to predict MPG From the UCI repository A Decision Stump Recursion Step Records in which cylinders = 4 Records in which cylinders
More informationBig Data Methods. Chapter 5: Machine learning. Big Data Methods, Chapter 5, Slide 1
Big Data Methods Chapter 5: Machine learning Big Data Methods, Chapter 5, Slide 1 5.1 Introduction to machine learning What is machine learning? Concerned with the study and development of algorithms that
More information7. Boosting and Bagging Bagging
Group Prof. Daniel Cremers 7. Boosting and Bagging Bagging Bagging So far: Boosting as an ensemble learning method, i.e.: a combination of (weak) learners A different way to combine classifiers is known
More informationLecture outline. Decision-tree classification
Lecture outline Decision-tree classification Decision Trees Decision tree A flow-chart-like tree structure Internal node denotes a test on an attribute Branch represents an outcome of the test Leaf nodes
More informationIntroduction to Machine Learning
Introduction to Machine Learning Eric Medvet 16/3/2017 1/77 Outline Machine Learning: what and why? Motivating example Tree-based methods Regression trees Trees aggregation 2/77 Teachers Eric Medvet Dipartimento
More informationRandom Forest Classification and Attribute Selection Program rfc3d
Random Forest Classification and Attribute Selection Program rfc3d Overview Random Forest (RF) is a supervised classification algorithm using multiple decision trees. Program rfc3d uses training data generated
More informationNetwork Traffic Measurements and Analysis
DEIB - Politecnico di Milano Fall, 2017 Sources Hastie, Tibshirani, Friedman: The Elements of Statistical Learning James, Witten, Hastie, Tibshirani: An Introduction to Statistical Learning Andrew Ng:
More informationDecision Trees / Discrete Variables
Decision trees Decision Trees / Discrete Variables Season Location Fun? Location summer prison -1 summer beach +1 Prison Beach Ski Slope Winter ski-slope +1-1 Season +1 Winter beach -1 Winter Summer -1
More informationNesnelerin İnternetinde Veri Analizi
Nesnelerin İnternetinde Veri Analizi Bölüm 3. Classification in Data Streams w3.gazi.edu.tr/~suatozdemir Supervised vs. Unsupervised Learning (1) Supervised learning (classification) Supervision: The training
More informationThe Curse of Dimensionality
The Curse of Dimensionality ACAS 2002 p1/66 Curse of Dimensionality The basic idea of the curse of dimensionality is that high dimensional data is difficult to work with for several reasons: Adding more
More informationCyber attack detection using decision tree approach
Cyber attack detection using decision tree approach Amit Shinde Department of Industrial Engineering, Arizona State University,Tempe, AZ, USA {amit.shinde@asu.edu} In this information age, information
More informationk-nearest Neighbors + Model Selection
10-601 Introduction to Machine Learning Machine Learning Department School of Computer Science Carnegie Mellon University k-nearest Neighbors + Model Selection Matt Gormley Lecture 5 Jan. 30, 2019 1 Reminders
More informationChapter 5. Tree-based Methods
Chapter 5. Tree-based Methods Wei Pan Division of Biostatistics, School of Public Health, University of Minnesota, Minneapolis, MN 55455 Email: weip@biostat.umn.edu PubH 7475/8475 c Wei Pan Regression
More informationAn Empirical Comparison of Ensemble Methods Based on Classification Trees. Mounir Hamza and Denis Larocque. Department of Quantitative Methods
An Empirical Comparison of Ensemble Methods Based on Classification Trees Mounir Hamza and Denis Larocque Department of Quantitative Methods HEC Montreal Canada Mounir Hamza and Denis Larocque 1 June 2005
More informationLecture 2 :: Decision Trees Learning
Lecture 2 :: Decision Trees Learning 1 / 62 Designing a learning system What to learn? Learning setting. Learning mechanism. Evaluation. 2 / 62 Prediction task Figure 1: Prediction task :: Supervised learning
More informationDecision Tree Learning
Decision Tree Learning Debapriyo Majumdar Data Mining Fall 2014 Indian Statistical Institute Kolkata August 25, 2014 Example: Age, Income and Owning a flat Monthly income (thousand rupees) 250 200 150
More informationStatistical Machine Learning Hilary Term 2018
Statistical Machine Learning Hilary Term 2018 Pier Francesco Palamara Department of Statistics University of Oxford Slide credits and other course material can be found at: http://www.stats.ox.ac.uk/~palamara/sml18.html
More informationNominal Data. May not have a numerical representation Distance measures might not make sense PR, ANN, & ML
Decision Trees Nominal Data So far we consider patterns to be represented by feature vectors of real or integer values Easy to come up with a distance (similarity) measure by using a variety of mathematical
More informationAn introduction to random forests
An introduction to random forests Eric Debreuve / Team Morpheme Institutions: University Nice Sophia Antipolis / CNRS / Inria Labs: I3S / Inria CRI SA-M / ibv Outline Machine learning Decision tree Random
More informationMIT 801. Machine Learning I. [Presented by Anna Bosman] 16 February 2018
MIT 801 [Presented by Anna Bosman] 16 February 2018 Machine Learning What is machine learning? Artificial Intelligence? Yes as we know it. What is intelligence? The ability to acquire and apply knowledge
More informationLecture 20: Bagging, Random Forests, Boosting
Lecture 20: Bagging, Random Forests, Boosting Reading: Chapter 8 STATS 202: Data mining and analysis November 13, 2017 1 / 17 Classification and Regression trees, in a nut shell Grow the tree by recursively
More informationEnsemble Learning: An Introduction. Adapted from Slides by Tan, Steinbach, Kumar
Ensemble Learning: An Introduction Adapted from Slides by Tan, Steinbach, Kumar 1 General Idea D Original Training data Step 1: Create Multiple Data Sets... D 1 D 2 D t-1 D t Step 2: Build Multiple Classifiers
More informationClassification with PAM and Random Forest
5/7/2007 Classification with PAM and Random Forest Markus Ruschhaupt Practical Microarray Analysis 2007 - Regensburg Two roads to classification Given: patient profiles already diagnosed by an expert.
More informationNearest neighbor classification DSE 220
Nearest neighbor classification DSE 220 Decision Trees Target variable Label Dependent variable Output space Person ID Age Gender Income Balance Mortgag e payment 123213 32 F 25000 32000 Y 17824 49 M 12000-3000
More informationThe digital copy of this thesis is protected by the Copyright Act 1994 (New Zealand).
http://waikato.researchgateway.ac.nz/ Research Commons at the University of Waikato Copyright Statement: The digital copy of this thesis is protected by the Copyright Act 1994 (New Zealand). The thesis
More informationWhat is Learning? CS 343: Artificial Intelligence Machine Learning. Raymond J. Mooney. Problem Solving / Planning / Control.
What is Learning? CS 343: Artificial Intelligence Machine Learning Herbert Simon: Learning is any process by which a system improves performance from experience. What is the task? Classification Problem
More informationData Mining Concepts & Techniques
Data Mining Concepts & Techniques Lecture No. 03 Data Processing, Data Mining Naeem Ahmed Email: naeemmahoto@gmail.com Department of Software Engineering Mehran Univeristy of Engineering and Technology
More information7. Decision or classification trees
7. Decision or classification trees Next we are going to consider a rather different approach from those presented so far to machine learning that use one of the most common and important data structure,
More informationArtificial Intelligence. Programming Styles
Artificial Intelligence Intro to Machine Learning Programming Styles Standard CS: Explicitly program computer to do something Early AI: Derive a problem description (state) and use general algorithms to
More informationCS229 Lecture notes. Raphael John Lamarre Townshend
CS229 Lecture notes Raphael John Lamarre Townshend Decision Trees We now turn our attention to decision trees, a simple yet flexible class of algorithms. We will first consider the non-linear, region-based
More informationBusiness Club. Decision Trees
Business Club Decision Trees Business Club Analytics Team December 2017 Index 1. Motivation- A Case Study 2. The Trees a. What is a decision tree b. Representation 3. Regression v/s Classification 4. Building
More informationMidterm Examination CS540-2: Introduction to Artificial Intelligence
Midterm Examination CS540-2: Introduction to Artificial Intelligence March 15, 2018 LAST NAME: FIRST NAME: Problem Score Max Score 1 12 2 13 3 9 4 11 5 8 6 13 7 9 8 16 9 9 Total 100 Question 1. [12] Search
More informationk Nearest Neighbors Super simple idea! Instance-based learning as opposed to model-based (no pre-processing)
k Nearest Neighbors k Nearest Neighbors To classify an observation: Look at the labels of some number, say k, of neighboring observations. The observation is then classified based on its nearest neighbors
More informationCS 229 Midterm Review
CS 229 Midterm Review Course Staff Fall 2018 11/2/2018 Outline Today: SVMs Kernels Tree Ensembles EM Algorithm / Mixture Models [ Focus on building intuition, less so on solving specific problems. Ask
More informationMIT Samberg Center Cambridge, MA, USA. May 30 th June 2 nd, by C. Rea, R.S. Granetz MIT Plasma Science and Fusion Center, Cambridge, MA, USA
Exploratory Machine Learning studies for disruption prediction on DIII-D by C. Rea, R.S. Granetz MIT Plasma Science and Fusion Center, Cambridge, MA, USA Presented at the 2 nd IAEA Technical Meeting on
More informationLazy Decision Trees Ronny Kohavi
Lazy Decision Trees Ronny Kohavi Data Mining and Visualization Group Silicon Graphics, Inc. Joint work with Jerry Friedman and Yeogirl Yun Stanford University Motivation: Average Impurity = / interesting
More informationCSE 5243 INTRO. TO DATA MINING
CSE 5243 INTRO. TO DATA MINING Classification (Basic Concepts) Huan Sun, CSE@The Ohio State University 09/12/2017 Slides adapted from UIUC CS412, Fall 2017, by Prof. Jiawei Han Classification: Basic Concepts
More informationThe Basics of Decision Trees
Tree-based Methods Here we describe tree-based methods for regression and classification. These involve stratifying or segmenting the predictor space into a number of simple regions. Since the set of splitting
More informationData Mining. 3.3 Rule-Based Classification. Fall Instructor: Dr. Masoud Yaghini. Rule-Based Classification
Data Mining 3.3 Fall 2008 Instructor: Dr. Masoud Yaghini Outline Using IF-THEN Rules for Classification Rules With Exceptions Rule Extraction from a Decision Tree 1R Algorithm Sequential Covering Algorithms
More informationKnowledge Discovery and Data Mining
Knowledge Discovery and Data Mining Lecture 10 - Classification trees Tom Kelsey School of Computer Science University of St Andrews http://tom.home.cs.st-andrews.ac.uk twk@st-andrews.ac.uk Tom Kelsey
More informationCross-validation. Cross-validation is a resampling method.
Cross-validation Cross-validation is a resampling method. It refits a model of interest to samples formed from the training set, in order to obtain additional information about the fitted model. For example,
More informationDecision Trees, Random Forests and Random Ferns. Peter Kovesi
Decision Trees, Random Forests and Random Ferns Peter Kovesi What do I want to do? Take an image. Iden9fy the dis9nct regions of stuff in the image. Mark the boundaries of these regions. Recognize and
More informationNearest Neighbor Classification
Nearest Neighbor Classification Professor Ameet Talwalkar Professor Ameet Talwalkar CS260 Machine Learning Algorithms January 11, 2017 1 / 48 Outline 1 Administration 2 First learning algorithm: Nearest
More informationINTRO TO RANDOM FOREST BY ANTHONY ANH QUOC DOAN
INTRO TO RANDOM FOREST BY ANTHONY ANH QUOC DOAN MOTIVATION FOR RANDOM FOREST Random forest is a great statistical learning model. It works well with small to medium data. Unlike Neural Network which requires
More informationComparing Univariate and Multivariate Decision Trees *
Comparing Univariate and Multivariate Decision Trees * Olcay Taner Yıldız, Ethem Alpaydın Department of Computer Engineering Boğaziçi University, 80815 İstanbul Turkey yildizol@cmpe.boun.edu.tr, alpaydin@boun.edu.tr
More informationSlides for Data Mining by I. H. Witten and E. Frank
Slides for Data Mining by I. H. Witten and E. Frank 7 Engineering the input and output Attribute selection Scheme-independent, scheme-specific Attribute discretization Unsupervised, supervised, error-
More informationNotes based on: Data Mining for Business Intelligence
Chapter 9 Classification and Regression Trees Roger Bohn April 2017 Notes based on: Data Mining for Business Intelligence 1 Shmueli, Patel & Bruce 2 3 II. Results and Interpretation There are 1183 auction
More informationUnivariate and Multivariate Decision Trees
Univariate and Multivariate Decision Trees Olcay Taner Yıldız and Ethem Alpaydın Department of Computer Engineering Boğaziçi University İstanbul 80815 Turkey Abstract. Univariate decision trees at each
More informationDecision tree learning
Decision tree learning Andrea Passerini passerini@disi.unitn.it Machine Learning Learning the concept Go to lesson OUTLOOK Rain Overcast Sunny TRANSPORTATION LESSON NO Uncovered Covered Theoretical Practical
More informationCART. Classification and Regression Trees. Rebecka Jörnsten. Mathematical Sciences University of Gothenburg and Chalmers University of Technology
CART Classification and Regression Trees Rebecka Jörnsten Mathematical Sciences University of Gothenburg and Chalmers University of Technology CART CART stands for Classification And Regression Trees.
More informationLecture 5: Decision Trees (Part II)
Lecture 5: Decision Trees (Part II) Dealing with noise in the data Overfitting Pruning Dealing with missing attribute values Dealing with attributes with multiple values Integrating costs into node choice
More informationBasic Data Mining Technique
Basic Data Mining Technique What is classification? What is prediction? Supervised and Unsupervised Learning Decision trees Association rule K-nearest neighbor classifier Case-based reasoning Genetic algorithm
More informationLecture 19: Decision trees
Lecture 19: Decision trees Reading: Section 8.1 STATS 202: Data mining and analysis November 10, 2017 1 / 17 Decision trees, 10,000 foot view R2 R5 t4 1. Find a partition of the space of predictors. X2
More informationEnsemble Methods: Bagging
Ensemble Methods: Bagging Instructor: Jessica Wu Harvey Mudd College The instructor gratefully acknowledges Eric Eaton (UPenn), Jenna Wiens (UMich), Tommi Jaakola (MIT), David Kauchak (Pomona), David Sontag
More informationA Systematic Overview of Data Mining Algorithms
A Systematic Overview of Data Mining Algorithms 1 Data Mining Algorithm A well-defined procedure that takes data as input and produces output as models or patterns well-defined: precisely encoded as a
More informationComputer Vision Group Prof. Daniel Cremers. 8. Boosting and Bagging
Prof. Daniel Cremers 8. Boosting and Bagging Repetition: Regression We start with a set of basis functions (x) =( 0 (x), 1(x),..., M 1(x)) x 2 í d The goal is to fit a model into the data y(x, w) =w T
More informationImproving Quality of Products in Hard Drive Manufacturing by Decision Tree Technique
Improving Quality of Products in Hard Drive Manufacturing by Decision Tree Technique Anotai Siltepavet 1, Sukree Sinthupinyo 2 and Prabhas Chongstitvatana 3 1 Computer Engineering, Chulalongkorn University,
More informationData Mining and Knowledge Discovery: Practice Notes
Data Mining and Knowledge Discovery: Practice Notes Petra Kralj Novak Petra.Kralj.Novak@ijs.si 2016/01/12 1 Keywords Data Attribute, example, attribute-value data, target variable, class, discretization
More informationCSCI567 Machine Learning (Fall 2014)
CSCI567 Machine Learning (Fall 2014) Drs. Sha & Liu {feisha,yanliu.cs}@usc.edu September 9, 2014 Drs. Sha & Liu ({feisha,yanliu.cs}@usc.edu) CSCI567 Machine Learning (Fall 2014) September 9, 2014 1 / 47
More informationData Mining Part 5. Prediction
Data Mining Part 5. Prediction 5.4. Spring 2010 Instructor: Dr. Masoud Yaghini Outline Using IF-THEN Rules for Classification Rule Extraction from a Decision Tree 1R Algorithm Sequential Covering Algorithms
More informationDecision Trees Oct
Decision Trees Oct - 7-2009 Previously We learned two different classifiers Perceptron: LTU KNN: complex decision boundary If you are a novice in this field, given a classification application, are these
More informationBIOINF 585: Machine Learning for Systems Biology & Clinical Informatics
BIOINF 585: Machine Learning for Systems Biology & Clinical Informatics Lecture 12: Ensemble Learning I Jie Wang Department of Computational Medicine & Bioinformatics University of Michigan 1 Outline Bias
More informationData Mining. Decision Tree. Hamid Beigy. Sharif University of Technology. Fall 1396
Data Mining Decision Tree Hamid Beigy Sharif University of Technology Fall 1396 Hamid Beigy (Sharif University of Technology) Data Mining Fall 1396 1 / 24 Table of contents 1 Introduction 2 Decision tree
More informationInput: Concepts, Instances, Attributes
Input: Concepts, Instances, Attributes 1 Terminology Components of the input: Concepts: kinds of things that can be learned aim: intelligible and operational concept description Instances: the individual,
More information