Introduction to Classification & Regression Trees
|
|
- Rodney Calvin Kelly
- 6 years ago
- Views:
Transcription
1 Introduction to Classification & Regression Trees ISLR Chapter 8 vember 8, 2017
2 Classification and Regression Trees Carseat data from ISLR package
3 Classification and Regression Trees Carseat data from ISLR package Binary Outcome High 1 if Sales > 8, otherwise 0
4 Classification and Regression Trees Carseat data from ISLR package Binary Outcome High 1 if Sales > 8, otherwise 0 Fit a Classification tree model to Price and Income
5 Classification and Regression Trees Carseat data from ISLR package Binary Outcome High 1 if Sales > 8, otherwise 0 Fit a Classification tree model to Price and Income Pick a predictor and a cutpoint to split data X j s and X k > s to minimize deviance (or SSE for regression) - leads to a root node in a tree
6 Classification and Regression Trees Carseat data from ISLR package Binary Outcome High 1 if Sales > 8, otherwise 0 Fit a Classification tree model to Price and Income Pick a predictor and a cutpoint to split data X j s and X k > s to minimize deviance (or SSE for regression) - leads to a root node in a tree continue splitting/partitioning data until stopping criterion is reached (number of observations in a node > 10 and within node deviance > 001 deviance of the root node)
7 Classification and Regression Trees Carseat data from ISLR package Binary Outcome High 1 if Sales > 8, otherwise 0 Fit a Classification tree model to Price and Income Pick a predictor and a cutpoint to split data X j s and X k > s to minimize deviance (or SSE for regression) - leads to a root node in a tree continue splitting/partitioning data until stopping criterion is reached (number of observations in a node > 10 and within node deviance > 001 deviance of the root node) Prediction is mean or proportion of successes of data in terminal nodes
8 Classification and Regression Trees Carseat data from ISLR package Binary Outcome High 1 if Sales > 8, otherwise 0 Fit a Classification tree model to Price and Income Pick a predictor and a cutpoint to split data X j s and X k > s to minimize deviance (or SSE for regression) - leads to a root node in a tree continue splitting/partitioning data until stopping criterion is reached (number of observations in a node > 10 and within node deviance > 001 deviance of the root node) Prediction is mean or proportion of successes of data in terminal nodes Output is a decision tree
9 Classification and Regression Trees Carseat data from ISLR package Binary Outcome High 1 if Sales > 8, otherwise 0 Fit a Classification tree model to Price and Income Pick a predictor and a cutpoint to split data X j s and X k > s to minimize deviance (or SSE for regression) - leads to a root node in a tree continue splitting/partitioning data until stopping criterion is reached (number of observations in a node > 10 and within node deviance > 001 deviance of the root node) Prediction is mean or proportion of successes of data in terminal nodes Output is a decision tree regression or classification function is nonlinear in predictors
10 Classification and Regression Trees Carseat data from ISLR package Binary Outcome High 1 if Sales > 8, otherwise 0 Fit a Classification tree model to Price and Income Pick a predictor and a cutpoint to split data X j s and X k > s to minimize deviance (or SSE for regression) - leads to a root node in a tree continue splitting/partitioning data until stopping criterion is reached (number of observations in a node > 10 and within node deviance > 001 deviance of the root node) Prediction is mean or proportion of successes of data in terminal nodes Output is a decision tree regression or classification function is nonlinear in predictors Captures interactions
11 Carseat Example library(tree) data(carseats) Carseats = mutate(carseats, High=factor(ifelse ( Carseats$Sales <=8, "", "Yes ")) ) treecarseats = tree(high ~ Price + Income, data=carseats)
12 Carseat Example plot(treecarseats) text(treecarseats) Price < 925 Yes Price < 142 Income < 605 Income < 625
13 Partition partitiontree(treecarseats) points(carseats$price,carseats$income, col=carseats$high) Price Income Yes
14 Splits treecarseats ## node), split, n, deviance, yval, (yprob) ## * denotes terminal node ## ## 1) root ( ) ## 2) Price < Yes ( ) * ## 3) Price > ( ) ## 6) Price < ( ) ## 12) Income < ( ) ## 13) Income > ( ) ## 7) Price > ( ) ## 14) Income < ( ) * ## 15) Income > ( ) *
15 Summary summary(treecarseats) ## ## Classification tree: ## tree(formula = High ~ Price + Income, data = Carseats) ## Number of terminal nodes: 5 ## Residual mean deviance: 118 = 4662 / 395 ## Misclassification error rate: 0325 = 130 / 400
16 All Variables treecarseats =tree(high ~ -Sales, data=carseats ) summary(treecarseats) ## ## Classification tree: ## tree(formula = High ~ - Sales, data = Carseats) ## Variables actually used in tree construction: ## [1] "ShelveLoc" "Price" "Income" "CompPrice ## [6] "Advertising" "Age" "US" ## Number of terminal nodes: 27 ## Residual mean deviance: = 1707 / 373 ## Misclassification error rate: 009 = 36 / 400 Overfitting?
17 Tree ShelveLoc:ac Price < 925 Price < 135 US:a Income < 46 Income < 57 CompPrice Population < 1105 < 2075 Advertising < 135 Price < 109 Yes Yes Yes Yes Yes Yes CompPrice < 1245 Age < 545 CompPrice < CompPrice 1305 < 1225 Price < 1065 Price < 1225 Income < 100 Price < 125 Population < 177 Yes Yes Income < 605 ShelveLoc:a CompPrice < 1475 Yes Yes Price < 1095 Price < 147 CompPrice < 1525 Age < 495 Yes Yes Yes
18 Classification Error setseed (2) train=sample (1: nrow(carseats ), 200) Carseatstest=Carseats [-train,] treecarseats =tree(high ~ -Sales, data=carseats,subset=train ) treepred=predict (treecarseats,carseatstest,type ="cla table(treepred, Carseatstest$High) ## ## treepred Yes ## ## Yes ( ) /200 # classification error ## [1] 0285
19 Cost-Complexity Pruning 1 Grow a large tree on training data, stopping when each terminal node has fewer than some minimum number of observations
20 Cost-Complexity Pruning 1 Grow a large tree on training data, stopping when each terminal node has fewer than some minimum number of observations 2 Prediction for region m is the Class c that max cˆπ mc
21 Cost-Complexity Pruning 1 Grow a large tree on training data, stopping when each terminal node has fewer than some minimum number of observations 2 Prediction for region m is the Class c that max cˆπ mc 3 Snip off the least important splits via cost-complexity pruning to the tree in order to obtain a sequence of best subtrees indexed by cost parameter k,
22 Cost-Complexity Pruning 1 Grow a large tree on training data, stopping when each terminal node has fewer than some minimum number of observations 2 Prediction for region m is the Class c that max cˆπ mc 3 Snip off the least important splits via cost-complexity pruning to the tree in order to obtain a sequence of best subtrees indexed by cost parameter k, N miss N k T missclassification error penalized by number of terminal nodes
23 Cost-Complexity Pruning 1 Grow a large tree on training data, stopping when each terminal node has fewer than some minimum number of observations 2 Prediction for region m is the Class c that max cˆπ mc 3 Snip off the least important splits via cost-complexity pruning to the tree in order to obtain a sequence of best subtrees indexed by cost parameter k, N miss N k T missclassification error penalized by number of terminal nodes 4 Using K-fold cross validation, compute average cost-complexity for each k
24 Cost-Complexity Pruning 1 Grow a large tree on training data, stopping when each terminal node has fewer than some minimum number of observations 2 Prediction for region m is the Class c that max cˆπ mc 3 Snip off the least important splits via cost-complexity pruning to the tree in order to obtain a sequence of best subtrees indexed by cost parameter k, N miss N k T missclassification error penalized by number of terminal nodes 4 Using K-fold cross validation, compute average cost-complexity for each k 5 Pick subtree with smallest penalized error
25 Pruning via Cross Validation ) setseed(2) cvcarseats = cvtree(treecarseats, FUN=prunemisclass) cvcarseats$dev cvcarseats$dev cvcarseats$size cvcarseats$k
26 Pruned prunecarseats = prunemisclass(treecarseats,best =9) ShelveLoc: Bad,Medium Price < 142 Price < 1425 ShelveLoc: Bad Price < 865 Advertising < 65 Yes Yes Yes Age < 375 CompPrice < 1185 Yes
27 Miss-classification after Selection treepred=predict(prunecarseats, Carseatstest, type="class") table(treepred,carseatstest$high) ## ## treepred Yes ## ## Yes (94 +60)/200 # classified Correctly ## [1] 077
28 Tree with Random Split of Data Price < 925 CompPrice < 1085 ShelveLoc:ac ShelveLoc:aPopulation < 86 Yes Yes Yes Age < 615 Income < 355 Price < 1265 Age < 795 Advertising < 75 Population < 545 Advertising < 16 Advertising < 3 CompPrice < 1315 CompPrice < 1385 CompPrice < 118 Population < 2315 Yes Yes Income < 84 Age < 445 Yes Income < 415
29 Tree with another Random Split of Data ShelveLoc:ac Advertising < 125 Price < 1425 Price < 1075 Age < 565 US:a CompPrice < 1295 Price < 92 ShelveLoc:a CompPrice < 1195 Yes Yes CompPrice < 127 Price < 1445 ShelveLoc:a Income < 655 CompPrice < 1445 Price < 985 Income < 97CompPrice < 120 Yes Age < 405 Yes Yes Yes Age < 485 Yes Population < 1345
30 Bagging: Bootstrap Aggregation Splitting data into random partitions and fitting a tree model on each half may lead to very different predictions (high variability)
31 Bagging: Bootstrap Aggregation Splitting data into random partitions and fitting a tree model on each half may lead to very different predictions (high variability) Reduce variability by averaging over multiple training sets!
32 Bagging: Bootstrap Aggregation Splitting data into random partitions and fitting a tree model on each half may lead to very different predictions (high variability) Reduce variability by averaging over multiple training sets! do not have access to multiple training sets so create them via bootstrap samples (sample of size n with replacement)
33 Bagging: Bootstrap Aggregation Splitting data into random partitions and fitting a tree model on each half may lead to very different predictions (high variability) Reduce variability by averaging over multiple training sets! do not have access to multiple training sets so create them via bootstrap samples (sample of size n with replacement) Generate B bootstrap sample of observations from the single training data
34 Bagging: Bootstrap Aggregation Splitting data into random partitions and fitting a tree model on each half may lead to very different predictions (high variability) Reduce variability by averaging over multiple training sets! do not have access to multiple training sets so create them via bootstrap samples (sample of size n with replacement) Generate B bootstrap sample of observations from the single training data Calculate predictions for the bth sample ˆf b (x)
35 Bagging: Bootstrap Aggregation Splitting data into random partitions and fitting a tree model on each half may lead to very different predictions (high variability) Reduce variability by averaging over multiple training sets! do not have access to multiple training sets so create them via bootstrap samples (sample of size n with replacement) Generate B bootstrap sample of observations from the single training data Calculate predictions for the bth sample ˆf b (x) Bagging (Bootstrap Aggregation) estimate is ˆfbag (x) = 1 B ˆfb (x) Trees are grown deep so little bias (although could prune)
36 Bagging: Bootstrap Aggregation Splitting data into random partitions and fitting a tree model on each half may lead to very different predictions (high variability) Reduce variability by averaging over multiple training sets! do not have access to multiple training sets so create them via bootstrap samples (sample of size n with replacement) Generate B bootstrap sample of observations from the single training data Calculate predictions for the bth sample ˆf b (x) Bagging (Bootstrap Aggregation) estimate is ˆfbag (x) = 1 B ˆfb (x) Trees are grown deep so little bias (although could prune) Reduce variance by averaging many trees across the bootstrap samples
37 Bagging: Bootstrap Aggregation Splitting data into random partitions and fitting a tree model on each half may lead to very different predictions (high variability) Reduce variability by averaging over multiple training sets! do not have access to multiple training sets so create them via bootstrap samples (sample of size n with replacement) Generate B bootstrap sample of observations from the single training data Calculate predictions for the bth sample ˆf b (x) Bagging (Bootstrap Aggregation) estimate is ˆfbag (x) = 1 B ˆfb (x) Trees are grown deep so little bias (although could prune) Reduce variance by averaging many trees across the bootstrap samples
38 Next Class Trees are simple to understand, but not as competitive with other supervised learning approaches for prediction/classification
39 Next Class Trees are simple to understand, but not as competitive with other supervised learning approaches for prediction/classification Ways to improving Trees through multiple trees in some ensemble: Bagging
40 Next Class Trees are simple to understand, but not as competitive with other supervised learning approaches for prediction/classification Ways to improving Trees through multiple trees in some ensemble: Bagging Random Forests
41 Next Class Trees are simple to understand, but not as competitive with other supervised learning approaches for prediction/classification Ways to improving Trees through multiple trees in some ensemble: Bagging Random Forests Boosting
42 Next Class Trees are simple to understand, but not as competitive with other supervised learning approaches for prediction/classification Ways to improving Trees through multiple trees in some ensemble: Bagging Random Forests Boosting BART (Bayesian Additive Regression Trees)
43 Next Class Trees are simple to understand, but not as competitive with other supervised learning approaches for prediction/classification Ways to improving Trees through multiple trees in some ensemble: Bagging Random Forests Boosting BART (Bayesian Additive Regression Trees) Combining trees will yield improved prediction accuracy, but with loss of interpretability
Random Forest A. Fornaser
Random Forest A. Fornaser alberto.fornaser@unitn.it Sources Lecture 15: decision trees, information theory and random forests, Dr. Richard E. Turner Trees and Random Forests, Adele Cutler, Utah State University
More informationThe Basics of Decision Trees
Tree-based Methods Here we describe tree-based methods for regression and classification. These involve stratifying or segmenting the predictor space into a number of simple regions. Since the set of splitting
More informationDecision trees. For this lab, we will use the Carseats data set from the ISLR package. (install and) load the package with the data set
Decision trees For this lab, we will use the Carseats data set from the ISLR package. (install and) load the package with the data set # install.packages('islr') library(islr) Carseats is a simulated data
More informationBig Data Methods. Chapter 5: Machine learning. Big Data Methods, Chapter 5, Slide 1
Big Data Methods Chapter 5: Machine learning Big Data Methods, Chapter 5, Slide 1 5.1 Introduction to machine learning What is machine learning? Concerned with the study and development of algorithms that
More informationStatistical Methods for Data Mining
Statistical Methods for Data Mining Kuangnan Fang Xiamen University Email: xmufkn@xmu.edu.cn Tree-based Methods Here we describe tree-based methods for regression and classification. These involve stratifying
More informationRandom Forests and Boosting
Random Forests and Boosting Tree-based methods are simple and useful for interpretation. However they typically are not competitive with the best supervised learning approaches in terms of prediction accuracy.
More informationLecture 20: Bagging, Random Forests, Boosting
Lecture 20: Bagging, Random Forests, Boosting Reading: Chapter 8 STATS 202: Data mining and analysis November 13, 2017 1 / 17 Classification and Regression trees, in a nut shell Grow the tree by recursively
More informationTree-based methods for classification and regression
Tree-based methods for classification and regression Ryan Tibshirani Data Mining: 36-462/36-662 April 11 2013 Optional reading: ISL 8.1, ESL 9.2 1 Tree-based methods Tree-based based methods for predicting
More informationEnsemble Learning: An Introduction. Adapted from Slides by Tan, Steinbach, Kumar
Ensemble Learning: An Introduction Adapted from Slides by Tan, Steinbach, Kumar 1 General Idea D Original Training data Step 1: Create Multiple Data Sets... D 1 D 2 D t-1 D t Step 2: Build Multiple Classifiers
More informationLecture 19: Decision trees
Lecture 19: Decision trees Reading: Section 8.1 STATS 202: Data mining and analysis November 10, 2017 1 / 17 Decision trees, 10,000 foot view R2 R5 t4 1. Find a partition of the space of predictors. X2
More informationSTAT Midterm Research Project: Random Forest. Vicky Xu. Xiyu Liang. Xue Cao
STAT 5703 Midterm Research Project: Random Forest Vicky Xu Xiyu Liang Xue Cao 1 Table of Contents Abstract... 4 Literature Review... 5 Decision Tree... 6 1. Definition and Overview... 6 2. Some most common
More informationClassification with Decision Tree Induction
Classification with Decision Tree Induction This algorithm makes Classification Decision for a test sample with the help of tree like structure (Similar to Binary Tree OR k-ary tree) Nodes in the tree
More informationRegression trees with R
Regression trees with R Emanuele Taufer file:///c:/users/emanuele.taufer/google%20drive/2%20corsi/5%20qmma%20-%20mim/0%20labs/l7-regression-trees.html#(1) 1/17 Implement decision trees with R To adapt
More informationNetwork Traffic Measurements and Analysis
DEIB - Politecnico di Milano Fall, 2017 Sources Hastie, Tibshirani, Friedman: The Elements of Statistical Learning James, Witten, Hastie, Tibshirani: An Introduction to Statistical Learning Andrew Ng:
More informationNonparametric Approaches to Regression
Nonparametric Approaches to Regression In traditional nonparametric regression, we assume very little about the functional form of the mean response function. In particular, we assume the model where m(xi)
More informationINTRO TO RANDOM FOREST BY ANTHONY ANH QUOC DOAN
INTRO TO RANDOM FOREST BY ANTHONY ANH QUOC DOAN MOTIVATION FOR RANDOM FOREST Random forest is a great statistical learning model. It works well with small to medium data. Unlike Neural Network which requires
More informationNonparametric Classification Methods
Nonparametric Classification Methods We now examine some modern, computationally intensive methods for regression and classification. Recall that the LDA approach constructs a line (or plane or hyperplane)
More informationBusiness Club. Decision Trees
Business Club Decision Trees Business Club Analytics Team December 2017 Index 1. Motivation- A Case Study 2. The Trees a. What is a decision tree b. Representation 3. Regression v/s Classification 4. Building
More informationPredictive Analytics: Demystifying Current and Emerging Methodologies. Tom Kolde, FCAS, MAAA Linda Brobeck, FCAS, MAAA
Predictive Analytics: Demystifying Current and Emerging Methodologies Tom Kolde, FCAS, MAAA Linda Brobeck, FCAS, MAAA May 18, 2017 About the Presenters Tom Kolde, FCAS, MAAA Consulting Actuary Chicago,
More informationData Mining Lecture 8: Decision Trees
Data Mining Lecture 8: Decision Trees Jo Houghton ECS Southampton March 8, 2019 1 / 30 Decision Trees - Introduction A decision tree is like a flow chart. E. g. I need to buy a new car Can I afford it?
More informationClassification and Regression Trees
Classification and Regression Trees Matthew S. Shotwell, Ph.D. Department of Biostatistics Vanderbilt University School of Medicine Nashville, TN, USA March 16, 2018 Introduction trees partition feature
More information7. Boosting and Bagging Bagging
Group Prof. Daniel Cremers 7. Boosting and Bagging Bagging Bagging So far: Boosting as an ensemble learning method, i.e.: a combination of (weak) learners A different way to combine classifiers is known
More informationLecture 27: Review. Reading: All chapters in ISLR. STATS 202: Data mining and analysis. December 6, 2017
Lecture 27: Review Reading: All chapters in ISLR. STATS 202: Data mining and analysis December 6, 2017 1 / 16 Final exam: Announcements Tuesday, December 12, 8:30-11:30 am, in the following rooms: Last
More informationCS 229 Midterm Review
CS 229 Midterm Review Course Staff Fall 2018 11/2/2018 Outline Today: SVMs Kernels Tree Ensembles EM Algorithm / Mixture Models [ Focus on building intuition, less so on solving specific problems. Ask
More informationMIT 801. Machine Learning I. [Presented by Anna Bosman] 16 February 2018
MIT 801 [Presented by Anna Bosman] 16 February 2018 Machine Learning What is machine learning? Artificial Intelligence? Yes as we know it. What is intelligence? The ability to acquire and apply knowledge
More informationStat 342 Exam 3 Fall 2014
Stat 34 Exam 3 Fall 04 I have neither given nor received unauthorized assistance on this exam. Name Signed Date Name Printed There are questions on the following 6 pages. Do as many of them as you can
More informationAdditive Models, Trees, etc. Based in part on Chapter 9 of Hastie, Tibshirani, and Friedman David Madigan
Additive Models, Trees, etc. Based in part on Chapter 9 of Hastie, Tibshirani, and Friedman David Madigan Predictive Modeling Goal: learn a mapping: y = f(x;θ) Need: 1. A model structure 2. A score function
More informationLecture 25: Review I
Lecture 25: Review I Reading: Up to chapter 5 in ISLR. STATS 202: Data mining and analysis Jonathan Taylor 1 / 18 Unsupervised learning In unsupervised learning, all the variables are on equal standing,
More informationPart I. Classification & Decision Trees. Classification. Classification. Week 4 Based in part on slides from textbook, slides of Susan Holmes
Week 4 Based in part on slides from textbook, slides of Susan Holmes Part I Classification & Decision Trees October 19, 2012 1 / 1 2 / 1 Classification Classification Problem description We are given a
More informationCSC 411 Lecture 4: Ensembles I
CSC 411 Lecture 4: Ensembles I Roger Grosse, Amir-massoud Farahmand, and Juan Carrasquilla University of Toronto UofT CSC 411: 04-Ensembles I 1 / 22 Overview We ve seen two particular classification algorithms:
More informationDecision trees. Decision trees are useful to a large degree because of their simplicity and interpretability
Decision trees A decision tree is a method for classification/regression that aims to ask a few relatively simple questions about an input and then predicts the associated output Decision trees are useful
More informationComputer Vision Group Prof. Daniel Cremers. 8. Boosting and Bagging
Prof. Daniel Cremers 8. Boosting and Bagging Repetition: Regression We start with a set of basis functions (x) =( 0 (x), 1(x),..., M 1(x)) x 2 í d The goal is to fit a model into the data y(x, w) =w T
More informationNonparametric Methods Recap
Nonparametric Methods Recap Aarti Singh Machine Learning 10-701/15-781 Oct 4, 2010 Nonparametric Methods Kernel Density estimate (also Histogram) Weighted frequency Classification - K-NN Classifier Majority
More informationEnsemble Methods: Bagging
Ensemble Methods: Bagging Instructor: Jessica Wu Harvey Mudd College The instructor gratefully acknowledges Eric Eaton (UPenn), Jenna Wiens (UMich), Tommi Jaakola (MIT), David Kauchak (Pomona), David Sontag
More informationRandom Forest Classification and Attribute Selection Program rfc3d
Random Forest Classification and Attribute Selection Program rfc3d Overview Random Forest (RF) is a supervised classification algorithm using multiple decision trees. Program rfc3d uses training data generated
More informationIntroduction to Automated Text Analysis. bit.ly/poir599
Introduction to Automated Text Analysis Pablo Barberá School of International Relations University of Southern California pablobarbera.com Lecture materials: bit.ly/poir599 Today 1. Solutions for last
More informationCART. Classification and Regression Trees. Rebecka Jörnsten. Mathematical Sciences University of Gothenburg and Chalmers University of Technology
CART Classification and Regression Trees Rebecka Jörnsten Mathematical Sciences University of Gothenburg and Chalmers University of Technology CART CART stands for Classification And Regression Trees.
More information8. Tree-based approaches
Foundations of Machine Learning École Centrale Paris Fall 2015 8. Tree-based approaches Chloé-Agathe Azencott Centre for Computational Biology, Mines ParisTech chloe agathe.azencott@mines paristech.fr
More informationDecision Trees Dr. G. Bharadwaja Kumar VIT Chennai
Decision Trees Decision Tree Decision Trees (DTs) are a nonparametric supervised learning method used for classification and regression. The goal is to create a model that predicts the value of a target
More informationModel Inference and Averaging. Baging, Stacking, Random Forest, Boosting
Model Inference and Averaging Baging, Stacking, Random Forest, Boosting Bagging Bootstrap Aggregating Bootstrap Repeatedly select n data samples with replacement Each dataset b=1:b is slightly different
More informationLinear Model Selection and Regularization. especially usefull in high dimensions p>>100.
Linear Model Selection and Regularization especially usefull in high dimensions p>>100. 1 Why Linear Model Regularization? Linear models are simple, BUT consider p>>n, we have more features than data records
More informationKnowledge Discovery and Data Mining
Knowledge Discovery and Data Mining Lecture 10 - Classification trees Tom Kelsey School of Computer Science University of St Andrews http://tom.home.cs.st-andrews.ac.uk twk@st-andrews.ac.uk Tom Kelsey
More informationMachine Learning: An Applied Econometric Approach Online Appendix
Machine Learning: An Applied Econometric Approach Online Appendix Sendhil Mullainathan mullain@fas.harvard.edu Jann Spiess jspiess@fas.harvard.edu April 2017 A How We Predict In this section, we detail
More informationInternational Journal of Software and Web Sciences (IJSWS)
International Association of Scientific Innovation and Research (IASIR) (An Association Unifying the Sciences, Engineering, and Applied Research) ISSN (Print): 2279-0063 ISSN (Online): 2279-0071 International
More informationLecture 13: Model selection and regularization
Lecture 13: Model selection and regularization Reading: Sections 6.1-6.2.1 STATS 202: Data mining and analysis October 23, 2017 1 / 17 What do we know so far In linear regression, adding predictors always
More informationLars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany
Syllabus Fri. 27.10. (1) 0. Introduction A. Supervised Learning: Linear Models & Fundamentals Fri. 3.11. (2) A.1 Linear Regression Fri. 10.11. (3) A.2 Linear Classification Fri. 17.11. (4) A.3 Regularization
More informationSTA 4273H: Statistical Machine Learning
STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 12 Combining
More informationAdvanced and Predictive Analytics with JMP 12 PRO. JMP User Meeting 9. Juni Schwalbach
Advanced and Predictive Analytics with JMP 12 PRO JMP User Meeting 9. Juni 2016 -Schwalbach Definition Predictive Analytics encompasses a variety of statistical techniques from modeling, machine learning
More informationApplying Supervised Learning
Applying Supervised Learning When to Consider Supervised Learning A supervised learning algorithm takes a known set of input data (the training set) and known responses to the data (output), and trains
More informationEnsemble methods in machine learning. Example. Neural networks. Neural networks
Ensemble methods in machine learning Bootstrap aggregating (bagging) train an ensemble of models based on randomly resampled versions of the training set, then take a majority vote Example What if you
More informationWhat is machine learning?
Machine learning, pattern recognition and statistical data modelling Lecture 12. The last lecture Coryn Bailer-Jones 1 What is machine learning? Data description and interpretation finding simpler relationship
More informationLogical Rhythm - Class 3. August 27, 2018
Logical Rhythm - Class 3 August 27, 2018 In this Class Neural Networks (Intro To Deep Learning) Decision Trees Ensemble Methods(Random Forest) Hyperparameter Optimisation and Bias Variance Tradeoff Biological
More informationStatistical Machine Learning Hilary Term 2018
Statistical Machine Learning Hilary Term 2018 Pier Francesco Palamara Department of Statistics University of Oxford Slide credits and other course material can be found at: http://www.stats.ox.ac.uk/~palamara/sml18.html
More informationA Practical Tour of Ensemble (Machine) Learning
A Practical Tour of Ensemble (Machine) Learning Nima Hejazi Evan Muzzall Division of Biostatistics, University of California, Berkeley D-Lab, University of California, Berkeley slides: https://googl/wwaqc
More informationNotes based on: Data Mining for Business Intelligence
Chapter 9 Classification and Regression Trees Roger Bohn April 2017 Notes based on: Data Mining for Business Intelligence 1 Shmueli, Patel & Bruce 2 3 II. Results and Interpretation There are 1183 auction
More informationBIOINF 585: Machine Learning for Systems Biology & Clinical Informatics
BIOINF 585: Machine Learning for Systems Biology & Clinical Informatics Lecture 12: Ensemble Learning I Jie Wang Department of Computational Medicine & Bioinformatics University of Michigan 1 Outline Bias
More informationAn Empirical Comparison of Ensemble Methods Based on Classification Trees. Mounir Hamza and Denis Larocque. Department of Quantitative Methods
An Empirical Comparison of Ensemble Methods Based on Classification Trees Mounir Hamza and Denis Larocque Department of Quantitative Methods HEC Montreal Canada Mounir Hamza and Denis Larocque 1 June 2005
More informationClassification and Regression Trees
Classification and Regression Trees David S. Rosenberg New York University April 3, 2018 David S. Rosenberg (New York University) DS-GA 1003 / CSCI-GA 2567 April 3, 2018 1 / 51 Contents 1 Trees 2 Regression
More informationOliver Dürr. Statistisches Data Mining (StDM) Woche 12. Institut für Datenanalyse und Prozessdesign Zürcher Hochschule für Angewandte Wissenschaften
Statistisches Data Mining (StDM) Woche 12 Oliver Dürr Institut für Datenanalyse und Prozessdesign Zürcher Hochschule für Angewandte Wissenschaften oliver.duerr@zhaw.ch Winterthur, 6 Dezember 2016 1 Multitasking
More informationData Mining. 3.2 Decision Tree Classifier. Fall Instructor: Dr. Masoud Yaghini. Chapter 5: Decision Tree Classifier
Data Mining 3.2 Decision Tree Classifier Fall 2008 Instructor: Dr. Masoud Yaghini Outline Introduction Basic Algorithm for Decision Tree Induction Attribute Selection Measures Information Gain Gain Ratio
More informationPerformance Estimation and Regularization. Kasthuri Kannan, PhD. Machine Learning, Spring 2018
Performance Estimation and Regularization Kasthuri Kannan, PhD. Machine Learning, Spring 2018 Bias- Variance Tradeoff Fundamental to machine learning approaches Bias- Variance Tradeoff Error due to Bias:
More informationClassification/Regression Trees and Random Forests
Classification/Regression Trees and Random Forests Fabio G. Cozman - fgcozman@usp.br November 6, 2018 Classification tree Consider binary class variable Y and features X 1,..., X n. Decide Ŷ after a series
More informationPreface to the Second Edition. Preface to the First Edition. 1 Introduction 1
Preface to the Second Edition Preface to the First Edition vii xi 1 Introduction 1 2 Overview of Supervised Learning 9 2.1 Introduction... 9 2.2 Variable Types and Terminology... 9 2.3 Two Simple Approaches
More informationLecture 5: Decision Trees (Part II)
Lecture 5: Decision Trees (Part II) Dealing with noise in the data Overfitting Pruning Dealing with missing attribute values Dealing with attributes with multiple values Integrating costs into node choice
More informationMachine Learning. A. Supervised Learning A.7. Decision Trees. Lars Schmidt-Thieme
Machine Learning A. Supervised Learning A.7. Decision Trees Lars Schmidt-Thieme Information Systems and Machine Learning Lab (ISMLL) Institute for Computer Science University of Hildesheim, Germany 1 /
More informationAn introduction to random forests
An introduction to random forests Eric Debreuve / Team Morpheme Institutions: University Nice Sophia Antipolis / CNRS / Inria Labs: I3S / Inria CRI SA-M / ibv Outline Machine learning Decision tree Random
More informationPatrick Breheny. November 10
Patrick Breheny November Patrick Breheny BST 764: Applied Statistical Modeling /6 Introduction Our discussion of tree-based methods in the previous section assumed that the outcome was continuous Tree-based
More informationSensor-based Semantic-level Human Activity Recognition using Temporal Classification
Sensor-based Semantic-level Human Activity Recognition using Temporal Classification Weixuan Gao gaow@stanford.edu Chuanwei Ruan chuanwei@stanford.edu Rui Xu ray1993@stanford.edu I. INTRODUCTION Human
More informationUsing Machine Learning to Optimize Storage Systems
Using Machine Learning to Optimize Storage Systems Dr. Kiran Gunnam 1 Outline 1. Overview 2. Building Flash Models using Logistic Regression. 3. Storage Object classification 4. Storage Allocation recommendation
More informationLecture 20: Classification and Regression Trees
Fall, 2017 Outline Basic Ideas Basic Ideas Tree Construction Algorithm Parameter Tuning Choice of Impurity Measure Missing Values Characteristics of Classification Trees Main Characteristics: very flexible,
More informationChapter 7: Numerical Prediction
Ludwig-Maximilians-Universität München Institut für Informatik Lehr- und Forschungseinheit für Datenbanksysteme Knowledge Discovery in Databases SS 2016 Chapter 7: Numerical Prediction Lecture: Prof. Dr.
More informationFitting Classification and Regression Trees Using Statgraphics and R. Presented by Dr. Neil W. Polhemus
Fitting Classification and Regression Trees Using Statgraphics and R Presented by Dr. Neil W. Polhemus Classification and Regression Trees Machine learning methods used to construct predictive models from
More informationMachine Learning (CSE 446): Concepts & the i.i.d. Supervised Learning Paradigm
Machine Learning (CSE 446): Concepts & the i.i.d. Supervised Learning Paradigm Sham M Kakade c 2018 University of Washington cse446-staff@cs.washington.edu 1 / 17 Review 1 / 17 Decision Tree: Making a
More information1) Give decision trees to represent the following Boolean functions:
1) Give decision trees to represent the following Boolean functions: 1) A B 2) A [B C] 3) A XOR B 4) [A B] [C Dl Answer: 1) A B 2) A [B C] 1 3) A XOR B = (A B) ( A B) 4) [A B] [C D] 2 2) Consider the following
More informationStat 602X Exam 2 Spring 2011
Stat 60X Exam Spring 0 I have neither given nor received unauthorized assistance on this exam. Name Signed Date Name Printed . Below is a small p classification training set (for classes) displayed in
More informationDecision Trees / Discrete Variables
Decision trees Decision Trees / Discrete Variables Season Location Fun? Location summer prison -1 summer beach +1 Prison Beach Ski Slope Winter ski-slope +1-1 Season +1 Winter beach -1 Winter Summer -1
More informationGiven a collection of records (training set )
Given a collection of records (training set ) Each record contains a set of attributes, one of the attributes is (always) the class. Find a model for class attribute as a function of the values of other
More informationMIT Samberg Center Cambridge, MA, USA. May 30 th June 2 nd, by C. Rea, R.S. Granetz MIT Plasma Science and Fusion Center, Cambridge, MA, USA
Exploratory Machine Learning studies for disruption prediction on DIII-D by C. Rea, R.S. Granetz MIT Plasma Science and Fusion Center, Cambridge, MA, USA Presented at the 2 nd IAEA Technical Meeting on
More informationTrimmed bagging a DEPARTMENT OF DECISION SCIENCES AND INFORMATION MANAGEMENT (KBI) Christophe Croux, Kristel Joossens and Aurélie Lemmens
Faculty of Economics and Applied Economics Trimmed bagging a Christophe Croux, Kristel Joossens and Aurélie Lemmens DEPARTMENT OF DECISION SCIENCES AND INFORMATION MANAGEMENT (KBI) KBI 0721 Trimmed Bagging
More informationCART Bagging Trees Random Forests. Leo Breiman
CART Bagging Trees Random Forests Leo Breiman Breiman, L., J. Friedman, R. Olshen, and C. Stone, 1984: Classification and regression trees. Wadsworth Books, 358. Breiman, L., 1996: Bagging predictors.
More information2017 ITRON EFG Meeting. Abdul Razack. Specialist, Load Forecasting NV Energy
2017 ITRON EFG Meeting Abdul Razack Specialist, Load Forecasting NV Energy Topics 1. Concepts 2. Model (Variable) Selection Methods 3. Cross- Validation 4. Cross-Validation: Time Series 5. Example 1 6.
More informationChapter 5. Tree-based Methods
Chapter 5. Tree-based Methods Wei Pan Division of Biostatistics, School of Public Health, University of Minnesota, Minneapolis, MN 55455 Email: weip@biostat.umn.edu PubH 7475/8475 c Wei Pan Regression
More informationPredictive modelling / Machine Learning Course on Big Data Analytics
Predictive modelling / Machine Learning Course on Big Data Analytics Roberta Turra, Cineca 19 September 2016 Going back to the definition of data analytics process of extracting valuable information from
More informationDECISION TREES & RANDOM FORESTS X CONVOLUTIONAL NEURAL NETWORKS
DECISION TREES & RANDOM FORESTS X CONVOLUTIONAL NEURAL NETWORKS Deep Neural Decision Forests Microsoft Research Cambridge UK, ICCV 2015 Decision Forests, Convolutional Networks and the Models in-between
More informationOverview. Non-Parametrics Models Definitions KNN. Ensemble Methods Definitions, Examples Random Forests. Clustering. k-means Clustering 2 / 8
Tutorial 3 1 / 8 Overview Non-Parametrics Models Definitions KNN Ensemble Methods Definitions, Examples Random Forests Clustering Definitions, Examples k-means Clustering 2 / 8 Non-Parametrics Models Definitions
More informationAllstate Insurance Claims Severity: A Machine Learning Approach
Allstate Insurance Claims Severity: A Machine Learning Approach Rajeeva Gaur SUNet ID: rajeevag Jeff Pickelman SUNet ID: pattern Hongyi Wang SUNet ID: hongyiw I. INTRODUCTION The insurance industry has
More informationLecture 06 Decision Trees I
Lecture 06 Decision Trees I 08 February 2016 Taylor B. Arnold Yale Statistics STAT 365/665 1/33 Problem Set #2 Posted Due February 19th Piazza site https://piazza.com/ 2/33 Last time we starting fitting
More informationRandom Forest in Genomic Selection
Random Forest in genomic selection 1 Dpto Mejora Genética Animal, INIA, Madrid; Universidad Politécnica de Valencia, 20-24 September, 2010. Outline 1 Remind 2 Random Forest Introduction Classification
More informationFrom Building Better Models with JMP Pro. Full book available for purchase here.
From Building Better Models with JMP Pro. Full book available for purchase here. Contents Acknowledgments... ix About This Book... xi About These Authors... xiii Part 1 Introduction... 1 Chapter 1 Introduction...
More informationAlgorithms: Decision Trees
Algorithms: Decision Trees A small dataset: Miles Per Gallon Suppose we want to predict MPG From the UCI repository A Decision Stump Recursion Step Records in which cylinders = 4 Records in which cylinders
More informationCS229 Lecture notes. Raphael John Lamarre Townshend
CS229 Lecture notes Raphael John Lamarre Townshend Decision Trees We now turn our attention to decision trees, a simple yet flexible class of algorithms. We will first consider the non-linear, region-based
More informationClassification: Decision Trees
Classification: Decision Trees IST557 Data Mining: Techniques and Applications Jessie Li, Penn State University 1 Decision Tree Example Will a pa)ent have high-risk based on the ini)al 24-hour observa)on?
More informationDecision Tree CE-717 : Machine Learning Sharif University of Technology
Decision Tree CE-717 : Machine Learning Sharif University of Technology M. Soleymani Fall 2012 Some slides have been adapted from: Prof. Tom Mitchell Decision tree Approximating functions of usually discrete
More informationCSE Data Mining Concepts and Techniques STATISTICAL METHODS (REGRESSION) Professor- Anita Wasilewska. Team 13
CSE 634 - Data Mining Concepts and Techniques STATISTICAL METHODS Professor- Anita Wasilewska (REGRESSION) Team 13 Contents Linear Regression Logistic Regression Bias and Variance in Regression Model Fit
More informationStatistical foundations of machine learning
Statistical foundations of machine learning INFO-F-422 Gianluca Bontempi Machine Learning Group Computer Science Department mlg.ulb.ac.be Some algorithms for nonlinear modeling Feedforward neural network
More informationAdvanced Video Content Analysis and Video Compression (5LSH0), Module 8B
Advanced Video Content Analysis and Video Compression (5LSH0), Module 8B 1 Supervised learning Catogarized / labeled data Objects in a picture: chair, desk, person, 2 Classification Fons van der Sommen
More informationStatistical Consulting Topics Using cross-validation for model selection. Cross-validation is a technique that can be used for model evaluation.
Statistical Consulting Topics Using cross-validation for model selection Cross-validation is a technique that can be used for model evaluation. We often fit a model to a full data set and then perform
More informationDecision Trees. This week: Next week: Algorithms for constructing DT. Pruning DT Ensemble methods. Random Forest. Intro to ML
Decision Trees This week: Algorithms for constructing DT Next week: Pruning DT Ensemble methods Random Forest 2 Decision Trees - Boolean x 1 0 1 +1 x 6 0 1 +1-1 3 Decision Trees - Continuous Decision stumps
More informationThe Curse of Dimensionality
The Curse of Dimensionality ACAS 2002 p1/66 Curse of Dimensionality The basic idea of the curse of dimensionality is that high dimensional data is difficult to work with for several reasons: Adding more
More informationData Mining Part 5. Prediction
Data Mining Part 5. Prediction 5.4. Spring 2010 Instructor: Dr. Masoud Yaghini Outline Using IF-THEN Rules for Classification Rule Extraction from a Decision Tree 1R Algorithm Sequential Covering Algorithms
More information