Supervised Learning for Image Segmentation
|
|
- Damon Rice
- 5 years ago
- Views:
Transcription
1 Supervised Learning for Image Segmentation Raphael Meier Raphael Meier MIA / 52
2 References A. Ng, Machine Learning lecture, Stanford University. A. Criminisi, J. Shotton, E. Konukoglu, Decision Forests: A Unified Framework for Classification, Regression, Density Estimation, Manifold Learning and Semi-Supervised Learning, Foundations and Trends in Computer Graphics and Computer Vision, A. Criminisi, Decision Forests for Computer Vision and Medical Image Analysis, Tutorial, projects/decisionforests/. S. J. D. Prince, Computer vision: Models, Learning and Inference, Cambridge University Press, D. Barber, Bayesian Reasoning and Machine Learning, T. Hastie, R. Tibshirani, J. Friedman, The Elements of Statistical Learning: Data Mining, Inference and Prediction, Springer, Raphael Meier MIA / 52
3 Part I Supervised Learning Expert knowledge (manual segmentation) Training General rule H(x) Training data Fully automatic segmentation Testing
4 Brain Tumor Segmentation Brain tumors: Glioma (Glioblastoma) Clinical guidelines Bidimensional measures (RANO/AvaGlio) Desired: Tumor Volumetry (manual segmentation, takes hours) Future: Fully-automatic segmentation
5 Bidimensional measures fail (Reuter et al., 2014) Raphael Meier MIA / 52
6 Motivation (Menze et al., 2014) Raphael Meier MIA / 52
7 The Learning Problem Training data Hypothesis H(x) New data (x) Prediction (y) Training set: S Input: x Output: y Hypothesis: H(x) : x y Raphael Meier MIA / 52
8 Application: Image segmentation Aim: Partition image into disjoint, semantically meaningful image regions can be seen as a learning (classification) problem Input: Image(s) consisting of voxels Output: Regions, indicated by voxel-wise numbers (usually integers: 1,2,3, ) Raphael Meier MIA / 52
9 Image representation - Features Definition: Measurable attributes of image data Can be either hand-crafted or automatically learned (e.g. via Restricted Boltzmann Machine) Raphael Meier MIA / 52
10 Taxonomy of Learning Scenarios Defined by nature of training data Unsupervised Learning: Given a set of unlabeled feature vectors Su = { x (i) : i = 1,..., m } Supervised Learning: Given a set of fully-labeled feature vectors Sl = {( x (i), y (i)) : i = 1,..., m } Semi-supervised Learning: Given a set of partially labeled feature vectors S = S u S l Raphael Meier MIA / 52
11 Taxonomy of Learning Problems Defined by the learning scenario and nature of the output Unsupervised Learning: Given Su, find interesting structure (clustering, density estimation) Given Su with x R n, find H(x) = x such that ñ n (dimensionality reduction, manifold learning) Supervised Learning: Given Sl, find H(x) : x y with x R n and y {1, 2, 3, } (classification) Given S l, find H(x) : x y with x R n and y R (regression) Raphael Meier MIA / 52
12 Image segmentation via Classification Expert knowledge (manual segmentation) General rule H(x) Training data Fully automatic segmentation Raphael Meier MIA / 52
13 Training and Testing phase Expert knowledge (manual segmentation) Training General rule H(x) Training data Fully automatic segmentation Testing Raphael Meier MIA / 52
14 Learning (Training) Algorithm Aim: Construct a hypothesis H which relates a feature vector x to its most probable label y. Output: Hypothesis (model) parametrized by set of parameters θ Assume we know p(y x, θ), then the mapping H(x) : x y can be realized via (MAP-rule): How do we obtain p(y x, θ)? ŷ = arg max p(y x, θ). (1) y Raphael Meier MIA / 52
15 Generative vs. Discriminative Models Bayes rule: p(y x, θ) = p(x, y θ) p(x θ) = p(x y, θ)p(y θ). (2) p(x θ) Generative models: Estimate p(y x) via likelihood p(x y) and prior distribution p(y). Discriminative models: Estimate posterior distribution p(y x) directly can be also non-probabilistic (e.g. Support Vector Machines) Raphael Meier MIA / 52
16 Logistic regression A Classic (1940s) Used extensively, 1415 hits on pubmed Supervised learning Solves binary classification problems (y {0, 1}) Discriminative approach, we model p(y x) directly: p(y = 1 x; θ) = hθ (x) and p(y = 0 x; θ) = 1 h θ (x) (Bernoulli) More compactly: p(y x; θ) = (h θ (x)) y (1 h θ (x)) 1 y (3) Linear model, hence: h θ (x) = g(θ T x) y x, θ Bernoulli(h θ (x)) (4) Raphael Meier MIA / 52
17 Logistic regression Sigmoid Function Logit functon: Previously, z = θ T x. g(z) = ez 1 + e z = e z (5) Motivation: Restrict values of our hypothesis to be between zero and one (probability)
18 Logistic regression Decision Boundary Set of points x for which p(y = 1 x; θ) = p(y = 0 x; θ) = 0.5 holds. Given by the hyperplane: θ T x = 0 (6) For θ T x > 0, feature vectors are classified as 1 s. For θ T x < 0, feature vectors are classified as 0 s. Raphael Meier MIA / 52
19 Learning θ Maximum Likelihood Given a set of i.i.d. training pairs S = {( x (i), y (i)) : i = 1,..., m } θ ML = arg max θ = arg max θ L(θ) = arg max θ m p(y (i) x (i), θ) (7) i=1 m (h θ (x (i) )) y (i) (1 h θ (x (i) )) 1 y (i) (8) i=1 For simplification, we maximize log L(θ): m l(θ) = log L(θ) = y (i) log h(x (i) ) + (1 y (i) ) log(1 h(x (i) )) (9) i=1 Raphael Meier MIA / 52
20 Learning θ Maximum Likelihood II No closed-form solution to maximize log-likelihood l(θ) l(θ) l(θ) θ θ However: l(θ) is concave Global maximum Allows optimization via gradient ascent Ascent method: θ (t+1) := θ (t) + α θ l(θ (t) ) with l(θ (t+1) ) > l(θ (t) ) Derivative w.r.t. θ j : l(θ) θ j = m ( i=1 y (i) h(x (i) ) ) x (i) j Raphael Meier MIA / 52
21 Learning algorithm Gradient ascent initialization; while convergence criteria not satisfied do for j = 0 to n do θ j := θ j + α m i=1 (y (i) h θ (x (i) ))x (i) j ; end end Algorithm 1: Gradient ascent Convergence: θ l(θ) 0 Magnitude of update is proportional to error in prediction: (y (i) h θ (x (i) )) Raphael Meier MIA / 52
22 Multiple classes Logistic regression can be generalized to situations with y {1,, K} Hypothesis changes (softmax function): p(y = k x) = exp(θ T k x) K i=1 exp(θt i x) (10) Raphael Meier MIA / 52
23 Binary Image Segmentation using Logistic Regression Preprocessing Feature Extraction Logistic Regression Spatial Regularization
24 Generalization Model complexity Errors in prediction due to: Bias (Wrong assumptions in our model) Variance (Limited sample size, sensitivity of model to changes in training data) Raphael Meier MIA / 52
25 Generalization Bias-Variance trade-off Generalization error = bias + variance + irreducible error How can we minimize generalization error? First: Employ appropriate error measure Second: Vary complexity of model, choose the one with minimum error Raphael Meier MIA / 52
26 Generalization Number of samples Generalization error decreases with increasing number of training samples m Dilemma: Acquisition of training data (ground truth) is usually expensive Raphael Meier MIA / 52
27 Model evaluation Strategies Always best: Training (2/3) and Testing (1/3) set K-fold cross-validation on full data set: Popular choices for K are 5 or 10 Alternative: Leave-one-out cross-validation (LOOCV) CV often used for tuning of hyperparameters Raphael Meier MIA / 52
28 Model evaluation Real-World example: BRATS 2013 Overfitted on training data Raphael Meier MIA / 52
29 Part II Decision Forests for Image Classification
30 Linear vs. Non-linear Logistic regression: Linear Classifier Real problems are very often non-linear! Raphael Meier MIA / 52
31 Transitioning from linear to non-linear classifier x x T h(x) g( 0 x) T h(x) g( 0 x) Idea: Combination of simple classifiers to more complex ones 1 h(x) g( x) 1 e T 0 T 0 x p(y 'red' x) h(x) p(y 'blue' x) 1 h(x) T h(x) g( 1 x) T h(x) g( 2 x) x x x T h(x) g( 0 x) T h(x) g( 1 x) T h(x) g( 2 x) x Final decision boundary is non-linear! Raphael Meier MIA / 52
32 Decision tree Raphael Meier MIA / 52
33 How to decide? Weak Learner Simple model which performs only slightly better than flipping a coin Can be represented as (1 { } is indicator function): h θ (x) = 1 {g(x, θ) > τ} (11) Linear model: g(x, θ) = φ(x) T θ (homogenous coordinates) φ(x) selects a random subset of features (Randomized Node Optimization), θ defines geometric primitive Raphael Meier MIA / 52
34 Examples of weak learner Weak learner: axis aligned Weak learner: oriented line Weak learner: conic section Raphael Meier MIA / 52
35 How to predict? Leaf prediction model Feature vector is passed down a tree and will end up in a leaf Leaf stores p(y x) (class label histogram) Apply MAP-rule on p(y x) Raphael Meier MIA / 52
36 How to predict? Leaf prediction model Raphael Meier MIA / 52
37 Testing phase T x x x Final prediction given by: p (y x) = 1 T T p t (y x). (12) t=1 Raphael Meier MIA / 52
38 How to train? Information gain high information gain low information gain Raphael Meier MIA / 52
39 How to train? Information gain where Optimization of information gain: IG = H(S) i {L,R} S i S H(Si ) (13) H(S) = y Y p(y) log p(y). (14) Minimizes impurity of child-distributions θ j = arg max θ j Θ IG j. (15) Optimization procedure: Exhaustive search over Θ Raphael Meier MIA / 52
40 How does all of this make sense? Bias-Variance trade-off Decision tree is a low-bias high-variance model Two key aspects (Breiman, 2001): Randomized Node Optimization (and Bagging) de-correlates trees Averaging of tree predictions Variance of average prediction given by: ρσ ρ T σ2 (16) Hence, grow randomized trees sufficiently deep and combine them into an ensemble Raphael Meier MIA / 52
41 Forest hyperparameters Number of trees T Depth of trees D Number of candidate weak learners H Number of candidate thresholds T How to tune them? Gridsearch (cross-validation) Raphael Meier MIA / 52
42 Forest hyperparameters Raphael Meier MIA / 52
43 Forest hyperparameters Raphael Meier MIA / 52
44 Toy example Raphael Meier MIA / 52
45 (Binary) Image Segmentation using Decision Forest Preprocessing Feature Extraction Decision Forest Spatial Regularization
46 Real-world examples MICCAI 2014 Raphael Meier MIA / 52
47 Real-world examples Brain Tumor Segmentation
48 Real-world examples Feature Importance I M Voxel-wise intensity value extracted from modality M Depth Tumor vs. healthy Healthy tissues Tumor core 1 I FLAIR I T 1 I T 1c 2 I T 2 I T 1c I T 1 I T 2 3 I T 1 I FLAIR I T 1c I T 1 I T 1c 4 I T 2 I T 1 I T 2 I T 2 5 I FLAIR I T 1c I FLAIR 6 I FLAIR I T 1c I T 1c 7 I T 1c I T 1c I FLAIR 8 I T 2 I T 1c I T 1c 9 I T 1c I T 1c I FLAIR 10 I T 1c I T 1c I T 1c 11 I T 1c I T 1c I FLAIR 12 I T 1c I T 1c I T 1c 13 I T 1c I T 1c I T 1c 14 I T 1c I T 1c I T 2 15 I T 2 I T 1c I T 1c 16 I T 2 I T 1c I T 2 17 I T 2 I T 1c I T 1c 18 I T 2 I T 1c I T 1c
49 Summary Decision Forest Discriminative model Decision Forest has two main degrees of freedom: Weak learner Objective function (information gain) Training: Generation of de-correlated trees based on maximizing information gain Testing: New input is pushed down each tree, prediction is performed based on model stored in leaf Raphael Meier MIA / 52
50 A last note... Decision forests are a flexible multi-purpose framework Can solve also regression problems, density estimation and manifold learning Raphael Meier MIA / 52
51 Connection to deep learning
52 Thank you! Raphael Meier MIA / 52
Context-sensitive Classification Forests for Segmentation of Brain Tumor Tissues
Context-sensitive Classification Forests for Segmentation of Brain Tumor Tissues D. Zikic, B. Glocker, E. Konukoglu, J. Shotton, A. Criminisi, D. H. Ye, C. Demiralp 3, O. M. Thomas 4,5, T. Das 4, R. Jena
More informationMEDICAL IMAGE COMPUTING (CAP 5937) LECTURE 20: Machine Learning in Medical Imaging II (deep learning and decision forests)
SPRING 2016 1 MEDICAL IMAGE COMPUTING (CAP 5937) LECTURE 20: Machine Learning in Medical Imaging II (deep learning and decision forests) Dr. Ulas Bagci HEC 221, Center for Research in Computer Vision (CRCV),
More informationMachine Learning for Medical Image Analysis. A. Criminisi
Machine Learning for Medical Image Analysis A. Criminisi Overview Introduction to machine learning Decision forests Applications in medical image analysis Anatomy localization in CT Scans Spine Detection
More informationNetwork Traffic Measurements and Analysis
DEIB - Politecnico di Milano Fall, 2017 Sources Hastie, Tibshirani, Friedman: The Elements of Statistical Learning James, Witten, Hastie, Tibshirani: An Introduction to Statistical Learning Andrew Ng:
More informationUlas Bagci
CAP5415-Computer Vision Lecture 14-Decision Forests for Computer Vision Ulas Bagci bagci@ucf.edu 1 Readings Slide Credits: Criminisi and Shotton Z. Tu R.Cipolla 2 Common Terminologies Randomized Decision
More informationWhat is machine learning?
Machine learning, pattern recognition and statistical data modelling Lecture 12. The last lecture Coryn Bailer-Jones 1 What is machine learning? Data description and interpretation finding simpler relationship
More informationCS 229 Midterm Review
CS 229 Midterm Review Course Staff Fall 2018 11/2/2018 Outline Today: SVMs Kernels Tree Ensembles EM Algorithm / Mixture Models [ Focus on building intuition, less so on solving specific problems. Ask
More informationMondrian Forests: Efficient Online Random Forests
Mondrian Forests: Efficient Online Random Forests Balaji Lakshminarayanan (Gatsby Unit, UCL) Daniel M. Roy (Cambridge Toronto) Yee Whye Teh (Oxford) September 4, 2014 1 Outline Background and Motivation
More informationAdvanced Video Content Analysis and Video Compression (5LSH0), Module 8B
Advanced Video Content Analysis and Video Compression (5LSH0), Module 8B 1 Supervised learning Catogarized / labeled data Objects in a picture: chair, desk, person, 2 Classification Fons van der Sommen
More informationSTA 4273H: Statistical Machine Learning
STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 12 Combining
More informationMachine Learning. Chao Lan
Machine Learning Chao Lan Machine Learning Prediction Models Regression Model - linear regression (least square, ridge regression, Lasso) Classification Model - naive Bayes, logistic regression, Gaussian
More informationLecture 7: Decision Trees
Lecture 7: Decision Trees Instructor: Outline 1 Geometric Perspective of Classification 2 Decision Trees Geometric Perspective of Classification Perspective of Classification Algorithmic Geometric Probabilistic...
More informationClassification. Vladimir Curic. Centre for Image Analysis Swedish University of Agricultural Sciences Uppsala University
Classification Vladimir Curic Centre for Image Analysis Swedish University of Agricultural Sciences Uppsala University Outline An overview on classification Basics of classification How to choose appropriate
More informationApplied Statistics for Neuroscientists Part IIa: Machine Learning
Applied Statistics for Neuroscientists Part IIa: Machine Learning Dr. Seyed-Ahmad Ahmadi 04.04.2017 16.11.2017 Outline Machine Learning Difference between statistics and machine learning Modeling the problem
More informationExpectation Maximization (EM) and Gaussian Mixture Models
Expectation Maximization (EM) and Gaussian Mixture Models Reference: The Elements of Statistical Learning, by T. Hastie, R. Tibshirani, J. Friedman, Springer 1 2 3 4 5 6 7 8 Unsupervised Learning Motivation
More informationSupervised vs unsupervised clustering
Classification Supervised vs unsupervised clustering Cluster analysis: Classes are not known a- priori. Classification: Classes are defined a-priori Sometimes called supervised clustering Extract useful
More informationCS6375: Machine Learning Gautam Kunapuli. Mid-Term Review
Gautam Kunapuli Machine Learning Data is identically and independently distributed Goal is to learn a function that maps to Data is generated using an unknown function Learn a hypothesis that minimizes
More informationBayes Net Learning. EECS 474 Fall 2016
Bayes Net Learning EECS 474 Fall 2016 Homework Remaining Homework #3 assigned Homework #4 will be about semi-supervised learning and expectation-maximization Homeworks #3-#4: the how of Graphical Models
More informationModel Assessment and Selection. Reference: The Elements of Statistical Learning, by T. Hastie, R. Tibshirani, J. Friedman, Springer
Model Assessment and Selection Reference: The Elements of Statistical Learning, by T. Hastie, R. Tibshirani, J. Friedman, Springer 1 Model Training data Testing data Model Testing error rate Training error
More informationCSC411 Fall 2014 Machine Learning & Data Mining. Ensemble Methods. Slides by Rich Zemel
CSC411 Fall 2014 Machine Learning & Data Mining Ensemble Methods Slides by Rich Zemel Ensemble methods Typical application: classi.ication Ensemble of classi.iers is a set of classi.iers whose individual
More informationContents. Preface to the Second Edition
Preface to the Second Edition v 1 Introduction 1 1.1 What Is Data Mining?....................... 4 1.2 Motivating Challenges....................... 5 1.3 The Origins of Data Mining....................
More informationA Taxonomy of Semi-Supervised Learning Algorithms
A Taxonomy of Semi-Supervised Learning Algorithms Olivier Chapelle Max Planck Institute for Biological Cybernetics December 2005 Outline 1 Introduction 2 Generative models 3 Low density separation 4 Graph
More informationMachine Learning With Python. Bin Chen Nov. 7, 2017 Research Computing Center
Machine Learning With Python Bin Chen Nov. 7, 2017 Research Computing Center Outline Introduction to Machine Learning (ML) Introduction to Neural Network (NN) Introduction to Deep Learning NN Introduction
More informationApplying Supervised Learning
Applying Supervised Learning When to Consider Supervised Learning A supervised learning algorithm takes a known set of input data (the training set) and known responses to the data (output), and trains
More informationRegularization and model selection
CS229 Lecture notes Andrew Ng Part VI Regularization and model selection Suppose we are trying select among several different models for a learning problem. For instance, we might be using a polynomial
More informationSegmenting Glioma in Multi-Modal Images using a Generative-Discriminative Model for Brain Lesion Segmentation
Segmenting Glioma in Multi-Modal Images using a Generative-Discriminative Model for Brain Lesion Segmentation Bjoern H. Menze 1,2, Ezequiel Geremia 2, Nicholas Ayache 2, and Gabor Szekely 1 1 Computer
More informationCS229 Lecture notes. Raphael John Lamarre Townshend
CS229 Lecture notes Raphael John Lamarre Townshend Decision Trees We now turn our attention to decision trees, a simple yet flexible class of algorithms. We will first consider the non-linear, region-based
More informationRandom Forest A. Fornaser
Random Forest A. Fornaser alberto.fornaser@unitn.it Sources Lecture 15: decision trees, information theory and random forests, Dr. Richard E. Turner Trees and Random Forests, Adele Cutler, Utah State University
More informationPattern Recognition. Kjell Elenius. Speech, Music and Hearing KTH. March 29, 2007 Speech recognition
Pattern Recognition Kjell Elenius Speech, Music and Hearing KTH March 29, 2007 Speech recognition 2007 1 Ch 4. Pattern Recognition 1(3) Bayes Decision Theory Minimum-Error-Rate Decision Rules Discriminant
More informationGenerative and discriminative classification techniques
Generative and discriminative classification techniques Machine Learning and Category Representation 013-014 Jakob Verbeek, December 13+0, 013 Course website: http://lear.inrialpes.fr/~verbeek/mlcr.13.14
More information08 An Introduction to Dense Continuous Robotic Mapping
NAVARCH/EECS 568, ROB 530 - Winter 2018 08 An Introduction to Dense Continuous Robotic Mapping Maani Ghaffari March 14, 2018 Previously: Occupancy Grid Maps Pose SLAM graph and its associated dense occupancy
More informationMachine Learning. Supervised Learning. Manfred Huber
Machine Learning Supervised Learning Manfred Huber 2015 1 Supervised Learning Supervised learning is learning where the training data contains the target output of the learning system. Training data D
More informationModule 4. Non-linear machine learning econometrics: Support Vector Machine
Module 4. Non-linear machine learning econometrics: Support Vector Machine THE CONTRACTOR IS ACTING UNDER A FRAMEWORK CONTRACT CONCLUDED WITH THE COMMISSION Introduction When the assumption of linearity
More informationData Mining Lecture 8: Decision Trees
Data Mining Lecture 8: Decision Trees Jo Houghton ECS Southampton March 8, 2019 1 / 30 Decision Trees - Introduction A decision tree is like a flow chart. E. g. I need to buy a new car Can I afford it?
More informationCS839: Probabilistic Graphical Models. Lecture 10: Learning with Partially Observed Data. Theo Rekatsinas
CS839: Probabilistic Graphical Models Lecture 10: Learning with Partially Observed Data Theo Rekatsinas 1 Partially Observed GMs Speech recognition 2 Partially Observed GMs Evolution 3 Partially Observed
More informationClassification with PAM and Random Forest
5/7/2007 Classification with PAM and Random Forest Markus Ruschhaupt Practical Microarray Analysis 2007 - Regensburg Two roads to classification Given: patient profiles already diagnosed by an expert.
More information7. Boosting and Bagging Bagging
Group Prof. Daniel Cremers 7. Boosting and Bagging Bagging Bagging So far: Boosting as an ensemble learning method, i.e.: a combination of (weak) learners A different way to combine classifiers is known
More informationVoxel selection algorithms for fmri
Voxel selection algorithms for fmri Henryk Blasinski December 14, 2012 1 Introduction Functional Magnetic Resonance Imaging (fmri) is a technique to measure and image the Blood- Oxygen Level Dependent
More informationECG782: Multidimensional Digital Signal Processing
ECG782: Multidimensional Digital Signal Processing Object Recognition http://www.ee.unlv.edu/~b1morris/ecg782/ 2 Outline Knowledge Representation Statistical Pattern Recognition Neural Networks Boosting
More informationOverview of machine learning
Overview of machine learning Kevin P. Murphy Last updated November 26, 2007 1 Introduction In this Chapter, we provide a brief overview of the most commonly studied problems and solution methods within
More informationEnsemble Learning. Another approach is to leverage the algorithms we have via ensemble methods
Ensemble Learning Ensemble Learning So far we have seen learning algorithms that take a training set and output a classifier What if we want more accuracy than current algorithms afford? Develop new learning
More informationAn Implementation and Discussion of Random Forest with Gaussian Process Leaves
An Implementation and Discussion of Random Forest with Gaussian Process Leaves 1 2 3 4 5 6 Anonymous Author(s) Affiliation Address email 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29
More informationHomework. Gaussian, Bishop 2.3 Non-parametric, Bishop 2.5 Linear regression Pod-cast lecture on-line. Next lectures:
Homework Gaussian, Bishop 2.3 Non-parametric, Bishop 2.5 Linear regression 3.0-3.2 Pod-cast lecture on-line Next lectures: I posted a rough plan. It is flexible though so please come with suggestions Bayes
More informationUsing Machine Learning to Optimize Storage Systems
Using Machine Learning to Optimize Storage Systems Dr. Kiran Gunnam 1 Outline 1. Overview 2. Building Flash Models using Logistic Regression. 3. Storage Object classification 4. Storage Allocation recommendation
More informationSemi-supervised Learning
Semi-supervised Learning Piyush Rai CS5350/6350: Machine Learning November 8, 2011 Semi-supervised Learning Supervised Learning models require labeled data Learning a reliable model usually requires plenty
More informationInformation theory methods for feature selection
Information theory methods for feature selection Zuzana Reitermanová Department of Computer Science Faculty of Mathematics and Physics Charles University in Prague, Czech Republic Diplomový a doktorandský
More informationThe exam is closed book, closed notes except your one-page (two-sided) cheat sheet.
CS 189 Spring 2015 Introduction to Machine Learning Final You have 2 hours 50 minutes for the exam. The exam is closed book, closed notes except your one-page (two-sided) cheat sheet. No calculators or
More informationCPSC 340: Machine Learning and Data Mining. Probabilistic Classification Fall 2017
CPSC 340: Machine Learning and Data Mining Probabilistic Classification Fall 2017 Admin Assignment 0 is due tonight: you should be almost done. 1 late day to hand it in Monday, 2 late days for Wednesday.
More informationTopics in Machine Learning
Topics in Machine Learning Gilad Lerman School of Mathematics University of Minnesota Text/slides stolen from G. James, D. Witten, T. Hastie, R. Tibshirani and A. Ng Machine Learning - Motivation Arthur
More informationMachine Learning in Biology
Università degli studi di Padova Machine Learning in Biology Luca Silvestrin (Dottorando, XXIII ciclo) Supervised learning Contents Class-conditional probability density Linear and quadratic discriminant
More informationSegmenting Glioma in Multi-Modal Images using a Generative Model for Brain Lesion Segmentation
Segmenting Glioma in Multi-Modal Images using a Generative Model for Brain Lesion Segmentation Bjoern H. Menze 1,2, Koen Van Leemput 3, Danial Lashkari 4 Marc-André Weber 5, Nicholas Ayache 2, and Polina
More informationLecture 25: Review I
Lecture 25: Review I Reading: Up to chapter 5 in ISLR. STATS 202: Data mining and analysis Jonathan Taylor 1 / 18 Unsupervised learning In unsupervised learning, all the variables are on equal standing,
More informationMIT Samberg Center Cambridge, MA, USA. May 30 th June 2 nd, by C. Rea, R.S. Granetz MIT Plasma Science and Fusion Center, Cambridge, MA, USA
Exploratory Machine Learning studies for disruption prediction on DIII-D by C. Rea, R.S. Granetz MIT Plasma Science and Fusion Center, Cambridge, MA, USA Presented at the 2 nd IAEA Technical Meeting on
More informationUnsupervised Learning
Unsupervised Learning Pierre Gaillard ENS Paris September 28, 2018 1 Supervised vs unsupervised learning Two main categories of machine learning algorithms: - Supervised learning: predict output Y from
More information8. Tree-based approaches
Foundations of Machine Learning École Centrale Paris Fall 2015 8. Tree-based approaches Chloé-Agathe Azencott Centre for Computational Biology, Mines ParisTech chloe agathe.azencott@mines paristech.fr
More informationMachine Learning. Topic 5: Linear Discriminants. Bryan Pardo, EECS 349 Machine Learning, 2013
Machine Learning Topic 5: Linear Discriminants Bryan Pardo, EECS 349 Machine Learning, 2013 Thanks to Mark Cartwright for his extensive contributions to these slides Thanks to Alpaydin, Bishop, and Duda/Hart/Stork
More informationClassification. Vladimir Curic. Centre for Image Analysis Swedish University of Agricultural Sciences Uppsala University
Classification Vladimir Curic Centre for Image Analysis Swedish University of Agricultural Sciences Uppsala University Outline An overview on classification Basics of classification How to choose appropriate
More informationCOSC160: Detection and Classification. Jeremy Bolton, PhD Assistant Teaching Professor
COSC160: Detection and Classification Jeremy Bolton, PhD Assistant Teaching Professor Outline I. Problem I. Strategies II. Features for training III. Using spatial information? IV. Reducing dimensionality
More informationLecture 27: Review. Reading: All chapters in ISLR. STATS 202: Data mining and analysis. December 6, 2017
Lecture 27: Review Reading: All chapters in ISLR. STATS 202: Data mining and analysis December 6, 2017 1 / 16 Final exam: Announcements Tuesday, December 12, 8:30-11:30 am, in the following rooms: Last
More informationEnsemble Methods, Decision Trees
CS 1675: Intro to Machine Learning Ensemble Methods, Decision Trees Prof. Adriana Kovashka University of Pittsburgh November 13, 2018 Plan for This Lecture Ensemble methods: introduction Boosting Algorithm
More informationAtlas of Classifiers for Brain MRI Segmentation
Atlas of Classifiers for Brain MRI Segmentation B. Kodner 1,2, S. H. Gordon 1,2, J. Goldberger 3 and T. Riklin Raviv 1,2 1 Department of Electrical and Computer Engineering, 2 The Zlotowski Center for
More informationClustering. Mihaela van der Schaar. January 27, Department of Engineering Science University of Oxford
Department of Engineering Science University of Oxford January 27, 2017 Many datasets consist of multiple heterogeneous subsets. Cluster analysis: Given an unlabelled data, want algorithms that automatically
More information10601 Machine Learning. Model and feature selection
10601 Machine Learning Model and feature selection Model selection issues We have seen some of this before Selecting features (or basis functions) Logistic regression SVMs Selecting parameter value Prior
More informationLast time... Coryn Bailer-Jones. check and if appropriate remove outliers, errors etc. linear regression
Machine learning, pattern recognition and statistical data modelling Lecture 3. Linear Methods (part 1) Coryn Bailer-Jones Last time... curse of dimensionality local methods quickly become nonlocal as
More informationSemi-supervised learning and active learning
Semi-supervised learning and active learning Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012 Combining classifiers Ensemble learning: a machine learning paradigm where multiple learners
More informationTransfer Forest Based on Covariate Shift
Transfer Forest Based on Covariate Shift Masamitsu Tsuchiya SECURE, INC. tsuchiya@secureinc.co.jp Yuji Yamauchi, Takayoshi Yamashita, Hironobu Fujiyoshi Chubu University yuu@vision.cs.chubu.ac.jp, {yamashita,
More informationLarge Scale Data Analysis Using Deep Learning
Large Scale Data Analysis Using Deep Learning Machine Learning Basics - 1 U Kang Seoul National University U Kang 1 In This Lecture Overview of Machine Learning Capacity, overfitting, and underfitting
More informationFeature Selection. CE-725: Statistical Pattern Recognition Sharif University of Technology Spring Soleymani
Feature Selection CE-725: Statistical Pattern Recognition Sharif University of Technology Spring 2013 Soleymani Outline Dimensionality reduction Feature selection vs. feature extraction Filter univariate
More informationMini-project 2 CMPSCI 689 Spring 2015 Due: Tuesday, April 07, in class
Mini-project 2 CMPSCI 689 Spring 2015 Due: Tuesday, April 07, in class Guidelines Submission. Submit a hardcopy of the report containing all the figures and printouts of code in class. For readability
More informationNIH Public Access Author Manuscript Proc IEEE Int Symp Biomed Imaging. Author manuscript; available in PMC 2014 November 15.
NIH Public Access Author Manuscript Published in final edited form as: Proc IEEE Int Symp Biomed Imaging. 2013 April ; 2013: 748 751. doi:10.1109/isbi.2013.6556583. BRAIN TUMOR SEGMENTATION WITH SYMMETRIC
More informationNeural Networks and Deep Learning
Neural Networks and Deep Learning Example Learning Problem Example Learning Problem Celebrity Faces in the Wild Machine Learning Pipeline Raw data Feature extract. Feature computation Inference: prediction,
More informationBIOINF 585: Machine Learning for Systems Biology & Clinical Informatics
BIOINF 585: Machine Learning for Systems Biology & Clinical Informatics Lecture 12: Ensemble Learning I Jie Wang Department of Computational Medicine & Bioinformatics University of Michigan 1 Outline Bias
More informationMachine Learning. Nonparametric methods for Classification. Eric Xing , Fall Lecture 2, September 12, 2016
Machine Learning 10-701, Fall 2016 Nonparametric methods for Classification Eric Xing Lecture 2, September 12, 2016 Reading: 1 Classification Representing data: Hypothesis (classifier) 2 Clustering 3 Supervised
More informationMachine Learning Lecture-1
Machine Learning Lecture-1 Programming Club Indian Institute of Technology Kanpur pclubiitk@gmail.com February 25, 2016 Programming Club (IIT Kanpur) ML-1 February 25, 2016 1 / 18 Acknowledgement This
More informationCSE 158. Web Mining and Recommender Systems. Midterm recap
CSE 158 Web Mining and Recommender Systems Midterm recap Midterm on Wednesday! 5:10 pm 6:10 pm Closed book but I ll provide a similar level of basic info as in the last page of previous midterms CSE 158
More informationNeural Network Neurons
Neural Networks Neural Network Neurons 1 Receives n inputs (plus a bias term) Multiplies each input by its weight Applies activation function to the sum of results Outputs result Activation Functions Given
More informationComputer Vision Group Prof. Daniel Cremers. 6. Boosting
Prof. Daniel Cremers 6. Boosting Repetition: Regression We start with a set of basis functions (x) =( 0 (x), 1(x),..., M 1(x)) x 2 í d The goal is to fit a model into the data y(x, w) =w T (x) To do this,
More informationLars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany
Syllabus Fri. 27.10. (1) 0. Introduction A. Supervised Learning: Linear Models & Fundamentals Fri. 3.11. (2) A.1 Linear Regression Fri. 10.11. (3) A.2 Linear Classification Fri. 17.11. (4) A.3 Regularization
More informationClustering Lecture 5: Mixture Model
Clustering Lecture 5: Mixture Model Jing Gao SUNY Buffalo 1 Outline Basics Motivation, definition, evaluation Methods Partitional Hierarchical Density-based Mixture model Spectral methods Advanced topics
More informationMondrian Forests: Efficient Online Random Forests
Mondrian Forests: Efficient Online Random Forests Balaji Lakshminarayanan Joint work with Daniel M. Roy and Yee Whye Teh 1 Outline Background and Motivation Mondrian Forests Randomization mechanism Online
More informationCS 559: Machine Learning Fundamentals and Applications 10 th Set of Notes
1 CS 559: Machine Learning Fundamentals and Applications 10 th Set of Notes Instructor: Philippos Mordohai Webpage: www.cs.stevens.edu/~mordohai E-mail: Philippos.Mordohai@stevens.edu Office: Lieb 215
More informationMachine Learning (CSE 446): Concepts & the i.i.d. Supervised Learning Paradigm
Machine Learning (CSE 446): Concepts & the i.i.d. Supervised Learning Paradigm Sham M Kakade c 2018 University of Washington cse446-staff@cs.washington.edu 1 / 17 Review 1 / 17 Decision Tree: Making a
More informationComputer Vision Group Prof. Daniel Cremers. 8. Boosting and Bagging
Prof. Daniel Cremers 8. Boosting and Bagging Repetition: Regression We start with a set of basis functions (x) =( 0 (x), 1(x),..., M 1(x)) x 2 í d The goal is to fit a model into the data y(x, w) =w T
More informationBusiness Club. Decision Trees
Business Club Decision Trees Business Club Analytics Team December 2017 Index 1. Motivation- A Case Study 2. The Trees a. What is a decision tree b. Representation 3. Regression v/s Classification 4. Building
More informationClustering and The Expectation-Maximization Algorithm
Clustering and The Expectation-Maximization Algorithm Unsupervised Learning Marek Petrik 3/7 Some of the figures in this presentation are taken from An Introduction to Statistical Learning, with applications
More informationCSC 411 Lecture 4: Ensembles I
CSC 411 Lecture 4: Ensembles I Roger Grosse, Amir-massoud Farahmand, and Juan Carrasquilla University of Toronto UofT CSC 411: 04-Ensembles I 1 / 22 Overview We ve seen two particular classification algorithms:
More informationWeka ( )
Weka ( http://www.cs.waikato.ac.nz/ml/weka/ ) The phases in which classifier s design can be divided are reflected in WEKA s Explorer structure: Data pre-processing (filtering) and representation Supervised
More informationSupervoxel Classification Forests for Estimating Pairwise Image Correspondences
Supervoxel Classification Forests for Estimating Pairwise Image Correspondences Fahdi Kanavati 1, Tong Tong 1, Kazunari Misawa 2, Michitaka Fujiwara 3, Kensaku Mori 4, Daniel Rueckert 1, and Ben Glocker
More informationLecture 9: Support Vector Machines
Lecture 9: Support Vector Machines William Webber (william@williamwebber.com) COMP90042, 2014, Semester 1, Lecture 8 What we ll learn in this lecture Support Vector Machines (SVMs) a highly robust and
More informationConditional Random Fields - A probabilistic graphical model. Yen-Chin Lee 指導老師 : 鮑興國
Conditional Random Fields - A probabilistic graphical model Yen-Chin Lee 指導老師 : 鮑興國 Outline Labeling sequence data problem Introduction conditional random field (CRF) Different views on building a conditional
More informationMachine Learning. A. Supervised Learning A.7. Decision Trees. Lars Schmidt-Thieme
Machine Learning A. Supervised Learning A.7. Decision Trees Lars Schmidt-Thieme Information Systems and Machine Learning Lab (ISMLL) Institute for Computer Science University of Hildesheim, Germany 1 /
More informationClassification/Regression Trees and Random Forests
Classification/Regression Trees and Random Forests Fabio G. Cozman - fgcozman@usp.br November 6, 2018 Classification tree Consider binary class variable Y and features X 1,..., X n. Decide Ŷ after a series
More informationAn introduction to random forests
An introduction to random forests Eric Debreuve / Team Morpheme Institutions: University Nice Sophia Antipolis / CNRS / Inria Labs: I3S / Inria CRI SA-M / ibv Outline Machine learning Decision tree Random
More informationPreface to the Second Edition. Preface to the First Edition. 1 Introduction 1
Preface to the Second Edition Preface to the First Edition vii xi 1 Introduction 1 2 Overview of Supervised Learning 9 2.1 Introduction... 9 2.2 Variable Types and Terminology... 9 2.3 Two Simple Approaches
More informationIntroduction to Machine Learning
Introduction to Machine Learning Brown University CSCI 1950-F, Spring 2012 Prof. Erik Sudderth Lecture 2: Probability: Discrete Random Variables Classification: Validation & Model Selection Many figures
More informationLearning-based Neuroimage Registration
Learning-based Neuroimage Registration Leonid Teverovskiy and Yanxi Liu 1 October 2004 CMU-CALD-04-108, CMU-RI-TR-04-59 School of Computer Science Carnegie Mellon University Pittsburgh, PA 15213 Abstract
More informationLeveling Up as a Data Scientist. ds/2014/10/level-up-ds.jpg
Model Optimization Leveling Up as a Data Scientist http://shorelinechurch.org/wp-content/uploa ds/2014/10/level-up-ds.jpg Bias and Variance Error = (expected loss of accuracy) 2 + flexibility of model
More informationDeep Learning for Computer Vision
Deep Learning for Computer Vision Lecture 7: Universal Approximation Theorem, More Hidden Units, Multi-Class Classifiers, Softmax, and Regularization Peter Belhumeur Computer Science Columbia University
More informationBig Data Methods. Chapter 5: Machine learning. Big Data Methods, Chapter 5, Slide 1
Big Data Methods Chapter 5: Machine learning Big Data Methods, Chapter 5, Slide 1 5.1 Introduction to machine learning What is machine learning? Concerned with the study and development of algorithms that
More informationInstance-based Learning CE-717: Machine Learning Sharif University of Technology. M. Soleymani Fall 2015
Instance-based Learning CE-717: Machine Learning Sharif University of Technology M. Soleymani Fall 2015 Outline Non-parametric approach Unsupervised: Non-parametric density estimation Parzen Windows K-Nearest
More information