Texture Segmentation

Similar documents
Texture Analysis. Selim Aksoy Department of Computer Engineering Bilkent University

Texture. Frequency Descriptors. Frequency Descriptors. Frequency Descriptors. Frequency Descriptors. Frequency Descriptors

TEXTURE. Plan for today. Segmentation problems. What is segmentation? INF 4300 Digital Image Analysis. Why texture, and what is it?

ECE 176 Digital Image Processing Handout #14 Pamela Cosman 4/29/05 TEXTURE ANALYSIS

CHAPTER 4 TEXTURE FEATURE EXTRACTION

Schedule for Rest of Semester

Fundamentals of Digital Image Processing

Image Sampling and Quantisation

Image Sampling & Quantisation

5. Feature Extraction from Images

Texture. Texture is a description of the spatial arrangement of color or intensities in an image or a selected region of an image.

TEXTURE ANALYSIS USING GABOR FILTERS

Region-based Segmentation

CHAPTER 1 Introduction 1. CHAPTER 2 Images, Sampling and Frequency Domain Processing 37

9 length of contour = no. of horizontal and vertical components + ( 2 no. of diagonal components) diameter of boundary B

ELEC Dr Reji Mathew Electrical Engineering UNSW

Digital Image Processing. Lecture # 15 Image Segmentation & Texture

SLIDING WINDOW FOR RELATIONS MAPPING

Feature Extraction and Image Processing, 2 nd Edition. Contents. Preface

Learning and Inferring Depth from Monocular Images. Jiyan Pan April 1, 2009

Image Segmentation and Registration

DIGITAL IMAGE ANALYSIS. Image Classification: Object-based Classification

Classification. Vladimir Curic. Centre for Image Analysis Swedish University of Agricultural Sciences Uppsala University

Lecture 11: Classification

CHAPTER 2 TEXTURE CLASSIFICATION METHODS GRAY LEVEL CO-OCCURRENCE MATRIX AND TEXTURE UNIT

Texture Analysis of Painted Strokes 1) Martin Lettner, Paul Kammerer, Robert Sablatnig

Lecture 8 Object Descriptors

TEXTURE ANALYSIS USING GABOR FILTERS FIL

Image Analysis Lecture Segmentation. Idar Dyrdal

Haralick Parameters for Texture feature Extraction

High Resolution Remote Sensing Image Classification based on SVM and FCM Qin LI a, Wenxing BAO b, Xing LI c, Bin LI d

Digital Image Processing. Prof. P.K. Biswas. Department of Electronics & Electrical Communication Engineering

Image Enhancement Techniques for Fingerprint Identification

Lecture 6: Multimedia Information Retrieval Dr. Jian Zhang

EE795: Computer Vision and Intelligent Systems

Experimentation on the use of Chromaticity Features, Local Binary Pattern and Discrete Cosine Transform in Colour Texture Analysis

Advanced Video Content Analysis and Video Compression (5LSH0), Module 4

CHAPTER 4 FEATURE EXTRACTION AND SELECTION TECHNIQUES

Biometrics Technology: Image Processing & Pattern Recognition (by Dr. Dickson Tong)

Face Detection for Skintone Images Using Wavelet and Texture Features

Texture Image Segmentation using FCM

Practical Image and Video Processing Using MATLAB

CS 4495 Computer Vision. Linear Filtering 2: Templates, Edges. Aaron Bobick. School of Interactive Computing. Templates/Edges

Textural Features for Image Database Retrieval

CS 534: Computer Vision Texture

Texture Analysis and Applications

Digital Image Processing

Image Segmentation. Shengnan Wang

2%34 #5 +,,% ! # %& ()% #% +,,%. & /%0%)( 1 ! # %& % %()# +(& ,.+/ +&0%//#/ &

Periodicity Extraction using Superposition of Distance Matching Function and One-dimensional Haar Wavelet Transform

INTENSITY TRANSFORMATION AND SPATIAL FILTERING

Introduction to Medical Imaging (5XSA0)

Figure 1: Workflow of object-based classification

Edge and Texture. CS 554 Computer Vision Pinar Duygulu Bilkent University

Simultaneous surface texture classification and illumination tilt angle prediction

Statistical texture classification via histograms of wavelet filtered images

then assume that we are given the image of one of these textures captured by a camera at a different (longer) distance and with unknown direction of i

Image and Multidimensional Signal Processing

Chapter 3: Intensity Transformations and Spatial Filtering

EXAM SOLUTIONS. Image Processing and Computer Vision Course 2D1421 Monday, 13 th of March 2006,

Introduction to digital image classification

Image Analysis - Lecture 5

Topic 4 Image Segmentation

Texture Modeling using MRF and Parameters Estimation

CITS 4402 Computer Vision


Edge and local feature detection - 2. Importance of edge detection in computer vision

October 17, 2017 Basic Image Processing Algorithms 3

CCITC Shervan Fekri-Ershad Department of Computer Science and Engineering Shiraz University Shiraz, Fars, Iran

Chapter 11 Representation & Description

[2006] IEEE. Reprinted, with permission, from [Wenjing Jia, Huaifeng Zhang, Xiangjian He, and Qiang Wu, A Comparison on Histogram Based Image

Lecture 4. Digital Image Enhancement. 1. Principle of image enhancement 2. Spatial domain transformation. Histogram processing

A COMPARISON OF WAVELET-BASED AND RIDGELET- BASED TEXTURE CLASSIFICATION OF TISSUES IN COMPUTED TOMOGRAPHY

Digital Image Classification Geography 4354 Remote Sensing

Computer Vision 2. SS 18 Dr. Benjamin Guthier Professur für Bildverarbeitung. Computer Vision 2 Dr. Benjamin Guthier

Statistical Texture Analysis

(Refer Slide Time: 0:51)

Digital Image Processing. Prof. P. K. Biswas. Department of Electronic & Electrical Communication Engineering

Multimedia Computing: Algorithms, Systems, and Applications: Edge Detection

Norbert Schuff VA Medical Center and UCSF

COMPARISION OF NORMAL Vs HERNIATED CERVICAL IMAGES USING GRAY LEVEL TEXTURE FEATURES

EXAM SOLUTIONS. Computer Vision Course 2D1420 Thursday, 11 th of march 2003,

Artifacts and Textured Region Detection

Points Lines Connected points X-Y Scatter. X-Y Matrix Star Plot Histogram Box Plot. Bar Group Bar Stacked H-Bar Grouped H-Bar Stacked

Types of image feature and segmentation

Lecture 6: Edge Detection

IMAGE ANALYSIS, CLASSIFICATION, and CHANGE DETECTION in REMOTE SENSING

Data: a collection of numbers or facts that require further processing before they are meaningful

The role of Fisher information in primary data space for neighbourhood mapping

Noise Model. Important Noise Probability Density Functions (Cont.) Important Noise Probability Density Functions

Segmentation and Grouping

Classification. Vladimir Curic. Centre for Image Analysis Swedish University of Agricultural Sciences Uppsala University

Analysis of Irregularly Shaped Texture Regions 1

What will we learn? Neighborhood processing. Convolution and correlation. Neighborhood processing. Chapter 10 Neighborhood Processing

Digital Image Processing

Feature extraction. Bi-Histogram Binarization Entropy. What is texture Texture primitives. Filter banks 2D Fourier Transform Wavlet maxima points

TEXTURE CLASSIFICATION METHODS: A REVIEW

FEATURE EXTRACTION TECHNIQUES FOR IMAGE RETRIEVAL USING HAAR AND GLCM

Lecture 4: Spatial Domain Transformations

Analysis of Functional MRI Timeseries Data Using Signal Processing Techniques

Transcription:

Texture Segmentation Introduction to Signal and Image Processing Prof. Dr. Philippe Cattin MIAC, University of Basel 1 of 48 22.02.2016 09:20

Contents Contents Abstract 2 1 Introduction What is Texture? 4 What do we need it for? 5 Characteristics of Textures 6 Principle of Texture Segmentation Methods 7 Problems with Texture Segmentation is Scale 8 2 Feature Extraction Feature Extraction 10 2.1 First-Order Histogram Based Features First-Order Histogram Based Features 12 Histogram Features 13 Properties of these Histogram Features 14 First-Order Texture Feature Example 15 2.2 Cooccurrence Matrices Cooccurrence Matrices 17 The Cooccurrence Matrix 18 Cooccurrence Matrix Examples 19 Cooccurrence Matrix Examples (2) 20 Extensions to the Cooccurrence Matrix 21 Cooccurrence Matrix Features 22 Haralick Feature Examples 23 2.3 Fourier Features Fourier Features 25 Fourier Features: Entropy Example 26 Fourier Features (2) 27 Disadvantage of the Fourier Features 28 2.4 Filter Banks Filter Banks 30 2.4.1 Laws Filters Laws Filters 32 2 of 48 22.02.2016 09:20

2.4.2 Gabor Filters Gabor Filters Gabor Filter Power Spectrum Gabor Filter in the Frequency Domain Gabor Filter Example 2.4.3 Eigenfilters Eigenfilters 3 Texture Discrimination Generalisation of Eigenfilters Eigenfilter Example Eigenfilter Example - Periodicity of the Texture Eigenfilter Example - Extract the Sample Vectors Eigenfilter Example - Calculation of the Eigenfilters Eigenfilter Example - Eigenfilter the Flawed Textile Eigenfilter Example - Smooth the Filter Example Eigenfilter Example - Mahalanobis Distance Texture Discrimination 3.1 Region Growing Region Growing Methods Region Growing by Pixel Aggregation 3.2 Other Methods Other Methods 3.3 Reduction of Dimensionality Reduction of Dimensionality 4 Texture Classification Texture Classification 5 References References 34 35 36 37 38 40 41 42 43 44 45 46 47 48 50 52 53 55 57 59 61 3 of 48 22.02.2016 09:20

Abstract (2) 4 of 48 22.02.2016 09:20

Introduction What is Texture? (4) The term texture is difficult to define, but it represents aspects of the surface pattern, such as coarseness, directionality, brightness, colour, regularity, and so forth. Texture is a phenomenon that is widespread, easy to recognise and hard to define. Forsyth/Ponce, 2002. 5 of 48 22.02.2016 09:20

Introduction What do we need it for? (5) Texture features can be used to segment scenes into parts. In remote sensing it is quite usual to segment scenes into different types of land, such as forrest, vineyards, urban, streets, lakes,... Fig 8.1: Segmentation example (forrest, cultural land, other), ETH Zurich 6 of 48 22.02.2016 09:20

Introduction Characteristics of Textures (6) Already form the small set of textures some differences can be observed. 1. 2. 3. 4. (a)+(b) are regular and contain some basic patter (texel) that is repeated in some way (placement rule) (a)'s texel is larger than (b)'s. (a) is thus said to be coarser It is even more subtle, as texture (a) has subpatterns (micro-texture) hierarchical organisation Texture (a) has a dominant diagonal organisation, (b) has a vertical orientation, (c)+(d) have no apparent orientation (a) (b) (c) (d) Fig 8.2: Sample textures 7 of 48 22.02.2016 09:20

Introduction Principle of Texture Segmentation Methods (7) Texture analysis methods typically probe for the aforementioned characteristics like regularity, coarseness, and directionality. Texture analysis can be split in three fundamental steps as shown in Fig 8.3. 1. Feature extraction: Extract the texture measures, that numerically describe the texture properties 2. Texture discrimination: partition a textured image into regions, each corresponding to a perceptually homogeneous texture 3. Texture classification: determine to which of a finite number of physically defined classes a homogeneous textured region belongs Fig 8.3: Fundamentals steps in texture analysis 8 of 48 22.02.2016 09:20

Introduction Problems with Texture Segmentation is Scale (8) (a) Segment forest, water,... (b) Segment fields, forest, streets,... (c) Segment small groups of trees, grass,... (d) Segment the leafs (e) Segment structure of the leaf (f) Segment the cells 9 of 48 22.02.2016 09:20

Feature Extraction Feature Extraction (10) The approaches to extract the texture measures are typically categorised into 1. statistical, 2. structural, 3. transform, and 4. model-based approaches. Fig 8.4: Typology of the texture measures commonly used in texture analysis 10 of 48 22.02.2016 09:20

First-Order Histogram Based Features First-Order Histogram Based Features Given an image of two space variables and. The function can take the discrete values, where is the number of intensity levels in the image. The intensity level histogram is a function showing the number of pixels in the whole image with this intensity level: (a) (12) (8.1) where is the Kronecker delta function (8.2) (b) Fig 8.5: (a) Texture, (b) corresponding histogram The probability density function (PDF) can be approximated by deviding by the total number of pixels, thus 11 of 48 22.02.2016 09:20

First-Order Histogram Based Features Histogram Features (13) The shape of the histogram can subsequently be used to characterise the image. For example, a narrowly distributed histogram indicates a low-contrast image, and a bimodal histogram often suggests a low contrast object against a differing background. To quantitatively describe the image, different parameters (image features) can be extracted from the histogram. Mean: Skewness: Energy: Variance: Kurtosis: Entropy: 12 of 48 22.02.2016 09:20

First-Order Histogram Based Features Properties of these Histogram Features (14) The mean takes the average level of intensity in the image The variance describes the variation of the intensity around the mean is a measure of contrast The energy is a measure of peakness The skewness is zero if the histogram is symmetrical about the mean indicator of symmetry The kurtosis is a measure of flatness of the histogram for Gaussian shaped histograms The entropy is a measure of histogram uniformity The mean and variance do not actually carry direct information about the texture. They rather depend on the image acquisition process. One often gets better texture segmentation performance if the images are normalised to a mean of and a standard deviation of. 13 of 48 22.02.2016 09:20

First-Order Histogram Based Features First-Order Texture Feature Example (15) Mean Variance (a) Skewness (b) Kurtosis Energy (c) Entropy (d) Fig 8.6: (a) Texture 1, (b) Texture 2, (c) Histogram of texture 1, (d) Histogram of texture 2 14 of 48 22.02.2016 09:20

Cooccurrence Matrices Cooccurrence Matrices (17) The histogram as used in the previous section captures intensity information in a highly compact, but also highly incomplete way. As Fig. 8.7 shows, two textures with a completely different visual appearance can have the same gray-scale histogram. All information about configurational aspects of the intensity distribution is lost. (a) Texture 1 (b) Histogram of texture 1 The Cooccurrence matrix, in contrast, is a related data structure that bases on second-order statistics and preserves some aspects of the spatial configurations. (c) Texture 2 (d) Histogram of texture 2 Fig 8.7: Completely different textures can have the same gray-scale histogram 15 of 48 22.02.2016 09:20

Cooccurrence Matrices The Cooccurrence Matrix (18) The Cooccurrence matrix is in the literature also known as the GLCM (Gray-Level Co-occurrence Matrices), or the Spatial dependence matrix. Mathematically, the cooccurrence matrix is defined over an image, parametrised by an offset as (8.3) The cooccurrence matrix is extracted for a specific vector. Fig 8.8: Construction of the cooccurrence matrix 16 of 48 22.02.2016 09:20

Cooccurrence Matrices Cooccurrence Matrix Examples (19) Fig 8.9: Cooccurrence matrix sample of a texture with a vector ( does correspond to a period of the texture. ) that The high values in the cooccurrence matrix are concentrated in a central location. 17 of 48 22.02.2016 09:20

Cooccurrence Matrices Cooccurrence Matrix Examples (2) (20) Fig 8.10: Cooccurrence matrix sample of a texture with a vector ( ) that does not correspond to a period of the texture. In contrast to the previous example, the values in the cooccurrence matrix are more smeared out. 18 of 48 22.02.2016 09:20

Cooccurrence Matrices Extensions to the Cooccurrence Matrix (21) Several intuitive extension of the Cooccurrence matrix are commonly used in pactice: 1. To analyse a texture, one can build several such matrices, each for a different choice of vector if the length of the vector is varied, a certain degree of scale invariance could be achieved. 2. One can also use several vectors at once and put the results into a single matrix vectors of the same length but with a different orientation combined results in a rotation invariant cooccurrence matrix. Fig 8.11: Rotation invariant cooccurrence matrix 19 of 48 22.02.2016 09:20

Cooccurrence Matrices Cooccurrence Matrix Features (22) Retaining the entire cooccurrence matrix is not efficient. typically, the widely spread Haralick features are extracted for later classification: Energy: Entropy: Contrast: Homogeneity: Max probability: Energy and entropy are both measures of how concentrated the entries occur. Energy is high if has strong peaks, and the entropy reaches a max if all entries are equal probable. Contrast and homogeneity are more specifically oriented towards the entries being concentrated near the diagonal (then contrast is small and homogeneity is large). 20 of 48 22.02.2016 09:20

Cooccurrence Matrices Haralick Feature Examples (23) Energy Entropy (a) Contrast (b) Homogeneity Max. Probability (d) (e) 21 of 48 22.02.2016 09:20

Fourier Features Fourier Features (25) Frequency related concepts like a texture being periodic or coarse-grained led to features extracted from the Fourier power spectrum. If a texture is highly regular (periodic) it can be expected to have dominant peaks in its power spectrum. To evaluate how strongly peaked a spectrum is, one can calculate its entropy. The term Entropy originates from Physics as was later adapted to Information Theory and is a measure for the amount of information stored in a particular signal. It is defined as (8.4) where is the probability density function (PDF) estimated with the histogram of the powerspectrum, is usually 2 yielding the unit bit for the entropy. Fig 8.12: Sample texture Fig 8.13: Corresponding Fourier power spectrum 22 of 48 22.02.2016 09:20

Fourier Features Fourier Features: Entropy Example (26) Example of the Fourier power spectra and its entropy of four different textures. Entropy: Entropy: Entropy: Entropy: Fig 8.14 Power spectrum and entropy example 23 of 48 22.02.2016 09:20

Fourier Features Fourier Features (2) (27) Other commonly used Fourier features use integrated power values to assess the textures coarseness and its directionality. Integration is performed over two different kinds of regions in the Fourier domain, namely Fig 8.15: concentric rings around the origin to assess the coarseness, and sectors from the apex to assess the directionality (8.5) Fig 8.16: (8.6) 24 of 48 22.02.2016 09:20

Fourier Features Disadvantage of the Fourier Features (28) As the Fourier power spectrum is typically calculated over the entire image, these features collect global information instead of localised texture properties. 25 of 48 22.02.2016 09:20

Filter Banks Filter Banks (30) A simple but very effective method for extracting texture information is to convolve the image with a series of convolution filters. The output of these Filter banks are then combined into a feature vector used for classification. Fig 8.17: Principle of filter banks 26 of 48 22.02.2016 09:20

Laws Filters Laws Filters (32) The Laws Filters use a fixed set of ad hoc filters of size,, and, created by the convolution of row and column filters with coefficients specified in Table 8.1. Usually, the output of these filters is non-linearly transformed and then averaged over larger regions in order to get an energy. The energies for the different filters are then combined into a feature vector, that can be used for segmentation or classification. Tab 8.1: 1D Laws Filter 27 of 48 22.02.2016 09:20

Gabor Filters Gabor Filters (34) In contrast to the ad hoc chosen Law filters, the Gabor filters offer a more principled alternative with a solid mathematical background. The Gabor filters are constructed by modulating a Gaussian curve with a sine function. This yields for vertically oriented Gabor filters: (8.7) where the parameter determines the width of the Gaussian envelope, specifies the frequency of the modulating cosine, while specifies the phase. : Fig 8.18: A Gabor filter profile (in the spatial domain) with, and Filters with other orientations can be easily obtained by applying a rigid rotation of the -plane. 28 of 48 22.02.2016 09:20

Gabor Filters Gabor Filters (35) Guess what the Power Spectrum of the Gabor filter looks like 29 of 48 22.02.2016 09:20

Gabor Filters Gabor Filter Power Spectrum (36) The Gabor filter samples the Fourier domain as can be seen from its Fourier transform (8.8) where. The corresponding power spectrum plot is depicted in Fig 8.19. The Gabor filter being a product of a Gaussian and a cosine, it's Fourier transform is known to be the convolution of a Gaussian and a pair of Dirac impulses. If we rotate the filter, the positions of these two lobes will undergo the same rotation in the frequency domain. Fig 8.19: Fourier power spectrum of the Gabor filter 30 of 48 22.02.2016 09:20

Gabor Filters Gabor Filter in the Frequency Domain (37) By increasing the number of parameters, the Gaussian lobes can be made elliptical. Fig 8.20 shows how the Gaussian profile can be made elliptical to optimally tessellate the frequency domain. Gabor filters provide the best possible simultaneous localisation in the spatial and the frequency domain, i.e.. Fig 8.20: Gabor filter in the frequency domain Fig 8.21: Gabor filter in the frequency domain 31 of 48 22.02.2016 09:20

Gabor Filters Gabor Filter Example (38) (a) (b) (c) Fig 8.22: (a) Input texture, (b) output of a Gabor filter tuned to detect horizontal structures, (c) same for vertical structures. 32 of 48 22.02.2016 09:20

Eigenfilters Eigenfilters (40) The Laws and Gabor filter banks have fixed coefficients. If a specific texture should be analysed, it can be interesting to finetune the filter set for this texture. Basic Idea 1. Consider a small neighbourhood e.g. pixels 2. All 9 pixels in this mask are put in a 9-vector 3. When shifting the mask over the image many such samples of this 9-vector are acquired 4. Calculate the corresponding covariance matrix 5. The principal components yield a new orthogonal basis for these 9-vectors 6. Arranging the components of all eigenvectors (principal components) yields in total 9 convolution filters of size Fig 8.23: Principle of the eigenfilter design 33 of 48 22.02.2016 09:20

Eigenfilters Generalisation of Eigenfilters (41) In most practical cases it is useful to also take the scale of the texture into account. This is done by extracting the periodicity of the pattern with e.g. autocorrelation and then adapting the size of the filters accordingly. This will generally result in large and therefore unpractical convolution masks. A way out is to consider sparse masks, i.e. masks where most of the elements are set to zero, see Fig 8.24. Fig 8.24: Structure of the eigenfilter convolution mask. The black squares indicate positions in the convolution mask that have non-zero coefficients 34 of 48 22.02.2016 09:20

Eigenfilters Eigenfilter Example (42) Task Develop a computationally efficient filter that can reliably detect such flaws. Fig 8.25: Textile sample with a flaw 35 of 48 22.02.2016 09:20

Eigenfilters Eigenfilter Example - Periodicity of the Texture (43) %load the flawless image img=imread('textile.png'); img=double(img(:,:,1)); imshow(img,[]); % Autocorrelation the image % with a small patch of itself r=imfilter(img,img(1:30,1:30)); % plot the 1D graphs in x- and % y-direction plot(r(:,100)); plot(r(100,:)); % read out the periodicity dy = 7; dx = 9; As the horizontal periodicity was found to be pixels and the vertical periodicity was found as, a convolution mask of size is used. To simplify the calculations we try a sparse convolution mask with the arrangement as in Fig 8.24. Fig 8.26: Correct textile sample Fig 8.27: Autocorrelation result in x-direction Fig 8.28: Autocorrelation result in y-direction 36 of 48 22.02.2016 09:20

Eigenfilters Eigenfilter Example - Extract the Sample Vectors (44) In the second step we shift the sparse mask over the entire image of the flawless textile image and extract the 9-element vectors at each location. These sample vectors are then stored in the matrix. [m n] = size(img); X=zeros((n-dx)*(m-dy),9); w=[1 0 0 0 1 0 0 0 1; 0 0 0 0 0 0 0 0 0; 0 0 0 0 0 0 0 0 0; 1 0 0 0 1 0 0 0 1; 0 0 0 0 0 0 0 0 0; 0 0 0 0 0 0 0 0 0; 1 0 0 0 1 0 0 0 1]; w=find(w>0); [m n] = size(img); pos=1; for x=1:n-dx for y=1:m-dy tmp=img(y:y+dy-1,x:x+dx-1); X(pos,:)=tmp(w)'; pos=pos+1; end end Fig 8.29: Extraction of the sample vectors 37 of 48 22.02.2016 09:20

Eigenfilters Eigenfilter Example - Calculation of the Eigenfilters (45) The Eigenfilters can the easily calculated with the MATLAB function princomp. [pc,score,lambda] = princomp(x,'econ'); Fig 8.30: Construction of the eigenfilters 38 of 48 22.02.2016 09:20

Eigenfilters Eigenfilter Example - Eigenfilter the Flawed Textile (46) In this step we filter the flawed image with all the eigenfilters we previously found. flawed = imread('textile_flaw.png'); flawed = double(flawed(:,:,1)); flawed_flt = zeros(m,n,9); for i = 1:9 kern = zeros(7,9); kern(w) = pc(:,i); flawed_flt(:,:,i) = imfilter(flawed,kern); figure, imshow(flawed_flt(:,:,i), []); end It can be clearly seen, that the flaw shows up in several of the eigenfiltred images. Filtered with PC 1 Filtered with PC 2 Filtered with PC 3 Filtered with PC 4 Filtered with PC 5 Filtered with PC 6 Filtered with PC 7 Filtered with PC 8 Filtered with PC 9 39 of 48 22.02.2016 09:20

Eigenfilters Eigenfilter Example - Smooth the Filter Example (47) The aim of this step is to generate images with more homogeneous intensities. This is achieved by squaring all values and boxfiltering the image with a kernel of size. flawed_pwr = zeros(m,n,9); for i = 1:9 flawed_pwr(:,:,i) = imfilter(flawed_flt(:,:,i).^2,ones(5)); figure, imshow(flawed_pwr(:,:,i),[]); end Local energy filtered with PC 1 Local energy filtered with PC 2 Local energy filtered with PC 3 Local energy filtered with PC 5 Local energy filtered with PC 6 Local energy filtered with PC 7 Local energy filtered with PC 4 Local energy filtered with PC 8 Local energy filtered with PC 9 40 of 48 22.02.2016 09:20

Eigenfilters Eigenfilter Example - Mahalanobis Distance (48) In this last step, the energy values are combined into a single distance (Mahalanobis) to the typical energies. For this we need the energies of the flawless textile: intact_pwr = zeros(m,n,9); for i = 1:9 kern = zeros(7,9); kern(w) = pc(:,i); Fig 8.31: Mahalanobis distance for the image in Fig 8.25 tmp=imfilter(imfilter(img,kern).^2,ones(5)); figure, imshow(tmp,[]); intact_pwr(:,:,i) = tmp; end then we calculate the Mahalanobis distance of every point of the defective textile to the energy values seen in the flawless textile image. intact_pwr2 = reshape(intact_pwr,n*m,9); flawed_pwr2 = reshape(flawed_pwr,n*m,9); r = mahal(flawed_pwr2,intact_pwr2); out = reshape(r,m,n); figure, imshow(out,[]); 41 of 48 22.02.2016 09:20

Texture Discrimination Texture Discrimination (50) In Texture discrimination the extracted texture measures are analysed and the image split into regions, each corresponding to a perceptually homogeneous texture. In texture segmentation we always face the trade-off between sample size and accuracy. The bigger the sample size, the better the accuracy of feature estimation; however, this allows a coarse segmentation only. Fig 8.33: Texture measures for small regions or even for individual pixels are known Fig 8.32: Texture segmentation principle: texture discrimination Fig 8.34: Regions with similar texture measures are grouped together 42 of 48 22.02.2016 09:20

Region Growing Region Growing Methods Segmentation may be viewed as a process that partitions subregions,, such that (52) into (a) (b) is a connected region, (c) for all and, (d) for, and (e) for where is a logical predicate over points in the set. 43 of 48 22.02.2016 09:20

Region Growing Region Growing by Pixel Aggregation (53) The simplest Region growing approach is pixel aggregation, which starts with a set of seed points and from these grows regions by appending to each seed point those neighbouring pixels that have similar properties i.e. texture measures. (a) Step 1 (b) Step 2 (c) Step 3 (d) Step 4 Fig 8.35: Region growing by pixel aggregation using an absolute difference of less than 3 between intensity levels and 4-connectivity 44 of 48 22.02.2016 09:20

Other Methods Other Methods (55) Below is a list other commonly used texture discrimination methods Watershed k-means clustering Bayesian classification Artificial neural networks (ANN) Estimation Theory - Maximum likelihood Split-and-Merge 45 of 48 22.02.2016 09:20

Reduction of Dimensionality Reduction of Dimensionality (57) In many cases it is desirable to prior reduce the dimensionality of the extracted texture measures. PCA is the method of choice as can be seen in Fig 8.36. Fig 8.36: Successfull dimensionality reduction with PCA Fig 8.37: Failed PCA dimensionality reduction Beware: in some special cases dimensionality reduction with PCA fails, see Fig 8.37. 46 of 48 22.02.2016 09:20

Texture Classification Texture Classification (59) Once the texture features have been extracted and regions with similar properties grouped, all that remains is to classify the different regions into known textures. The following methods are given as a reference but will not be discussed in this lecture Bayesian Decision Theory Artificial Neural Networks (ANN) Support Vector Machines (SVM) Fig 8.38: Texture segmentation principle: Texture classification 47 of 48 22.02.2016 09:20

References References (61) Robert M Haralick, K Shanmugam, Its'hak Dinstein (1973). "Textural Features for Image Classification". IEEE Transactions on Systems, Man, and Cybernetics SMC-3 (6): 610-621. 48 of 48 22.02.2016 09:20