EE795: Computer Vision and Intelligent Systems

Similar documents
ECG782: Multidimensional Digital Signal Processing

ECG782: Multidimensional Digital Signal Processing

Chapter 3: Intensity Transformations and Spatial Filtering

EEM 463 Introduction to Image Processing. Week 3: Intensity Transformations

Digital Image Processing. Image Enhancement - Filtering

Filtering and Enhancing Images

Sharpening through spatial filtering

Babu Madhav Institute of Information Technology Years Integrated M.Sc.(IT)(Semester - 7)

CHAPTER 3 IMAGE ENHANCEMENT IN THE SPATIAL DOMAIN

Lecture 4: Spatial Domain Transformations

What will we learn? Neighborhood processing. Convolution and correlation. Neighborhood processing. Chapter 10 Neighborhood Processing

SYDE 575: Introduction to Image Processing

C E N T E R A T H O U S T O N S C H O O L of H E A L T H I N F O R M A T I O N S C I E N C E S. Image Operations II

2D Image Processing INFORMATIK. Kaiserlautern University. DFKI Deutsches Forschungszentrum für Künstliche Intelligenz

Image processing. Reading. What is an image? Brian Curless CSE 457 Spring 2017

Biometrics Technology: Image Processing & Pattern Recognition (by Dr. Dickson Tong)

Biomedical Image Analysis. Mathematical Morphology

Morphological Image Processing

EECS 556 Image Processing W 09. Image enhancement. Smoothing and noise removal Sharpening filters

Morphological Image Processing

Digital Image Processing, 2nd ed. Digital Image Processing, 2nd ed. The principal objective of enhancement

Outlines. Medical Image Processing Using Transforms. 4. Transform in image space

Vivekananda. Collegee of Engineering & Technology. Question and Answers on 10CS762 /10IS762 UNIT- 5 : IMAGE ENHANCEMENT.

SECTION 5 IMAGE PROCESSING 2

Mathematical Morphology and Distance Transforms. Robin Strand

09/11/2017. Morphological image processing. Morphological image processing. Morphological image processing. Morphological image processing (binary)

Biomedical Image Analysis. Spatial Filtering

[ ] Review. Edges and Binary Images. Edge detection. Derivative of Gaussian filter. Image gradient. Tuesday, Sept 16

Lecture: Edge Detection

Classification of image operations. Image enhancement (GW-Ch. 3) Point operations. Neighbourhood operation

CS334: Digital Imaging and Multimedia Edges and Contours. Ahmed Elgammal Dept. of Computer Science Rutgers University

Filters. Advanced and Special Topics: Filters. Filters

Filtering Images. Contents

Digital Image Processing

CS443: Digital Imaging and Multimedia Binary Image Analysis. Spring 2008 Ahmed Elgammal Dept. of Computer Science Rutgers University

Image Enhancement in Spatial Domain (Chapter 3)

C2: Medical Image Processing Linwei Wang

Lecture 4: Image Processing

Edges and Binary Images

CS4442/9542b Artificial Intelligence II prof. Olga Veksler

Image Processing. Traitement d images. Yuliya Tarabalka Tel.

Digital Image Processing

CS4442/9542b Artificial Intelligence II prof. Olga Veksler

Computer Vision 2. SS 18 Dr. Benjamin Guthier Professur für Bildverarbeitung. Computer Vision 2 Dr. Benjamin Guthier

Detection of Edges Using Mathematical Morphological Operators

Digital Image Processing. Prof. P. K. Biswas. Department of Electronic & Electrical Communication Engineering

1.Some Basic Gray Level Transformations

Image Processing. Bilkent University. CS554 Computer Vision Pinar Duygulu

Sampling and Reconstruction

Lecture 6: Edge Detection

Lecture 7: Morphological Image Processing

EELE 5310: Digital Image Processing. Lecture 2 Ch. 3. Eng. Ruba A. Salamah. iugaza.edu

Lecture 7: Most Common Edge Detectors

Computer Vision I - Basics of Image Processing Part 1

EECS490: Digital Image Processing. Lecture #19

EELE 5310: Digital Image Processing. Ch. 3. Eng. Ruba A. Salamah. iugaza.edu

Morphological Image Processing

Chapter 11 Representation & Description

EE 584 MACHINE VISION

Linear Operations Using Masks

Image Enhancement: To improve the quality of images

11. Gray-Scale Morphology. Computer Engineering, i Sejong University. Dongil Han

Point and Spatial Processing

Digital Image Processing

CS 4495 Computer Vision. Linear Filtering 2: Templates, Edges. Aaron Bobick. School of Interactive Computing. Templates/Edges

CITS 4402 Computer Vision

Anno accademico 2006/2007. Davide Migliore

Topic 4 Image Segmentation

Introduction. Computer Vision & Digital Image Processing. Preview. Basic Concepts from Set Theory

Biomedical Image Analysis. Point, Edge and Line Detection

Introduction to Digital Image Processing

Image Processing Fundamentals. Nicolas Vazquez Principal Software Engineer National Instruments

Image gradients and edges April 11 th, 2017

Filtering and Edge Detection. Computer Vision I. CSE252A Lecture 10. Announcement

3.4& Fundamentals& mechanics of spatial filtering(page 166) Spatial filter(mask) Filter coefficients Filter response

Edge and local feature detection - 2. Importance of edge detection in computer vision

Image gradients and edges April 10 th, 2018

From Pixels to Blobs

Computer Vision I. Announcements. Fourier Tansform. Efficient Implementation. Edge and Corner Detection. CSE252A Lecture 13.

Digital Image Processing. Image Enhancement in the Spatial Domain (Chapter 4)

11/10/2011 small set, B, to probe the image under study for each SE, define origo & pixels in SE

Point operation Spatial operation Transform operation Pseudocoloring

Edge detection. Stefano Ferrari. Università degli Studi di Milano Elaborazione delle immagini (Image processing I)

Processing of binary images

Introduction to Medical Imaging (5XSA0)

CS534: Introduction to Computer Vision Edges and Contours. Ahmed Elgammal Dept. of Computer Science Rutgers University

EXAM SOLUTIONS. Image Processing and Computer Vision Course 2D1421 Monday, 13 th of March 2006,

CSci 4968 and 6270 Computational Vision, Fall Semester, 2011 Lectures 2&3, Image Processing. Corners, boundaries, homogeneous regions, textures?

INTENSITY TRANSFORMATION AND SPATIAL FILTERING

Image Restoration and Reconstruction

Morphological Image Processing GUI using MATLAB

Filtering Images in the Spatial Domain Chapter 3b G&W. Ross Whitaker (modified by Guido Gerig) School of Computing University of Utah

Point Operations and Spatial Filtering

Binary Image Processing. Introduction to Computer Vision CSE 152 Lecture 5

Morphological Image Processing

Computer Vision I - Filtering and Feature detection

morphology on binary images

CSci 4968 and 6270 Computational Vision, Fall Semester, Lectures 2&3, Image Processing

Chapter 9 Morphological Image Processing

CS4733 Class Notes, Computer Vision

Transcription:

EE795: Computer Vision and Intelligent Systems Spring 2012 TTh 17:30-18:45 WRI C225 Lecture 04 130131 http://www.ee.unlv.edu/~b1morris/ecg795/

2 Outline Review Histogram Equalization Image Filtering Linear Filtering Morphology Connected Components

3 Histogram Processing Digital image histogram is the count of pixels in an image having a particular value in range [0, L 1] h r k = n k r k - the kth gray level value Set of r k are known as the bins of the histogram n k - the numbers of pixels with kth gray level Empirical probability of gray level occurrence is obtained by normalizing the histogram p r k = n k /n n total number of pixels

4 Histogram Example x-axis intensity value Bins [0, 255] y-axis count of pixels Dark image Concentration in lower values Bright image Concentration in higher values Low-contrast image Narrow band of values High-contrast image Intensity values in wide band

5 Histogram Equalization Assume continuous functions (rather than discrete images) Define a transformation of the intensity values to equalize each pixel in the image s = T r 0 r 1 Notice: intensity values are normalized between 0 and 1 The inverse transformation is given as r = T 1 s 0 s 1 Viewing the gray level of an image as a random variable p s (s)=p r (r) dr ds Let s by the cumulative distribution function (CDF) r 0 s = T r = p r w dw Then ds dr = p r(r) Which results in a uniform PDF for the output intensity p s s = 1 Hence, using the CDF of a histogram will equalize an image Make the resulting histogram flat across all intensity levels

6 Discrete Histogram Equalization The probability density is approximated by the normalized histogram p r r k = n k k = 0,, L 1 n The discrete CDF transformation is k j=0 s k = T r k = p r (r j ) s k = k j=0 n k n This transformation does not guarantee a uniform histogram in the discrete case It has the tendency to spread the intensity values to span a larger range

7 Histogram Equalization Example Histograms have wider spread of intensity levels Notice the equalized images all have similar visual appearance Even though histograms are different Contrast enhancement Original histogram original image histogram equalized equalized image

8 Local Histogram Enhancement Global methods (like histogram equalization as presented) may not always make sense What happens when properties of image regions are different? Original image Block histogram equalization Compute histogram over smaller windows Break image into blocks Process each block separately Notice the blocking effects that cause noticeable boundary effects

9 Local Enhancement Compute histogram over a block (neighborhood) for every pixel in a moving window Adaptive histogram equalization (AHE) is a computationally efficient method to combine block based computations through interpolation Figure 3.8 Locally adaptive histogram equalization: (a) original image; (b) block histogram equalization; (c) full locally adaptive equalization.

10 Image Processing Motivation Image processing is useful for the reduction of noise Replace a pixel by the average value in a neighborhood Common types of noise Salt and pepper random occurrences of black and white pixels Impulse random occurrences of white pixels Gaussian variations in intensity drawn from normal distribution Adapted from S. Seitz

11 Linear Filtering Most common type of neighborhood operator Output pixel is determined as a weighted sum of input pixel values g x, y = k,l f x + k, y + l w(k, l) w is known as the kernel, mask, filter, template, or window w(k, l) entry is known as a kernel weight or filter coefficient This is also known as the correlation operator g = f w Linear filtering (correlation) is the same as convolution from signal processing classes Convolution you flip your kernel but in correlation there is no flip

12 Filtering Operation g x, y = f x + k, y + l w(k, l) k,l The filter mask is moved from point to point in an image The response is computed based on the sum of products of the mask coefficients and image Notice the mask is centered at w 0,0 Usually we use odd sized masks so that the computation is symmetrically defined Matlab commands imfilter.m, filter2.m, conv2.m

13 Filtering Process g x, y = f x + k, y + l w(k, l) k,l 0 0 0 90 0 90 90 90 0 0 0 0 90 0 0 0 0 0 0 0 f(x, y) g(x, y)

14 Computational Requirements Convolution requires K 2 operations per pixel for a K K size filter Total operations on an image is M N K 2 This can be computationally expensive for large K Cost can be greatly improved if the kernel is separable First do 1D horizontal convolution Follow with 2D vertical convolution Separable kernel w = vh T v vertical kernel h - horizontal kernel Defined by outer product Can approximate a separable kernel using singular value decomposition (SVD) Truly separable kernels will only have one non-zero singular value

15 Smoothing Filters Smoothing filters are used for blurring and noise reduction Blurring is useful for small detail removal (object detection), bridging small gaps in lines, etc. These filters are known as lowpass filters Higher frequencies are attenuated What happens to edges?

16 Linear Smoothing Filter The simplest smoothing filter is the moving average or box filter Computes the average over a constant neighborhood This is a separable filter Weighted average kernel (bilinear) - places more emphasis on closer pixels More local consistency Gaussian kernel - an approximation of a Gaussian function Has variance parameter to control the kernel width

17 Mean Filtering g x, y = f x + k, y + l w(k, l) k,l 0 0 0 90 0 90 90 90 0 0 0 0 90 0 0 0 0 0 0 0 w x, y - mean of 3 3 box 0 f(x, y) g(x, y)

18 Mean Filtering g x, y = f x + k, y + l w(k, l) k,l w x, y - mean of 3 3 box 0 0 0 90 0 90 90 90 0 0 0 0 90 0 0 0 0 0 0 0 0 10 f(x, y) g(x, y)

19 Mean Filtering g x, y = f x + k, y + l w(k, l) k,l w x, y - mean of 3 3 box 0 10 0 0 0 90 0 90 90 90 0 0 80 0 0 90 0 0 0 0 0 0 0 f(x, y) g(x, y)

20 Mean Filtering g x, y = f x + k, y + l w(k, l) k,l w x, y - mean of 3 3 box 0 10 0 0 0 90 0 90 90 90 0 0 80 0 0 90 0 0 0 0 0 0 0 10 f(x, y) g(x, y)

21 Mean Filtering g x, y = f x + k, y + l w(k, l) k,l 0 0 0 90 0 90 90 90 0 0 0 0 90 0 0 0 0 0 0 0 f(x, y) w x, y - mean of 3 3 box 0 10 20 30 30 30 20 10 0 20 40 60 60 60 40 20 0 30 60 90 90 90 60 30 0 30 50 80 80 90 60 30 0 30 50 80 80 90 60 30 0 20 30 50 50 60 40 20 10 20 30 30 30 30 20 10 10 10 10 0 0 0 0 0 g(x, y)

22 Smoothing Examples Object detection

23 Median Filtering Sometimes linear filtering is not sufficient Non-linear neighborhood operations are required Median filter replaces the center pixel in a mask by the median of its neighbors Non-linear operation, computationally more expensive Provides excellent noise-reduction with less blurring than smoothing filters of similar size (edge preserving) For impulse and salt-and-pepper noise

24 Bilateral Filtering Combine the idea of a weighted filter kernel with a better version of outlier rejection α-trimmed mean calculates average in neighborhood excluding the α fraction that are smallest or largest w i, j, k, l = d(i, j, k, l) r(i, j, k, l) d(i, j, k, l) - domain kernel specifies distance similarity between pixels (usually Gauassian) r(i, j, k, l) range kernel specifies appearance (intensity) similarity between pixels

Bilateral Filtering Example 25

26 Sharpening Filters Sharpening filters are used to highlight fine detail or enhance blurred detail Smoothing we saw was averaging This is analogous to integration Since sharpening is the dual operation to smoothing, it can be accomplished through differentiation

27 Digital Derivatives Derivatives of digital functions are defined in terms of differences Various computational approaches Discrete approximation of a derivative f x f x = f x + 1 f(x) = f x + 1 f(x 1) Center symmetric Second-order derivative 2 f x2 = f x + 1 + f x 1 2f(x)

28 Difference Properties 1 st derivative Zero in constant segments Non-zero at intensity transition Non-zero along ramps 2 nd derivative Zero in constant areas Non-zero at intensity transition Zero along ramps 2 nd order filter is more aggressive at enhancing sharp edges Outputs different at ramps 1 st order produces thick edges 2 nd order produces thin edges Notice: the step gets both a negative and positive response in a double line

29 The Laplacian 2 nd derivatives are generally better for image enhancement because of sensitivity to fine detail The Laplacian is simplest isotropic derivative operator 2 f = 2 f x 2 + 2 f y 2 Isotropic rotation invariant Discrete implementation using the 2 nd derivative previously defined 2 f x2 = f x + 1, y + f x 1, y 2f(x, y) 2 f x2 = f x, y + 1 + f x, y 1 2f x, y 2 f = f x + 1, y + f x 1, y + f x, y + 1 + f x, y 1 4f(x, y)

30 Discrete Laplacian Zeros in corners give isotropic results for rotations of 90 Non-zeros corners give isotropic results for rotations of 45 Include diagonal derivatives in Laplacian definition Center pixel sign indicates light-to-dark or dark-to-light transitions

31 Sharpening Images Sharpened image created by addition of Laplacian g x, y = f x, y 2 f(x, y) w 0,0 < 0 f x, y + 2 f(x, y) w 0,0 > 0 Notice: the use of diagonal entries creates much sharper output image How can we compute g(x, y) in one filter pass without the image addition? Think of a linear system

32 Unsharp Masking Edges can be obtained by subtracting a blurred version of an image f us x, y = f x, y f x, y Blurred image f x, y = h blur f(x, y) Sharpened image f s x, y = f x, y + γf us x, y

33 The Gradient 1 st derivatives can be useful for enhancement of edges Useful preprocessing before edge extraction and interest point detection The gradient is a vector indicating edge direction f = G x G y = f x f y The gradient magnitude can be approximated as f G x + G y This give isotropic results for rotations of 90 Sobel operators Have directional sensitivity Coefficients sum to zero Zero response in constant intensity region G y G x

34 Morphological Image Processing Filtering done on binary images Images with two values [0,1], [0, 255], [black,white] Typically, this image will be obtained by thresholding g x, y = 1 f x, y > T 0 f(x, y) T Morphology is concerned with the structure and shape In morphology, a binary image is convolved with a structuring element s and results in a binary image See Chapter 9 of Gonzalez and Woods for a more complete treatment

35 Mathematical Morphology Tool for extracting image components that are useful in the representation and description of region shape Boundaries, skeletons, convex hull, etc. The language of mathematical morphology is set theory A set represents an object in an image This is often useful in video processing because of the simplicity of processing and emphasis on objects Handy tool for clean up of a thresholded image

36 Morphological Operations Threshold operation 1 f t θ f, t = 0 else Structuring element s e.g. 3 x 3 box filter (1 s indicate included pixels in the mask) S number of on pixels in s Count of 1s in a structuring element c = f s Correlation (filter) raster scan procedure Basic morphological operations can be extended to grayscale images Dilation dilate f, s = θ(c, 1) Grows (thickens) 1 locations Erosion erode f, s = θ(c, S) Shrink (thins) 1 locations Opening open f, s = dilate(erode f, s, s) Generally smooth the contour of an object, breaks narrow isthmuses, and eliminates thin protrusions Closing close f, s = erode(dilate f, s, s) Generally smooth the contour of an object, fuses narrow breaks/separations, eliminates small holes, and fills gaps in a contour

37 Morphology Example Dilation - grows (thickens) 1 locations Erosion - shrink (thins) 1 locations Opening - generally smooth the contour of an object, breaks narrow isthmuses, and eliminates thin protrusions Closing - generally smooth the contour of an object, fuses narrow breaks/separations, eliminates small holes, and fills gaps in a contour

38 Connected Components Semi-global image operation to provide consistent labels to similar regions Based on adjacency concept Most efficient algorithms compute in two passes More computational formulations (iterative) exist from morphology X k = X k 1 B A Set Connected component Structuring element

39 More Connected Components Typically, only the white pixels will be considered objects Dark pixels are background and do not get counted After labeling connected components, statistics from each region can be computed Statistics describe the region e.g. area, centroid, perimeter, etc. Matlab functions bwconncomp.m, labelmatrix.m (bwlabel.m)- label image label2rgb.m color components for viewing regionprops.m calculate region statistics

40 Connected Component Example Grayscale image Threshold image Opened Image Labeled image 91 grains of rice