INTENSITY TRANSFORMATION AND SPATIAL FILTERING

Similar documents
EEM 463 Introduction to Image Processing. Week 3: Intensity Transformations

Intensity Transformations and Spatial Filtering

Sampling and Reconstruction

Lecture 4: Spatial Domain Transformations

Chapter 3: Intensity Transformations and Spatial Filtering

CHAPTER 3 IMAGE ENHANCEMENT IN THE SPATIAL DOMAIN

UNIT - 5 IMAGE ENHANCEMENT IN SPATIAL DOMAIN

Image Enhancement: To improve the quality of images

Image Enhancement in Spatial Domain. By Dr. Rajeev Srivastava

Image Enhancement in Spatial Domain (Chapter 3)

Classification of image operations. Image enhancement (GW-Ch. 3) Point operations. Neighbourhood operation

EE795: Computer Vision and Intelligent Systems

Lecture 4. Digital Image Enhancement. 1. Principle of image enhancement 2. Spatial domain transformation. Histogram processing

Digital Image Processing, 2nd ed. Digital Image Processing, 2nd ed. The principal objective of enhancement

Intensity Transformation and Spatial Filtering

GNR401M Remote Sensing and Image Processing

Filtering and Enhancing Images

1.Some Basic Gray Level Transformations

Lecture 4 Image Enhancement in Spatial Domain

Introduction to Digital Image Processing

EELE 5310: Digital Image Processing. Lecture 2 Ch. 3. Eng. Ruba A. Salamah. iugaza.edu

Point and Spatial Processing

EELE 5310: Digital Image Processing. Ch. 3. Eng. Ruba A. Salamah. iugaza.edu

Digital Image Processing

Digital Image Processing

Chapter - 2 : IMAGE ENHANCEMENT

What will we learn? Neighborhood processing. Convolution and correlation. Neighborhood processing. Chapter 10 Neighborhood Processing

IMAGE ENHANCEMENT IN THE SPATIAL DOMAIN

ECG782: Multidimensional Digital Signal Processing

Digital Image Processing. Prof. P. K. Biswas. Department of Electronic & Electrical Communication Engineering

Chapter 11 Representation & Description

Digital Image Processing

Biomedical Image Analysis. Spatial Filtering

Basic Algorithms for Digital Image Analysis: a course

Image Processing Lecture 10

Vivekananda. Collegee of Engineering & Technology. Question and Answers on 10CS762 /10IS762 UNIT- 5 : IMAGE ENHANCEMENT.

Biometrics Technology: Image Processing & Pattern Recognition (by Dr. Dickson Tong)

Digital Image Analysis and Processing

In this lecture. Background. Background. Background. PAM3012 Digital Image Processing for Radiographers

Lecture 3 - Intensity transformation

Broad field that includes low-level operations as well as complex high-level algorithms

Image Restoration and Reconstruction

Image Restoration and Reconstruction

Digital Image Processing. Image Enhancement - Filtering

Digital Image Processing. Image Enhancement in the Spatial Domain (Chapter 4)

Basic relations between pixels (Chapter 2)

Image restoration. Restoration: Enhancement:

TEXTURE. Plan for today. Segmentation problems. What is segmentation? INF 4300 Digital Image Analysis. Why texture, and what is it?

Digital Image Processing. Lecture # 3 Image Enhancement

Computer Vision I - Basics of Image Processing Part 1

Introduction to Digital Image Processing

2D Image Processing INFORMATIK. Kaiserlautern University. DFKI Deutsches Forschungszentrum für Künstliche Intelligenz

ECG782: Multidimensional Digital Signal Processing

Point operation Spatial operation Transform operation Pseudocoloring

9 length of contour = no. of horizontal and vertical components + ( 2 no. of diagonal components) diameter of boundary B

Lecture 6: Multimedia Information Retrieval Dr. Jian Zhang

x' = c 1 x + c 2 y + c 3 xy + c 4 y' = c 5 x + c 6 y + c 7 xy + c 8

Digital Image Processing. Image Enhancement (Point Processing)

Outlines. Medical Image Processing Using Transforms. 4. Transform in image space

Edge and local feature detection - 2. Importance of edge detection in computer vision

EECS 556 Image Processing W 09. Image enhancement. Smoothing and noise removal Sharpening filters

Review for Exam I, EE552 2/2009

Fundamentals of Digital Image Processing

Lecture: Edge Detection

Image Enhancement. Digital Image Processing, Pratt Chapter 10 (pages ) Part 1: pixel-based operations

Digital Image Processing. Lecture 6

IMAGING. Images are stored by capturing the binary data using some electronic devices (SENSORS)

Noise Model. Important Noise Probability Density Functions (Cont.) Important Noise Probability Density Functions

Biomedical Image Analysis. Point, Edge and Line Detection

ECEN 447 Digital Image Processing

Image Processing. BITS Pilani. Dr Jagadish Nayak. Dubai Campus

Digital Image Processing

Computer Vision I - Filtering and Feature detection

CS4442/9542b Artificial Intelligence II prof. Olga Veksler

Digital Image Processing. Prof. P.K. Biswas. Department of Electronics & Electrical Communication Engineering

CS4442/9542b Artificial Intelligence II prof. Olga Veksler

Digital Image Fundamentals

Ulrik Söderström 16 Feb Image Processing. Segmentation

Linear Operations Using Masks

Image segmentation based on gray-level spatial correlation maximum between-cluster variance

EE663 Image Processing Histogram Equalization I

Selected Topics in Computer. Image Enhancement Part I Intensity Transformation

Digital Image Processing

Feature extraction. Bi-Histogram Binarization Entropy. What is texture Texture primitives. Filter banks 2D Fourier Transform Wavlet maxima points

Digital Image Processing. Image Enhancement in the Frequency Domain

EECS 556 Image Processing W 09. Interpolation. Interpolation techniques B splines

Image processing. Reading. What is an image? Brian Curless CSE 457 Spring 2017

Binary Image Processing. Introduction to Computer Vision CSE 152 Lecture 5

Babu Madhav Institute of Information Technology Years Integrated M.Sc.(IT)(Semester - 7)

Image Processing. Bilkent University. CS554 Computer Vision Pinar Duygulu

Computer Vision 2. SS 18 Dr. Benjamin Guthier Professur für Bildverarbeitung. Computer Vision 2 Dr. Benjamin Guthier

Achim J. Lilienthal Mobile Robotics and Olfaction Lab, AASS, Örebro University

C E N T E R A T H O U S T O N S C H O O L of H E A L T H I N F O R M A T I O N S C I E N C E S. Image Operations I

Histograms. h(r k ) = n k. p(r k )= n k /NM. Histogram: number of times intensity level rk appears in the image

Chapter 11 Image Processing

Lecture 2 Image Processing and Filtering

Image Analysis - Lecture 1

Digital Image Processing

CoE4TN3 Medical Image Processing

Transcription:

1 INTENSITY TRANSFORMATION AND SPATIAL FILTERING Lecture 3

Image Domains 2 Spatial domain Refers to the image plane itself Image processing methods are based and directly applied to image pixels Transform domain Transforming an image into a transform domain, doing the processing there and obtaining the results back into the spatial domain

Spatial domain 3 Two principle categories of spatial processing: Intensity transformation Spatial filtering Intensity transformation operate on single pixels of an image Contrast manipulation Image thresholding

Spatial domain 4 Spatial filtering deals with operations : Image sharpening -> working in a neighbourhood of every pixel in an image Classical techniques of intensity transformations and spatial filtering Fuzzy techniques -> incorporate imprecise, knowledge based information in the formulation of intensity transformation and spatial filtering

Basics of intensity transformation 5 Spatial domain techniques are applied on pixels Frequency domain are performed on Fourier transform of an image Some application requires spatial domain techniques other rely on frequency domain approaches.

Transformation 6 = Input image = Output image = Operator defined over a neighbourhood

Transformation 7 The operator can be applied to a single image or to a set of images over pixel-by-pixel y (x,y) 3 x3 Nbd x

Transformation 8 Process of small window operation are called spatial filtering The type of operation performed in the neighbourhood determines the filtering process The smallest possible neighnourhood is 1x1 G depends on the value of F at a single point (x,y) Intensity transformation function

Transformation 9 Intensity of G = s Intensity of F = r s = T(r)

Transformation 10 Intensity transformation are among the simplest of all image processing techniques.

Linear Contrast Enhancement 11 Linear Contrast Enhancement Identify minimum and maximum gray levels in the input Specify minimum and maximum gray levels in the output image Compute the gray level mapping based on the line y = m.x + c

Logarithmic Contrast Stretch 12 Logarithmic Contrast Stretch Identify minimum and maximum gray levels in the input Apply the transformation y = k.log(1+x) k is user-specified parameter. The resulting floating point values of y are scaled linearly to the desired output range as before

Exponential Contrast Stretch 13 Exponential Contrast Stretch Identify the input minimum and maximum gray levels Apply the transformation y = kx r The parameter r controls the rate at which x r rises The resulting values of y are rescaled to the desired output range using linear stretching

Contrast 14 Contrast is the difference in the intensity of the object of interest compared to the background (rest of the image) The perceptual contrast is a function of the logarithm of the difference in the object and background intensities This means that in the darker regions, small changes in intensity can be noticed, but in brighter regions, the difference has to be much more

Histogram 15 Given a 1-D histogram (computed for a black/white image or for one band in a multispectral image), it conveys information about the quality of the image. Positively skewed histogram darkish image Negatively skewed histogram lightish image

Histogram 16 Negatively skewed histogram Positively skewed histogram

Example image 17

After Logarithmic stretch 18

Example image 19

After exponential stretch 20

Example image 2 21

After Logarithmic stretch 22

Histogram Equalization 23 When the image contains very few similar valued gray levels, then the ability to interpret it is hampered. It is desirable that the dynamic range of the display device is better utilized. One way to achieve this is by transforming the image such that all gray levels have equal likelihood of occurrence.

Principle of Histogram Equalization 24 Given an imperfect histogram, and an ideal histogram that has equal population of all gray levels, map the input histogram to approximate the equalized histogram.

Translation of Theory into Practice 25 Essentially, the enhanced image which has an equalized histogram has (ideally) equal number of pixels at each gray level. In practice, we can only achieve an approximation of it. For an equalized histogram, the cumulative histogram is known given the size of the image and the number of gray levels.

How to equalize the histogram? 26 For a gray level, corresponding to its cumulative frequency, find the nearest gray level that matches the ideal cumulative frequency Therefore image enhancement by histogram equalization is achieved by the mapping of gray levels is based on the actual cumulative histogram of the image and the desired cumulative histogram

Local histogram processing 27 In the global sense : Pixels are modified by a transformation function based on the intensity distribution of the entire region In the local sense : It is sometime necessary to enhance the details over small areas in an image

Histogram statistics 28 Statistics obtained directly from an image histogram can be used for image enhancement. nth moment of r about its mean Second moment Global mean and variance computed over the image are useful for gross adjustments in overall intensity Local mean and variance over a neighbour of a pixels -> S xy

Histogram 29 The normalized version of f(n) may be defined as p(n) = f(n) / (M.N) p(n) probability of the occurrence of gray level n in the image (in relative freq. sense) S n p(n) = 1 MIN = min n {f(n) f(n) 0} MAX = max n {f(n) f(n) 0}

Histogram of image with good contrast 30

Histogram of Low Contrast Image 31

Information from Histogram 32 The information conveyed by the occurrence of an event whose probability of occurrence is p(n) is given by I(n) = ln{1/p(n)} = -ln{p(n)} This implies that if the probability of occurrence of an event is low, then its occurrence conveys significant amount of information If the probability of an event is high, the information conveyed by its occurrence is low

Average Information Entropy 33 Average information conveyed by a set of events with probabilities p(i), i=1,2,, is given by H = - S n p(n) log {p(n)} H is called entropy and is extensively used in image processing operations H is highest when all probabilities are equally likely. H max = -S n k.ln(k), where k = p(n) for all n H is zero when p(j)=1 for some j, and p(k) = 0 for all k j

Role of Entropy 34 Indicator whether very few gray levels are actually present, or wide range of levels in sufficient numbers Entropy is also used for threshold selection e.g., separating image into object of interest and background

Histogram 35 f(n) A Bimodal Histogram Mode 1 Mode 2 n =m 1 =m 2

Image Statistics from Histogram 36 MIN gray level MIN = n: min n f(n) 0 Max gray level MAX = n: max n f(n) 0 Mean gray level µ = S n n.f(n) / (M.N) Variance s 2 = S n f(n)[n-m] 2 / (M.N) Median Med = k: S k n=0 f(n) = (M.N)/2

Image Statistics from Histogram 37 Skewness 1 Sk n f n 3 ( MN 1) 3 ( ) ( ) Skewness is positive if the histogram is skewed to the left of the mean, i.e., it has a long tail towards the higher gray levels Skewness is negative if the histogram is skewed to the right of the mean

Image Statistics from Histogram 38 Kurtosis 1 Ku n f n 4 ( MN 1) 4 ( ) ( ) 3 For Gaussian distributions, Kurtosis = 3. Therefore the excess kurtosis is defined by subtracting 3 from the above equation. Positive kurtosis in this case indicates a sharply peaked distribution, and negative kurtosis denotes a flat distribution, with uniform distribution being the limiting case.

Pixel and Neighborhood 39 A B C D X E F G H Pixel under consideration X Neighbors of X are A, C, F,H, B,D,E,G Size of neighborhood = 3x3 Neighborhoods of size mxn m and n are odd; Unique pixel at the centre of the neighborhood

Point Operations v/s Neighborhood Operations 40 Point operations do not alter the sharpness or resolution of the image Gray level associated with a pixel is manipulated independent of the gray levels associated with neighbors Pixel operations cannot deal with noise in the image, nor highlight local features like object boundaries

Neighborhood Operations 41 Simple averaging A B C D X E F G H g(x) = (1/9)[f(A) + f(b) + f(c) + f(d) + f(x) + f(e) + f(f) + f(g) + f(h)] The output gray level is the average of the gray levels of all the pixels in the 3x3 neighborhood

Example 42 15 17 16 15 17 16 18 17 15 18 37 15 17 14 16 17 14 16 Case 1 Case 2 In case 1, after averaging, the central element 17 is replaced by the local average 16 negligible change In case 2, after averaging, the central element 37 is replaced by 18 significant change Averaging is a powerful tool to deal with random noise

Neighborhood Operations - Procedure 43 The procedure involves applying the computational step at every pixel, considering its value and the values at the neighboring pixels Then the neighborhood is shifted by one pixel to the right and the centre pixel of the new neighborhood is in focus This process continues from left to right, top to bottom

Mathematical form for averaging 44 In general, we can write g(x) = K i 1 f ( A ) N( X ) i where K is the number of neighbors A i. A 5 refers to X, the central pixel for a 3x3 neighborhood. It is obvious that all neighbors are given equal weightage during the averaging process

General form for averaging 45 In case different weights are preferred for different neighbors, then we can writek wi f ( Ai ) i 1 g(x) = K w i 1 i For simple averaging over a 3x3 neighborhood, w i = (1/9), i=1,2,,9 We can alter, for example, the weights for 4-neighbors and 8-neighbors. In such a case, w i is not a constant for all values of i.

Concept of Convolution 46 Convolution is a weighted summation of inputs to produce an output; weights do not change anytime during the processing of the entire data If the input shifts in time or position, the output also shifts in time or position; character of the processing operation will not change The weights with which the pixels in the image are modified are represented by the term filter

Filter Mask 47 The filter can be compactly represented using the weights or multiplying coefficients: e.g., 3x3 averaging filter 0.111 0.111 0.111 1 1 1 0.111 0.111 0.111 or (1/9) 1 1 1 0.111 0.111 0.111 1 1 1 This implies that the pixels in the image are multiplied with corresponding filter coefficients and the products are added

Reduced neighborhood influence 48 0.05 0.15 0.05 0.15 0.20 0.15 0.05 0.15 0.05 Central pixel is given 20% weight, 4-neighbors 15% weight. Diagonal neighbors given 5% weight. Note that the weights are all positive, and sum to unity

Discrete Convolution 49 g i,j = w w k w l w h f k, l i k, j l The filter coefficients are mirror-reflected around the central element, and then the filter is slid on the input image The filter moves from top left to bottom right, moving one position at a time For each position of the filter, an output value is computed