Lecture 4: Spatial Domain Transformations

Similar documents
Chapter 3: Intensity Transformations and Spatial Filtering

EEM 463 Introduction to Image Processing. Week 3: Intensity Transformations

EE795: Computer Vision and Intelligent Systems

Digital Image Processing, 2nd ed. Digital Image Processing, 2nd ed. The principal objective of enhancement

Lecture 6: Edge Detection

CHAPTER 3 IMAGE ENHANCEMENT IN THE SPATIAL DOMAIN

2D Image Processing INFORMATIK. Kaiserlautern University. DFKI Deutsches Forschungszentrum für Künstliche Intelligenz

Digital Image Processing

Image Enhancement: To improve the quality of images

Biometrics Technology: Image Processing & Pattern Recognition (by Dr. Dickson Tong)

EECS 556 Image Processing W 09. Image enhancement. Smoothing and noise removal Sharpening filters

Image Enhancement in Spatial Domain (Chapter 3)

Vivekananda. Collegee of Engineering & Technology. Question and Answers on 10CS762 /10IS762 UNIT- 5 : IMAGE ENHANCEMENT.

ECG782: Multidimensional Digital Signal Processing

EELE 5310: Digital Image Processing. Lecture 2 Ch. 3. Eng. Ruba A. Salamah. iugaza.edu

Filtering and Enhancing Images

EELE 5310: Digital Image Processing. Ch. 3. Eng. Ruba A. Salamah. iugaza.edu

What will we learn? Neighborhood processing. Convolution and correlation. Neighborhood processing. Chapter 10 Neighborhood Processing

ECG782: Multidimensional Digital Signal Processing

Babu Madhav Institute of Information Technology Years Integrated M.Sc.(IT)(Semester - 7)

Sampling and Reconstruction

Lecture 2 Image Processing and Filtering

Lecture 4. Digital Image Enhancement. 1. Principle of image enhancement 2. Spatial domain transformation. Histogram processing

1.Some Basic Gray Level Transformations

Classification of image operations. Image enhancement (GW-Ch. 3) Point operations. Neighbourhood operation

Digital Image Processing. Image Enhancement in the Spatial Domain (Chapter 4)

Lecture 4 Image Enhancement in Spatial Domain

Biomedical Image Analysis. Point, Edge and Line Detection

UNIT - 5 IMAGE ENHANCEMENT IN SPATIAL DOMAIN

Intensity Transformations and Spatial Filtering

Digital Image Processing. Image Enhancement - Filtering

Outlines. Medical Image Processing Using Transforms. 4. Transform in image space

Noise Model. Important Noise Probability Density Functions (Cont.) Important Noise Probability Density Functions

Point operation Spatial operation Transform operation Pseudocoloring

Broad field that includes low-level operations as well as complex high-level algorithms

Image gradients and edges April 11 th, 2017

CS4442/9542b Artificial Intelligence II prof. Olga Veksler

Image gradients and edges April 10 th, 2018

Digital Image Processing. Prof. P. K. Biswas. Department of Electronic & Electrical Communication Engineering

Image processing. Reading. What is an image? Brian Curless CSE 457 Spring 2017

INTENSITY TRANSFORMATION AND SPATIAL FILTERING

Introduction to Digital Image Processing

Biomedical Image Analysis. Spatial Filtering

Image Enhancement in Spatial Domain. By Dr. Rajeev Srivastava

Chapter - 2 : IMAGE ENHANCEMENT

Image Enhancement. Digital Image Processing, Pratt Chapter 10 (pages ) Part 1: pixel-based operations

CS4442/9542b Artificial Intelligence II prof. Olga Veksler

CS4733 Class Notes, Computer Vision

Anno accademico 2006/2007. Davide Migliore

Computer Vision I - Basics of Image Processing Part 1

SYDE 575: Introduction to Image Processing

Intensity Transformation and Spatial Filtering

Image gradients and edges

Motivation. Gray Levels

Lecture 9: Hough Transform and Thresholding base Segmentation

Point Operations and Spatial Filtering

IMAGE ENHANCEMENT IN THE SPATIAL DOMAIN

Motivation. Intensity Levels

Lecture 7: Most Common Edge Detectors

Lecture 4: Image Processing

Perception. Autonomous Mobile Robots. Sensors Vision Uncertainties, Line extraction from laser scans. Autonomous Systems Lab. Zürich.

convolution shift invariant linear system Fourier Transform Aliasing and sampling scale representation edge detection corner detection

Lecture: Edge Detection

Why is computer vision difficult?

Image Processing. Traitement d images. Yuliya Tarabalka Tel.

Fundamentals of Digital Image Processing

Filtering and Edge Detection. Computer Vision I. CSE252A Lecture 10. Announcement

Digital Image Processing COSC 6380/4393

Feature descriptors. Alain Pagani Prof. Didier Stricker. Computer Vision: Object and People Tracking

Point and Spatial Processing

Filtering Images. Contents

Basic Algorithms for Digital Image Analysis: a course

Introduction to Digital Image Processing

3.4& Fundamentals& mechanics of spatial filtering(page 166) Spatial filter(mask) Filter coefficients Filter response

Ulrik Söderström 16 Feb Image Processing. Segmentation

C E N T E R A T H O U S T O N S C H O O L of H E A L T H I N F O R M A T I O N S C I E N C E S. Image Operations I

EXAM SOLUTIONS. Image Processing and Computer Vision Course 2D1421 Monday, 13 th of March 2006,

Edge and local feature detection - 2. Importance of edge detection in computer vision

IMAGING. Images are stored by capturing the binary data using some electronic devices (SENSORS)

Digital Image Fundamentals

Part 3: Image Processing

C E N T E R A T H O U S T O N S C H O O L of H E A L T H I N F O R M A T I O N S C I E N C E S. Image Operations II

IMAGE PROCESSING >FILTERS AND EDGE DETECTION FOR COLOR IMAGES UTRECHT UNIVERSITY RONALD POPPE

Image Restoration and Reconstruction

Computer Vision. Recap: Smoothing with a Gaussian. Recap: Effect of σ on derivatives. Computer Science Tripos Part II. Dr Christopher Town

Digital Image Processing

In this lecture. Background. Background. Background. PAM3012 Digital Image Processing for Radiographers

Digital Image Processing. Lecture # 3 Image Enhancement

Basic relations between pixels (Chapter 2)

Image Restoration and Reconstruction

Computer Vision I - Filtering and Feature detection

Texture. Frequency Descriptors. Frequency Descriptors. Frequency Descriptors. Frequency Descriptors. Frequency Descriptors

Image Processing. Daniel Danilov July 13, 2015

Sharpening through spatial filtering

Chapter 10: Image Segmentation. Office room : 841

Image Processing Fundamentals. Nicolas Vazquez Principal Software Engineer National Instruments

5. Feature Extraction from Images

Image restoration. Restoration: Enhancement:

Digital Image Processing

Transcription:

# Lecture 4: Spatial Domain Transformations Saad J Bedros sbedros@umn.edu

Reminder 2 nd Quiz on the manipulator Part is this Fri, April 7 205, :5 AM to :0 PM Open Book, Open Notes, Focus on the material covered since the st Quiz All the videos of the lectures are available The Students will be split in 2 locations, Students with last name starting at: A-K, should come to ME 02 L-Z come to ME 8 Unite students take the quiz at their own location Conflict with Friday slot, Email Prof Donath ASAP #2

Last Vision Lecture Digital Image Representation Sampling Quantization Color Fundamentals Color Spaces Digital Cameras 2 types of Sensors Single Sensor for 3 Color Planes #3

Any Questions for HW# #4? Office Hours::5 :50 Today Only

Outline of this Lecture Spatial Domain Transformation Point Processing Transformations Pixel Mapping Histogram Processing Area/Mask Processing Transformations Image Filtering Frame Processing Transformations Geometric Transformations #5

Digital Image Processing Pixel Processing: change range of image g(x) = h(f(x)) f h f x x Geometric Transform: change domain of image f g(x) = f(h(x)) h f x x

Digital Image Processing Pixel Processing : change range of image g(x) = h(f(x)) f h g Geometric Transform : change domain of image f g(x) = f(h(x)) h g

Spatial Domain Methods Point Processing Area/Mask Processing

Point Processing The simplest kind of range transformations are independent of the position x,y: g = t(f) This is called point processing. What can they do? What s the form of t? Important: every pixel is mapped independently spatial information completely lost!

Point Processing Methods The most primitive, yet essential, image processing operations. Intensity transformations that convert an old pixel into a new pixel based on some predefined function. Operate on a pixel based solely on that pixel s value. Used primarily for image enhancement. #0 Transformation function

Identity Transformation

Negative Image O(r,c) = 255-I(r,c)

Negative Image

Contrast Stretching or Compression Stretch gray-level ranges where we desire more information (slope > ). Compress gray-level ranges that are of little interest (0 < slope < ).

Point Pocessing Methods Thresholding: Special case of contrast compression

Point Processing Methods Intensity-level slicing Highlight a specific range of graylevels only Acts as double thresholding #6

Bit-level Slicing Highlighting the contribution made by a specific bit. For pgm images, each pixel is represented by 8 bits. Each bit-plane is a binary image

Logarithmic transformation Non-linear transformations We may use any function, provided that is gives a oneto-one or many-to-one (i.e., single-valued) mapping. Enhance details in the darker regions of an image at the expense of detail in brighter regions. compress s r c log r T stretch

Log Stretching

Exponential transformation Reverse effect of that obtained using logarithmic mapping. stretch compress

Basic Point Processing

Power-law transformations

Image Enhancement

Contrast Stretching

#26 Image Histogram

Image Intensity Histogram The histogram of a digital image with gray levels from 0 to L- is a discrete function h(r k )=n k, where: #27 r k is the kth gray level n k is the # pixels in the image with that gray level n is the total number of pixels in the image k = 0,, 2,, L- Normalized histogram: p(r k )=n k /n sum of all components =

Building Distributions #28 Number of Pixels 80 80 Pt ( G 00) 0.25 40 60 2 70 80 320 80 70 60 40 0 50 70 00 0 50 Intensity of Green Channel

Example: Image Histograms An image histogram is a plot of the gray-level frequencies (i.e., the number of pixels in the image that have that gray level).

Example: Image Histograms Divide frequencies by total number of pixels to represent as probabilities. pk nk / N

Image Histograms

Properties of Image Histograms Histograms clustered at the low end correspond to dark images. Histograms clustered at the high end correspond to bright images.

Properties of Image Histograms Histograms with small spread correspond to low contrast images (i.e., mostly dark, mostly bright, or mostly gray). Histograms with wide spread correspond to high contrast images.

Properties of Image Histograms Low contrast High contrast

Histogram Processing The shape of the histogram of an image does provide useful info about the possibility for contrast enhancement. #35 Types of processing: Histogram equalization Histogram matching (specification) Local enhancement

Point Processing Methods Histogram equalization Low contrast images are usually mostly dark, mostly light, or mostly gray. High contrast images have large regions of dark and large regions of white. Good contrast images exhibit a wide range of pixel values (i.e., no single gray level dominates the image). #36

Point Processing Methods Histogram equalization is a transformation that stretches the contrast by redistributing the gray-level values uniformly. is fully automatic compared to other contrast stretching techniques. #37

Histogram Equalization As mentioned above, for gray levels that take on discrete values, we deal with probabilities: p r (r k )=n k /n, k=0,,.., L- n k is the number of pixels with r k gray level n is the total number of pixels k= 0,,,255 The plot of p r (r k ) versus r k is called a histogram and the technique used for obtaining a uniform histogram is known as histogram equalization (or histogram linearization). #38

How do we determine this grey scale transformation function? Assume our grey levels are continuous and have been normalized to lie between 0 (black) and (white). We must find a transformation T that maps grey values r in the input image F to grey values s = T(r) in the transformed image. It is assumed that T is single valued and monotonically increasing, and for The inverse transformation from s to r is given by : r = T - (s).

An example of such a transfer function is illustrated in the Figure

Histogram Equalization k n k j sk T ( rk ) n j 0 j 0 p r ( r j ) #4 Histogram equalization(he) results are similar to contrast stretching but offer the advantage of full automation, since HE automatically determines a transformation function to produce a new image with a uniform histogram. http://homepages.inf.ed.ac.uk/rbf/hipr2/histeq.htm

Histogram Equalization A fully automatic gray-level stretching technique.

Point Processing Methods Histogram equalization In practice, the histogram might not become totally flat! #43

Histogram Specification #44 Histogram equalization does not allow interactive image enhancement and generates only one result: an approximation to a uniform histogram. Sometimes though, we need to be able to specify particular histogram shapes capable of highlighting certain gray-level ranges.

Histogram Specification The procedure for histogram-specification based enhancement is: #45 Equalize the levels of the original image using: s T ( r k ) k j 0 n: total number of pixels, nj: number of pixels with gray level rj, L: number of discrete gray levels n j n

Histogram Specification #46 Specify the desired density function and obtain the transformation function G(z): v G( z) z 0 p z ( w) z i 0 ni n pz: specified desirable PDF for output Apply the inverse transformation function z=g - (s) to the levels obtained in step.

Histogram Specification #47 The new, processed version of the original image consists of gray levels characterized by the specified density p z (z). In essence: z G ( s) z G [ T ( r)]

Histogram Specification #48 The principal difficulty in applying the histogram specification method to image enhancement lies in being able to construct a meaningful histogram. So

Histogram Specification #49 Either a particular probability density function (such as a Gaussian density) is specified and then a histogram is formed by digitizing the given function, Or a histogram shape is specified on a graphic device and then is fed into the processor executing the histogram specification algorithm.

Local Enhancement #50 Desire to enhance a local region Local Enhancement is achieved by applying histogram processing ( Equalization ) on a local region instead of the whole image

Histogram Color Processing can apply histogram equalization to color images don't want to apply it using the RGB color model - equalizing R, G, and B bands independently causes color shifts must convert to a color model that separates intensity information from color information (e.g. HSI) can then apply histogram equalization on the intensity band

#52 Area Mask Processing OR Image Filtering

Spatial Domain Methods Point Processing Area/Mask Processing

Area/Mask Processing Methods A pixel value is computed from its old value and the values of pixels in its vicinity. More costly operations than simple point processes, but more powerful. What is a Mask? A mask is a small matrix whose values are called weights. Each mask has an origin, which is usually one of its positions. The origins of symmetric masks are usually their center pixel position. #54

Area/Mask Processing Methods Examples: #55 Applying masks to images (filtering) The application of a mask to an input image produces an output image of the same size as the input. For linear filters, the operation is a linear operation on the image pixels

Linear Mask Processing Methods Two main linear Mask Operation or Spatial Filtering methods: Correlation Convolution

Area/Mask Processing Methods Cross Correlation Correlation applies the mask directly to the image without flipping it. It is often used in applications where it is necessary to measure the similarity between images or parts of images. If the mask is symmetric (i.e., the flipped mask is the same as the original one) then the results of convolution and correlation are the same. #57

Correlation w(i,j) g(i,j) Output Image f(i,j) K/2 K/2 g( x, y) w( x, y) f( x, y) w( s, t) f( x s, y t) s K/2 t K/2

Convolution Similar to correlation except that the mask is first flipped both horizontally and vertically. K/2 K/2 g( x, y) w( x, y) f( x, y) w( s, t) f( x s, y t) s K/2 t K/2 Note: if w(x,y) is symmetric, that is w(x,y)=w(-x,-y), then convolution is equivalent to correlation!

Area/Mask Processing Methods Convolution ) For each pixel in the input image, the mask is conceptually placed on top of the image with its origin lying on that pixel. 2) The values of each input image pixel under the mask are multiplied by the values of the corresponding mask weights. 3) The results are summed together to yield a single output value that is placed in the output image at the location of the pixel being processed on the input. #60

Example Correlation: Convolution:

Area/Mask Processing Methods Normalization of mask weights The sum of weights in the convolution mask affect the overall intensity of the resulting image. Many convolution masks have coefficients that sum to (convolved image will have the same average intensity as the original one). Some masks have negative weights and sum to 0. Pixels with negative values may be generated using masks with negative weights. Negative values are mapped to the positive range through appropriate normalization. Practical problems How to treat the image borders? Time increases exponentially with mask size. #62

Image filtering g[, ] f [.,.] h[.,.] 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 90 90 90 90 90 0 0 0 0 0 90 90 90 90 90 0 0 0 0 0 90 90 90 90 90 0 0 0 0 0 90 90 0 90 90 90 90 90 90 0 0 0 0 0 90 90 90 90 90 90 90 90 90 90 0 0 0 0 0 0 0 0 0 0 0 0 0 0 90 90 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 h[ m, n] k, l g[ k, l] f [ m k, n l] Credit: S. Seitz

Image filtering g[, ] f [.,.] h[.,.] 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 90 90 90 90 90 0 0 0 0 0 90 90 90 90 90 0 0 0 0 0 90 90 90 90 90 0 0 0 0 0 90 0 90 90 90 0 0 0 0 0 90 90 90 90 90 0 0 0 0 0 0 0 0 0 0 0 0 0 0 90 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 h[ m, n] g[ k, l] k, l f [ m k, n l] Credit: S. Seitz

Image filtering g[, ] f [.,.] h[.,.] 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 20 0 0 0 90 90 90 90 90 0 0 0 0 0 90 90 90 90 90 0 0 0 0 0 90 90 90 90 90 0 0 0 0 0 90 0 90 90 90 0 0 0 0 0 90 90 90 90 90 0 0 0 0 0 0 0 0 0 0 0 0 0 0 90 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 h[ m, n] g[ k, l] k, l f [ m k, n l] Credit: S. Seitz

Image filtering g[, ] f [.,.] h[.,.] 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 20 30 0 0 0 90 90 90 90 90 0 0 0 0 0 90 90 90 90 90 0 0 0 0 0 90 90 90 90 90 0 0 0 0 0 90 0 90 90 90 0 0 0 0 0 90 90 90 90 90 0 0 0 0 0 0 0 0 0 0 0 0 0 0 90 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 h[ m, n] g[ k, l] k, l f [ m k, n l] Credit: S. Seitz

Image filtering g[, ] f [.,.] h[.,.] 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 20 30 30 0 0 0 90 90 90 90 90 0 0 0 0 0 90 90 90 90 90 0 0 0 0 0 90 90 90 90 90 0 0 0 0 0 90 0 90 90 90 0 0 0 0 0 90 90 90 90 90 0 0 0 0 0 0 0 0 0 0 0 0 0 0 90 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 h[ m, n] g[ k, l] k, l f [ m k, n l] Credit: S. Seitz

Image filtering g[, ] f [.,.] h[.,.] 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 20 30 30 0 0 0 90 90 90 90 90 0 0 0 0 0 90 90 90 90 90 0 0 0 0 0 90 90 90 90 90 0 0 0 0 0 90 0 90 90 90 0 0 0 0 0 90 90 90 90 90 0 0 0 0 0 0 0 0 0 0 0 0? 0 0 90 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 h[ m, n] g[ k, l] k, l f [ m k, n l] Credit: S. Seitz

Image filtering g[, ] f [.,.] h[.,.] 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 20 30 30 0 0 0 90 90 90 90 90 0 0 0 0 0 90 90 90 90 90 0 0 0 0 0 90 90 90 90 90 0 0 0 0 0 90 0 90 90 90 0 0? 0 0 0 90 90 90 90 90 0 0 50 0 0 0 0 0 0 0 0 0 0 0 0 90 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 h[ m, n] g[ k, l] k, l f [ m k, n l] Credit: S. Seitz

Image filtering g[, ] f [.,.] h[.,.] 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 20 30 30 30 20 0 0 0 0 90 90 90 90 90 0 0 0 20 40 60 60 60 40 20 0 0 0 90 90 90 90 90 0 0 0 30 60 90 90 90 60 30 0 0 0 90 90 90 90 90 0 0 0 30 50 80 80 90 60 30 0 0 0 90 0 90 90 90 0 0 0 30 50 80 80 90 60 30 0 0 0 90 90 90 90 90 0 0 0 20 30 50 50 60 40 20 0 0 0 0 0 0 0 0 0 0 0 20 30 30 30 30 20 0 0 0 90 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 h[ m, n] g[ k, l] k, l f [ m k, n l] Credit: S. Seitz

Box Filter What does it do? g[, ] Replaces each pixel with an average of its neighborhood Achieve smoothing effect (remove sharp features) Slide credit: David Lowe (UBC)

Smoothing with box filter

Practice with linear filters 0 0 0 0 0? 0 0 0 Original Source: D. Lowe

Practice with linear filters 0 0 0 0 0 0 0 0 Original Filtered (no change) Source: D. Lowe

Practice with linear filters 0 0 0 0 0? 0 0 0 Original Source: D. Lowe

Practice with linear filters 0 0 0 0 0 0 0 0 Original Shifted left By pixel Source: D. Lowe

Area/Mask Processing Methods Smoothing (or Low-Pass) Filters Useful for noise reduction and image blurring. Remove the finer details of an image. #77 Averaging or mean filter Elements of mask must be positive. Size of mask degree of smoothing.

Practice with linear filters 0 0 0 0 2 0 0 0 0 -? Original (Note that filter sums to ) Source: D. Lowe

Practice with linear filters 0 0 0 0 2 0 0 0 0 - Original Sharpening filter - Accentuates differences with local average Source: D. Lowe

Sharpening Source: D. Lowe

Other filters 2 0 0 0 - -2 - Other Filter Vertical Edge (absolute value)

Other filters 0-2 0-2 0 - Other Filter Horizontal Edge (absolute value)

Basic gradient filters Horizontal Gradient 0 0 0-0 0 0 0 or 0 0 0 Vertical Gradient 0 0 0 or - 0-0 - 0

Area/Mask Processing Methods Averaging or mean filter cont. Not the best choice #84

Area/Mask Processing Methods Gaussian filters properties: #85 Most common natural model (there are cells in eye that perform Gaussian filtering) Smooth function, it has infinite number of derivatives Fourier Transform of Gaussian is Gaussian. Convolution of a Gaussian with itself is a Gaussian. Gaussian smoothing can be implemented efficiently because the kernel is separable

Gaussian filter Compute empirically * = Input image f Filter h Output image g

Gaussian vs. mean filters What does real blur look like?

Important filter: Gaussian Spatially-weighted average 0.003 0.03 0.022 0.03 0.003 0.03 0.059 0.097 0.059 0.03 0.022 0.097 0.59 0.097 0.022 0.03 0.059 0.097 0.059 0.03 0.003 0.03 0.022 0.03 0.003 5 x 5, = Slide credit: Christopher Rasmussen

Gaussian filters What parameters matter here? Variance of Gaussian: determines extent of smoothing Source: K. Grauman

Area/Mask Processing Methods The value of determines the degree of smoothing. As increases, the size of the mask must also increase if we are to sample the Gaussian satisfactorily. Rule choose: height = width = 5 area) (subtends 98.76% of the #90

Smoothing with box filter

Smoothing with Gaussian filter

Smoothing with a Gaussian Parameter σ is the scale / width / spread of the Gaussian kernel, and controls the amount of smoothing. Source: K. Grauman

Area/Mask Processing Methods Sharpening (or High-Pass) Filters Used to emphasize the fine details of an image (has the opposite effect of smoothing). Points of high contrast can be detected by computing intensity differences in local image regions. Weights of the mask are both positive and negative. When the mask is over an area of constant or slowly varying gray level, the result of convolution will be close to zero. When gray level is varying rapidly within the neighborhood, the result of convolution will be a large number. Typically, such points form the border between different objects or scene parts #94

Area/Mask Processing Methods Sharpening (or High-Pass) Filters cont. #95

Area/Mask Processing Methods Sharpening using derivatives #96 Computing the derivative of an image has as a result the sharpening of the image. The most common way to differentiate an image is by using the gradient.

Area/Mask Processing Methods Sharpening using derivatives cont. #97 The gradient can be approximated by finite differences which can be implemented efficiently as masks. Examples of masks based on gradient approximations with finite differences:

How do we choose the elements of a mask? Typically, by sampling certain functions. Gaussian st derivative of Gaussian 2 nd derivative of Gaussian

Filters Smoothing (i.e., low-pass filters) Reduce noise and eliminate small details. The elements of the mask must be positive. Sum of mask elements is (after normalization) Gaussian

Filters Sharpening(i.e., high-pass filters) Highlight fine detail or enhance detail that has been blurred. The elements of the mask contain both positive and negative weights. Sum of the mask weights is 0 (after normalization) st derivative of Gaussian 2 nd derivative of Gaussian

Sharpening Filters (High Pass Filtering) Useful for emphasizing transitions in image intensity (e.g., edges).

Sharpening Filters Note that the response of high-pass filtering might be negative. Values must be re-mapped to [0, 255] original image sharpened images

Sharpening Filters: Unsharp Masking Obtain a sharp image by subtracting a lowpass filtered (i.e., smoothed) image from the original image: - =

Sharpening Filters: Derivatives Taking the derivative of an image results in sharpening the image. The derivative of an image can be computed using the gradient.

Sharpening Filters: Derivatives The gradient is a vector which has magnitude and direction: or f f x y (approximation)

Sharpening Filters: Derivatives Magnitude: provides information about edge strength. Direction: perpendicular to the direction of the edge.

Sharpening Filters: Gradient Computation Approximate gradient using finite differences: Δx sensitive to vertical edges sensitive to horizontal edges

Sharpening Filters: Gradient Computation (cont d)

Example f x f y

Sharpening Filters: Gradient Computation (cont d) We can implement and using masks: (x+/2,y) good approximation at (x+/2,y) (x,y+/2) * * good approximation at (x,y+/2) Example: approximate gradient at z 5

Sharpening Filters: Gradient Computation (cont d) A different approximation of the gradient: good approximation (x+/2,y+/2) * We can implement and using the following masks:

Sharpening Filters: Gradient Computation (cont d) Example: approximate gradient at z 5 Other approximations Sobel

Example f x f y

Sharpening Filters: Laplacian The Laplacian (2 nd derivative) is defined as: (dot product) Approximate derivatives:

Sharpening Filters: Laplacian Laplacian Mask

Handling Pixels Close to Boundaries pad with zeroes 0 0 0.0 0 0 0.0 or

Handling Pixels Close to Boundaries What to do about image borders: black fixed periodic reflected

Linear vs Non-Linear Spatial Filtering A filtering method is linear when the output is a weighted sum of the input pixels. Methods that do not satisfy the above property are called nonlinear. e.g.,

Area/Mask Processing Methods Median filter (non-linear filter). Replace each pixel value by the median of the gray-levels in the neighborhood of the pixels #9

Smoothing Filters: Median Filtering (non-linear) Very effective for removing salt and pepper noise (i.e., random occurrences of black and white pixels). averaging median filtering

Remember Intensity operations can yield pixel values outside of the range [0 255]. You should convert values back to the range [0 255] to ensure that the image is displayed properly. Consider the following mapping? [f min f max ] [ 0 255]

#22 Frame Processing

Frame Processing Methods Generate a pixel value based on an operation involving two or more images or Frames. Each output pixel is usually located at the same position in the input image. Attractive in Video Processing Useful for combining information in two images. #23

Frame Processing Methods Subtraction Useful for "change detection #24 O(r, c) = I (r, c) - I 2 (r, c)

Frame Processing Methods Averaging Image quality can be improved by averaging a number of images together. #25

Frame Processing Methods Logical Operators #26

Frame Processing Methods #27

#28 Geometric Transformations

Geometric Transformations Modify the arrangement of pixels based on some geometric transformation. #29

What are geometric transformations?

Translation

Translation and rotation

Scale

Similarity transformations Similarity transform (4 DoF) = translation + rotation + scale

Aspect ratio

Shear

Affine transformations Affine transform (6 DoF) = translation + rotation + scale + aspect ratio + shear

Geometric Transformations y=v sinθ + w cosθ [ x Affine Transformation t y] [ v w] t t 2 3 t t t 2 22 32 0 0

Geometric Transformations Some Practical Problems ) Transformed pixel coordinates might not lie within the bounds of the image. 2) Transformed pixel coordinates can be non-integer. 3) There might be no pixels in the input image that map to certain pixel locations in the transformed image (one-to-one correspondence can be lost). #39

Geometric Transformations Problem (3): Forward vs. Inverse Mapping To guarantee that a value is generated for every pixel in the output image, we must consider each output pixel in turn and use the inverse mapping to determine the position in the input image. #40

Geometric Transformations Problem (2): Image Interpolation Interpolation is the process of generating integer coordinates for a transformed pixel by examining its surrounding pixels. Zero-order interpolation (or nearest-neighbor) #4

Geometric Transformations First-order interpolation #42 Higher-order interpolation schemes are more sophisticated but also more time consuming

Review Spatial Domain Transformation Point Processing for Enhancement Area/Mask Processing Transformations Image Filter for Blurring, Enhancement Frame Processing Transformations Operations on multiple images Geometric Transformations Image Warping or Operations on the image axes #43