Digital Image Processing ERRATA. Wilhelm Burger Mark J. Burge. An algorithmic introduction using Java. Second Edition. Springer

Similar documents
Computer Vision I - Basics of Image Processing Part 1

Outlines. Medical Image Processing Using Transforms. 4. Transform in image space

EXAM SOLUTIONS. Image Processing and Computer Vision Course 2D1421 Monday, 13 th of March 2006,

Image processing. Reading. What is an image? Brian Curless CSE 457 Spring 2017

Basic relations between pixels (Chapter 2)

Filters. Advanced and Special Topics: Filters. Filters

Comparative Analysis in Medical Imaging

Digital Image Processing. Prof. P. K. Biswas. Department of Electronic & Electrical Communication Engineering

Lecture 4: Spatial Domain Transformations

Filtering and Edge Detection. Computer Vision I. CSE252A Lecture 10. Announcement

Sharpening through spatial filtering

Lecture: Edge Detection

Motion Estimation. There are three main types (or applications) of motion estimation:

EECS 556 Image Processing W 09. Image enhancement. Smoothing and noise removal Sharpening filters

What will we learn? Neighborhood processing. Convolution and correlation. Neighborhood processing. Chapter 10 Neighborhood Processing

SYDE 575: Introduction to Image Processing

Digital Image Processing. Prof. P.K. Biswas. Department of Electronics & Electrical Communication Engineering

CS334: Digital Imaging and Multimedia Edges and Contours. Ahmed Elgammal Dept. of Computer Science Rutgers University

EE795: Computer Vision and Intelligent Systems

Image gradients and edges April 11 th, 2017

Vivekananda. Collegee of Engineering & Technology. Question and Answers on 10CS762 /10IS762 UNIT- 5 : IMAGE ENHANCEMENT.

Lecture 6: Edge Detection

Image gradients and edges April 10 th, 2018

Digital Image Processing. Image Enhancement - Filtering

In this lecture. Background. Background. Background. PAM3012 Digital Image Processing for Radiographers

Image Processing. Traitement d images. Yuliya Tarabalka Tel.

Analysis and extensions of the Frankle-McCann

JNTUWORLD. 4. Prove that the average value of laplacian of the equation 2 h = ((r2 σ 2 )/σ 4 ))exp( r 2 /2σ 2 ) is zero. [16]

PROCESS > SPATIAL FILTERS

Edge and local feature detection - 2. Importance of edge detection in computer vision

Motion. 1 Introduction. 2 Optical Flow. Sohaib A Khan. 2.1 Brightness Constancy Equation

A Real-time Algorithm for Atmospheric Turbulence Correction

Intensity Transformation and Spatial Filtering

Chapter 3 Image Registration. Chapter 3 Image Registration

Outline 7/2/201011/6/

Course Evaluations. h"p:// 4 Random Individuals will win an ATI Radeon tm HD2900XT

Edge detection. Gradient-based edge operators

Computer Vision 2. SS 18 Dr. Benjamin Guthier Professur für Bildverarbeitung. Computer Vision 2 Dr. Benjamin Guthier

Lecture 2 Image Processing and Filtering

Digital Image Processing (CS/ECE 545) Lecture 5: Edge Detection (Part 2) & Corner Detection

Image Enhancement: To improve the quality of images

Image gradients and edges

CMPUT 206. Introduction to Digital Image Processing

Image Processing. BITS Pilani. Dr Jagadish Nayak. Dubai Campus

Fundamentals of Digital Image Processing

EEM 463 Introduction to Image Processing. Week 3: Intensity Transformations

IMAGE ENHANCEMENT IN THE SPATIAL DOMAIN

Motivation. Intensity Levels

Computer Vision I - Filtering and Feature detection

Chapter 3: Intensity Transformations and Spatial Filtering

IMAGE PROCESSING >FILTERS AND EDGE DETECTION FOR COLOR IMAGES UTRECHT UNIVERSITY RONALD POPPE

Image Segmentation Image Thresholds Edge-detection Edge-detection, the 1 st derivative Edge-detection, the 2 nd derivative Horizontal Edges Vertical

Image Processing

IMAGING. Images are stored by capturing the binary data using some electronic devices (SENSORS)

2D Image Processing INFORMATIK. Kaiserlautern University. DFKI Deutsches Forschungszentrum für Künstliche Intelligenz

Requirements for region detection

Analysis and Comparison of Spatial Domain Digital Image Inpainting Techniques

Midterm Examination CS 534: Computational Photography

ECE 470: Homework 5. Due Tuesday, October 27 in Seth Hutchinson. Luke A. Wendt

CS4442/9542b Artificial Intelligence II prof. Olga Veksler

CS534: Introduction to Computer Vision Edges and Contours. Ahmed Elgammal Dept. of Computer Science Rutgers University

CS4442/9542b Artificial Intelligence II prof. Olga Veksler

Lecture 4 Image Enhancement in Spatial Domain

Low Contrast Image Enhancement Using Adaptive Filter and DWT: A Literature Review

Threshold Visual Cryptography Scheme for Color Images with No Pixel Expansion

3. (a) Prove any four properties of 2D Fourier Transform. (b) Determine the kernel coefficients of 2D Hadamard transforms for N=8.

Contrast adjustment via Bayesian sequential partitioning

Noise Model. Important Noise Probability Density Functions (Cont.) Important Noise Probability Density Functions

SIFT - scale-invariant feature transform Konrad Schindler

EXAM SOLUTIONS. Computer Vision Course 2D1420 Thursday, 11 th of march 2003,

Correspondence and Stereopsis. Original notes by W. Correa. Figures from [Forsyth & Ponce] and [Trucco & Verri]

Image restoration. Restoration: Enhancement:

SEMI-BLIND IMAGE RESTORATION USING A LOCAL NEURAL APPROACH

Structural Analysis of Aerial Photographs (HB47 Computer Vision: Assignment)

Introduction to Digital Image Processing

Classification of image operations. Image enhancement (GW-Ch. 3) Point operations. Neighbourhood operation

CS443: Digital Imaging and Multimedia Binary Image Analysis. Spring 2008 Ahmed Elgammal Dept. of Computer Science Rutgers University

SIFT: SCALE INVARIANT FEATURE TRANSFORM SURF: SPEEDED UP ROBUST FEATURES BASHAR ALSADIK EOS DEPT. TOPMAP M13 3D GEOINFORMATION FROM IMAGES 2014

Lecture 3 - Intensity transformation

LYU0602 Automatic PhotoHunt Generation

CHAPTER 3 IMAGE ENHANCEMENT IN THE SPATIAL DOMAIN

Fast 3D Mean Shift Filter for CT Images

Chapter - 2 : IMAGE ENHANCEMENT

Broad field that includes low-level operations as well as complex high-level algorithms

Lecture 4: Image Processing

Part 3: Image Processing

COMPUTER VISION > OPTICAL FLOW UTRECHT UNIVERSITY RONALD POPPE

Digital Image Processing

Biometrics Technology: Image Processing & Pattern Recognition (by Dr. Dickson Tong)

Digital Image Processing. Image Enhancement (Point Processing)

Digital Image Processing, 3rd ed. Gonzalez & Woods

School of Computing University of Utah

Shading. Reading. Pinhole camera. Basic 3D graphics. Brian Curless CSE 557 Fall Required: Shirley, Chapter 10

Islamic University of Gaza Faculty of Engineering Computer Engineering Department

Topic 4 Image Segmentation

Local Feature Detectors

Improved Non-Local Means Algorithm Based on Dimensionality Reduction

A new Eulerian computational method for the propagation of short acoustic and electromagnetic pulses

Digital Image Processing. Image Enhancement in the Spatial Domain (Chapter 4)

Local Features: Detection, Description & Matching

Transcription:

Wilhelm Burger Mark J. Burge Digital Image Processing An algorithmic introduction using Java Second Edition ERRATA Springer Berlin Heidelberg NewYork Hong Kong London Milano Paris Tokyo

12.1 RGB Color Images RGB Fig. 12.2 A color image and its corresponding RGB channels. The fruits depicted are mainly yellow and red and therefore have high values in the R and G channels. In these regions, the B content is correspondingly lower (represented here by darker gray values) except for the bright highlights on the apple, where the color changes gradually to white. The tabletop in the foreground is purple and therefore displays correspondingly higher values in its B channel. R G B individual color components. In the next sections we will examine the difference between true color images, which utilize colors uniformly selected from the entire color space, and so-called palleted or indexed images, in which only a select set of distinct colors are used. Deciding which type of image to use depends on the requirements of the application. Duplicate text removed. True color images A pixel in a true color image can represent any color in its color space, as long as it falls within the (discrete) range of its individual color components. True color images are appropriate when the image contains many colors with subtle differences, as occurs in digital photography and photo-realistic computer graphics. Next we look at two methods of ordering the color components in true color images: component ordering and packed ordering. 293

I (u, v) = m = i = I(u + m, v + n) H d (m, n) (17.15) n = I(i, j) H d (i u, j v), (17.16) j = 17.2 Bilateral Filter every new pixel value I (u, v) is the weighted average of the original image pixels I in a certain neighborhood, with the weights specified by the elements of the filter kernel H d. 5 The weight assigned to each pixel only depends on its spatial position relative to the current center coordinate (u, v). In particular, H d (0, 0) specifies the weight of the center pixel I(u, v), and H d (m, n) is the weight assigned to a pixel displaced by (m, n) from the center. Since only the spatial image coordinates are relevant, such a filter is called a domain filter. Obviously, ordinary filters as we know them are all domain filters. 17.2.2 Range Filter Although the idea may appear strange at first, one could also apply a linear filter to the pixel values or range of an image in the form I r(u, v) i = j = ( ) I(i, j) H r I(i, j) I(u, v). (17.17) The contribution of each pixel is specified by the function H r and depends on the difference between its own value I(i, j) and the value at the current center pixel I(u, v). The operation in Eqn. (17.17) is called a range filter, where the spatial position of a contributing pixel is irrelevant and only the difference in values is considered. For a given position (u, v), all surrounding image pixels I(i, j) with the same value contribute equally to the result I r(u, v). Consequently, the application of a range filter has no spatial effect upon the image in contrast to a domain filter, no blurring or sharpening will occur. Instead, a range filter effectively performs a global point operation by remapping the intensity or color values. However, a global range filter by itself is of little use, since it combines pixels from the entire image and only changes the intensity or color map of the image, equivalent to a nonlinear, image-dependent point operation. 17.2.3 Bilateral Filter General Idea The key idea behind the bilateral filter is to combine domain filtering (Eqn. (17.16)) and range filtering (Eqn. (17.17)) in the form I (u, v) = 1 W u,v i = j = ( ) I(i, j) H d (i u, j v) H r I(i, j) I(u, v), }{{} w i,j (17.18) 5 In Eqn. (17.16), functions I and H d are assumed to be zero outside their domains of definition. 421

n = 0 n = 5 n = 10 n = 20 n = 40 n = 80 17.3 Anisotropic Diffusion Filters σ n 1.411 σ n 1.996 σ n 2.823 σ n 3.992 σ n 5.646 Fig. 17.15 Discrete isotropic diffusion. Blurred images and impulse response obtained after n iterations, with α = 0.20 (see Eqn. (17.45)). The size of the images is 50 50. The width of the equivalent Gaussian kernel (σ n ) grows with the square root of n (the number of iterations). Impulse response plots are normalized to identical peak values. (a) (b) (c) (d) (e) (f) I (n) (u) { I(u) for n = 0, I (n 1) (u) + α [ 2 I (n 1) (u) ] for n > 0, (17.45) for each image position u = (u, v), with n denoting the iteration number. This is called the direct solution method (there are other methods but this is the simplest). The constant α in Eqn. (17.45) is the time increment, which controls the speed of the diffusion process. Its value should be in the range (0, 0.25] for the numerical scheme to be stable. At each iteration n, the variations in the image function are reduced and (depending on the boundary conditions) the image function should eventually flatten out to a constant plane as n approaches infinity. For a discrete image I, the Laplacian 2 I in Eqn. (17.45) can be approximated by a linear 2D filter, 2 I I H L, with H L = 0 1 0 1 4 1, (17.46) 0 1 0 as described earlier. 11 An essential property of isotropic diffusion is that it has the same effect as a Gaussian filter whose width grows with the elapsed time. For a discrete 2D image, in particular, the result obtained after n diffusion steps (Eqn. (17.45)), is the same as applying a linear filter to the original image I, with the normalized Gaussian kernel I (n) I H G,σ n, (17.47) H G,σ n (x, y) = 1 2πσ 2 n e x 2 +y 2 2σ n 2 (17.48) of width σ n = 2t = 2n α. The example in Fig. 17.15 illustrates this Gaussian smoothing behavior obtained with discrete isotropic diffusion. 11 See also Chapter 6, Sec. 6.6.1 and Sec. C.3.1 in the Appendix. 435

21 Geometric Operations I I Fig. 21.2 Affine mapping. An affine 2D transformation is uniquely specified by three pairs of corresponding points; for example, (x 0, x 0 ), (x 1, x 1), and (x 2, x 2). x 2 x 1 x 2 x 1 x 0 x 0 six transformation parameters a 00,..., a 12 are derived by solving the system of linear equations x 0 = a 00 x 0 + a 01 y 0 + a 02, y 0 = a 10 x 0 + a 11 y 0 + a 12, x 1 = a 00 x 1 + a 01 y 1 + a 02, y 1 = a 10 x 1 + a 11 y 1 + a 12, (21.25) x 2 = a 00 x 2 + a 01 y 2 + a 02, y 2 = a 10 x 2 + a 11 y 2 + a 12, provided that the points (vectors) x 0, x 1, x 2 are linearly independent (i.e., that they do not lie on a common straight line). Since Eqn. (21.25) consists of two independent sets of linear 3 3 equations for x i and y i, the solution can be written in closed form as a 00 = 1 d [y 0(x 1 x 2 ) + y 1(x 2 x 0 ) + y 2(x 0 x 1 )], a 01 = 1 d [x 0(x 2 x 1 ) + x 1(x 0 x 2 ) + x 2(x 1 x 0 )], a 10 = 1 d [y 0(y 1 y 2 ) + y 1(y 2 y 0 ) + y 2(y 0 y 1 )], (21.26) a 11 = 1 d [x 0(y 2 y 1) + x 1 (y 0 y 2) + x 2 (y 1 y 0)], a 02 = 1 d [x 0(y 2 x 1 y 1 x 2) + x 1 (y 0 x 2 y 2 x 0) + x 2 (y 1 x 0 y 0 x 1)], a 12 = 1 d [x 0(y 2 y 1 y 1y 2 ) + x 1(y 0 y 2 y 2y 0 ) + x 2(y 1 y 0 y 0y 1 )], with d = x 0 (y 2 y 1 ) + x 1 (y 0 y 2 ) + x 2 (y 1 y 0 ). Inverse affine mapping The inverse of the affine transformation, which is often required in practice (see Sec. 21.2.2), can be calculated by simply applying the inverse of the transformation matrix A affine (Eqn. (21.20)) in homogeneous coordinate space, that is, x = A 1 affine x (21.27) 518 or x = hom 1 [ A 1 affine hom(x ) ] in Cartesian coordinates, that is,