Image processing. Reading. What is an image? Brian Curless CSE 457 Spring 2017

Similar documents
Lecture 4: Image Processing

EE795: Computer Vision and Intelligent Systems

Digital Image Processing. Image Enhancement - Filtering

Reading. 2. Fourier analysis and sampling theory. Required: Watt, Section 14.1 Recommended:

Filtering Images. Contents

Why is computer vision difficult?

2D Image Processing INFORMATIK. Kaiserlautern University. DFKI Deutsches Forschungszentrum für Künstliche Intelligenz

Prof. Feng Liu. Winter /15/2019

Biometrics Technology: Image Processing & Pattern Recognition (by Dr. Dickson Tong)

Lecture 2 Image Processing and Filtering

Image gradients and edges April 11 th, 2017

Homework #1. Displays, Image Processing, Affine Transformations, Hierarchical Modeling

Image gradients and edges April 10 th, 2018

CS4442/9542b Artificial Intelligence II prof. Olga Veksler

Fourier analysis and sampling theory

Lecture 4: Spatial Domain Transformations

Filtering and Enhancing Images

Image gradients and edges

Image Processing. Traitement d images. Yuliya Tarabalka Tel.

CS4442/9542b Artificial Intelligence II prof. Olga Veksler

ECG782: Multidimensional Digital Signal Processing

EECS 556 Image Processing W 09. Image enhancement. Smoothing and noise removal Sharpening filters

CS334: Digital Imaging and Multimedia Edges and Contours. Ahmed Elgammal Dept. of Computer Science Rutgers University

ECG782: Multidimensional Digital Signal Processing

Edge and local feature detection - 2. Importance of edge detection in computer vision

SYDE 575: Introduction to Image Processing

Computer Vision I - Basics of Image Processing Part 1

Part 3: Image Processing

Computer Vision 2. SS 18 Dr. Benjamin Guthier Professur für Bildverarbeitung. Computer Vision 2 Dr. Benjamin Guthier

Digital Image Processing. Prof. P. K. Biswas. Department of Electronic & Electrical Communication Engineering

Lecture: Edge Detection

Relationship between Fourier Space and Image Space. Academic Resource Center

Edge Detection. Today s reading. Cipolla & Gee on edge detection (available online) From Sandlot Science

Chapter 3: Intensity Transformations and Spatial Filtering

Image Processing

Linear Operations Using Masks

Edge Detection. CSE 576 Ali Farhadi. Many slides from Steve Seitz and Larry Zitnick

What will we learn? Neighborhood processing. Convolution and correlation. Neighborhood processing. Chapter 10 Neighborhood Processing

EXAM SOLUTIONS. Image Processing and Computer Vision Course 2D1421 Monday, 13 th of March 2006,

Edge Detection. CMPUT 206: Introduction to Digital Image Processing. Nilanjan Ray. Source:

Edge and Texture. CS 554 Computer Vision Pinar Duygulu Bilkent University

CS534: Introduction to Computer Vision Edges and Contours. Ahmed Elgammal Dept. of Computer Science Rutgers University

Filtering and Edge Detection. Computer Vision I. CSE252A Lecture 10. Announcement

Segmentation and Grouping

ME/CS 132: Introduction to Vision-based Robot Navigation! Low-level Image Processing" Larry Matthies"

Edge detection. Winter in Kraków photographed by Marcin Ryczek

Broad field that includes low-level operations as well as complex high-level algorithms

Babu Madhav Institute of Information Technology Years Integrated M.Sc.(IT)(Semester - 7)

Edge detection. Winter in Kraków photographed by Marcin Ryczek

Image Processing (2) Point Operations and Local Spatial Operations

Edge Detection (with a sidelight introduction to linear, associative operators). Images

Edge Detection CSC 767

Chapter - 2 : IMAGE ENHANCEMENT

SURVEY ON IMAGE PROCESSING IN THE FIELD OF DE-NOISING TECHNIQUES AND EDGE DETECTION TECHNIQUES ON RADIOGRAPHIC IMAGES

Edge detection. Stefano Ferrari. Università degli Studi di Milano Elaborazione delle immagini (Image processing I)

Biomedical Image Analysis. Spatial Filtering

Computer Vision and Graphics (ee2031) Digital Image Processing I

CS 4495 Computer Vision. Linear Filtering 2: Templates, Edges. Aaron Bobick. School of Interactive Computing. Templates/Edges

Image Processing. Filtering. Slide 1

Image Processing. BITS Pilani. Dr Jagadish Nayak. Dubai Campus

Coarse-to-fine image registration

Digital Image Processing ERRATA. Wilhelm Burger Mark J. Burge. An algorithmic introduction using Java. Second Edition. Springer

Perception. Autonomous Mobile Robots. Sensors Vision Uncertainties, Line extraction from laser scans. Autonomous Systems Lab. Zürich.

CS4733 Class Notes, Computer Vision

Image Enhancement: To improve the quality of images

3.4& Fundamentals& mechanics of spatial filtering(page 166) Spatial filter(mask) Filter coefficients Filter response

Filtering Images in the Spatial Domain Chapter 3b G&W. Ross Whitaker (modified by Guido Gerig) School of Computing University of Utah

Feature Detectors - Sobel Edge Detector

Comparison between Various Edge Detection Methods on Satellite Image

Sobel Edge Detection Algorithm

Edge and corner detection

Computer Vision I. Announcements. Fourier Tansform. Efficient Implementation. Edge and Corner Detection. CSE252A Lecture 13.

Edge Detection. Announcements. Edge detection. Origin of Edges. Mailing list: you should have received messages

Applications of Image Filters

CHAPTER 3 IMAGE ENHANCEMENT IN THE SPATIAL DOMAIN

Homework #1. Displays, Image Processing, Affine Transformations, Hierarchical Modeling

Announcements. Edge Detection. An Isotropic Gaussian. Filters are templates. Assignment 2 on tracking due this Friday Midterm: Tuesday, May 3.

Solution: filter the image, then subsample F 1 F 2. subsample blur subsample. blur

Digital Image Processing (CS/ECE 545) Lecture 5: Edge Detection (Part 2) & Corner Detection

CS5670: Computer Vision

Vivekananda. Collegee of Engineering & Technology. Question and Answers on 10CS762 /10IS762 UNIT- 5 : IMAGE ENHANCEMENT.

Fourier transforms and convolution

Image Processing. Bilkent University. CS554 Computer Vision Pinar Duygulu

Local Image preprocessing (cont d)

Anno accademico 2006/2007. Davide Migliore

EECS490: Digital Image Processing. Lecture #19

Edge detection. Goal: Identify sudden. an image. Ideal: artist s line drawing. object-level knowledge)

Point and Spatial Processing

Digital Image Processing

CS559: Computer Graphics. Lecture 6: Painterly Rendering and Edges Li Zhang Spring 2008

Basics. Sampling and Reconstruction. Sampling and Reconstruction. Outline. (Spatial) Aliasing. Advanced Computer Graphics (Fall 2010)

Feature descriptors. Alain Pagani Prof. Didier Stricker. Computer Vision: Object and People Tracking

Filters and Pyramids. CSC320: Introduction to Visual Computing Michael Guerzhoy. Many slides from Steve Marschner, Alexei Efros

Lecture 6: Edge Detection

Image Processing. CSCI 420 Computer Graphics Lecture 22

Image Processing. Alpha Channel. Blending. Image Compositing. Blending Errors. Blending in OpenGL

Denoising and Edge Detection Using Sobelmethod

CMPUT 206. Introduction to Digital Image Processing

Image features. Image Features

Lecture 7: Most Common Edge Detectors

Transcription:

Reading Jain, Kasturi, Schunck, Machine Vision. McGraw-Hill, 1995. Sections 4.2-4.4, 4.5(intro), 4.5.5, 4.5.6, 5.1-5.4. [online handout] Image processing Brian Curless CSE 457 Spring 2017 1 2 What is an image? We can think of an image as a function, f, from R 2 to R: f (x, y ) gives the intensity of a channel at position (x, y ) Realistically, we expect the image only to be defined over a rectangle, with a finite range: f : [a, b ] x [c, d ] [0,1] A color image is just three functions pasted together. We can write this as a vector-valued function: r( x, y) f( x, y) g( x, y) b( x, y) 3 4

Images as functions What is a digital image? In computer graphics, we usually operate on digital (discrete) images: Sample the space on a regular grid Quantize each sample (round to nearest integer) If our samples are apart, we can write this as: f [i,j ]= Quantize{ f (i, j ) } f(x,y) y x f [i,j] j 5 i 6 Image processing An image processing operation typically defines a new image g in terms of an existing image f. Noise Image processing is also useful for noise reduction and edge enhancement. We will focus on these applications for the remainder of the lecture The simplest operations are those that transform each pixel in isolation. These pixel-to-pixel operations can be written: g( x, y) t( f( x, y)) Examples: threshold, RGB grayscale Note: a typical choice for mapping to grayscale is to apply the YIQ television matrix and keep the Y. Y 0.299 0.587 14 R I 0.596 0.275 0.321 G Q 0.212 0.523 0.311 B 7 Common types of noise: Salt and pepper noise: contains random occurrences of black and white pixels Impulse noise: contains random occurrences of white pixels Gaussian noise: variations in intensity drawn from a Gaussian normal distribution 8

Ideal noise reduction Ideal noise reduction 9 10 Practical noise reduction Discrete convolution How can we smooth away noise in a single image? One of the most common methods for filtering an image is called discrete convolution. (We will just call this convolution from here on.) In 1D, convolution is defined as: g[i ] = f [i ]* h[i ] = å f [k ]h[i - k ] k = å f [k ]h [k - i ] where h [i ]= h[-i ]. Is there a more abstract way to represent this sort of operation? Of course there is! 11 k Flipping the kernel (i.e., working with h[-i ]) is mathematically important. In practice, though, you can assume kernels are pre-flipped unless I say otherwise. 12

Convolution in 2D In two dimensions, convolution becomes: gi [, j] f[, i j] hi [, j] f[ k, ] h[ ik, j] k where hi [, j] h[ i, j]. f[ k, ] h [ ki, j] k Convolving in 2D Since f and h are defined over finite regions, we can write them out in two-dimensional arrays: Image (f ) 128 54 9 78 100 145 98 240 233 86 89 177 246 228 127 67 90 255 148 95 106 111 128 84 172 221 154 97 69 94 Filter (h ) Again, flipping the kernel (i.e., working with h[-i, -j ]) is mathematically important. In practice, though, you can assume kernels are pre-flipped unless I say otherwise. 0.2 13 This is not matrix multiplication. For color images, filter each color channel separately. The filter is assumed to be zero outside its boundary. Q: What happens at the boundary of the image? 14 Normalization Suppose f is a flat / constant image, with all pixel values equal to some value C. Image (f ) Normalization C C X h 13 C X h 23 C X h 33 C C C X h 12 C X h 22 C X h 32 C C C X h 11 C X h 21 C X h 31 C Filter (h ) h 13 h 23 h 33 h 12 h 22 h 32 h 11 h 21 h 31 Q: What will be the value of each pixel after filtering? Q: What will be the value of each pixel after filtering? Q: How do we avoid getting a value brighter or darker than the original image? 15 Q: How do we avoid getting a value brighter or darker than the original image? 16

Mean filters Effect of mean filters How can we represent our noise-reducing averaging as a convolution filter (know as a mean filter)? 17 18 Gaussian filters Effect of Gaussian filters Gaussian filters weigh pixels based on their distance from the center of the convolution filter. In particular: e hi [, j] 2 2 2 ( i j )/(2 ) This does a decent job of blurring noise while preserving features of the image. What parameter controls the width of the Gaussian? What happens to the image as the Gaussian filter kernel gets wider? What is the constant C? What should we set it to? C 19 20

Median filters Effect of median filters A median filter operates over an N xn region by selecting the median intensity in the region. What advantage does a median filter have over a mean filter? Is a median filter a kind of convolution? 21 Comparison: Gaussian noise 22 Comparison: salt and pepper noise 23 24

Mean bilateral filtering 2D Mean bilateral filtering Bilateral filtering is a method to average together nearby samples only if they are similar in value. range domain This is a mean bilateral filter where you take the average of everything that is both within the domain footprint (w x w in 2D) and range height (h ). You must sum up all pixels you find in that box and then divide by the number of pixels. Now consider filtering an image with a bilateral filter with a 3x3 domain and a total range height of 20 (i.e., range of [-10,10] from center pixel). 205 198 190 203 210 192 191 203 191 194 206 98 210 197 204 101 98 103 205 199 104 97 94 107 190 92 106 106 100 108 110 91 101 100 96 99 Q: What happens as the range size becomes large? Q: Will bilateral filtering take care of impulse noise? 25 26 Color bilateral filtering Finally, for color, we simply compute range distance in R,G,B space as the length of the vector between the two color vectors. Consider colors at different pixels: Gaussian bilateral filtering We can also change the filter to something nicer like Gaussians. Let s go back to 1D: R1 C 1 G1 B 1 R2 C 2 G2 B 2 The range distance between them is then: Where d is the width of the domain Gaussian and r is the width of the range Gaussian. Note that we can write a 2D Gaussian as a product of two Gaussians (i.e., it is a separable function): ( i 2 r 2 )/(2 2 ) i 2 /(2 2 ) r 2 /(2 2 ) hi [, r]~ e e e After selecting pixels that are within the color distance range, you then separately average each color channel of those pixels to compute the final color. 27 where i indexes the spatial domain and r is the range difference. This would make a round Gaussian. We can make it elliptical by having different s for domain and range: 2 2 i /(2 2 2 d ) r /(2 r ) hi [, r]~ e e ~ h ( i) h ( r) d r 28

The math: 1D bilateral filtering Recall that convolution looked like this: 1 gi [] f[ kh ] d [ ik] C k with normalization (sum of filter values): C hd [ ik] This was just domain filtering. The bilateral filter is similar, but includes both domain and range filtering: 1 g[] i f[ k] h d [ ik] hr ( f[] if[ k]) C k with normalization (sum of filter values): C hd [ ik] hr ( f[ i] f[ k]) k k Note that with regular convolution, we pre-compute C once, but for bilateral filtering, we must compute it at each pixel location where it s applied. The math: 2D bilateral filtering In 2D, bilateral filtering generalizes to having a 2D domain, but still a 1D range: 1 gi [, j] f[ k, ] hd [ ik, j] hr ([, f i j] f[ k, ]) C k, And the normalization becomes (sum of filter values): C hd [ ik, j] hr ( f[ i, j] f[ k, ]) k, For Gaussian filtering, the new form looks like this: ( i 2 j 2 )/(2 d 2 ) r 2 /(2 r 2 ) hi [, j, r]~ e e ~ h ( i, j) h ( r) d Note that Gaussian bilateral filtering is slow without some optimization, and some optimizations can be fairly complicated (if worthwhile). Fortunately, simple mean bilateral filtering is fairly fast and works well in practice. r 29 30 Edge detection Input r = r = 0.25 One of the most important uses of image processing is edge detection: Really easy for humans Really difficult for computers Fundamental in computer vision Important in many graphics applications d = 6 d = 2 Paris, et al. SIGGRAPH course notes 2007 31 32

What is an edge? Gradients The gradient is the 2D equivalent of the derivative: f f f( x, y), x y Properties of the gradient It s a vector Points in the direction of maximum increase of f Magnitude is rate of increase Note: use atan2(y,x)to compute the angle of the gradient (or any 2D vector). How can we approximate the gradient in a discrete image? Q: How might you detect an edge in 1D? 33 34 Less than ideal edges Steps in edge detection Edge detection algorithms typically proceed in three or four steps: Filtering: cut down on noise Enhancement: amplify the difference between edges and non-edges Detection: use a threshold operation Localization (optional): estimate geometry of edges as 1D contours that can pass between pixels 35 36

Edge enhancement Results of Sobel edge detection A popular gradient filter is the Sobel operator: 1 0 1 s x 2 0 2 1 0 1 1 2 1 s y 0 0 0 1 2 1 We can then compute the magnitude of the vector s, s. x y Note that these operators are conveniently preflipped for convolution, so you can directly slide these across an image without flipping first. 37 38 Second derivative operators Constructing a second derivative filter We can construct a second derivative filter from the first derivative. First, one can show that convolution has some convenient properties. Given functions a, b, c: Commutative: abba Associative: Distributive: a b c a b a c abc abc The Sobel operator can produce thick edges. Ideally, we re looking for infinitely thin boundaries. An alternative approach is to look for local extrema in the first derivative: places where the change in the gradient is highest. The flipping of the kernel is needed for associativity. Now let s use associativity to construct our second derivative filter Q: A peak in the first derivative corresponds to what in the second derivative? Q: How might we write this as a convolution filter? 39 40

Localization with the Laplacian Localization with the Laplacian An equivalent measure of the second derivative in 2D is the Laplacian: f x f y 2 2 2 f( x, y) 2 2 Using the same arguments we used to compute the gradient filters, we can derive a Laplacian filter to be: 0 1 0 1 4 1 0 1 0 (The symbol is often used to refer to the discrete Laplacian filter.) Zero crossings in a Laplacian filtered image can be used to localize edges. Original Laplacian (+128) Smoothed 41 42 Sharpening with the Laplacian Summary What you should take away from this lecture: Original Laplacian (+128) The meanings of all the boldfaced terms. How noise reduction is done How discrete convolution filtering works The effect of mean, Gaussian, and median filters What an image gradient is and how it can be computed How edge detection is done What the Laplacian image is and how it is used in either edge detection or image sharpening Original + Laplacian Original - Laplacian Why does the sign make a difference? How can you write the filter that makes the sharpened image? 43 44