Ulrik Söderström 17 Jan Image Processing. Introduction

Similar documents
UNIT-2 IMAGE REPRESENTATION IMAGE REPRESENTATION IMAGE SENSORS IMAGE SENSORS- FLEX CIRCUIT ASSEMBLY

Digital Image Processing

CS 548: Computer Vision and Image Processing Digital Image Basics. Spring 2016 Dr. Michael J. Reale

CP467 Image Processing and Pattern Recognition

Babu Madhav Institute of Information Technology Years Integrated M.Sc.(IT)(Semester - 7)

Lecture 1 Introduction & Fundamentals

Digital Image Processing COSC 6380/4393

Computer Vision. The image formation process

IMAGING. Images are stored by capturing the binary data using some electronic devices (SENSORS)

1/12/2009. Image Elements (Pixels) Image Elements (Pixels) Digital Image. Digital Image =...

Digital Image Processing

EE795: Computer Vision and Intelligent Systems

Motivation. Gray Levels

Motivation. Intensity Levels

ECE 178: Introduction (contd.)

Computer Assisted Image Analysis TF 3p and MN1 5p Lecture 1, (GW 1, )

Color and Shading. Color. Shapiro and Stockman, Chapter 6. Color and Machine Vision. Color and Perception

Digital Image Fundamentals. Prof. George Wolberg Dept. of Computer Science City College of New York

DD2423 Image Analysis and Computer Vision IMAGE FORMATION. Computational Vision and Active Perception School of Computer Science and Communication

Unit - I Computer vision Fundamentals

Why is computer vision difficult?

Interpolation is a basic tool used extensively in tasks such as zooming, shrinking, rotating, and geometric corrections.

Digital Image Processing. Introduction

(Refer Slide Time 00:17) Welcome to the course on Digital Image Processing. (Refer Slide Time 00:22)

CSE 167: Lecture #6: Color. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2012

Image Analysis. 1. A First Look at Image Classification

EE 584 MACHINE VISION

VC 16/17 TP5 Single Pixel Manipulation

Image Acquisition Image Digitization Spatial domain Intensity domain Image Characteristics

Broad field that includes low-level operations as well as complex high-level algorithms

Review for Exam I, EE552 2/2009

Final Review. Image Processing CSE 166 Lecture 18

EXAM SOLUTIONS. Image Processing and Computer Vision Course 2D1421 Monday, 13 th of March 2006,

Reading. 2. Fourier analysis and sampling theory. Required: Watt, Section 14.1 Recommended:

CPSC 425: Computer Vision

CSE 167: Lecture #6: Color. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2011

CS4733 Class Notes, Computer Vision

Low-level Vision Processing Algorithms Speaker: Ito, Dang Supporter: Ishii, Toyama and Y. Murakami

Image Acquisition + Histograms

(0, 1, 1) (0, 1, 1) (0, 1, 0) What is light? What is color? Terminology

What is an Image? Image Acquisition. Image Processing - Lesson 2. An image is a projection of a 3D scene into a 2D projection plane.

An Introduc+on to Mathema+cal Image Processing IAS, Park City Mathema2cs Ins2tute, Utah Undergraduate Summer School 2010

NAME :... Signature :... Desk no. :... Question Answer

Introduction to Digital Image Processing

EC-433 Digital Image Processing

Detector systems for light microscopy

ME/CS 132: Introduction to Vision-based Robot Navigation! Low-level Image Processing" Larry Matthies"

CoE4TN3 Medical Image Processing

Computer Vision 2. SS 18 Dr. Benjamin Guthier Professur für Bildverarbeitung. Computer Vision 2 Dr. Benjamin Guthier

Lecture 2 Image Processing and Filtering

Chapter 2 - Fundamentals. Comunicação Visual Interactiva

Perception. Autonomous Mobile Robots. Sensors Vision Uncertainties, Line extraction from laser scans. Autonomous Systems Lab. Zürich.

Matching. Compare region of image to region of image. Today, simplest kind of matching. Intensities similar.

Sampling and Reconstruction

EE795: Computer Vision and Intelligent Systems

Biometrics Technology: Image Processing & Pattern Recognition (by Dr. Dickson Tong)

Sampling, Aliasing, & Mipmaps

Rendering. Converting a 3D scene to a 2D image. Camera. Light. Rendering. View Plane

INTRODUCTION TO IMAGE PROCESSING (COMPUTER VISION)

EEM 463 Introduction to Image Processing. Week 3: Intensity Transformations

Digital Image Fundamentals

Basic Algorithms for Digital Image Analysis: a course

CS6670: Computer Vision

CS4670: Computer Vision

Lecture 4 Image Enhancement in Spatial Domain

Image Processing. CSCI 420 Computer Graphics Lecture 22

Image Processing. Alpha Channel. Blending. Image Compositing. Blending Errors. Blending in OpenGL

Digital Image Processing Lectures 1 & 2

International ejournals

Sampling, Aliasing, & Mipmaps

x' = c 1 x + c 2 y + c 3 xy + c 4 y' = c 5 x + c 6 y + c 7 xy + c 8

CS6670: Computer Vision

11. Image Data Analytics. Jacobs University Visualization and Computer Graphics Lab

Basic relations between pixels (Chapter 2)

Image Processing. Ch1: Introduction. Prepared by: Hanan Hardan. Hanan Hardan 1

Image Filtering, Warping and Sampling

Main topics in the Chapter 2. Chapter 2. Digital Image Representation. Bitmaps digitization. Three Types of Digital Image Creation CS 3570

EECS 556 Image Processing W 09. Image enhancement. Smoothing and noise removal Sharpening filters

Sampling: Application to 2D Transformations

Fourier transform. Filtering. Examples of FT pairs. Examples of FT pairs. Comb function. Examples of FT pairs FRPE. Decomposes into freq.

Image Analysis - Lecture 1

Segmentation and Grouping

CHAPTER 3 IMAGE ENHANCEMENT IN THE SPATIAL DOMAIN

Ulrik Söderström 16 Feb Image Processing. Segmentation

Image Processing, Warping, and Sampling

Ch 22 Inspection Technologies

Brightness and geometric transformations

CS443: Digital Imaging and Multimedia Binary Image Analysis. Spring 2008 Ahmed Elgammal Dept. of Computer Science Rutgers University

Edge and local feature detection - 2. Importance of edge detection in computer vision

Capturing, Modeling, Rendering 3D Structures

Sampling, Aliasing, & Mipmaps

EE368 Project Report CD Cover Recognition Using Modified SIFT Algorithm

Binary representation and data

Computer Graphics. Sampling Theory & Anti-Aliasing. Philipp Slusallek

Image Registration Lecture 4: First Examples

Digital Image Processing COSC 6380/4393

Volume Illumination, Contouring

Lecture 10: Image Descriptors and Representation

Wireless Communication

ELEC Dr Reji Mathew Electrical Engineering UNSW

Transcription:

Ulrik Söderström ulrik.soderstrom@tfe.umu.se 17 Jan 2017 Image Processing Introduction

Image Processsing Typical goals: Improve images for human interpretation Image processing Processing of images for machine perception Image analysis Dealing with images for storage and communication (compression) Image handling 2

Image processing steps Low-level Primitive operations (filtering, noise reduction) Both input and output are images Mid-level Segmentation, description, recognition Input. images, output. image attributes High-level Making sense of the recognized objects, (like vision) 3

Course outline Lectures Spatial and Frequency domain, Restoration, Compression, Morphological Image Processing, Representation, Description Practical work 3 Labs Project (extensive lab) Written exam 4

Grading 3,4,5 Lab exercises ~ 30% Project ~ 30% Written exam ~ 40% 5

Images An image is a 2-D function f(x, y) x and y - spatial coordinates f - amplitude (intensity, graylevel) at a point with coordinates (x, y) All values finite and discrete - digital image Digital image processing - computers involved Each value - pixel (picture element) 6

Image creation Observation of energy Electromagnetic (EM) radiation most common Human vision is limited to a narrow band Sensors have higher capacities The whole EM spectrum + other energies 7

Photons Photon - an amount of energy No mass Travelling at the speed of light Different frequencies v and wavelengths λ c= speed of light, 3x10 8 Energy of a photon h = plancks constant 8

The EM spectrum 9

Ultraviolet (UV) light Flourescence images of corn UV light from the same star as previous 10

Visible light By far the most common 11

Visible light Microscopic images of a CD, cholesterol, and a microprocessor 12

Infrared light America (north and south) 13

Combined spectra Different bands give totally different images of the same object Astronomic images of the same region but in different bands 14

Other energies Sound High frequency, ultrasound (1-5 MHz) Medical images 15

Other energies Electron microscopy 16

Computer generated images No need for a physical energy source Fractals 3-D computer models 17

Foundations All imaging systems replicates the human visual system 18

Unknown functionalities 19

Unknown functionalities 20

Image aquisition Sensor to measure energy In digital cameras - CCD arrays Integrate over the sensor, values proportional to the number of photons hitting the surfaces 21

Image aquisition 22

Image formation An image f(x, y) When generated from a physical process: 0 < f(x, y) < 0 L min f(x, y) L max < (monocromatic image) The interval [L min, L max ] - grayscale of the image 23

Image formation Two components - illumination and reflectance f(x, y) = i(x, y) r(x, y) 0 < i(x, y) <, illumination component Determined by illumination source 0 < r(x, y) < 1, reflectance component Determined by object charasteristics Transmissivity is used instead of reflectivity in the case of illumination passing through objects, eg X-rays 24

Sampling and quantization The output of a sensor is in most cases a continuous voltage waveform Needs to be digitized Sampling - digitizing the coordinate values Usually M = 2 m steps in x-direction and N = 2 n steps in y-direction Quantization. digitizing the amplitude values L = 2 k gray values Image (storage) size = M N k/8 bytes 25

Sampling and quantization 26

Sampling and quantization 27

Sampling and quantization A square grid is the most common (the only one in the book) 28

Image representation Most common convention f(x, y) = Matrix representation of image values 29

Image representation Surface color Intensities Gray-level, Color information 30

Resolution Spatial resolution - determined by the Sampling Tightness in pixels Sampling distance The human eyes cannot detect resolution higher/lower than a threshold A computer might see more information 31

Spatial resolution 32

Acceptable resolution? Isopreference subjectively perceived quality of the images Many details- few gray levels needed 33

Sampling Sampling theorem If the distance between sampling points is larger than the smallest objects we want to capture, we get problems with aliasing Sampling introduces new frequencies The sampling frequency must be at least twice the highest frequency in the image Blur the image before sampling 34

Moiré patterns Moiré pattern effects occurs when periodic patterns break up. E.g. scanned images from printed pages with periodic dots 35

Zooming and shrinking Resampling an already digital image Resize the image grid Simplest way to enlarge an image to twice its size - duplicate all pixels (nearest neighbor interpolation) Better results if more neighbors are taken into account (e.g. bilinear interpolation, using the four nearest neighbors) 36

Zooming and shrinking 37

Zooming 38

Pixel relationships Neighbors of a pixel p with coordinates (x, y) Four horizontal and vertical neighbors (x+1, y), (x-1, y), (x, y+1), and (x, y-1) N 4 (p), the 4-neighbors of p Four diagonal neighbors (x+1, y+1), (x+1, y-1), (x-1, y+1), and (x-1, y-1) N D (p) Combined, N 4 (p) N D (p) (union) N 8 (p), the 8-neighbors of p 39

Adjacency Two pixels that are neighbors are adjacent 4-adjacency, two pixels p and q with values V are 4-adjacent if q is in the set N 4 (p) 8-adjacency, two pixels p and q with values V are 8-adjacent if q is in the set N 8 (p) 40

Distance measures p has coordinates (x, y), q has (s, t) Distances between p and q Euclidean distance (the most natural in R 2 ) D e (p, q) = [(x-s) 2 + (y-t)2] ½ 2 1 2 1 x 1 2 1 2 D 4 distance, - city block distance D 4 (p, q) = x-s + y-t The 4-neighbors of (x, y) have D 4 = 1 2 1 2 1 x 1 2 1 2 41

Distance measures D 8 distance, - chessboard distance D 8 (p, q) = max( x-s, y-t ) The 8-neighbors of (x, y) have D 8 = 1 D m distance 2 2 2 2 2 2 1 1 1 2 2 1 x 1 2 2 1 1 1 2 2 2 2 2 2 The number of jumps between p and q along the path that connects them, depending on the values of the pixels on the path and their neighbors. 42

Operations on a pixel basis It is common to carry out arithmetic operations on images E.g. dividing one image by another - not a defined matrix operation Pixel wise operations Images must be of equal size 43

Linear and nonlinear operations An operator H whose input and output are images is linear if H(af + bg) = ah(f) + bh(g) for any images f and g and any scalars a and b E.g. summing K images Computing the absolute value of a function is an example of a nonlinear operation 44