Being edited by Prof. Sumana Gupta 1
|
|
- Baldric Ward
- 5 years ago
- Views:
Transcription
1 Being edited by Prof. Sumana Gupta 1 Introduction Digital Image Processing: refers to processing of 2-D picture by a digital Computer- in a broader context refers to digital proc of any 2-d data. A digital image is an array of real or complex numbers represented by a finite no. of bits. A typical digital image processing sequence consists of: Observer Image syst Digitization Digital storage Digital Computer On-line buffer Store Process Refresh /Store Display Record Output The image given in the form of a transparency slide or photograph is first digitized and stored in a computer memory.the digitized image is then processed and/or displayed on a high resolution TV monitor. For display, the image is stored in a rapid access buffer memory which refreshes the monitor at 30 frames/sec to produce a continuous display. Digital image processing has a broad spectrum of application such as : remote sensing via satellites or other space-crafts,image transmission, medical imaging, robotics, radar, sonar,automated inspection of industrial parts.we will consider the following classes of problems: 1. Image representation and modeling: This deals with characterization of the quality that each picture element (pixel) represents.it could represent luminance or absorption characteristics of a body or gravitational field in an area depending on the context. 2. Image enhancement: accentuate certain features of an image. 3. Image restoration: refers to removal of known degradation in an image,linear or non-linear methods used. 4. Image analysis:making qualitative measurements from an image to provide description. 5. Image reconstruction: relates to computer tomographies. 6. Image data compression: refers to transforms,quantization and coding.
2 In general any 2-D function that bears in forming can be considered as an image. Image models give a quantitative or a logical description of the properties of the function. An important consideration in image reptu is the fidelity or intelligibility criteria for measuring the quality of an image or the performance of a processing technique.specification of such measures requires models of perception of contrast, spatial frequencies, colors and so on; Knowledge of fidelity criteria helps in designing the imaging sensors because it tell us the variables that can be measured most accurately. An image model We define the term image as a 2D light intensity function denoted by f(x, y). The value or amplitude of f at spatial coordinates (x,y) gives the intensity (brightness) of the image at that point. Since light is a form of energy f(x, y) must be non-zero and finite, ie, 0 <f(x, y) < Now the images we perceive normally consists of light reflected from objects. The basic nature of f(x, y) can be considered as being characterized by two components. 1. The illumination component i(x, y) ie the component is the amount of source light incident in the scene been viewed. 2. The surface reflectance component r(x, y),amount of light reflected by objects in the scene. These forms combine to form f(x, y) givenby f(x, y) =i(x, y) r(x, y) where 0 <i(x, y) < nature of i(x, y) is determined by the light source. and 0 <r(x, y) < 1 determined by the characteristics of the objects in the scene. 0 total absorption 1 total reflectance 2
3 The range of values are theoretical bounds but some typical ranges are, for i(x, y): 9000 ft candles on a clear day on the surface of earth. :1000 ft candles cloudy day :0.01 ft candles clear evening, full moon day. and r(x, y):0.01 for black velvet :0.65 for steel :0.9 for silver :0.93 for snow The intensity of monochrome image f at (x,y) will be referred to as gray level (l) of image at the point. l lies in the range L min <l<l max is +ve finite The interval (L min,l max ) is called gray scale.it is common practice to use (0,1). in practice, L min = i min r min L max = i max r max For indoor image processing applications, L min L max 100 The interval (L min,l max ) is called the gray scale. It is common practice to shift this interval numerically to (0,L)wherel = 0 is considered black and l = L is considered white. All intermediate values are shades of gray varying continuously from black to white. Uniform Sampling and quantization In order to put the data in a form suitable for computer processing an image function f(x, y) must be digitized both spatially and in amplitude. Digitization of spatial coordinated (x, y) will be referred to as image sampling while amplitude digitization will be called gray level quantization. Suppose that a continuous image f(x, y) is approximated by equally spaced 3
4 samples arranged in the form of an N N array as: f(0, 0) f(0, 1) f(0,n 1) f(x, y) = f(n 1, 0)... f(n 1,N 1) N N where each element of the array refers to as pixel is a discrete quantity. The array represents a digital image. The above digitization requires a decision be made on a value for N as well as on the number of discrete gray levels allowed for each pixel. It is common practice in digital image processing to let N =2 n and G = nos of gray levels =2 m. It is assumed discrete levels are equally spaced between 0toLinthegrayscale. Therefore number of bits required to store a digitized image is b = N N m ie a image with 256 gray levels(ie 8 bits/pixel) requires a storage of 17,000 bytes. Equ f(x, y) f(0, 0).. f(0,n 1) is an approximation to a continuous image. Reasonable question to ask at this point is how many samples, gray levels are required for a good approximation? This bring up the question of resolution. The resolution (ie the degree of discernable detail) of an image is strangely dependent on both N and m. The more these parameters are increased, the closer the digitized array will approximate the original image. However eqn b = N N m clearly points out the unfortunate fact that storage consequently processing requirements increase rapidly as a... of N r m. In view of the above comments, it is of interest to consider the effect that variations in N r m have an image quality. A good image is difficult to define because quality requirements vary accordingly to application. The number of samples and gray levels required to produce a faithful reproduction of an original image depends on the image itself. As a basis for comparison, the requirements to obtain a quality comparable to that of monochrome TV pictures over a wide range of image types are of the order of pixels with 128 gray levels. As a rule, a minimum system for general image processing work should be able to display pixels with 64 gray levels. 4
5 Huang in (1965) considered this problem of effects produced on image quality by varying N and m independently. He tried to quantify experimentally the above effects. The experiment consisted of a set of subjective tests. Three images were considered 1. Image with relatively little detail 2. Image with an intermediate amount of detail. 3. Image with a large amount of detail(picture of crowd) Sets of these 3 images were generated by varying r m and observers were then asked to rank them according to their subjective quality. The results were summarized in the form of curves in the N-m plane called isopreference curves. Each point in this plane represents an image with values of N r m equal to the coordinates of that point. An isopreference curve is one in which the points represent images of equal subjective quality. Huang suggested several empirical conclusions: 1. As expected the quality of images increase as N r m increase. 2. There were a few cases where for fixed N the quality improved by decreasing m. By decreasing m we increase the apparent contrast of an image. 3. As the image detail increases, the curves tend to become vertical. This suggests that for images with a large amount of detail only few gray levels are needed. This is not true for images with less detail. Non-uniform sampling For a fixed value of N, it is possible in many cases to improve the appearance of an image by using an adaptive scheme where the sampling process depends on the characteristics of the image. In general fine sampling is required in the neighborhood of sharp gray level transitions while coarse sampling is required in relatively smooth regions. Example: A simple image consisting of a face superimposed on a uniform background. The background contains little detail... and can be quite adequately 5
6 represented by coarse sampling. The face contains more detail. Using in this region of the image the additional sample not used in the background will tend to improve the overall result, particularly if N is small. In general, in distributing the samples, greater sample concentration should be used in gray-level transition boundaries such as the boundary between face and background in the example. Drawbacks of non-uniform sampling: The necessity of having to identify the boundaries even if only on a rough basis is a definite drawback of this method. Also this method is not practical for images containing small uniform regions, example an image depicting a crowd. Non uniform quantization When the number of gray levels must be kept small, it is usually desirable to use unequally spaced levels in quantization process. This method is similar to the previous case. However, since the eye is relatively poor at estimating shades of gray near abrupt level changes, the approach is to use less number of gray levels near the boundaries. The remaining levels may be used in those regions where gray level variations are smooth, thus avoiding or reducing the false contours which often appear in these regions if they are coarsely quantized. Drawbacks are same as the previous method. Sampling and replication using a component frequency. The digitization process of images can be understood by modeling then as bandlimited signals. Consider a function f(x, y) that is bandlimited ie F (ξ 1,ξ 2 )=0 ξ 1 >ξ xo ξ 2 >ξ y0 6
7 ξ 2 y w of image ξ yo ξ xo ξ yo ξ xo x ξ 1 BW of image Consider an ideal sampling frequency which is a 2-D infinite array of Dirac delta functions situated on a rectangular grid with spacing x, y, ie. Comb(x, y, x, y) δ(x m x, y n y) The sampled image is m,n fs(x, y) =f(x, y) comb(x, y, x, y) y y x x FT[comb( )] is another component frequency with spacing 1/ x, 1/ y namely, Defining Comb(ξ 1,ξ 2 )=I[comb(x, y; x, y)] [ = 1 x 1 ] δ(ξ 1 k/ x, ξ 2 l/ y) y k,l= ξ xs 1 x and ξ ys 1 y 7
8 The FT of the sampled image fs(x, y) isgivenby Fs(ξ 1,ξ 2 )=F(ξ 1,ξ 2 ) comp(ξ 1,ξ 2 ) = ξ xs,ξ ys F (ξ 1,ξ 2 ) δ(ξ 1 kξ xs,ξ 2 lξ ys ) k,l= = ξ xs,ξ ys k,l= F (ξ 1 kξ xs,ξ 2 lξ ys ) If the x, y sampling frequencies are greater than twice the bandwidths ie ξ xs > 2ξ xo, ξ ys > 2ξ yo or or equivalently, if x < 1 2ξ xo and y < 1 2ξ yo Then F (ξ 1,ξ 2 ) can be recovered by a low pass filter with frequency response H(ξ 1,ξ 2 )= 1, (ξ (ξ xsξ ys) 1,ξ 2 ) ɛr 0 otherwise where R is the region of support of the ideal low pass filter R 2 R R 1 R 0 2ξ xo ξ xs ie. R is any region whose boundary δr lies in the annular region of two rectangles R 1 and R 2. F (ξ 1,ξ 2 ) H(ξ 1,ξ 2 )F s (ξ 1,ξ 2 ) =F (ξ 1,ξ 2 ) 8
9 Example. An image described by the frequency f(x, y) = 2 cos 2Π(3x, 4y) is sampled s t x = y =0.2 and the reconstruction filter has a rectangular region of support with cut off frequency at half the sampling frequency.find the reconstructed image. f(x, y)2 cos 2Π(3x +4y) is bandlimited since F (ξ 1,ξ 2 )=δ(ξ 1 3,ξ 2 4) + δ(ξ 1 +3,ξ 2 + 4) is zero for ξ 1 > 3, ξ 2 > 4 Hence ξ xo =3,ξ yo =4 Also ξ xs = ξ ys =1/0.2 = 5 which is less than the Nyquist frequency 2ξ xo, 2ξ yo. The sampled image spectrum is Fs(ξ 1,ξ 2 )=25 [δ(ξ 1 3 5k, ξ 2 4 5l)+δ(ξ k, ξ l)] k,l= } Now H(ξ 1,ξ 2 )= ξ ξ otherwise F (ξ1,ξ 2 )=δ(ξ 1 2,ξ 2 1) + δ(ξ 1 +2,ξ 2 +1) which gives the reconstructed image as ˆf(x, y)2 cos 2Π(2x + y) This shows that any frequency component in the input image that is above (1/2ξ xs, 1/2ξ ys )by( ξ x, ξ y ) is reproduced (or aliased) as a frequency component at (1/2ξ xs ξ x, 1/2ξ ys ξ y ) It can be shown that the PSD function S s (ξ 1,ξ 2 ) of the sampled image fs(x, y) is a periodic extension of S(ξ 1,ξ 2 )andisgivenby S s (ξ 1,ξ 2 )=ξ xs ξ ys S(ξ 1 kξ xs,ξ 2 lξ ys ) k,l= 1 when the image is reconstructed by an ideal low-pass filter with gain ξ xsξ ys, the reconstructed image PSD is S(ξ 1,ξ 2 )= S(ξ 1 kξ xs,ξ 2 lξ ys )W (ξ 1,ξ 2 ) k,l= 9
10 Where W (ξ 1,ξ 2 )= = S(ξ 1,ξ 2 ) { 1 (ξ1,ξ 2 ) ɛr 0 otherwise The aliasing power σa 2 is the power in the tails of power spectrum outside R. σa 2 = S(ξ 1,ξ 2 )dξ 1 dξ 2 = [(1 W (ξ 1,ξ 2 ))]S(ξ 1,ξ 2 )dξ 1 dξ 2 ) ξ 1,ξ 2 ɛr which is zero if f(x, y) is bandlimited with ξ xo 1 2 ξ xs; ξ yo 1 2 ξ ys This analysis is useful when bandlimited image containing wide band noise is sampled. The S/N ratio of the sampled image can deteriorate significantly unless it is low pass filtered before sampling. Sampling random fields In physical sampling environments, random noise is always present in the image so that it is important to consider sampling theory for random fields(a family of 2-D functions which itself is a r-v). A continuous stationary random field f(x,y) is called bandlimited if its PSD function S(ξ 1,ξ 2 ) is bandlimited, ie. if S(ξ 1,ξ 2 )=0 for ξ 1 >ξ xo ξ 2 >ξ yo Sampling theory for Random fields If f(x,y) is a stationary bandlimited random field then f(x, y) f(m x, n y) sinc(xξ xs m)sinc(yξ ys n) m,n= Converges to f(x,y) in the mean square sense ie E( f f 2 )=0 where ξ xs = 1 x ξ ys = 1 y ξ xs > 2ξ xo 10
11 Nyquist rate The lower bonds on the sampling rates that is given by ξ xs > 2ξ xo and ξ ys > 2ξ yo are referred to as Nyquist rates or Nyquist frequency. Their reciprocals are called Nyquist intervals. The sampling theory states that a bandlimited image sampled above its x and y Nyquist rates can be recovered without error by low pass filtering the sampled image. However if sampling frequencies are below Nyquist frequencies is ξ xs > 2ξ xo or ξ ys > 2ξ yo then periodic replication of F (ξ 1,ξ 2 ) will overlap resulting in a distorted spectrum of Fs(ξ 1,ξ 2 )fromwhichf (ξ 1,ξ 2 ) cannot be recovered. FIG 5 The frequencies above half the sampling frequency ie ξ xs /2andξ ys /2 are called foldover frequencies. ξ 2 2ξ yo ξ ys 2ξ xo ξ xs Aliasing ξ 1 The overlapping of the successive periods of the spectrum cases the foldover frequencies in the original image to appear below ξ xs /2andξ ys /2inthe sampled image. This phenomena is called aliasing. Aliasing errors cannot be removed by subsequent filtering. Aliasing can be avoided by low pass filtering the image first so that its bandwidth is less than one half of the sampling criteria(of equ (A)) is satisfied. In other words, if the region of support of the ideal low pass filter is the rectangular region [ ξxs R = 2, 1 ] 2 ξ xs [ 12 ] ξ ys,ξ ys centered at origin, then its impulse response is h(x, y) = sin c(xξ xs )sincyξ ys f(x, y) can be reconstructed as: IFT[ F (ξ 1 ξ 2 )] = IFT[H(ξ 1,ξ 2 )Fs(ξ 1,ξ 2 )] 11
12 f(x, y) = m,n= f(m x, n y)sinc(xξ xs m)sinc(yξ ys n) Interpolation formula = f(x, y) if x and y satisfy Nyquist criteria. 12
(Refer Slide Time 00:17) Welcome to the course on Digital Image Processing. (Refer Slide Time 00:22)
Digital Image Processing Prof. P. K. Biswas Department of Electronics and Electrical Communications Engineering Indian Institute of Technology, Kharagpur Module Number 01 Lecture Number 02 Application
More informationDigital Image Processing
Digital Image Processing Lecture # 4 Digital Image Fundamentals - II ALI JAVED Lecturer SOFTWARE ENGINEERING DEPARTMENT U.E.T TAXILA Email:: ali.javed@uettaxila.edu.pk Office Room #:: 7 Presentation Outline
More informationDigital Image Fundamentals. Prof. George Wolberg Dept. of Computer Science City College of New York
Digital Image Fundamentals Prof. George Wolberg Dept. of Computer Science City College of New York Objectives In this lecture we discuss: - Image acquisition - Sampling and quantization - Spatial and graylevel
More informationImage Restoration and Reconstruction
Image Restoration and Reconstruction Image restoration Objective process to improve an image Recover an image by using a priori knowledge of degradation phenomenon Exemplified by removal of blur by deblurring
More information1/12/2009. Image Elements (Pixels) Image Elements (Pixels) Digital Image. Digital Image =...
PAM3012 Digital Image Processing for Radiographers Image Sampling & Quantization In this lecture Definitions of Spatial l & Gray-level l resolution Perceived Image Quality & Resolution Aliasing & Moire
More informationImage Restoration and Reconstruction
Image Restoration and Reconstruction Image restoration Objective process to improve an image, as opposed to the subjective process of image enhancement Enhancement uses heuristics to improve the image
More informationDigital Image Processing COSC 6380/4393
Digital Image Processing COSC 6380/4393 Lecture 4 Jan. 24 th, 2019 Slides from Dr. Shishir K Shah and Frank (Qingzhong) Liu Digital Image Processing COSC 6380/4393 TA - Office: PGH 231 (Update) Shikha
More informationReading. 2. Fourier analysis and sampling theory. Required: Watt, Section 14.1 Recommended:
Reading Required: Watt, Section 14.1 Recommended: 2. Fourier analysis and sampling theory Ron Bracewell, The Fourier Transform and Its Applications, McGraw-Hill. Don P. Mitchell and Arun N. Netravali,
More informationMotivation. Gray Levels
Motivation Image Intensity and Point Operations Dr. Edmund Lam Department of Electrical and Electronic Engineering The University of Hong ong A digital image is a matrix of numbers, each corresponding
More informationWeek No. 02 Basic concepts of IMAGE (course: Computer Vision)
Week No. 02 Basic concepts of IMAGE (course: Computer Vision) e- mail: naeemmahoto@gmail.com Department of So9ware Engineering, Mehran UET Jamshoro, Sind, Pakistan Outline Image Digital Image Gray Scale,
More informationSampling and Reconstruction
Sampling and Reconstruction Sampling and Reconstruction Sampling and Spatial Resolution Spatial Aliasing Problem: Spatial aliasing is insufficient sampling of data along the space axis, which occurs because
More informationUnit - I Computer vision Fundamentals
Unit - I Computer vision Fundamentals It is an area which concentrates on mimicking human vision systems. As a scientific discipline, computer vision is concerned with the theory behind artificial systems
More informationImage Enhancement 3-1
Image Enhancement The goal of image enhancement is to improve the usefulness of an image for a given task, such as providing a more subjectively pleasing image for human viewing. In image enhancement,
More informationUlrik Söderström 17 Jan Image Processing. Introduction
Ulrik Söderström ulrik.soderstrom@tfe.umu.se 17 Jan 2017 Image Processing Introduction Image Processsing Typical goals: Improve images for human interpretation Image processing Processing of images for
More informationWhat is an Image? Image Acquisition. Image Processing - Lesson 2. An image is a projection of a 3D scene into a 2D projection plane.
mage Processing - Lesson 2 mage Acquisition mage Characteristics mage Acquisition mage Digitization Sampling Quantization mage Histogram What is an mage? An image is a projection of a 3D scene into a 2D
More informationImage Acquisition + Histograms
Image Processing - Lesson 1 Image Acquisition + Histograms Image Characteristics Image Acquisition Image Digitization Sampling Quantization Histograms Histogram Equalization What is an Image? An image
More informationCS354 Computer Graphics Sampling and Aliasing
Slide Credit: http://www.siggraph.org/education/materials/hypergraph/aliasing/alias0.htm CS354 Computer Graphics Sampling and Aliasing Qixing Huang February 12th 2018 Sampling and Aliasing Display is discrete
More informationComputer Vision. The image formation process
Computer Vision The image formation process Filippo Bergamasco (filippo.bergamasco@unive.it) http://www.dais.unive.it/~bergamasco DAIS, Ca Foscari University of Venice Academic year 2016/2017 The image
More informationFourier analysis and sampling theory
Reading Required: Shirley, Ch. 9 Recommended: Fourier analysis and sampling theory Ron Bracewell, The Fourier Transform and Its Applications, McGraw-Hill. Don P. Mitchell and Arun N. Netravali, Reconstruction
More informationIMAGING. Images are stored by capturing the binary data using some electronic devices (SENSORS)
IMAGING Film photography Digital photography Images are stored by capturing the binary data using some electronic devices (SENSORS) Sensors: Charge Coupled Device (CCD) Photo multiplier tube (PMT) The
More informationAn Introduc+on to Mathema+cal Image Processing IAS, Park City Mathema2cs Ins2tute, Utah Undergraduate Summer School 2010
An Introduc+on to Mathema+cal Image Processing IAS, Park City Mathema2cs Ins2tute, Utah Undergraduate Summer School 2010 Luminita Vese Todd WiCman Department of Mathema2cs, UCLA lvese@math.ucla.edu wicman@math.ucla.edu
More informationNeurophysical Model by Barten and Its Development
Chapter 14 Neurophysical Model by Barten and Its Development According to the Barten model, the perceived foveal image is corrupted by internal noise caused by statistical fluctuations, both in the number
More informationIMAGE ENHANCEMENT in SPATIAL DOMAIN by Intensity Transformations
It makes all the difference whether one sees darkness through the light or brightness through the shadows David Lindsay IMAGE ENHANCEMENT in SPATIAL DOMAIN by Intensity Transformations Kalyan Kumar Barik
More informationLecture 2 Image Processing and Filtering
Lecture 2 Image Processing and Filtering UW CSE vision faculty What s on our plate today? Image formation Image sampling and quantization Image interpolation Domain transformations Affine image transformations
More informationMotivation. Intensity Levels
Motivation Image Intensity and Point Operations Dr. Edmund Lam Department of Electrical and Electronic Engineering The University of Hong ong A digital image is a matrix of numbers, each corresponding
More informationLecture 4. Digital Image Enhancement. 1. Principle of image enhancement 2. Spatial domain transformation. Histogram processing
Lecture 4 Digital Image Enhancement 1. Principle of image enhancement 2. Spatial domain transformation Basic intensity it tranfomation ti Histogram processing Principle Objective of Enhancement Image enhancement
More informationPoint operation Spatial operation Transform operation Pseudocoloring
Image Enhancement Introduction Enhancement by point processing Simple intensity transformation Histogram processing Spatial filtering Smoothing filters Sharpening filters Enhancement in the frequency domain
More informationDigital Image Processing Lectures 1 & 2
Lectures 1 & 2, Professor Department of Electrical and Computer Engineering Colorado State University Spring 2013 Introduction to DIP The primary interest in transmitting and handling images in digital
More information1216 P a g e 2.1 TRANSLATION PARAMETERS ESTIMATION. If f (x, y) F(ξ,η) then. f(x,y)exp[j2π(ξx 0 +ηy 0 )/ N] ) F(ξ- ξ 0,η- η 0 ) and
An Image Stitching System using Featureless Registration and Minimal Blending Ch.Rajesh Kumar *,N.Nikhita *,Santosh Roy *,V.V.S.Murthy ** * (Student Scholar,Department of ECE, K L University, Guntur,AP,India)
More informationVivekananda. Collegee of Engineering & Technology. Question and Answers on 10CS762 /10IS762 UNIT- 5 : IMAGE ENHANCEMENT.
Vivekananda Collegee of Engineering & Technology Question and Answers on 10CS762 /10IS762 UNIT- 5 : IMAGE ENHANCEMENT Dept. Prepared by Harivinod N Assistant Professor, of Computer Science and Engineering,
More informationUNIT-2 IMAGE REPRESENTATION IMAGE REPRESENTATION IMAGE SENSORS IMAGE SENSORS- FLEX CIRCUIT ASSEMBLY
18-08-2016 UNIT-2 In the following slides we will consider what is involved in capturing a digital image of a real-world scene Image sensing and representation Image Acquisition Sampling and quantisation
More informationDigital Image Processing (EI424)
Scheme of evaluation Digital Image Processing (EI424) Eighth Semester,April,2017. IV/IV B.Tech (Regular) DEGREE EXAMINATIONS ELECTRONICS AND INSTRUMENTATION ENGINEERING April,2017 Digital Image Processing
More informationLecture 4 Image Enhancement in Spatial Domain
Digital Image Processing Lecture 4 Image Enhancement in Spatial Domain Fall 2010 2 domains Spatial Domain : (image plane) Techniques are based on direct manipulation of pixels in an image Frequency Domain
More informationx' = c 1 x + c 2 y + c 3 xy + c 4 y' = c 5 x + c 6 y + c 7 xy + c 8
1. Explain about gray level interpolation. The distortion correction equations yield non integer values for x' and y'. Because the distorted image g is digital, its pixel values are defined only at integer
More informationLevel lines based disocclusion
Level lines based disocclusion Simon Masnou Jean-Michel Morel CEREMADE CMLA Université Paris-IX Dauphine Ecole Normale Supérieure de Cachan 75775 Paris Cedex 16, France 94235 Cachan Cedex, France Abstract
More informationImage Processing and Analysis
Image Processing and Analysis 3 stages: Image Restoration - correcting errors and distortion. Warping and correcting systematic distortion related to viewing geometry Correcting "drop outs", striping and
More informationAdvanced Computer Graphics. Aliasing. Matthias Teschner. Computer Science Department University of Freiburg
Advanced Computer Graphics Aliasing Matthias Teschner Computer Science Department University of Freiburg Outline motivation Fourier analysis filtering sampling reconstruction / aliasing antialiasing University
More informationCS334: Digital Imaging and Multimedia Edges and Contours. Ahmed Elgammal Dept. of Computer Science Rutgers University
CS334: Digital Imaging and Multimedia Edges and Contours Ahmed Elgammal Dept. of Computer Science Rutgers University Outlines What makes an edge? Gradient-based edge detection Edge Operators From Edges
More informationChapter - 2 : IMAGE ENHANCEMENT
Chapter - : IMAGE ENHANCEMENT The principal objective of enhancement technique is to process a given image so that the result is more suitable than the original image for a specific application Image Enhancement
More informationIntroduction to Computer Vision. Human Eye Sampling
Human Eye Sampling Sampling Rough Idea: Ideal Case 23 "Digitized Image" "Continuous Image" Dirac Delta Function 2D "Comb" δ(x,y) = 0 for x = 0, y= 0 s δ(x,y) dx dy = 1 f(x,y)δ(x-a,y-b) dx dy = f(a,b) δ(x-ns,y-ns)
More informationComputer Graphics. Sampling Theory & Anti-Aliasing. Philipp Slusallek
Computer Graphics Sampling Theory & Anti-Aliasing Philipp Slusallek Dirac Comb (1) Constant & δ-function flash Comb/Shah function 2 Dirac Comb (2) Constant & δ-function Duality f(x) = K F(ω) = K (ω) And
More informationAliasing and Antialiasing. ITCS 4120/ Aliasing and Antialiasing
Aliasing and Antialiasing ITCS 4120/5120 1 Aliasing and Antialiasing What is Aliasing? Errors and Artifacts arising during rendering, due to the conversion from a continuously defined illumination field
More informationLecture 6 Basic Signal Processing
Lecture 6 Basic Signal Processing Copyright c1996, 1997 by Pat Hanrahan Motivation Many aspects of computer graphics and computer imagery differ from aspects of conventional graphics and imagery because
More informationPart 3: Image Processing
Part 3: Image Processing Image Filtering and Segmentation Georgy Gimel farb COMPSCI 373 Computer Graphics and Image Processing 1 / 60 1 Image filtering 2 Median filtering 3 Mean filtering 4 Image segmentation
More informationTheoretically Perfect Sensor
Sampling 1/67 Sampling The ray tracer samples the geometry, only gathering information from the parts of the world that interact with a finite number of rays In contrast, a scanline renderer can push all
More informationComputer Vision and Graphics (ee2031) Digital Image Processing I
Computer Vision and Graphics (ee203) Digital Image Processing I Dr John Collomosse J.Collomosse@surrey.ac.uk Centre for Vision, Speech and Signal Processing University of Surrey Learning Outcomes After
More informationScanner Parameter Estimation Using Bilevel Scans of Star Charts
ICDAR, Seattle WA September Scanner Parameter Estimation Using Bilevel Scans of Star Charts Elisa H. Barney Smith Electrical and Computer Engineering Department Boise State University, Boise, Idaho 8375
More informationCoE4TN3 Medical Image Processing
CoE4TN3 Medical Image Processing Image Restoration Noise Image sensor might produce noise because of environmental conditions or quality of sensing elements. Interference in the image transmission channel.
More informationImage Processing Lecture 10
Image Restoration Image restoration attempts to reconstruct or recover an image that has been degraded by a degradation phenomenon. Thus, restoration techniques are oriented toward modeling the degradation
More informationTopic 5 Image Compression
Topic 5 Image Compression Introduction Data Compression: The process of reducing the amount of data required to represent a given quantity of information. Purpose of Image Compression: the reduction of
More informationScientific Visualization Example exam questions with commented answers
Scientific Visualization Example exam questions with commented answers The theoretical part of this course is evaluated by means of a multiple- choice exam. The questions cover the material mentioned during
More informationTheoretically Perfect Sensor
Sampling 1/60 Sampling The ray tracer samples the geometry, only gathering information from the parts of the world that interact with a finite number of rays In contrast, a scanline renderer can push all
More informationPSD2B Digital Image Processing. Unit I -V
PSD2B Digital Image Processing Unit I -V Syllabus- Unit 1 Introduction Steps in Image Processing Image Acquisition Representation Sampling & Quantization Relationship between pixels Color Models Basics
More informationImage Acquisition Image Digitization Spatial domain Intensity domain Image Characteristics
Image Acquisition Image Digitization Spatial domain Intensity domain Image Characteristics 1 What is an Image? An image is a projection of a 3D scene into a 2D projection plane. An image can be defined
More informationLecture 6: Edge Detection
#1 Lecture 6: Edge Detection Saad J Bedros sbedros@umn.edu Review From Last Lecture Options for Image Representation Introduced the concept of different representation or transformation Fourier Transform
More informationIMAGE COMPRESSION. Chapter - 5 : (Basic)
Chapter - 5 : IMAGE COMPRESSION (Basic) Q() Explain the different types of redundncies that exists in image.? (8M May6 Comp) [8M, MAY 7, ETRX] A common characteristic of most images is that the neighboring
More informationImage Sampling and Quantisation
Image Sampling and Quantisation Introduction to Signal and Image Processing Prof. Dr. Philippe Cattin MIAC, University of Basel 1 of 46 22.02.2016 09:17 Contents Contents 1 Motivation 2 Sampling Introduction
More informationBasic relations between pixels (Chapter 2)
Basic relations between pixels (Chapter 2) Lecture 3 Basic Relationships Between Pixels Definitions: f(x,y): digital image Pixels: q, p (p,q f) A subset of pixels of f(x,y): S A typology of relations:
More informationImage Sampling & Quantisation
Image Sampling & Quantisation Biomedical Image Analysis Prof. Dr. Philippe Cattin MIAC, University of Basel Contents 1 Motivation 2 Sampling Introduction and Motivation Sampling Example Quantisation Example
More informationImage Transformation Techniques Dr. Rajeev Srivastava Dept. of Computer Engineering, ITBHU, Varanasi
Image Transformation Techniques Dr. Rajeev Srivastava Dept. of Computer Engineering, ITBHU, Varanasi 1. Introduction The choice of a particular transform in a given application depends on the amount of
More informationCS534: Introduction to Computer Vision Edges and Contours. Ahmed Elgammal Dept. of Computer Science Rutgers University
CS534: Introduction to Computer Vision Edges and Contours Ahmed Elgammal Dept. of Computer Science Rutgers University Outlines What makes an edge? Gradient-based edge detection Edge Operators Laplacian
More informationCPSC 425: Computer Vision
CPSC 425: Computer Vision Image Credit: https://docs.adaptive-vision.com/4.7/studio/machine_vision_guide/templatematching.html Lecture 9: Template Matching (cont.) and Scaled Representations ( unless otherwise
More informationMotivation. The main goal of Computer Graphics is to generate 2D images. 2D images are continuous 2D functions (or signals)
Motivation The main goal of Computer Graphics is to generate 2D images 2D images are continuous 2D functions (or signals) monochrome f(x,y) or color r(x,y), g(x,y), b(x,y) These functions are represented
More informationIMAGE ENHANCEMENT IN THE SPATIAL DOMAIN
1 Image Enhancement in the Spatial Domain 3 IMAGE ENHANCEMENT IN THE SPATIAL DOMAIN Unit structure : 3.0 Objectives 3.1 Introduction 3.2 Basic Grey Level Transform 3.3 Identity Transform Function 3.4 Image
More informationDigital Image Processing
Digital Image Processing Lecture # 6 Image Enhancement in Spatial Domain- II ALI JAVED Lecturer SOFTWARE ENGINEERING DEPARTMENT U.E.T TAXILA Email:: ali.javed@uettaxila.edu.pk Office Room #:: 7 Local/
More informationColor Characterization and Calibration of an External Display
Color Characterization and Calibration of an External Display Andrew Crocker, Austin Martin, Jon Sandness Department of Math, Statistics, and Computer Science St. Olaf College 1500 St. Olaf Avenue, Northfield,
More informationImage Processing. Traitement d images. Yuliya Tarabalka Tel.
Traitement d images Yuliya Tarabalka yuliya.tarabalka@hyperinet.eu yuliya.tarabalka@gipsa-lab.grenoble-inp.fr Tel. 04 76 82 62 68 Noise reduction Image restoration Restoration attempts to reconstruct an
More informationComputer Vision 2. SS 18 Dr. Benjamin Guthier Professur für Bildverarbeitung. Computer Vision 2 Dr. Benjamin Guthier
Computer Vision 2 SS 18 Dr. Benjamin Guthier Professur für Bildverarbeitung Computer Vision 2 Dr. Benjamin Guthier 1. IMAGE PROCESSING Computer Vision 2 Dr. Benjamin Guthier Content of this Chapter Non-linear
More informationInternational ejournals
ISSN 2249 5460 Available online at www.internationalejournals.com International ejournals International Journal of Mathematical Sciences, Technology and Humanities 96 (2013) 1063 1069 Image Interpolation
More informationCoE4TN4 Image Processing. Chapter 5 Image Restoration and Reconstruction
CoE4TN4 Image Processing Chapter 5 Image Restoration and Reconstruction Image Restoration Similar to image enhancement, the ultimate goal of restoration techniques is to improve an image Restoration: a
More informationIMAGE DE-NOISING IN WAVELET DOMAIN
IMAGE DE-NOISING IN WAVELET DOMAIN Aaditya Verma a, Shrey Agarwal a a Department of Civil Engineering, Indian Institute of Technology, Kanpur, India - (aaditya, ashrey)@iitk.ac.in KEY WORDS: Wavelets,
More informationDigital Image Processing. Introduction
Digital Image Processing Introduction Digital Image Definition An image can be defined as a twodimensional function f(x,y) x,y: Spatial coordinate F: the amplitude of any pair of coordinate x,y, which
More informationLast update: May 4, Vision. CMSC 421: Chapter 24. CMSC 421: Chapter 24 1
Last update: May 4, 200 Vision CMSC 42: Chapter 24 CMSC 42: Chapter 24 Outline Perception generally Image formation Early vision 2D D Object recognition CMSC 42: Chapter 24 2 Perception generally Stimulus
More informationVisual Distortions in Macular Degeneration: Quantitative Diagnosis and Correction
Visual Distortions in Macular Degeneration: Quantitative Diagnosis and Correction Walter Kohn, Professor Emeritus of Physics & Chemistry, UC Santa Barbara Jim Klingshirn, Consulting Engineer, Santa Barbara
More informationIntensity Transformations and Spatial Filtering
77 Chapter 3 Intensity Transformations and Spatial Filtering Spatial domain refers to the image plane itself, and image processing methods in this category are based on direct manipulation of pixels in
More informationCentral Slice Theorem
Central Slice Theorem Incident X-rays y f(x,y) R x r x Detected p(, x ) The thick line is described by xcos +ysin =R Properties of Fourier Transform F [ f ( x a)] F [ f ( x)] e j 2 a Spatial Domain Spatial
More informationCHAPTER 3 IMAGE ENHANCEMENT IN THE SPATIAL DOMAIN
CHAPTER 3 IMAGE ENHANCEMENT IN THE SPATIAL DOMAIN CHAPTER 3: IMAGE ENHANCEMENT IN THE SPATIAL DOMAIN Principal objective: to process an image so that the result is more suitable than the original image
More informationDIGITAL IMAGE PROCESSING
AS SR INSTITUTE OF TECHNOLOGY PRATHIPADU, TADEPALLIGUDEM DEPARTMENT OF ECE DIGITAL IMAGE PROCESSING S.NO. 1 2 3 4 5 CONTENT UNIT UNIT-1 UNIT-3 UNIT-4 UNIT-6 UNIT-7 PAGE NO. 2-30 31-56 57-69 70-97 98-111
More informationCS4442/9542b Artificial Intelligence II prof. Olga Veksler
CS4442/9542b Artificial Intelligence II prof. Olga Veksler Lecture 8 Computer Vision Introduction, Filtering Some slides from: D. Jacobs, D. Lowe, S. Seitz, A.Efros, X. Li, R. Fergus, J. Hayes, S. Lazebnik,
More informationCS4442/9542b Artificial Intelligence II prof. Olga Veksler
CS4442/9542b Artificial Intelligence II prof. Olga Veksler Lecture 2 Computer Vision Introduction, Filtering Some slides from: D. Jacobs, D. Lowe, S. Seitz, A.Efros, X. Li, R. Fergus, J. Hayes, S. Lazebnik,
More informationDigital Image Fundamentals
Digital Image Fundamentals Image Quality Objective/ subjective Machine/human beings Mathematical and Probabilistic/ human intuition and perception 6 Structure of the Human Eye photoreceptor cells 75~50
More informationComputer Assisted Image Analysis TF 3p and MN1 5p Lecture 1, (GW 1, )
Centre for Image Analysis Computer Assisted Image Analysis TF p and MN 5p Lecture, 422 (GW, 2.-2.4) 2.4) 2 Why put the image into a computer? A digital image of a rat. A magnification of the rat s nose.
More informationEEM 463 Introduction to Image Processing. Week 3: Intensity Transformations
EEM 463 Introduction to Image Processing Week 3: Intensity Transformations Fall 2013 Instructor: Hatice Çınar Akakın, Ph.D. haticecinarakakin@anadolu.edu.tr Anadolu University Enhancement Domains Spatial
More informationCHAPTER 3 SHOT DETECTION AND KEY FRAME EXTRACTION
33 CHAPTER 3 SHOT DETECTION AND KEY FRAME EXTRACTION 3.1 INTRODUCTION The twenty-first century is an age of information explosion. We are witnessing a huge growth in digital data. The trend of increasing
More information(0, 1, 1) (0, 1, 1) (0, 1, 0) What is light? What is color? Terminology
lecture 23 (0, 1, 1) (0, 0, 0) (0, 0, 1) (0, 1, 1) (1, 1, 1) (1, 1, 0) (0, 1, 0) hue - which ''? saturation - how pure? luminance (value) - intensity What is light? What is? Light consists of electromagnetic
More informationEECS 556 Image Processing W 09. Interpolation. Interpolation techniques B splines
EECS 556 Image Processing W 09 Interpolation Interpolation techniques B splines What is image processing? Image processing is the application of 2D signal processing methods to images Image representation
More informationELEC Dr Reji Mathew Electrical Engineering UNSW
ELEC 4622 Dr Reji Mathew Electrical Engineering UNSW Dynamic Range and Weber s Law HVS is capable of operating over an enormous dynamic range, However, sensitivity is far from uniform over this range Example:
More informationDigital Image Analysis and Processing
Digital Image Analysis and Processing CPE 0907544 Image Enhancement Part I Intensity Transformation Chapter 3 Sections: 3.1 3.3 Dr. Iyad Jafar Outline What is Image Enhancement? Background Intensity Transformation
More informationEECS 556 Image Processing W 09. Image enhancement. Smoothing and noise removal Sharpening filters
EECS 556 Image Processing W 09 Image enhancement Smoothing and noise removal Sharpening filters What is image processing? Image processing is the application of 2D signal processing methods to images Image
More informationLecture 2: 2D Fourier transforms and applications
Lecture 2: 2D Fourier transforms and applications B14 Image Analysis Michaelmas 2017 Dr. M. Fallon Fourier transforms and spatial frequencies in 2D Definition and meaning The Convolution Theorem Applications
More informationChapter 2: Digital Image Fundamentals
Chapter : Digital Image Fundamentals Lecturer: Wanasanan Thongsongkrit Email : wanasana@eng.cmu.ac.th Office room : 4 Human and Computer Vision We can t think of image processing without considering the
More informationEdge Detection (with a sidelight introduction to linear, associative operators). Images
Images (we will, eventually, come back to imaging geometry. But, now that we know how images come from the world, we will examine operations on images). Edge Detection (with a sidelight introduction to
More informationINTRODUCTION TO IMAGE PROCESSING (COMPUTER VISION)
INTRODUCTION TO IMAGE PROCESSING (COMPUTER VISION) Revision: 1.4, dated: November 10, 2005 Tomáš Svoboda Czech Technical University, Faculty of Electrical Engineering Center for Machine Perception, Prague,
More informationThe main goal of Computer Graphics is to generate 2D images 2D images are continuous 2D functions (or signals)
Motivation The main goal of Computer Graphics is to generate 2D images 2D images are continuous 2D functions (or signals) monochrome f(x,y) or color r(x,y), g(x,y), b(x,y) These functions are represented
More informationAn Intuitive Explanation of Fourier Theory
An Intuitive Explanation of Fourier Theory Steven Lehar slehar@cns.bu.edu Fourier theory is pretty complicated mathematically. But there are some beautifully simple holistic concepts behind Fourier theory
More informationLecture 1 Introduction & Fundamentals
Digital Image Processing Lecture 1 Introduction & Fundamentals Presented By: Diwakar Yagyasen Sr. Lecturer CS&E, BBDNITM, Lucknow What is an image? a representation, likeness, or imitation of an object
More informationUNIT - 5 IMAGE ENHANCEMENT IN SPATIAL DOMAIN
UNIT - 5 IMAGE ENHANCEMENT IN SPATIAL DOMAIN Spatial domain methods Spatial domain refers to the image plane itself, and approaches in this category are based on direct manipulation of pixels in an image.
More informationReview and Implementation of DWT based Scalable Video Coding with Scalable Motion Coding.
Project Title: Review and Implementation of DWT based Scalable Video Coding with Scalable Motion Coding. Midterm Report CS 584 Multimedia Communications Submitted by: Syed Jawwad Bukhari 2004-03-0028 About
More informationSampling, Aliasing, & Mipmaps
Sampling, Aliasing, & Mipmaps Last Time? Monte-Carlo Integration Importance Sampling Ray Tracing vs. Path Tracing source hemisphere What is a Pixel? Sampling & Reconstruction Filters in Computer Graphics
More informationFiltering and Enhancing Images
KECE471 Computer Vision Filtering and Enhancing Images Chang-Su Kim Chapter 5, Computer Vision by Shapiro and Stockman Note: Some figures and contents in the lecture notes of Dr. Stockman are used partly.
More information