IT Digital Image ProcessingVII Semester - Question Bank

Similar documents
Digital Image Processing

3. (a) Prove any four properties of 2D Fourier Transform. (b) Determine the kernel coefficients of 2D Hadamard transforms for N=8.


SIDDHARTH GROUP OF INSTITUTIONS :: PUTTUR Siddharth Nagar, Narayanavanam Road UNIT I

PSD2B Digital Image Processing. Unit I -V

DIGITAL IMAGE PROCESSING. Question Bank [VII SEM ECE] CONTENTS

JNTUWORLD. 4. Prove that the average value of laplacian of the equation 2 h = ((r2 σ 2 )/σ 4 ))exp( r 2 /2σ 2 ) is zero. [16]

Review for the Final

Babu Madhav Institute of Information Technology Years Integrated M.Sc.(IT)(Semester - 7)

EC2029 DIGITAL IMAGE PROCESSING L T P C

Digital Image Processing (EI424)

Fundamentals of Digital Image Processing

IMAGE COMPRESSION. Image Compression. Why? Reducing transportation times Reducing file size. A two way event - compression and decompression


Digital Image Processing

Final Review. Image Processing CSE 166 Lecture 18

Multimedia Communications. Transform Coding

Image Processing, Analysis and Machine Vision

An Introduc+on to Mathema+cal Image Processing IAS, Park City Mathema2cs Ins2tute, Utah Undergraduate Summer School 2010

Topic 5 Image Compression

Digital Image Processing COSC 6380/4393

Ulrik Söderström 16 Feb Image Processing. Segmentation

Computer Vision 2. SS 18 Dr. Benjamin Guthier Professur für Bildverarbeitung. Computer Vision 2 Dr. Benjamin Guthier

Noise Model. Important Noise Probability Density Functions (Cont.) Important Noise Probability Density Functions

( ) ; For N=1: g 1. g n

Chapter - 2 : IMAGE ENHANCEMENT

Digital Image Processing, 2nd ed. Digital Image Processing, 2nd ed. The principal objective of enhancement

Filtering and Enhancing Images

CHAPTER 1 Introduction 1. CHAPTER 2 Images, Sampling and Frequency Domain Processing 37

Biometrics Technology: Image Processing & Pattern Recognition (by Dr. Dickson Tong)

Vivekananda. Collegee of Engineering & Technology. Question and Answers on 10CS762 /10IS762 UNIT- 5 : IMAGE ENHANCEMENT.

Compression Part 2 Lossy Image Compression (JPEG) Norm Zeck

Digital Image Processing

IMAGE COMPRESSION. Chapter - 5 : (Basic)

Lecture 4: Spatial Domain Transformations

Classification of image operations. Image enhancement (GW-Ch. 3) Point operations. Neighbourhood operation

Image coding and compression

Chapter 3: Intensity Transformations and Spatial Filtering

MRT based Fixed Block size Transform Coding

Image Enhancement in Spatial Domain. By Dr. Rajeev Srivastava

JPEG. Wikipedia: Felis_silvestris_silvestris.jpg, Michael Gäbler CC BY 3.0

CoE4TN4 Image Processing. Chapter 8 Image Compression

FACULTY OF INFORMATICS B.E. 4/4 (IT) I Semester (Old) Examination, July Subject : Digital Image Processing (Elective III) Estelar

CHAPTER 3 IMAGE ENHANCEMENT IN THE SPATIAL DOMAIN

Anno accademico 2006/2007. Davide Migliore

Digital Image Processing. Prof. P. K. Biswas. Department of Electronic & Electrical Communication Engineering

Image restoration. Restoration: Enhancement:

EE795: Computer Vision and Intelligent Systems

Digital Image Fundamentals

COMMON TO IT & ECE SEVENTH SEMESTER SUBJECT NAME : DIGITAL IMAGE PROCESSING (EC1009) PREPARED BY : S. MARIA CELESTIN VIGILA, N.

Lecture 5: Compression I. This Week s Schedule

CSE237A: Final Project Mid-Report Image Enhancement for portable platforms Rohit Sunkam Ramanujam Soha Dalal

Multimedia Systems Image III (Image Compression, JPEG) Mahdi Amiri April 2011 Sharif University of Technology

Linear Operations Using Masks

Image Processing. Traitement d images. Yuliya Tarabalka Tel.

Examination in Image Processing

Perceptual Coding. Lossless vs. lossy compression Perceptual models Selecting info to eliminate Quantization and entropy encoding

DCT Based, Lossy Still Image Compression

Review for Exam I, EE552 2/2009

CS 335 Graphics and Multimedia. Image Compression

BLIND MEASUREMENT OF BLOCKING ARTIFACTS IN IMAGES Zhou Wang, Alan C. Bovik, and Brian L. Evans. (

compression and coding ii

VC 12/13 T16 Video Compression

Image Pyramids and Applications

Biomedical signal and image processing (Course ) Lect. 5. Principles of signal and image coding. Classification of coding methods.

11. Image Data Analytics. Jacobs University Visualization and Computer Graphics Lab

MEDICAL IMAGE ANALYSIS

C E N T E R A T H O U S T O N S C H O O L of H E A L T H I N F O R M A T I O N S C I E N C E S. Image Operations II

EEM 463 Introduction to Image Processing. Week 3: Intensity Transformations

Feature Extraction and Image Processing, 2 nd Edition. Contents. Preface

The. Handbook ijthbdition. John C. Russ. North Carolina State University Materials Science and Engineering Department Raleigh, North Carolina

2: Image Display and Digital Images. EE547 Computer Vision: Lecture Slides. 2: Digital Images. 1. Introduction: EE547 Computer Vision

Video Compression An Introduction

Lecture 8 JPEG Compression (Part 3)

Computer Vision I. Announcements. Fourier Tansform. Efficient Implementation. Edge and Corner Detection. CSE252A Lecture 13.

CoE4TN3 Medical Image Processing

Chapter 11 Representation & Description

Source Coding Techniques

Lossy Coding 2 JPEG. Perceptual Image Coding. Discrete Cosine Transform JPEG. CS559 Lecture 9 JPEG, Raster Algorithms

Digital Image Processing

Point and Spatial Processing

SECTION 5 IMAGE PROCESSING 2

CSEP 521 Applied Algorithms Spring Lossy Image Compression

CMPT 365 Multimedia Systems. Media Compression - Image

Image Coding and Compression

IMAGE COMPRESSION. October 7, ICSY Lab, University of Kaiserslautern, Germany

ECE 484 Digital Image Processing Lec 12 - Mid Term Review

Topic 4 Image Segmentation

Image Processing Notes

Lecture 6: Edge Detection

Perception. Autonomous Mobile Robots. Sensors Vision Uncertainties, Line extraction from laser scans. Autonomous Systems Lab. Zürich.

Digital Image Processing

Course Syllabus. Website Multimedia Systems, Overview

Compression of Stereo Images using a Huffman-Zip Scheme

Point operation Spatial operation Transform operation Pseudocoloring

CS4733 Class Notes, Computer Vision

The Core Technology of Digital TV

Image Enhancement in Spatial Domain (Chapter 3)

Standard Codecs. Image compression to advanced video coding. Mohammed Ghanbari. 3rd Edition. The Institution of Engineering and Technology

Sparse Transform Matrix at Low Complexity for Color Image Compression

Transcription:

UNIT I DIGITAL IMAGE FUNDAMENTALS PART A Elements of Digital Image processing (DIP) systems 1. What is a pixel? 2. Define Digital Image 3. What are the steps involved in DIP? 4. List the categories of digital storage. 5. What is dynamic range? 6. Define Digital Image Processing 7. What are the types of connectivity? 8. Write the formula for calculating D 4 and D 8 distance. 9. What is geometric transformation? Elements of visual perception, brightness, contrast, hue, saturation, machband effect 1. Define Brightness (M/J 12) 2. Define Luminance 3. What are the types of light receptors? 4. How cones and rods are distributed in retina? 5. Define Subjective Brightness and Brightness Adaptation 6. What is meant by machband effect? 7. What are hue and saturation? 8. Define 4 and 8 Neighbors of a Pixel (N/D 09) Color Image Fundamentals- RGB, HSI models 1. What is meant by colour model? 2. List the hardware oriented colour models. Image Sampling, Quantization, dither 1. Define Sampling and Quantization (N/D 08) 2. Write the expression for finding the number of bits required to store a digital image. 3. Define Tapered Quantization Department of Electronics and Communication Engineering1

2-D mathematical preliminaries, 2-D Transform -DFT,DCT,KLT,SVD 1. List the properties of 2D Fourier transform. (N/D 08) 2. List the properties of forward transformation kernel. (M/J 12) 3. KLT is an optimum transform - Justify. (N/D 12) 4. What is separable image transform? (N/D 09) 5. List the properties of Singular Value Decomposition (SVD). (N/D 11) 6. What is the need for a transform? 7. What are the applications of transform? 8. List the properties of two-dimensional DFT. 9. What is image translation and scaling? PART B Elements of Digital Image processing (DIP) systems 1. What are the elements of image processing system? Describe its working. (M/J 12), (N/D 12), (N/D 08) 2. Whatis a frame buffer? Write the categories of digital storage for image processing applications. (8) (N/D 12) Elements of visual perception, brightness, contrast, hue, saturation, machband effect 1. Explain with neat diagram,the elements of visual perception.(n/d 08) 2. Explain any four basic relationships between pixels. (8) (N/D 12) Color Image Fundamentals- RGB, HSI models 1. How an RGB model is represented using HSI format? Describe the transformation. (M/J 12) 2. Write in detail,the RGB colour model. (N/D 11) Image Sampling, Quantization, dither 1. Explain the principle of sampling and quantization. Discuss the effect of increasing the a) Sampling frequency b) Quantization levels on image (M/J 12), (N/D 09) 2-D mathematical preliminaries, 2-D Transform -DFT,DCT,KLT,SVD 2. Explain the computation of DFT for a given 2D image. (M/J 12) 3. Explain the different transforms in DIP and explain any one in detail. Department of Electronics and Communication Engineering2

4. Explain the following separable transforms a) Hadamard transform b) DCT transform c) Karhunen - Loeve transform Histogram Equalization and Specification UNIT II- IMAGE ENHANCEMENT PART A 1. List the categories of image enhancement. (N/D 12) 2. What is meant by bit plane slicing? (N/D 12) 3. Define Histogram (N/D 09) 4. What is a multimodal histogram? (M/J 12) 5. List the types of image enhancement. (N/D 12) 6. Write the objectives of image enhancement technique. Noise Distributions 1. Why does the noise always considered to be additive in images? (M/J 12) Spatial Averaging, Directional Smoothing, Median, Geometric mean, Harmonic mean, Contraharmonic mean filters, Homomorphic filtering 1. List the different types of derivative filters. (N/D 09) 2. State the principle of directional smoothing. (N/D 11) 3. Define Geometric Mean Filtering (N/D 11) 4. Compare spatial and frequency domain methods. (N/D 08) 5. What are the effects of applying Butterworth low pass filter to the noisy image? (N/D 08) 6. Define Contrast Stretching 7. What is gray level slicing? 8. What is the purpose of image averaging? 9. What is meant by masking? 10. Write the steps involved in frequency domain filtering. 11. What is image negative? Department of Electronics and Communication Engineering3

12. Define Spatial Filtering 13. What is meant by median filter? 14. What are maximum filter and minimum filter? 15. Write the applications of sharpening filters. Histogram Equalization and Specification PART B 1. Howis a monochrome image enhanced by histogram equalization? (M/J 12), (N/D 12) 2. Explain histogram processing. (N/D 11) Spatial Averaging, Directional Smoothing, Median, Geometric mean, Harmonic mean, Contraharmonic mean filters, Homomorphic filtering 3. Write an algorithm for obtaining the average of four images of same size. 4. Explainhomomorphic filtering. (M/J 12) 5. How are image subtraction and image averaging is used to enhance the image? 6. Explain the various sharpening filters used in spatial domain. 7. Explain the spatial domain methods for image enhancement. 8. Explain image enhancement in frequency domain using a) Low Pass Filter b) High Pass Filter (N/D 09) Color Image Enhancement 9. Explaincolour image enhancement. (N/D 11) Department of Electronics and Communication Engineering4

UNIT III - IMAGE RESTORATION AND SEGMENTATION Image Restoration - degradation model PART A 1. What is meant by image restoration? (M/J 12) 2. Differentiate enhancement from restoration. (N/D 09) 3. How a degradation process is is modeled? 4. What are the types of noise models? 5. Write the expression for gamma noise. 6. Write the expression for uniform noise. 7. Write the expression for Impulse noise. Unconstrained restoration - Lagrange multiplier and constrained restoration, Inverse Filtering 8. Define Geometric Transformation(M/J 12) 9. Define Averaging Filters (N/D 09) 10. Write the condition to be met by the partitions in region based segmentation. Department of Electronics and Communication Engineering5 (N/D 11) 11. What is inverse filtering? (N/D 11) 12. Why the restoration is called an unconstrained restoration? 13. What are the three methods of estimating the degradation function? 14. What is pseudo inverse filter? 15. What is least mean square filter? 16. What is blind image restoration? Removal of blur caused by linear motion, Wiener filtering 17. What are the two approaches for blind image restoration? Geometric Transformations - spatial transformations 18. Define Gray Level Interpolation 19. What is rubber sheet transformation?

Edge Detection 20. Define Texture (N/D 08) 21. How is edge detection used for detecting discontinuities in a digital image?(n/d 08) 22. What is directional derivative? Where is it used? (M/J 12) 23. Define Sobel Operator (N/D 12) 24. What are the three types of discontinuity in digital image? 25. How are the derivatives obtained in edge detection during formulation? 26. What are the two properties used for establishing similarity of edge pixels? 27. What is an edge? 28. List out the properties of the second derivative around an edge. 29. Define Gradient Operator Edge Linking via Hough Transform 30. List out the steps involved in splitting and merging. Thresholding 31. What is a global, local and dynamic or adaptive threshold? 32. Define Chain Code Derivative in 4 and 8 connectivity (N/D 09) Region based growing,region splitting and Merging 33. How is an image identified as an over segmented? (M/J 12) 34. What is the principle of region growing based image segmentation?(n/d 11) Segmentation by Morphological watersheds,watershed algorithm 35. What is segmentation? 36. List the applications of segmentation. 37. What are the uses of markers? 38. What is the condition to be met by the partitions in region based segmentation? Department of Electronics and Communication Engineering6

PART B Image Restoration - degradation model,unconstrained restoration - Lagrange multiplier and constrained restoration, Inverse Filtering 1. Explain the following a) Inverse filtering b) Least square error filtering (N/D 12) 2. What is image restoration? Explain the degradation model for continuous function. (N/D 08) 3. Explain mean filters. (N/D 09) 4. Explain the constrained least square restoration. 5. Explain the digital image restoration system and the image observation models. Removal of blur caused by linear motion, Wiener filtering, Geometric Transformations - spatial transformations 6. Explain thewiener filtering approach for image restoration. (M/J 12) 7. What is gray level interpolation? Explain the schemes involved in it.(n/d 12) 8. What is rubber sheet transformation? Explain the basic operations involved in it. (N/D 09) 9. Explain the blind image restoration. Edge Detection 10. How is edge detection performed? Write a suitable algorithm and explain the edge point linking. (M/J 12) 11. What is edge detection? Describe the types of edge detection operations. (N/D 11) Edge Linking via Hough Transform 12. Explain global processing using Hough Transform. (N/D 12) Thresholding 13. Explain the concept of thresholding in image segmentation and write its merits and demerits. Department of Electronics and Communication Engineering7

Region based growing,region splitting and Merging 14. How are region growing,region splitting and merging approaches used for image segmentation. Segmentation by Morphological watersheds,watershed algorithm 15. Explain segmentation by morphological watersheds. 16. Explain the watershed segmentation algorithm. UNIT IV WAVELETS AND IMAGE COMPRESSION PART A Need for data compression 1. What is the need for compression? (N/D 11) 2. Define Compression Ratio. 3. What is image compression? 4. What is data compression? 5. What are the types of data compression? (N/D 09) Codings 6. What are the coding systems in JPEG? (N/D 12) 7. How shift codes are generated? (N/D 12) 8. Write the Hadamard transform matrix H n for n=3. (N/D 12) 9. What is interpixel redundancy? (N/D 08) 10. Define Coding Redundancy 11. Define Interpixel Redundancy 12. What is run length coding? 13. Define Psycho Visual Redundancy 14. Define Encoder 15. Define Source Encoder 16. Define Channel Encoder 17. What are the types of decoder? 18. What are the operations performed by error free compression? 19. What is Variable Length Coding? 20. Define Huffman Coding Department of Electronics and Communication Engineering8

21. Define I frame 22. Define P frame IT6005 - Digital Image ProcessingVII Semester - Question Bank JPEG and MPEG standards 23. What is JPEG? (N/D 11) 24. What are the basic steps used in JPEG? (N/D 09) 25. What is MPEG? (N/D 11) PART B Need for data compression 1. Explain the image compression model with a neat diagram. 2. Explain the need for image compression. How run length encoding approach is used for compression? (N/D 12) 3. Differentiate lossless compression from lossy compression and explain transform coding system. 4. Explain in detail, the Huffman coding procedure with an example. Codings 5. Explain the wavelet coding of images. 6. Explain in detail, the method of zonal and threshold coding. 7. Explain the following lossless compression coding. i. LZW coding ii. Predictive coding 8. Explain the lossy compression wavelet coding. 9. Explain the two dimensional transform coding. (N/D 11) 10. Explain the lossless predictive coding. (N/D 09) 11. Explain the block diagram of the lossy predictive coding with delta modulation technique. (N/D 09) JPEG and MPEG standards 1. Explain the MPEG encoder. (N/D 12) 2. Explain the methods of constructing the masking function based on maximum variance and maximum magnitude. (N/D 12) 3. Explain the image compression standards. Department of Electronics and Communication Engineering9

UNIT V IMAGE REPRESENTATION AND RECOGNITION PART A 1. What is pattern? 2. What is pattern class? 3. What is pattern recognition? 4. What are the three principle pattern arrangements? 5. Define Chaincode 6. What are the demerits of chain code? 7. What is polygonal approximation method? 8. Specify the various polygonal approximation methods. 9. Name few boundary descriptors. 10. Define length of a boundary. 11. Define shape numbers 12. Name few measures used as simple descriptors in region descriptors. 13. Define texture. 14. Define compactness. 15. List the approaches to describe texture of a region. 16. What is global, local and dynamic or adaptive threshold? PART B 1. Explain the boundary descriptors in image representation. 2. Explain the regional descriptors in image representation. 3. Explain the pattern and pattern classes in object recognition. 4. Explain the different object recognition methods. 5. Explain the structural methods in object recognition. Department of Electronics and Communication Engineering10