Estimating the wavelength composition of scene illumination from image data is an

Similar documents
ams AG TAOS Inc. is now The technical content of this TAOS document is still valid. Contact information:

Lecture 1 Image Formation.

Introduction to color science

Estimating basis functions for spectral sensitivity of digital cameras

Computational Photography and Video: Intrinsic Images. Prof. Marc Pollefeys Dr. Gabriel Brostow

COLOR FIDELITY OF CHROMATIC DISTRIBUTIONS BY TRIAD ILLUMINANT COMPARISON. Marcel P. Lucassen, Theo Gevers, Arjan Gijsenij

IMAGE DE-NOISING IN WAVELET DOMAIN

Short Survey on Static Hand Gesture Recognition

Colour Reading: Chapter 6. Black body radiators

Multispectral Image Invariant to Illumination Colour, Strength, and Shading

Color Constancy from Illumination Changes

Color by Correlation: A Simple, Unifying Framework for Color Constancy

Removing Shadows from Images

CHAPTER 9. Classification Scheme Using Modified Photometric. Stereo and 2D Spectra Comparison

SPECTRAL MISMATCH CORRECTION FACTOR ESTIMATION FOR WHITE LED SPECTRA BASED ON THE PHOTOMETER'S f 1 VALUE.

Color Content Based Image Classification

Image Acquisition Image Digitization Spatial domain Intensity domain Image Characteristics

Robust color segmentation algorithms in illumination variation conditions

ELEC Dr Reji Mathew Electrical Engineering UNSW

2003 Steve Marschner 7 Light detection discrete approx. Cone Responses S,M,L cones have broadband spectral sensitivity This sum is very clearly a dot

CSE 167: Lecture #7: Color and Shading. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2011

Physics-based Vision: an Introduction

An ICA based Approach for Complex Color Scene Text Binarization

Low Cost Colour Measurements with Improved Accuracy

CSE 167: Introduction to Computer Graphics Lecture #6: Colors. Jürgen P. Schulze, Ph.D. University of California, San Diego Fall Quarter 2013

Diagonal versus affine transformations for color correction

Chapter 4. The Classification of Species and Colors of Finished Wooden Parts Using RBFNs

Reconstruction of Surface Spectral Reflectances Using Characteristic Vectors of Munsell Colors

Fuzzy Segmentation. Chapter Introduction. 4.2 Unsupervised Clustering.

Pattern Recognition. Kjell Elenius. Speech, Music and Hearing KTH. March 29, 2007 Speech recognition

A Visualization Tool to Improve the Performance of a Classifier Based on Hidden Markov Models

Lecture 11: Classification

Application of CIE with Associated CRI-based Colour Rendition Properties

NAME :... Signature :... Desk no. :... Question Answer

(0, 1, 1) (0, 1, 1) (0, 1, 0) What is light? What is color? Terminology

Analysis of colour constancy algorithms using the knowledge of variation of correlated colour temperature of daylight with solar elevation

Color Image Processing

Illuminant Estimation from Projections on the Planckian Locus

[2006] IEEE. Reprinted, with permission, from [Wenjing Jia, Huaifeng Zhang, Xiangjian He, and Qiang Wu, A Comparison on Histogram Based Image

SD 372 Pattern Recognition

Capturing, Modeling, Rendering 3D Structures

Reprint (R30) Accurate Chromaticity Measurements of Lighting Components. Reprinted with permission from Craig J. Coley The Communications Repair depot

Region-based Segmentation

Estimation of Reflection Properties of Silk Textile with Multi-band Camera

FRESNEL EQUATION RECIPROCAL POLARIZATION METHOD

5. Feature Extraction from Images

Biometrics Technology: Image Processing & Pattern Recognition (by Dr. Dickson Tong)

Computational Statistics The basics of maximum likelihood estimation, Bayesian estimation, object recognitions

E (sensor) is given by; Object Size

Color, Edge and Texture

Change detection using joint intensity histogram

Color Restoration Techniques for Faded Colors of Old Photos, Printings and Paintings

Reading. 2. Color. Emission spectra. The radiant energy spectrum. Watt, Chapter 15.

Digital Image Processing

Super-resolution on Text Image Sequences

Lecture 12 Color model and color image processing

EE640 FINAL PROJECT HEADS OR TAILS

Classification. Vladimir Curic. Centre for Image Analysis Swedish University of Agricultural Sciences Uppsala University

CHAPTER 5 MOTION DETECTION AND ANALYSIS

IMAGING. Images are stored by capturing the binary data using some electronic devices (SENSORS)

Detecting and Identifying Moving Objects in Real-Time

Advanced phase retrieval: maximum likelihood technique with sparse regularization of phase and amplitude

Machine Learning A W 1sst KU. b) [1 P] Give an example for a probability distributions P (A, B, C) that disproves

Production of Video Images by Computer Controlled Cameras and Its Application to TV Conference System

Sensor Modalities. Sensor modality: Different modalities:

Feature Detectors and Descriptors: Corners, Lines, etc.

CS231A Course Project Final Report Sign Language Recognition with Unsupervised Feature Learning

An Algorithm to Determine the Chromaticity Under Non-uniform Illuminant

Lecture #13. Point (pixel) transformations. Neighborhood processing. Color segmentation

Extensions of One-Dimensional Gray-level Nonlinear Image Processing Filters to Three-Dimensional Color Space

Supplementary Material: Specular Highlight Removal in Facial Images

Note Set 4: Finite Mixture Models and the EM Algorithm

Synthesis of Facial Images with Foundation Make-Up

Computer Vision 2. SS 18 Dr. Benjamin Guthier Professur für Bildverarbeitung. Computer Vision 2 Dr. Benjamin Guthier

Combining Strategies for White Balance

Computer Vision. The image formation process

CS 490: Computer Vision Image Segmentation: Thresholding. Fall 2015 Dr. Michael J. Reale

Color Appearance in Image Displays. O Canada!

Neurophysical Model by Barten and Its Development

Computational Color Constancy: Survey and Experiments

ENG 7854 / 9804 Industrial Machine Vision. Midterm Exam March 1, 2010.

DIGITAL COLOR RESTORATION OF OLD PAINTINGS. Michalis Pappas and Ioannis Pitas

Abstract. 1. Introduction

Statistical image models

Lecture 11. Color. UW CSE vision faculty

this is processed giving us: perceived color that we actually experience and base judgments upon.

Color Vision. Spectral Distributions Various Light Sources

Textural Features for Image Database Retrieval

Color can be an important cue for computer vision or

Color and Shading. Color. Shapiro and Stockman, Chapter 6. Color and Machine Vision. Color and Perception

Investigating von Kries-like Adaptation Using Local Linear Models

Principal Component Image Interpretation A Logical and Statistical Approach

Adaptive Learning of an Accurate Skin-Color Model

Digital Image Processing. Prof. P.K. Biswas. Department of Electronics & Electrical Communication Engineering

CIE L*a*b* color model

Image processing & Computer vision Xử lí ảnh và thị giác máy tính

Motion Tracking and Event Understanding in Video Sequences

Image Restoration and Reconstruction

A Statistical Model of Tristimulus Measurements Within and Between OLED Displays

Transcription:

Chapter 3 The Principle and Improvement for AWB in DSC 3.1 Introduction Estimating the wavelength composition of scene illumination from image data is an important topics in color engineering. Solutions to this issue have applications in image understanding, image processing and computer graphics. Several methods for estimating the full spectral-power distribution have been proposed,[17-18] but all depend on strong physical constraints with respect to the illuminant and often encounter mathematical complexity and robustness difficulties associated with inferring a continuous illuminant spectrum from a small number of color sensor responses. In recent years, theorists have attempted a useful and simpler form of the estimation problem.[19-20] Rather than numerical estimates of the full spectral power distribution, one tries to classify the wavelength composition of the scene illumination into one of a restricted number of groups. An example of illumination classification is to restrict the estimation to a set of blackbody radiators, say spaced every 500 degrees Kelvin (K). Classification of illuminants by color temperature is useful in many applications, including photography, color imaging, printing, and room lighting. Color temperature classification provides simple specification of many common light sources.[21] Two related issues are analyzed in this paper. First, we consider the estimation of color temperature of scene illumination from a single image. 13

Second, we consider how a color image acquired under one illumination can be rendered for viewing in an illumination with a different color temperature. We use a modification of the correlation method suggested by Finlayson et al.[19] In that method, each illuminant is associated with a reference gamut in the chromaticity plane. To estimate the illuminant for a given image, the image pixel chromaticities are compared with the reference gamuts of several different illuminants. The color temperature for the image illumination is estimated by finding the best match between the number of pixels and the reference gamuts. Using a set of calibrated images, we found that this method provides a complex process to estimate of the illuminant color temperature and required lots of hardware memory. The difficulty rests in the reliance on chromaticity coordinates. By using a scaled version of the red and blue sensor responses, we obtain a better estimate of the illuminant color temperature. Also, we have modeled this method by Gaussian equation that largely saved the memory in digital still camera. Having estimated the color temperature of the acquired image, it is often desirable to render that image under a simulated illuminant of a different color temperature. When the potential illuminants are restricted to vary only in color temperature, this color correction can be performed in a simple way. The method is described and applied to an image database that includes simulation images and a variety of real scene. 14

3.2 Illuminant Estimation Method (Modified Method) 3.2.1 Color Temperature and Reciprocal Color Temperature To determine the basic parameters of the illuminant classification model, we perform calculations using a set of blackbody radiators, a moderately large set of surface reflectance functions, and the properties of our camera. Figure 3-1 shows the spectral-power distributions of a black body radiator at absolute color temperatures ranging from 1000K to 10,000K in 500K steps. The spectral radiant power at temperature T (in Kelvin K) is described by an equation of the following form.[21] M(l) = c 1 λ -5 {exp(c 2 /λt) - 1} -1, (3-1) where c 1, c 2 are constants and λ is wavelength (m).. Differences in color temperature do not correspond to equal perceptual color differences. Judd s experimental report [23] suggested that visually equally significant differences of color temperature correspond more closely to equal differences of reciprocal color temperature. The unit on the scale of microreciprocal degrees (10 6 K -1 ) is called mired. This unit is also called remek, which is the contraction for a unit of the International System of Units (SI), the reciprocal megakelvin(mk -1 ). Judd determined that color-temperature difference corresponding to a just noticeably different (JND) chromaticity difference over the range of 1800 11 000 K. Figure.3-2 shows the Planckian locus (chromaticity locus of blackbody radiators) in the (u, v ) plane of the CIE 1976 UCS chromaticity diagram, where the locus is segmented in two ways of equal color-temperature steps and equal reciprocal color-temperature steps. Note that small intervals in reciprocal color temperature are more nearly perceptually equal than small intervals in color temperature. 15

(a) (b) Fig.3-1 (a) Spectra of a black body radiator; (b) Planckian locus with the scales of color temperature and reciprocal color temperature in the (u, v ) chromaticity plane. Sources with lower color temperatures tend to be reddish, while those with higher color temperatures are bluish. 3.2.2 Reference Gamut Properties Consider some desirable properties of a set of reference gamuts. First, one reference gamut must not include another. Otherwise, choosing a unique temperature fails. Second, it is preferable that the gamuts be separated and have minimal overlap. Two desires improve the ability to discriminate between illuminant color temperatures. S. Tominaga [22] have examined several coordinate systems with these criteria in mind. First consider the sensor chromaticity coordinates (r, b) which are obtained by normalizing the camera outputs RGB for each surface as 16

r = R/(R+G+B), b = B/(R+G+B). (3-2) Figure 3-2(a) shows the reference gamuts with respect to our camera. The gamut for 8500K includes most of the area in the other gamuts. In the presence of even modest amounts of sensor noise, these chromaticity coordinates provide a poor choice for illuminant classification. Next consider the CIE-xy chromaticity coordinate system. We make a 3x3 matrix for transforming the camera outputs RGB into the tristimulus values XYZ. This matrix is determined by fitting the sensor spectral-sensitivity functions to the CIE color-matching functions. The chromaticity coordinates (x, y) are then obtained by normalizing the tristimulus values XYZ as x = X/(X+Y+Z), y = Y/(X+Y+Z). (3-3) Figure 3-2(b) shows the reference gamuts in the xy chromaticity plane. Again the biggest gamut for 8500K includes most of the area in the other gamuts. One difficulty in using the xy chromaticity representation is that the chromaticity projection removes intensity differences. High intensity regions of the image contain more information about the illuminant than dark, shadowed regions. Hence, basing illuminant classification on data that have been normalized by the chromaticity mapping removes an important source of information. 17

b y (a) r x (b) Fig.3-2 color gamus have different arrangement in different chromaticity plane.(a) r-b chromaticity plane and (b) x-y chromaticity plane. 3.2.3 Finlayson s method (Color by Correlation) Finlayson s work[19] begin by determining which image colors can occur (and how these colors are distributed) under each of a set of possible lights. they discussed in the paper how, for a given camera, they can obtain this knowledge. They then correlate this information with the colors in a particular image to obtain a measure of the likelihood that each of the possible lights was the scene illuminant. Finally, they use this likelihood information to choose a single light as an estimate of the scene illuminant. Computation is expressed and performed in a generic correlation framework. Color constancy have been solved in three stages. First, build a correlation matrix to correlate possible image colors with each of the set of N possible scene illuminants (see Fig.3-3). For each illuminant, characterize the range of possible image colors 18

(chromaticities) that can be observed under that light (Fig. 3-3(a)) This information is used to build a probability distribution (Fig.3-3(b)) which stand for the likelihood of observing an image color under a given light. The probability distributions for each light form the columns of a correlation matrix M (Fig. 3-3(c)) (each row of the matrix corresponds to one of the N x N discrete cells of the partitioned chromaticity space). Color by Correlation 1 st 2 nd 3 nd Bayes rule: It s used to define every element. N (a) (b) (c) Fig. 3-3. Three steps in building a correlation matrix. (a) Firstly, characterize which image colors (chromaticities) are possible under each of our reference illuminants. (b)then, use this information to build a probability distribution for each light. (c) Finally, encode these distributions in the columns of our matrix. Given a correlation matrix and an image whose illuminant we wish to estimate, they performed the following two steps (illustrated in Fig.3-4). First, they determine which image colors are present in the image (Fig. 3-4(a)). This information is coded in a vector v of ones and zeros corresponding to whether or not a given chromaticity is present in the image. Then, determine a measure of the correlation between this image data v and each of the possible illuminants. The usual expression of a correlation is as a vector 19

dot-product. For example, if a and b are vectors, then they are strongly correlated if a.b is large. After that, use a similar dot-product definition of correlation here. Each column of the correlation matrix M corresponds to a possible illuminant so that the elements of the vector returned by the product v t M are a measure of how strongly the image data correlates with each of the possible illuminants. Fig.3-4 is a graphical representation of this process. The highlighted rows of the correlation matrix correspond to chromaticities present in the image (entries of v which are one). To obtain a correlation measure for an illuminant, they simply sum the highlighted elements of the corresponding column. The result of these sum values is a vector, (Fig. 3-4(b)), whose elements express the degree of correlation of each illuminant. l (a) (b) Fig.3-4. Solving for color constancy in three stages. (a) Histogram the chromaticities in the image. (b) Correlate this image vector v with each column of the correlation matrix. (c) This information is used to find an estimate of the unknown illuminant, for example, the illuminant which is most correlated with the image data. 20

3.2.4 Modified Model In the previous section, we have described Finlayson s color by correlation method for illuminant estimation. Although this method have resulted in a good result to illuminant estimation. For the application in digital still camera, this algorithm was too complex and needed lots of memory. Here, we modified their method to simplify the illuminant estimation parameters by making assumption that the chromaticity probability distribution converges to Gaussian distribution. Gaussian model Many patterns-from fish to handwritten characters to some speech sounds-can be viewed as some ideal or prototype corrupted by a large number of random process. Therefore, the Gaussian is often a good model for actual probability distribution. A univariate normal distribution has roughly 95% of its area in the range x u 2σ, as shown in Figure 3-5.[24]. The general multivariate normal density in d dimensions is written as: 1 P( x ) = ( 2π ) d / 2 Σ 1 / 2 1 t exp ( x μ ) Σ 2 1 ( x μ ) (3-4) where: x = (x 1, x 2,, x d ) t (t stands for the transpose vector form) μ = (μ 1, μ 2,, μ d ) t mean vector Σ = d*d covariance matrix Σ and Σ -1 are determinant and inverse, respectively μ = k= n x n k ; σ 2 = k= n k= 1 (x k n μ) k= 1 (3-5) 2 21

Therefore, the parameters for illuminant estimation has reduced to mean and covariance matrix. (a) (b) Fig.3-5 A univariate Gaussian distribution (a) two-dimensions and (b) three-dimension. Classifier We model the chromaticity probability distribution by Gaussian model, and the paramemers now are mean and covariance matrix. Once we get an image whose illuminant we wish to estimate, we perform the following two steps: First, we determine which image colors are present in the image. This information is coded in a mean value. Then, we determine a measure of the correlation between this image mean value and each of the possible illuminants. The usual expression of a correlation is as a vector distance which defined as follow[24]: 1. Euclidean distance: d 2 t = ( x u) ( x u) (3-6) 2. Mahalanobis distance: r 2 t = ( x u) 1 ( x u) (3-7) If the distance is miniman, then they are strongly correlated. 22

3.3 Color Correction Method The color correction method is based on summarizing the ratio of R, G, and B sensor responses under different illuminants. The camera responses to all gray surface response on Macbeth color checker are calculated for an illuminant at each reference color temperature. These values are used to define two functions of color temperature: k 1 (T) =R(T)/G(T), k2(t) =B(T)/G(T) (3-8) In this example, the functions are computed using responses from all the gray surfaces. Shoji Tominaga [22] have experimented with variations of this formula, including methods that emphasize the white surfaces. Several different methods produce similar results. The color image acquired at one color temperature can be rendered as an image at another temperature by using k 1 and k 2. Let [R(T 0 ), G(T 0 ), B(T 0 )] be the RGB values for each pixel at estimated color temperature T 0. The RGB values at any temperature T can be estimated as (R, G, B) = (R(T 0 )k 1 (T)/k 1 (T 0 ), G(T 0 ), B(T 0 )k 2 (T)/k 2 (T 0 ) (3-9) 23

3.4 Summary The present chapter has analyzed two related issues concerning scene illumination. First, we considered the classification of color temperature from a single image. We have introduced a modification of the correlation method for illuminant estimation. The original correlation method (Finlayson s method) used reference gamuts defined in the chromaticity plane. The estimation performance is improved by making assumption that chromaticity probability distribution converges to Gaussian distribution. The improvement occurs because the illuminant estimation parameters is reduced and saved hardware memory in DSC. The color correction method is based on summarizing the ratio of R, G, and B sensor responses under different illuminants. Both of R and B gains are defined as a function of color temperature. The color correction can be performed in a simple way using the lookup table. The proposed method can be applied to a variety of real scenes. The precision of the algorithm was compared using experimental data obtained with a calibrated camera and real images of outdoor scenes which will be discussed in next chapter. 24