Multi-scale Fusion Algorithm Comparisons: Pyramid, DWT and Iterative DWT

Similar documents
Three-band MRI Image Fusion Utilizing the Wavelet-based Method Optimized with Two Quantitative Fusion Metrics

An Advanced Image Fusion Algorithm Based on Wavelet Transform Incorporation with PCA and Morphological Processing

Advanced discrete wavelet transform fusion algorithm and its optimization by using the metric of image quality index

Fusion of Visual and IR Images for Concealed Weapon Detection 1

University of Bristol - Explore Bristol Research. Peer reviewed version. Link to published version (if available): /ICIP.2005.

Visible and Long-Wave Infrared Image Fusion Schemes for Situational. Awareness

Performance Evaluation of Biorthogonal Wavelet Transform, DCT & PCA Based Image Fusion Techniques

Multi-focus Image Fusion Using Stationary Wavelet Transform (SWT) with Principal Component Analysis (PCA)

Image Compression. -The idea is to remove redundant data from the image (i.e., data which do not affect image quality significantly)

Implementation of Image Fusion Algorithm Using Laplace Transform

Region Based Image Fusion Using SVM

IMAGE FUSION PARAMETER ESTIMATION AND COMPARISON BETWEEN SVD AND DWT TECHNIQUE

Image Fusion Using Double Density Discrete Wavelet Transform

International Journal of Engineering Research-Online A Peer Reviewed International Journal Articles available online

Domain. Faculty of. Abstract. is desirable to fuse. the for. algorithms based popular. The key. combination, the. A prominent. the

Multi-focus image fusion using de-noising and sharpness criterion

Change Detection in Remotely Sensed Images Based on Image Fusion and Fuzzy Clustering

A Novel Metric for Performance Evaluation of Image Fusion Algorithms

Multi-Sensor Fusion of Electro-Optic and Infrared Signals for High Resolution Visible Images: Part II

ENHANCED IMAGE FUSION ALGORITHM USING LAPLACIAN PYRAMID U.Sudheer Kumar* 1, Dr. B.R.Vikram 2, Prakash J Patil 3

CHAPTER 3 WAVELET DECOMPOSITION USING HAAR WAVELET

The method of Compression of High-Dynamic-Range Infrared Images using image aggregation algorithms

Image denoising in the wavelet domain using Improved Neigh-shrink

Metric for the Fusion of Synthetic and Real Imagery from Multimodal Sensors

ADVANCED IMAGE PROCESSING METHODS FOR ULTRASONIC NDE RESEARCH C. H. Chen, University of Massachusetts Dartmouth, N.

Real-Time Fusion of Multi-Focus Images for Visual Sensor Networks

Learn From The Proven Best!

Performance Evaluation of Fusion of Infrared and Visible Images

A NOVEL TECHNIQUE TO REDUCE THE IMAGE NOISE USING IMAGE FUSION A.Aushik 1, Mrs.S.Sridevi 2

Survey on Multi-Focus Image Fusion Algorithms

Comparative Evaluation of DWT and DT-CWT for Image Fusion and De-noising

Satellite Image Processing Using Singular Value Decomposition and Discrete Wavelet Transform

FUSION OF TWO IMAGES BASED ON WAVELET TRANSFORM

Evaluation of texture features for image segmentation

Image Contrast Enhancement in Wavelet Domain

COMPARATIVE STUDY OF IMAGE FUSION TECHNIQUES IN SPATIAL AND TRANSFORM DOMAIN

Outlines. Medical Image Processing Using Transforms. 4. Transform in image space

Hybrid Fused Displays: Between Pixel- and Region-Based Image Fusion

Digital Image Processing. Prof. P. K. Biswas. Department of Electronic & Electrical Communication Engineering

A HYBRID APPROACH OF WAVELETS FOR EFFECTIVE IMAGE FUSION FOR MULTIMODAL MEDICAL IMAGES

HYBRID TRANSFORMATION TECHNIQUE FOR IMAGE COMPRESSION

Saurabh Tiwari Assistant Professor, Saroj Institute of Technology & Management, Lucknow, Uttar Pradesh, India

Medical Image Fusion Using Discrete Wavelet Transform

Nonlinear Multiresolution Image Blending

Performance Optimization of Image Fusion using Meta Heuristic Genetic Algorithm

A New Correlation Based Image Fusion Algorithm Using Discrete Wavelet Transform

CHAPTER 3 IMAGE ENHANCEMENT IN THE SPATIAL DOMAIN

Comparison of Fusion Algorithms for Fusion of CT and MRI Images

Fusion of Multimodality Medical Images Using Combined Activity Level Measurement and Contourlet Transform

Medical Image Fusion using Rayleigh Contrast Limited Adaptive Histogram Equalization and Ant Colony Edge Method

A Novel NSCT Based Medical Image Fusion Technique

Efficient Image Compression of Medical Images Using the Wavelet Transform and Fuzzy c-means Clustering on Regions of Interest.

Multi Focus Image Fusion Using Joint Sparse Representation

Adaptive Wavelet Image Denoising Based on the Entropy of Homogenus Regions

ANALYSIS OF SPIHT ALGORITHM FOR SATELLITE IMAGE COMPRESSION

Lecture 8 Object Descriptors

Image Quality Assessment Techniques: An Overview

Multi-focus Image Fusion Using De-noising and Sharpness Criterion

Region-Based Image Fusion Using Complex Wavelets

Optimal Contrast for Color Image Fusion using ICA bases

An Approach for Reduction of Rain Streaks from a Single Image

Digital Image Processing. Chapter 7: Wavelets and Multiresolution Processing ( )

A Novel Image Super-resolution Reconstruction Algorithm based on Modified Sparse Representation

An ICA based Approach for Complex Color Scene Text Binarization

TEXT DETECTION AND RECOGNITION IN CAMERA BASED IMAGES

GENERAL AUTOMATED FLAW DETECTION SCHEME FOR NDE X-RAY IMAGES

Image Fusion Based on Wavelet and Curvelet Transform

CORRELATION BASED CAR NUMBER PLATE EXTRACTION SYSTEM

Optimal Decomposition Level of Discrete, Stationary and Dual Tree Complex Wavelet Transform for Pixel based Fusion of Multi-focused Images

Super-Resolution on Moving Objects and Background

An Effective Multi-Focus Medical Image Fusion Using Dual Tree Compactly Supported Shear-let Transform Based on Local Energy Means

Image Denoising Based on Hybrid Fourier and Neighborhood Wavelet Coefficients Jun Cheng, Songli Lei

Image Compression Algorithm for Different Wavelet Codes

Reduction of Blocking artifacts in Compressed Medical Images

An Optimal Gamma Correction Based Image Contrast Enhancement Using DWT-SVD

EE795: Computer Vision and Intelligent Systems

Image Classification Using Wavelet Coefficients in Low-pass Bands

A Quantitative Approach for Textural Image Segmentation with Median Filter

Object Detection in Video Streams

Identifying Layout Classes for Mathematical Symbols Using Layout Context

CHAPTER 6. 6 Huffman Coding Based Image Compression Using Complex Wavelet Transform. 6.3 Wavelet Transform based compression technique 106

Full- ocused Image Fusion in the Presence of Noise


Schedule for Rest of Semester

Blur Space Iterative De-blurring

to ensure that both image processing and the intermediate representation of the coefficients are performed without significantly losing quality. The r

A Study and Evaluation of Transform Domain based Image Fusion Techniques for Visual Sensor Networks

Comparative Analysis of Discrete Wavelet Transform and Complex Wavelet Transform For Image Fusion and De-Noising

Digital Image Processing

DIGITAL IMAGE WATERMARKING BASED ON A RELATION BETWEEN SPATIAL AND FREQUENCY DOMAINS

Lecture 4: Spatial Domain Transformations

Image Enhancement Techniques for Fingerprint Identification

CHAPTER 5 GLOBAL AND LOCAL FEATURES FOR FACE RECOGNITION

FPGA IMPLEMENTATION OF IMAGE FUSION USING DWT FOR REMOTE SENSING APPLICATION

Improved Multi-Focus Image Fusion

CONTENT ADAPTIVE SCREEN IMAGE SCALING

Robotics Programming Laboratory

Implementation of Hybrid Model Image Fusion Algorithm

Chapter 7 UNSUPERVISED LEARNING TECHNIQUES FOR MAMMOGRAM CLASSIFICATION

A Study on Multiresolution based Image Fusion Rules using Intuitionistic Fuzzy Sets

Transcription:

1th International Conference on Information Fusion Seattle, WA, USA, July 6-9, 009 Multi-scale Fusion Algorithm Comparisons: Pyramid, DWT and Iterative DWT Yufeng Zheng Dept. of Advanced Technologies Alcorn State University Alcorn State, MS 39096, USA yzheng@alcorn.edu Abstract An advanced discrete wavelet transform (adwt) for image fusion was recently proposed and the adwt algorithm was further optimized with an iterative procedure utilizing quantitative fusion metrics the image quality index (IQI) and ratio spatial frequency error (rsfe), respectively. In this paper, a modified rsfe metric by considering the first-order and second-order gradients (denoted as rsfe_nd) is presented. Using rsfe_nd and normalized mutual information (NMI) as optimization metrics, two new iterative adwt (adwti) algorithms are developed. For comparison purpose, Laplacian pyramid and regular DWT are also implemented. Spatial frequency and entropy are used as quantitative evaluation metrics. For visual examination, three frequently-used image pairs were analyzed and presented. From the experimental results, it is found that (1) the adwti-rsfe or adwtirsfe_nd algorithm is the best; () the adwti-iqi provides us a smoothed fusion; (3) the adwti-nmi algorithm is unsatisfied; and (4) Laplacian pyramid is better than regular DWT but poorer than iterative DWT algorithms. Keywords: Advanced discrete wavelet transform (adwt); Iterative image fusion; Image quality index; Laplacian pyramid; Normalized mutual information; Ratio spatial frequency error. 1 Introduction Image fusion has wide applications in medical imaging, remote sensing, nighttime operations and multi-spectral imaging. Together with image registration techniques, more attentions have been paid on image fusion especially in medical image processing domain such as computeraided diagnosis (CAD) with multi-modality images and image guided surgery and therapy [1]. Image fusion combines multiple-source complementary imagery in order to enhance the information apparent in the respective source images, as well as to increase the reliability of interpretation and classification [-4]. This paper discusses the pixel-level fusion process, where a composite image is built of two (or more) input images. A general framework of image fusion can be found elsewhere [5]. In image fusion, some general requirements [6], for instance, pattern conservation and distortion minimization, need to be followed. To measure the image quality, quantitative evaluation of the fused imagery is considered quite important such that performance comparisons of the respective fusion algorithms can be carried out objectively and automatically. In addition, a quantitative metric may potentially be used as feedback to the fusion algorithm to further improve the fused image quality. Two common fusion methods are the discrete wavelet transform (DWT) [6-10] and various pyramids (such as Laplacian, gradient and morphological pyramids) [11-15]. As with any pyramid method, the wavelet-based fusion method is a multi-scale analysis method. The studies [16-18] showed that the DWT methods have more complete theoretical support as well as further development potentials. For example, Ref. [16] showed that an advanced DWT (adwt) fusion algorithm is superior to pyramid-based methods, and Refs. [17-18] presented an iterative adwt (adwti) algorithm optimized by a specified fusion metric. This paper will discuss four iterative adwt algorithms, and compare their results with that of two common algorithms, Laplacian pyramid and regular DWT. Laplacian pyramid is chosen as a representative of pyramid methods according to Ref. [16]. In the adwt method [16,19], principle component analysis (PCA) and morphological processing are incorporated into a regular DWT fusion algorithm. Furthermore, the adwt has two adjustable parameters the level of DWT decomposition (L d ) and the length of the selected wavelet (L w ) that determinately affect the fusion result. The fused image quality can be quantitatively measured with one of four metrics the image quality index (IQI) [17] and ratio spatial frequency error (rsfe) [18], a modified rsfe metric by considering both firstorder and second-order gradients (termed as rsfe_nd) and normalized mutual information (NMI). Varying the control parameters (L d and L w ), an iterative fusion procedure can be implemented and running until an optimized fusion with respect to a particular metric (say IQI) is achieved. Specifically, two common algorithms (Laplacian pyramid and regular DWT) are compared with the four iterative adwt algorithms that are optimized utilizing four fusion metrics IQI, rsfe, rsfe_nd and NMI, respectively. The spatial frequency (SF) and entropy are also used to measure the image quality. 978-0-984438-0-4 009 ISIF 1060

The subsequent sections of this paper are organized as follows. First, the image fusion metrics are fully described. The regular DWT and iterative adwt image fusion methods are then introduced. Next the experimental results and discussions are presented. Lastly conclusions are made based upon the experimental analyses. Image fusion metrics As mentioned in the introduction, an ideal image fusion process should preserve all useful patterns from the source images and meanwhile minimize artifacts that could interfere with subsequent analyses or distract human observers [0]. Given that it is nearly impossible to fuse images without introducing some form of distortion, some measurements are necessary to measure the fused image quality. In this section, we discuss quantitative measures (without need of ground truth images) that could be carried out automatically by computers. There are a total of seven metrics (four optimization metrics plus three general evaluation metrics) used in our image fusion experiments..1 Entropy Entropy (denoted as H) is given by the following equation L 1 H p( l)log, (1) = l= 0 p( l) where p(l) is the probability of gray level l, and the dynamic range of the analyzed image is [0, L-1] (typically L = 56).. Normalized mutual information Mutual information is a similarity measure used for multimodal image registration, which actually measures the relative independence of two images [1]. High values indicate high dependence. The mutual information of two images is given as MI(A,B) = H(A) + H(B) H(A,B) () where H denotes entropy as defined in Eq. (1). H(A,B) is the joint entropy computed with an estimate of the joint density of two input images A and B. Normalized mutual information (NMI) is given as NMI(A,B) = [H(A) + H(B)] / [H(A,B)] (3) which is less sensitive to the image size. To measure the fused image quality with the NMI defined in Eq. (3), a weighted and normalized NMI metric with entropies is defined by the following equation NMI(A,B,F) = [H(A) NMI(A,F) + H(B) NMI(B,F)] / [H(A) +H(B)] (4) where F represents for the fused image. The NMI(A,F) is a measurement of dependence (similarity) between A and F. The NMI(A,B,F) can be interpreted as the measurement of similarity between the fused image (F) and input images (A and B). The larger the NMI(A,B,F), the better the fusion. In case A=B=F, NMI(A,B,F) = 1 (the highest dependence)..3 Image quality index The image quality index was introduced by Wang and Bovik []. Given two images x and y, let x denote the mean of x, and σ x and σ xy denote the variance of x and covariance of x and y, respectively. The global quality index of two images is defined as Q 4σ xy xy x, y) =, (5) ( x + y )( σ + σ ) ( 0 x y which can be decomposed as Q σ xy xy σ xσ y x, y) =. (6) σ σ ( x + y ) ( σ + σ ) ( 0 x y x y Note that the first component in Eq. (6) is the correlation coefficient between x and y. This value is a measure for the similarity of the images x and y, and takes values between 0 and 1. The second component in Eq. (6) corresponds to the luminance distortion which has a dynamic range of [0, 1]. The third factor in Eq. (6) measures the contrast distortion and its range is also [0, 1]. In summary, Q 0 [0, 1], and the maximum value Q 0 = 1 is achieved when x and y are identical. Piella and Heijmans [3] introduced a weighting procedure into Q 0 calculation. The weight should reflect the local relevance of an input image that may depend on its local variance, contrast, sharpness, or entropy. Given the local salience (e.g., local variance) of two input images A and B, we compute a local weight λ indicating the relative importance of image A compared to image B: the larger λ, the more weight is given to image A. A typical choice for λ is λ = S(I A ) / [S(I A ) + S(I B )], (7) where S(I A ) and S(I B ) denote the salience of input image A and B, respectively. Then, the weighted image quality index (IQI) can be defined as Q w = λq 0 (I A, I F ) + (1 λ) Q 0 (I B, I F ). (8) Since image signals are generally non-stationary, it is more appropriate to measure the weighted image quality index Q w over local regions (e.g., by splitting the entire image into a set of blocks ) and then combine these local results into a single measure as the global measure of the entire image. Piella also suggested using local variance as the salience of an image, e.g., S(I A ) = σ A. In fact, the IQI metric measures the similarity between the fused image (I F ) and both of the input images (I A and I B ) by assuming that an ideally fused image should resemble both original input images. For example, in our experiments, each image (I A, I B or I F ) is divided into a certain number of blocks by designating the block size of 16 16 pixels. Q 0 and λ are 1061

computed on each small bock (window), which concludes block-wise Q w by using Eq. (8). Then sum all Q w s up through all blocks and finally take the average..4 Spatial frequency and its modification The metric spatial frequency (SF) [4-5] is used to measure the overall activity level of an image. The spatial frequency of an image is defined as SF = [( RF) + ( CF) + ( MDF ) + ( SDF ) ]/(4 1), (9) where RF and CF are row frequency and column frequency respectively; and MDF and SDF represent for main diagonal SF and secondary diagonal SF. Eq. (9) is a revision of original definition (refer to [16,18,4-5]) about spatial frequency by introducing two diagonal SFs and also with the normalization of the degree of freedom. Four directional spatial frequencies are defined as follow, RF = CF = MDF = 1 M N, (10a) i= 1 j= j= 1i= [ I ( i, I ( i, j 1)] 1 N M. (10b) wd [ I ( i, I ( i 1, ] 1 M N, (10c) i= j= 1 1 N M j= 1i= [ I( i, I( i 1, j 1)] SDF = wd [ I ( i, I ( i 1, j + 1)], (10d) where w d =1 is a distance weight; similarly it can be considered that w d = 1 in Eqs. (10a-b). M and N are the image size (in pixels). Notice that the term of spatial frequency that is computed in spatial domain as defined in Eqs. (9-10), does not correspond with Fourier transform where the spatial frequency is measured in frequency domain with the unit of cycles per degree or cycles per millimeter. The SF definition given above can be rewritten with the consideration of the first-order and second-order gradients (termed as SF_nd ) simultaneously, SF _ nd = { [( RF ) + ( CF ) + ( MDF ) + ( SDF ) + ( RF _ ) + ( CF _ ) + ( MDF _ ) + ( SDF _ ) ] /(8 1) (11) where RF_ is the row frequency of second-order gradients that can be defined as M N 1 RF _ = [ I( i, I( i, j 1) + I( i, j )]. i= 1 j= 3 (1) With the similar definition in (1) we can define the other three directional frequencies of second-order gradients. } 1.5 The ratio of spatial frequency error and its modification With Eq. (9) we can calculate the SFs of input images (SF A and SF B ) or of the fused image (SF F ). Now we determine how to calculate a reference SF (SF R ) with which the SF F can be compared. The four differences (inside square brackets) defined in Eqs. (10a-d) are actually the four firstorder gradients along four directions at that pixel, denoted as Grad[I(i,]. The four reference gradients can be obtained by taking the maximum of absolute gradient values between input image A and B along four directions: Grad D [I R (i,] = max{ Grad D [I A (i,], Grad D [I B (i,] }, for each of four directions, i.e., D = {H, V, MD, SD}, (13) where D denotes one of four directions (Horizontal, Vertical, Main Diagonal, and Secondary Diagonal). Substituting the differences (defined inside square brackets) in Eqs. (10a-d) with Grad D [I R (i,], four directional reference SFs (i.e., RF R, CF R, MDF R and SDF R ) can be calculated. For example, the reference row frequency can be calculated as follows: RF R = 1 M N H [ Grad ( I R ( i, )] (14) i= 1 j= Similar to Eq. (9), the SF R can be computed by combining four directional reference SFs. Note that the notation of Grad H [I R (i,] is interpreted as the horizontal reference gradient at point (i,, and no reference image is needed to compute the SF R value. Finally, the ratio of SF error (rsfe) is defined as follows: rsfe = (SF F SF R ) / SF R. (15) Clearly, an ideal fusion has rsfe = 0; that is, the smaller rsfe s absolute value, the better the fused image. Furthermore, rsfe > 0 means that an over-fused image, with some distortion or noise introduced, has resulted; rsfe < 0 denotes that an under-fused image, with some meaningful information lost, has been produced. With the definition of SF_nd in Eq. (11) and the above description, it is not difficult to define and compute SF F _nd and SF R _nd by considering the first-order and second-order gradients simultaneously. Similar with Eq. (15) a modified rsfe metric can be conducted, termed as rsfe_nd. Specifically, to compute SF R _nd, both Grad D [I R (i,] and Grad_nd D [I R (i,] (refer to Eqs. (13-14)) have to be calculated first. 3 Image fusion methods In this section, Laplacian pyramid and regular DWT fusion methods are briefly reviewed. An advanced discrete wavelet transform (adwt) [16] is then introduced. Based on the adwt, four iterative fusion algorithms can be 106

formed and optimized with the four established metrics IQI, rsfe, rsfe_nd and NMI, respectively. 3.1 Laplacian pyramid The Laplacian pyramid was first introduced as a model for binocular fusion in human stereo vision [0], where the implementation used a Laplacian pyramid and a maximum selection rule at each point of the pyramid transform. Essentially, the procedure involves a set of band-pass copies of an image is referred to as the Laplacian pyramid due to its similarity to a Laplacian operator. Each level of the Laplacian pyramid is recursively constructed from its lower level by applying the following four basic steps: blurring (low-pass filtering); sub-sampling (reduce size); interpolation (expand); and differencing (to subtract two images pixel by pixel) [1]. In the Laplacian pyramid, the lowest level of the pyramid is constructed from the original image. 3. DWT fusion and the advanced DWT The DWT fusion method is a multi-scale analysis method. At each DWT scale of a particular image, the DWT coefficients of a D image consist of four parts: approximation, horizontal detail, vertical detail and diagonal detail. In a regular DWT fusion process, DWT coefficients from two input images are fused pixel-bypixel by choosing the average of the approximation coefficients at the highest transform scale; and the larger absolute value of the detail coefficients at each transform scale. Then an inverse DWT is performed to obtain the fused image. In the advanced DWT (adwt) method, we apply principal component analysis (PCA) to the two input images approximation coefficients at the highest transform scale, that is, we fuse them by using the principal eigenvector (corresponding to the larger eigenvalue) derived from the two original images, as described in Eq. (16) below: C F ( 1 A B 1 + a = a C + a C ) /( a ), (16) where C A and C B are approximation coefficients transformed from input images A and B. C F represents the fused coefficients; a 1 and a are the elements of the principal eigenvector, which are computed by analyzing the original input images (Note: not analyzing C A and C B alone because their sizes at the highest transform scale are too small to conduct an accurate result). Note that the denominator in Eq. (16) is used for normalization so that the fused image has the same energy distribution as the original input images. For the detail coefficients (the other three quarters of the coefficients) at each transform scale, the larger absolute values are selected, followed by neighborhood (e.g., a 3 3 window) morphological processing, which serves to verify the selected pixels by using a filling and cleaning operation (i.e., the operation fills or removes isolated pixels locally). For example, in a 3 3 processing window (sliding pixel-by-pixel over the whole image), if the central coefficient was selected from Image A but all its 8-surrounding coefficients were selected from Image B, then the central one would be replaced with the detail coefficient from Image B. Such an operation (similar to smoothing) can increase the consistency of coefficient selection thereby reducing the intensity distortion in the fused image. 3.3 The iterative adwt algorithm optimized by the IQI The IQI value (discussed in Section.3) is calculated to measure the image quality fused by the adwt, then it is fed back to the fusion algorithm in order to achieve a better fusion by adjusting parameters. Previous experiments [16] have pointed to an important relationship between the fused image quality and the wavelet properties, that is, a higher level DWT decomposition (with smaller image resolution at higher scale) or a lower order of wavelets (with shorter length) usually result in a more sharpened fused image. This means that we can use the level of DWT decomposition (Notated as L d ) and the length of a wavelet (notated as L w ) as adjusted parameters of an iterative adwt (adwti) algorithm. If measured with IQI, the IQI value usually tends to be large for a smaller L d or larger L w. The combination of the adwti algorithm and the IQI metric derives a new algorithm, termed as adwti-iqi. With the definition of IQI, we know that it has an ideal value, 1, but it cannot give the error direction (i.e., positive or negative; refer to Section 3.4) because of 0 < IQI 1 (i.e., always IQI-1 < 0). Of course, termination conditions are needed in order to stop the fusion iteration. The fusion iteration stops when (1) it converges at the ideal point the absolute value of (IQI-1) is smaller than a designated tolerance error, i.e. IQI-1) < ε; () there is no significant change of the IQI value between two adjacent iterations; (3) the IQI value is generally decreasing for subsequent iterations; (4) the parameters boundaries are reached. In implementing the iteration of a fusion procedure, appropriate boundaries of varying parameters should be designated based on the definition of parameters and the context. The details of implementation are depicted in Ref. [17]. 3.4 The iterative adwt algorithms optimized by the rsfe or rsfe_nd The similar fusion procedure as depicted in Section 3.3 is applicable here except for using a different optimization metric. Hereby we use the rsfe (or rsfe_nd) to direct the iterative fusion process that results in two adwti algorithms, adwti-rsfe and adwti-rsfe_nd respectively. Notice that the error of the rsfe (or rsfe_nd) value (subtracting the current rsfe or rsfe_nd value from the ideal value, 0) can indicate the error direction (positive or negative), which is useful to decide 1063

the parameters (L d and L w ) adjustment (i.e., increment or decrement). For example, if the current rsfe(l d, L w ) < 0 (under-fused), then increase L d or decrease L w for next adwti fusion iteration. Replace the termination condition (3) with the absolute value of rsfe (or rsfe_nd) is generally increasing for subsequent iterations; One more termination condition is applicable to the adwti-rsfe (or adwti-rsfe_nd) algorithm, that is, (5) stop the iteration when the rsfe (or rsfe_nd) value crosses over zero (the ideal value). Using the sign of rsfe (or rsfe_nd) the iteration procedure can be expedited. 3.5 The iterative adwt algorithm optimized by the NMI Similar to adwti-iqi, the adwti-nmi algorithm can be formed by using NMI as the optimization metric (instead of using IQI). NMI has an ideal value, 1.0, but it cannot provide the error direction. 4 Experiments and discussions For visual examinations, quantitative analyses and comparisons, three frequently-used image pairs are fused and presented in Figs. 1-3. The quantitative evaluation values are shown in Tables 1-. 4.1 Experiments design In our experiments, six fusion algorithms were implemented and applied to three image pairs, which are Laplacian pyramid, regular DWT, four iterative DWT procedures (i.e., adwti-iqi, adwti-rsfe, adwtirsfe_nd and adwti-nmi, with this fixed order by default when mentioning four iterative algorithms thereafter). As recommended in literatures, four-level fusion procedures were implemented in the Laplacian pyramid and regular DWT methods. For the regular DWT, the length of wavelet was four (L w = 4). Symlets are chosen in all DWT-based algorithms because Symlets are the modifications of Daubechies wavelets and the symmetry of Symlets is increased while retaining great simplicity. Four iterative DWT procedures were implemented by separately varying two parameters, i.e., first changing L d (because this parameter dominates the fused image quality [17-18]) then changing L w to optimize the respective metric. The convergence speed of the adwti-rsfe (or -rsfe_nd) is faster than that of the other two algorithms. The optimized parameters (L d, L w ) are also given in Table. For reference, the SF and entropy values of original input images are listed in Table 1. 4. Result analyses and discussion As shown in Figs. 1-3, three frequently-used image pairs were fused, which are night-vision (NV II/IR) images, medical images (CT/MRI) and off-focal images (clocks). From the three examples shown in Figs. 1-3, by casual inspections we observed that the fused images by adwti- NMI (shown in Figs. 1-3(h)) is the poorest (especially unsatisfied with Fig. (h)), and that the fused images (Figs. 1-3(e-g)) by other three iterative algorithms (adwti-rsfe, adwti-rsfe_nd, and adwti-iqi) are better than the fused images (Figs. 1-3(c-d)) by two common algorithms (Laplacian pyramid and regular DWT). For quantitative evaluations, there are seven metrics are shown in Table but we need to select one metric to measure the image quality across six different algorithms in order to make a proper and fair comparison. The IQI, rsfe, rsfe_nd and NMI were used as optimization metrics in four iterative algorithms and thus they cannot be selected for comparison. The metric of Entropy (H) is not suitable because it cannot reflect the fusion relationship. For instance, we expect H(T1+T) Any{ H(T1), H(T)}. For most of fusion cases, their Entropy values do not obey this inequation whereas their SF values do. Although the SF metric itself cannot absolutely measure the fused image quality because it cannot distinguish useful information from artifacts or noise, it is still suitable to compare fused images quality (i.e. as a relative measurement). The SF_nd is not used as a comparison metric because it is a newly proposed metric and has not been verified yet. Hereby we assume that a larger SF value of a fused image means more information and thus a better fusion. According to the SF evaluations shown in Table, the largest SF value (among six values regarding each image pair) is highlighted in bold face. In fact, the best algorithm (adwti-rsfe) evaluated with the SF metric is consistent with the visual examinations (refer to Figs. 1-3). In sum, by examining all fused images (shown in Figs. 1-3) both visually and quantitatively, the quality of fused images produced by six algorithms are ordered from the best to the poorest as follows, adwti-rsfe (adwtirsfe_nd), adwti-iqi, Laplacian pyramid, regular DWT, and adwti-nmi. In addition, adwti-iqi generates smoothed images whereas adwti-rsfe (or adwti-rsfe_nd) produces sharpened images. An interesting question is what the optimization values for parameters (L d, L w ) are. Although they rely on a particular application, the experimental values are shown in Table. To have an optimized fusion using adwtirsfe, we may initially set (L d, L w ) = (6, 3) that will be able to speed up the convergence. 5 Conclusions Laplacian pyramid, regular DWT and four iterative DWT fusion procedures (adwti-iqi, adwti-rsfe, adwtirsfe_nd and adwti-nmi) are developed and compared with the applications to three pairs of image fusion. The fused images were evaluated quantitatively and qualitatively (visually). From the experimental results the following points can be concluded: (1) adwti-rsfe or adwti-rsfe_nd gives a better and sharpened fusion; () the adwti-iqi algorithm produces a reasonably good and smoothed image; (3) adwti-nmi cannot provide an 1064

satisfied fusion. In other words, the NMI metric seems not to be suitable as an optimization metric; (4) Laplacian pyramid is sometimes better than regular DWT but not as good as the iterative DWT methods. The fused images are greatly useful for further image processing such as segmentation, computer-aided diagnosis (CAD), and automatic target recognition. Acknowledgements This research is supported by the U. S. Army Research Office under grant number W911NF-08-1-0404. Table 1: Three general measurements (spatial frequency, revised spatial frequency and entropy) of input images. Image SF_A SF_B SF_nd_A SF_nd_B Epy_A Epy_B Clocks 8.139 6.4494 7.1047 6.155 6.9784 6.94 CT/MRI 1.3174 1.043 10.343 9.6346 1.716 5.6561 NV II/IR 9.5474 8.5130 9.875 9.7944 7.1484 6.7589 Table : Seven measurements of six versions of fused images (refer to Figs. 1-3) for each of three frequently used image pairs. Three parameter values (number of iterations, level of decomposition and length of wavelets) of each iterative fusion algorithm are given in Column 3-5. The best algorithm evaluated with the selected metric (SF) is highlighted in bold face. Image Algorithm #Iterations L d L w IQI rsfe rsfe_nd NMI SF SF_nd Entropy Clocks CT/MRI NV II/IR Laplacian Pyramid - 4-0.8887 0.88 0.186 0.6941 7.6415 7.0407 7.1156 Regular DWT - 4 4 0.8978 0.018 0.1904 0.6877 7.9095 7.954 7.187 adwti-iqi 15 6 7 0.97-0.0363-0.003 0.6847 9.5491 8.880 7.4398 adwti -rsfe 13 7 3 0.9-0.036-0.0149 0.654 9.586 8.8761 7.4181 adwti -rsfe_nd 15 6 3 0.93-0.037-0.0147 0.6670 9.5845 8.8777 7.4385 adwti-nmi 16 1 5 0.9068-0.3169-0.718 0.7014 6.7686 6.561 7.374 Laplacian Pyramid - 4-0.6975 0.397 0.3083 0.5499 11.308 9.458 6.4705 Regular DWT - 4 4 0.5107 0.5146 0.4984 0.5606 8.1336 6.8554 5.9146 adwti-iqi 15 5 10 0.7797-0.0381-0.0144 0.5589 16.118 13.4700 6.8934 adwti -rsfe 15 4 0.7696 0.0093 0.0753 0.610 16.910 14.6957 6.7199 adwti -rsfe_nd 5 4 3 0.748-0.0454 0.0009 0.5967 15.9950 13.6791 6.8056 adwti-nmi 15 1 9 0.6156-0.307-0.303 0.7676 11.6090 9.5353 6.1858 Laplacian Pyramid - 4-0.7335-0.17-0.1047 0.5630 10.494 11.604 7.0794 Regular DWT - 4 4 0.6494-0.966-0.845 0.5300 8.4576 9.78 6.1714 adwti-iqi 15 6 7 0.7089-0.0089 0.001 0.540 11.917 1.9868 6.9556 adwti -rsfe 14 0.6107 0.00 0.0999 0.5369 1.0474 14.537 6.5885 adwti -rsfe_nd 1 3 3 0.6388-0.0405 0.0085 0.539 11.5366 13.0693 6.576 adwti-nmi 3 1 0.6199-0.0398 0.098 0.541 11.5448 14.35 6.7017 Fig. 1: Night-vision images: (a) Image intensified (II), (b) Infrared image (IR), (c) Fused by Laplacian pyramid, (d) Fused by regular DWT, (e) Fused by adwti-iqi, (f) Fused by adwti-rsfe, (g) Fused by adwti-rsfe_nd, (h) Fused by adwti-nmi. 1065

Fig. : Medical images: (a) CT image, (b) MRI image, (c) Fused by Laplacian pyramid, (d) Fused by regular DWT, (e) Fused by adwti-iqi, (f) Fused by adwti-rsfe, (g) Fused by adwti-rsfe_nd, (h) Fused by adwti-nmi. Fig. 3: Image Clocks: (a) Off-focal Image A, (b) Off-focal Image B, (c) Fused by Laplacian pyramid, (d) Fused by regular DWT, (e) Fused by adwti-iqi, (f) Fused by adwti-rsfe, (g) Fused by adwti-rsfe_nd, (h) Fused by adwti-nmi. Papers 1990, ACSMASPRS Annual Conv. Image Processing and Remote Sensing 4, pp. 35 360 (1990). References [1] J. V. Hajnal, D. L. G. Hill, and D. J. Hawkes, View of the [3] E. A. Essock, M. J. Sinai, J. S. McCarley, W. K. Krebs, and future, in Medical Image Registration, J. V. Hajnal, D. L. G. Hill, and D. J. Hawkes, Eds. Boca Raton, FL: CRC, 001. J. K. DeFord, Perceptual ability with real-world nighttime scenes: imageintensified, infrared, and fused-color imagery, Hum. Factors 41(3), 438 45 (1999). [] R. H. Rogers and L. Wood, The history and status of [4] E. A. Essock, J. S. McCarley, M. J. Sinai, and J. K. merging multiple sensor data: an overview, in Technical 1066 DeFord, Human perception of sensor-fused imagery, in

Interpreting Remote Sensing Imagery: Human Factors, R. R. Hoffman and A. B. Markman, Eds., Lewis Publishers, Boca Raton, Florida (Feb. 001). [5] C. Pohl and J. L. Van Genderen, Review article: multisensor image fusion in remote sensing: concepts, methods and applications, Int. J. Remote Sens. 19(5), 83 854 (1998). [6] O. Rockinger, Pixel level fusion of image sequences using wavelet frames, in Proc. 16th Leeds Applied Shape Research Workshop, pp. 149 154, Leeds University Press (1996). [7] H. Li, B. S. Manjunath, and S. K. Mitra, Multisensor image fusion using the wavelet transform, Graph. Models Image Process. 57(3), 35 45 (1995). [8] T. Pu and G. Ni, Contrast-based image fusion using the discrete wavelet transform, Opt. Eng. 39(8), 075 08 (000). [9] D. A. Yocky, Image merging and data fusion by means of the discrete two-dimensional wavelet transform, J. Opt. Soc. Am. A 1(9), 1834 1841 (1995). [10] J. Nunez, X. Otazu, O. Fors, A. Prades, V. Pala, and R. Arbiol, Image fusion with additive multiresolution wavelet decomposition; applications to spot1landsat images, J. Opt. Soc. Am. A 16, 467 474 (1999). [11] F. Jahard, D. A. Fish, A. A. Rio, and C. P. Thompson, Far/near infrared adapted pyramid-based fusion for automotive night vision, in IEEE Proc. 6th Int. Conf. on Image Processing and its Applications (IPA97), pp. 886 890 (1997). [1] P. J. Burt and E. Adelson, The Laplacian pyramid as a compact image code, IEEE Trans. Commun. Com-31(4), 53 540 (1983). [13] A. Toet, L. J. Van Ruyven, and J. M. Valeton, Merging thermal and visual images by contrast pyramid, Opt. Eng. 8(7), 789 79 (1989). [14] P. J. Burt, A gradient pyramid basis for pattern-selective image fusion, SID Int. Symp. Digest Tech. Papers 16, 467 470 (1985). [15] A. Toet, A morphological pyramid image decomposition, Pattern Recogn. Lett. 9(4), 55 61 (1989). [16] Y. Zheng, E. A. Essock, and B. C. Hansen, An advanced image fusion algorithm based on wavelet transform incorporation with PCA and morphological processing, Proc. SPIE 598, 177 187 (004). [17] Y. Zheng, E. A. Essock, and B. C. Hansen, An advanced DWT fusion algorithm and its optimization by using the metric of image quality index, Optical Engineering, 44(3), 037003-1-1 (005). [18] Y. Zheng, E. A. Essock, and B. C. Hansen and A. M. Haun, A new metric based on extended spatial frequency and its application to DWT based fusion algorithms, Information Fusion, Vol. 8, No., pp. 177-19 (007). [19] Y. Zheng, A. S. Elmaghraby and H. Frigui, Three-band MRI Image Fusion Utilizing the Wavelet-based Method Optimized with Two Quantitative Fusion Metrics, Proc. of the SPIE, Vol. 6144, pp. 61440R-1-61440R-1 (006). [0] P. J. Burt and E. H. Adelson, Merging images through pattern decomposition, Proc. SPIE 575, 173 18 (1985). [1] D. L. G. Hill and P. Batchelor, Registration methodology: concepts and algorithms, in Medical Image Registration, J. V. Hajnal, D. L. G. Hill, and D. J. Hawkes, Eds. Boca Raton, FL: CRC, 001. [] Z. Wang and A. C. Bovik, A universal image quality index, IEEE Signal Process. Lett. 9(3), 81 84 (00). [3] G. Piella and H. Heijmans, A new quality metric for image fusion, in Proc. 003 Int. Conf. on Image Processing, Barcelona, Spain (003). [4] M. Eskicioglu and P. S. Fisher, Image quality measure and their performance, IEEE Trans. Commun. 43(1), 959 965 (1995). [5] S. Li, J. T. Kwok, and Y. Wang, Combination of images with diverse focuses using the spatial frequency, Infusionstherapie (3), 169 176 (001). 1067