IMAGE CODING USING WAVELET TRANSFORM, VECTOR QUANTIZATION, AND ZEROTREES

Similar documents
Wavelet Transform (WT) & JPEG-2000

FAST AND EFFICIENT SPATIAL SCALABLE IMAGE COMPRESSION USING WAVELET LOWER TREES

Embedded Descendent-Only Zerotree Wavelet Coding for Image Compression

Embedded Rate Scalable Wavelet-Based Image Coding Algorithm with RPSWS

ECE 533 Digital Image Processing- Fall Group Project Embedded Image coding using zero-trees of Wavelet Transform

A 3-D Virtual SPIHT for Scalable Very Low Bit-Rate Embedded Video Compression

Modified SPIHT Image Coder For Wireless Communication

Wavelet Based Image Compression Using ROI SPIHT Coding

A New Configuration of Adaptive Arithmetic Model for Video Coding with 3D SPIHT

An Optimum Approach for Image Compression: Tuned Degree-K Zerotree Wavelet Coding

Image Compression Algorithms using Wavelets: a review

CSEP 521 Applied Algorithms Spring Lossy Image Compression

Visually Improved Image Compression by using Embedded Zero-tree Wavelet Coding

SIGNAL COMPRESSION. 9. Lossy image compression: SPIHT and S+P

ANALYSIS OF SPIHT ALGORITHM FOR SATELLITE IMAGE COMPRESSION

JPEG 2000 compression

Fully Scalable Wavelet-Based Image Coding for Transmission Over Heterogeneous Networks

Optimized Progressive Coding of Stereo Images Using Discrete Wavelet Transform

A Image Comparative Study using DCT, Fast Fourier, Wavelet Transforms and Huffman Algorithm

An Embedded Wavelet Video Coder. Using Three-Dimensional Set. Partitioning in Hierarchical Trees. Beong-Jo Kim and William A.

An Embedded Wavelet Video Coder Using Three-Dimensional Set Partitioning in Hierarchical Trees (SPIHT)

PERFORMANCE ANAYSIS OF EMBEDDED ZERO TREE AND SET PARTITIONING IN HIERARCHICAL TREE

MEDICAL IMAGE COMPRESSION USING REGION GROWING SEGMENATION

An Embedded Wavelet Video. Set Partitioning in Hierarchical. Beong-Jo Kim and William A. Pearlman

Color Image Compression Using EZW and SPIHT Algorithm

Analysis and Comparison of EZW, SPIHT and EBCOT Coding Schemes with Reduced Execution Time

signal-to-noise ratio (PSNR), 2

DCT-BASED IMAGE COMPRESSION USING WAVELET-BASED ALGORITHM WITH EFFICIENT DEBLOCKING FILTER

IMAGE COMPRESSION USING EMBEDDED ZEROTREE WAVELET

An embedded and efficient low-complexity hierarchical image coder

Short Communications

Performance Evaluation on EZW & SPIHT Image Compression Technique

On the Selection of Image Compression Algorithms

A SCALABLE SPIHT-BASED MULTISPECTRAL IMAGE COMPRESSION TECHNIQUE. Fouad Khelifi, Ahmed Bouridane, and Fatih Kurugollu

JPEG Joint Photographic Experts Group ISO/IEC JTC1/SC29/WG1 Still image compression standard Features

Error Protection of Wavelet Coded Images Using Residual Source Redundancy

Scalable Medical Data Compression and Transmission Using Wavelet Transform for Telemedicine Applications

IMAGE DATA COMPRESSION BASED ON DISCRETE WAVELET TRANSFORMATION

Motion Estimation Using Low-Band-Shift Method for Wavelet-Based Moving-Picture Coding

Fully Spatial and SNR Scalable, SPIHT-Based Image Coding for Transmission Over Heterogenous Networks

Module 8: Video Coding Basics Lecture 42: Sub-band coding, Second generation coding, 3D coding. The Lecture Contains: Performance Measures

CS 335 Graphics and Multimedia. Image Compression

CERIAS Tech Report An Evaluation of Color Embedded Wavelet Image Compression Techniques by M Saenz, P Salama, K Shen, E Delp Center for

Image Compression using Discrete Wavelet Transform Preston Dye ME 535 6/2/18

On the Selection of Image Compression Algorithms

A Review on Wavelet-Based Image Compression Techniques

Bit-Plane Decomposition Steganography Using Wavelet Compressed Video

Visually Improved Image Compression by Combining EZW Encoding with Texture Modeling using Huffman Encoder

Reversible Wavelets for Embedded Image Compression. Sri Rama Prasanna Pavani Electrical and Computer Engineering, CU Boulder

Medical Image Compression Using Multiwavelet Transform

Coding the Wavelet Spatial Orientation Tree with Low Computational Complexity

SI NCE the mid 1980s, members from both the International Telecommunications Union (ITU) and the International

DCT Coefficients Compression Using Embedded Zerotree Algorithm

Image Compression Algorithm and JPEG Standard

Fingerprint Image Compression

to ensure that both image processing and the intermediate representation of the coefficients are performed without significantly losing quality. The r

Application of Daubechies Wavelets for Image Compression

Image Compression for Mobile Devices using Prediction and Direct Coding Approach

Review and Implementation of DWT based Scalable Video Coding with Scalable Motion Coding.

Image Compression Algorithm for Different Wavelet Codes

Wavelet-based Contourlet Coding Using an SPIHT-like Algorithm

Topic 5 Image Compression

Improved Image Compression by Set Partitioning Block Coding by Modifying SPIHT

Wavelet Based Image Compression, Pattern Recognition And Data Hiding

CHAPTER 6. 6 Huffman Coding Based Image Compression Using Complex Wavelet Transform. 6.3 Wavelet Transform based compression technique 106

Denoising of Fingerprint Images

DIGITAL IMAGE PROCESSING WRITTEN REPORT ADAPTIVE IMAGE COMPRESSION TECHNIQUES FOR WIRELESS MULTIMEDIA APPLICATIONS

Adaptive Quantization for Video Compression in Frequency Domain

Implication of variable code block size in JPEG 2000 and its VLSI implementation

SPIHT-BASED IMAGE ARCHIVING UNDER BIT BUDGET CONSTRAINTS

ANALYSIS OF IMAGE COMPRESSION ALGORITHMS USING WAVELET TRANSFORM WITH GUI IN MATLAB

A COMPRESSION TECHNIQUES IN DIGITAL IMAGE PROCESSING - REVIEW

WAVELET BASED VIDEO COMPRESSION USING STW, 3D- SPIHT & ASWDR TECHNIQUES: A COMPARATIVE STUDY

Low-Memory Packetized SPIHT Image Compression

JPEG Compression Using MATLAB

REGION-BASED SPIHT CODING AND MULTIRESOLUTION DECODING OF IMAGE SEQUENCES

Mr.Pratyush Tripathi, Ravindra Pratap Singh

The Improved Embedded Zerotree Wavelet Coding (EZW) Algorithm

THE TRANSFORM AND DATA COMPRESSION HANDBOOK

8- BAND HYPER-SPECTRAL IMAGE COMPRESSION USING EMBEDDED ZERO TREE WAVELET

A Lossy Image Codec Based on Adaptively Scanned Wavelet Difference Reduction

An adaptive wavelet-based approach for perceptual low bit rate audio coding attending to entropy-type criteria

Fast Progressive Image Coding without Wavelets

New Perspectives on Image Compression

Perfect Compression Technique in Combination with Training Algorithm and Wavelets

Digital Image Processing. Chapter 7: Wavelets and Multiresolution Processing ( )

Image Segmentation Techniques for Object-Based Coding

IMPLEMENTATION OF BCWT IN GUI WAVELET TOOLBOX. Spandana Kongara, B. Tech. A Thesis ELECTRICAL ENGINEERING

Progressive resolution coding of hyperspectral imagery featuring region of interest access

06/12/2017. Image compression. Image compression. Image compression. Image compression. Coding redundancy: image 1 has four gray levels

642 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, VOL. 11, NO. 5, MAY 2001

A Study of Image Compression Based Transmission Algorithm Using SPIHT for Low Bit Rate Application

HYBRID LOSSLESS-LOSSY COMPRESSION OF INDUSTRIAL RADIOGRAPHS. Ajai Narayan and Tenkasi V. Ramabadran

Hybrid Fractal Zerotree Wavelet Image Coding

CHAPTER 2 LITERATURE REVIEW

AUDIO COMPRESSION USING WAVELET TRANSFORM

Center for Image Processing Research. Motion Differential SPIHT for Image Sequence and Video Coding

Application of wavelet filtering to image compression

International Journal of Multidisciplinary Research and Modern Education (IJMRME) ISSN (Online): ( Volume I, Issue

Keywords DCT, SPIHT, PSNR, Bar Graph, Compression Quality

Transcription:

IMAGE CODING USING WAVELET TRANSFORM, VECTOR QUANTIZATION, AND ZEROTREES Juan Claudio Regidor Barrientos *, Maria Angeles Losada Binue **, Antonio Artes Rodriguez **, Francisco D Alvano *, Luis Urbano * (*) Grupo de Procesamiento de SeFiales, Departamento de Electronica y Circuitos, Universidad Simon Bolivar, Aptdo. 89000, Caracas 1080-A, Venezuela. Tel.: (58)(2) 906-36-30, Fax: (58)(2) 906-36-3 1, e-mail: jregidor@skynet.usb,ve, fdalvano@skynet.usb.ve, lurbano@skynet.usb.ve (* *) Departamento de Sefiales, Sistemas y Radiocomunicaciones, Escuela Tecnica Superior de lngenieros de Telecomunicacion, Universidad PolitCcnica de Madrid, Ciudad Universitaria, s/n, Madrid 28040, Espafla. Tel.: (34)( 1) 549-5700, ext. 23 I, Fax: (34)( 1) 336-7350, e-mail: angeles@gtts.ssr.upm.es, antonio@gtts.ssr.upm.es Abstract. This paper proposes an image coding procedure that combines the vector quantization of a wavelet-transformed image and a modified form of Shapiro s zerotree elimination algorithm. With this new approach, applied to radiographs, we have obtained compression rates better than with other methods tested at the ETSIT-UPM, with subjective quality of the reconstructed images ranging from very good to acceptable. We are continuing this research to refine the many parameters that affect the results. INTRODUCTION Image storage and transmission pose an important problem to the development of intelligent communication systems due to memory and bandwidth requirements. Consequently, many different image compression techniques have been devised during the last few decades. Although lossless or reversible schemes are very desirable, the achieved compression ratios are relatively low, which makes necessary the use of lossy (irreversible) schemes, allowing some distortion in the reconstructed images. The efficiency of a coder can be defined as the image quality for a given bit rate and, for a lossy method, is generally increased at the cost of computational complexity [I]. The Wavelet transform, or hierarchical sub-band decomposition [2], separates the information of the image at different scales and orientations, without changing the image size. Vector quantization (VQ) of these transformed images [3] gives a good compression, without the blocking effects usually found in images coded using VQ in the spatial domain. The embedded zerotree wavelet (EZW) algorithm introduced by Shapiro [4], is a relatively simple tech- nique that is based on the wavelet transform, followed by the prediction of the absence of significant information across scales due to the self-similarity inherent in transformed images [5]. It addresses the two-fold problem of obtaining the best image quality for a given rate and accomplishing this task in an embedded fashion, Le., in such a way that all encodings of the same image at lower bit rates are contained in the beginning of the bit stream for the target bit rate. We have combined the information reduction achieved with vector quantization, with a modified version of the basic EZW algorithm in a compression technique that produces reconstructed images with very good subjective quality. The bases of both algorithms are explained in the next section, and our method and the conclusions based on the comparison of the results for a thorax radiography are presented afterwards. 2.1 - Wavelet Transform 11. ALGORITHMS The Wavelet Transform (WT) or multiresolution analysis represents a function as a superposition of a family of dilated and translated versions of two functions, the mother wavelet and its related scaling funct1on. This can be viewed as the decomposition of the function with a band-pass filter bank, each filter separating a range of resolutions or scales. For the two-dimensional case, the filters are applied to the horizontal and vertical directions, producing a series of sub-images as shown in Fig. 1. The WT does not change the number of pixels required to represent the image, but separates the information in a way that resembles the human visual system [5]. 0-7803-4434-0 I98 l$lo.oo 0 1998 IEEE 166

2.2 - Vector Quantization Vector Quantization (VQ) has been applied to image compression, either by coding of the image itself or by some transformation of it. In VQ, a group of pixels, called a vector, is approximated by another one taken from a table of admissible vectors, the codebook, and coded simply by the index of this vector in the table. The codebook is created by some optimization algorithm, applied over a series of images similar to the ones to be coded, or training set. As discussed in [3], application of VQ to the WT of an image requires the creation of one codebook for each scale and orientation in the transformed image. 2.3 - Embedded zerotree wavelet The EZW algorithm is based in the construction of two lists for a given image that has been previously decorrelated with a wavelet transform. In the first list, called the dominant list, the information about the significance of a coefficient is coded. In the second, or significant list, only the values for the significant coefficients are kept up to a given degree of precision. In Shapiro s scheme, the significance of a coefficient at a given iteration is determined based on its comparison with a threshold (T). If the value of the coefficient is greater than T, the coefficient is significant while, if it is smaller than T, it is considered insignificant. When the coefficient is significant, it can be positive (P) or negative (N). When the coefficient value is below threshold, the values of its descendants, which are the corresponding coefficients in lower scales, are analyzed. The parent-child relationships are described in Figure 1. If all the descendants are insignificant, the coefficient is coded as a zerotree root (ZTR) and all its children are discarded from further processing. When some of the descendants are significant, however, we have an isolated zero (IZ), and the descendants have to be codified individually. In summary, 4 symbols (2 bits) are necessary to code completely the dominant list. The same procedure is performed in all scales with a prefixed order (given in Figure 2) until the dominant list is completed. Then, the same scheme is repeated iteratively reducing the threshold at each iteration, and in this way, the values of the coefficients are successively approximated. ii) Each sub-band in the transform is independently coded using vector quantization, except for the lowest frequency band, on which scalar quantization to eight bits is used. In this preliminary work, all subbands are coded using a 256-vector codebook, which gives the same compression for the three sub-bands of each scale. fl Figure I : Parent-child (dependencies of sub-bands. The arrows point from parents to children. This scheme can be extended to larger number of subbands. 111. METHODOLOGY The proposed procedure has four steps: i) Obtaining the Wavelet Transform of the image. Figure 2: Scanning order of the sub-bands foi encoding a significant map. The lower frequency sub-band is at the top left and the higher frequency sub-band at the bottom right. This scheme can be extended to larger number of sub-bands 167

iii) The VQ coded sub-bands are subjected to an information elimination procedure based on the EZW. We define a significant vector as one whose distance to the origin is greater than a selectable threshold, T. Subbands are scanned as in Fig. 2, maintaining two lists: the significance map (SM), and the significant vectors (SV). The SM has positional information for each subband. For the lowest frequency sub-bands, the SM has a one to one correspondence with the sub-band vectors, with three possible values: significant vector, isolated zero (less than T but with significant descendants) or zerotree root (this vector and all its descendants are less than T). For higher frequency sub-bands, the SM includes only those vectors that do not belong to a zerotree. Vectors in the highest frequency sub-bands have no children, and can only be significant or non significant, requiring only two symbols. When a vector in any scale is significant, its index is added to the significant vector list. Only one pass is made through the image, as the vector indexes saved represent the f dl precision of the significant vectors. In this procedure, T is fixed and controls the quality of the reconstructed image. A file is generated which contains both the SM and SV lists, and the scalar-coded low frequency image. iv) Finally, this file is further compressed using arithmetic coding, using the standard application "zip". Although our procedure sacrifices the embedded encoding properties of Shapiro's algorithm, it still maintains the WT good properties for a progressive transmission scheme. Threshold 1164 1 I32 1/16 Compression rate SNR (db) 30: 1 45.8 42: 1 44. I 53.1 42.0 (a) Original image IV. RESULTS Two sets of experiments have been realized, using the length 8 Daubechies wavelet. In the first set, an image (thorax radiography) of size 512x512 was transformed to three scales, and each sub-band was vector quantized using codebooks trained over ten similarly transformed radiographs; the quantization for all sub-bands uses 8-bit indexes to represent vectors of four elements (2x2 blocks). The threshold is a fraction of the greatest magnitude vector found in all sub-bands, and varies between 1/64 and 1/16 of this value. Figure 3 shows the original image and the reconstructed one for a threshold of 1/64. Figure 4 shows an amplified view of a section of the original image and the reconstructed images for threshold 1/64 and 1/16. The results are summarized in Table I. The compression rates range from 30:l with very good subjective quality, to 53:l with acceptable quality. As is frequently found, the SNR is a poor guide to the visual quality of the reconstructed image. (b) Reconstructed image. Threshold 1/64, compression 30: 1 Figure 3.- Experiment 1. Results for the 512x512 image 168

The second set of experiments was similar, but using a bigger image, 864x864 pixels, and transforming to four scales. We expected better compression rates, as the low frequency band (which is compressed only arithmetically) now represents a lower percentage of the total data. However, the magnitude of the coefficients in the WT rises in each new scale added; this introduces greater errors in the quantization of the low frequency band, and more high frequency information is discarded as non significant, with an overall loss in visual quality. The threshold varies between 112.56 and 1/64 of the maximum magnitude vector. Figure 5 shows about 80% of the images, both the original and the reconstructed from threshold 1/90. Figure 6 shows an amplified section of the original andl three reconstructed images, compressed with different thresholds. Numerical results are summarized in Table 11. The compression rate goes from 1 :37 (good quality) to 1 :lo0 (acceptable). Table TI Experiment 2 43.9 100: 1 43.2 (a) Original image V. CONCLUSIONS The results presented here compare very favorably against those obtained at the ETSIT-UPM in other project, using VQ over Discrete Cosine Transform of the image. In this work, a compression rate of 14:l gave acceptable quality of the reconstructed images, with SNR around 40.8 db, and notable "block" artifacts. (b) Reconstructed image. Threshold 1/64, compression 30: 1 Figure 4.- Experiment 1. Amplified 256x256 section (c) Reconstructed image. Threshold 111 6, compression 53: 1 169

37: I, 46.3 db, although subjectively the differences are almost unnoticeable,) but did not perform as well as the proposed method for higher rates (for the 512x512 image at 53:1, 43.4 db, and for the 864x864 one at loo:l, 42.1 db, and notably lower visual quality.) This is probably due to the SNR of the reconstructed image being limited by the quality of the VQ step when few vectors are discarded. This suggests that fine tuning of the VQ step can give better performance for the proposed method in every case. (a) Original image Many parameters in the proposed method can be changed, as, for example, the wavelet type, the filter length, the threshold magnitude for each sub-band, and the definition of significant vector. Specially critical, as discussed in [3], is the number of bits assigned to each sub-band in the VQ step. To find an optimal combination of these parameters is the subject of our present research, as is the medical validation of the results. Also of interest is the application of this method to normal b&w and color photographs ACKNOWLEDGMENT This work was supported in part by the project BID- CONICIT E-] 8 (New Technologies Program), and by Universidad Simon Bolivar. (b) Reconstructed image. Threshold 1/90, compression 78: 1 Figure 5.- Experiment 2. 768x768 pixel section of the images REFERENCES [l] Jayant, N., Speech and image coding, special issue of IEEE J. Select. Areas Comm., SAC-10(5), 1992. [2] Vetterli, M., Herley, C., "Wavelets and filter banks: Theory and Design", IEEE Trans. Signal Processing, vol. 40, No. 9, pp. 2207-2232, Sep. 1992. [3] Antonini, M., Barlaud, M., Mathieu, P., Daubechies, I., "Image Coding Using Wavelet Transform", IEEE Trans. Image Proc., Vol. 1, pp. 205-220, Apr. 1992. [4] Shapiro, J. M., "Embedded Image Coding Using Zerotrees of Wavelet Coefficients", IEEE Trans. Signal Proc., Vol. 41,No. 12, pp. 3445-3462, Dec. 1993. [5] Field, D. J., "Scale invariance and self-similar wavelet transform: an analysis of natural images and mammalian visual systems", In "Wavelets, Fractals and Fourier Transform", D. M. Farge, J. C. R. Hunt and J. C. Vassilicos, Clarendon Press. Oxford 1993. A test run using Shapiro's algorithm gave slightly better results at lower compression rates (for the 5 12x5 12 image at 30:1, 45.7 db, and for the 864x854 one at 170

(a) Original image (b) Reconstructed irnage. Threshold 1 /256, compression 37: 1 (c) Reconstructed image. Threshold 1/90, compression 78: 1 (d) Reconstructed image. Threshold 1/64, compression 100: 1 Figure 6.- Experiment 2. Amplified 256x256 section 171