Image Inpainting by Hyperbolic Selection of Pixels for Two Dimensional Bicubic Interpolations

Similar documents
Image Inpainting Using Sparsity of the Transform Domain

IMA Preprint Series # 2016

Geeta Salunke, Meenu Gupta

Image Inpainting. Seunghoon Park Microsoft Research Asia Visual Computing 06/30/2011

The Proposal of a New Image Inpainting Algorithm

A Review on Image InpaintingTechniques and Its analysis Indraja Mali 1, Saumya Saxena 2,Padmaja Desai 3,Ajay Gite 4

Image Inpainting by Patch Propagation Using Patch Sparsity Zongben Xu and Jian Sun

Texture Sensitive Image Inpainting after Object Morphing

Comparative Study and Analysis of Image Inpainting Techniques

A Robust and Adaptive Image Inpainting Algorithm Based on a Novel Structure Sparsity

A Review on Design, Implementation and Performance analysis of the Image Inpainting Technique based on TV model

IJCSN International Journal of Computer Science and Network, Volume 3, Issue 1, February 2014 ISSN (Online) :

AN ANALYTICAL STUDY OF DIFFERENT IMAGE INPAINTING TECHNIQUES

Novel Occlusion Object Removal with Inter-frame Editing and Texture Synthesis

Learning How to Inpaint from Global Image Statistics

SYMMETRY-BASED COMPLETION

Light Field Occlusion Removal

ON THE ANALYSIS OF PARAMETERS EFFECT IN PDE- BASED IMAGE INPAINTING

Image Inpainting By Optimized Exemplar Region Filling Algorithm

Object Removal Using Exemplar-Based Inpainting

IMAGE INPAINTING CONSIDERING BRIGHTNESS CHANGE AND SPATIAL LOCALITY OF TEXTURES

Region Filling and Object Removal in Images using Criminisi Algorithm

Improved Non-Local Means Algorithm Based on Dimensionality Reduction

INTERNATIONAL ORGANISATION FOR STANDARISATION ORGANISATION INTERNATIONALE DE NORMALISATION ISO/IEC JTC1/SC29/WG11 CODING OF MOVING PICTURES AND AUDIO

An Automatic Hole Filling Method of Point Cloud for 3D Scanning

" Video Completion using Spline Interpolation

Dermascopic hair disocclusion using inpainting

FPGA IMPLEMENTATION FOR REAL TIME SOBEL EDGE DETECTOR BLOCK USING 3-LINE BUFFERS

Crack Classification and Interpolation of Old Digital Paintings

Fast and Enhanced Algorithm for Exemplar Based Image Inpainting (Paper# 132)

IAJIT First Online Publication

Automatic Logo Detection and Removal

Computational Acceleration of Image Inpainting Alternating-Direction Implicit (ADI) Method Using GPU CUDA

A A A. Fig.1 image patch. Then the edge gradient magnitude is . (1)

Repairing and Inpainting Damaged Images using Diffusion Tensor

DEPTH PIXEL CLUSTERING FOR CONSISTENCY TESTING OF MULTIVIEW DEPTH. Pravin Kumar Rana and Markus Flierl

PatchMatch: A Randomized Correspondence Algorithm for Structural Image Editing

GRID WARPING IN TOTAL VARIATION IMAGE ENHANCEMENT METHODS. Andrey Nasonov, and Andrey Krylov

Video Inpainting Using a Contour-based Method in Presence of More than One Moving Objects

2D rendering takes a photo of the 2D scene with a virtual camera that selects an axis aligned rectangle from the scene. The photograph is placed into

ROBUST INTERNAL EXEMPLAR-BASED IMAGE ENHANCEMENT. Yang Xian 1 and Yingli Tian 1,2

Robust Exemplar based Object Removal in Video

Joint Gap Detection and Inpainting of Line Drawings

Image Inpainting Using Vanishing Point Analysis and Scene Segmentation

Nearest Clustering Algorithm for Satellite Image Classification in Remote Sensing Applications

Old Manuscripts Restoration Using Segmentation and Texture Inpainting

then assume that we are given the image of one of these textures captured by a camera at a different (longer) distance and with unknown direction of i

Improved Seamless Cloning based Image In painting

An Iterative, Non-local Approach for Restoring Depth Maps in RGB-D Images

An Edge Based Adaptive Interpolation Algorithm for Image Scaling

Reliable Object Detection and Segmentation using Inpainting

Applying Synthetic Images to Learning Grasping Orientation from Single Monocular Images

Novel Video Inpainting and Secure QR Code Watermarking Technique

Volume Editor. Hans Weghorn Faculty of Mechatronics BA-University of Cooperative Education, Stuttgart Germany

Image Inpainting and Selective Motion Blur

Method of Background Subtraction for Medical Image Segmentation

Robotics Programming Laboratory

Augmented Reality of Robust Tracking with Realistic Illumination 1

Analysis and Comparison of Spatial Domain Digital Image Inpainting Techniques

A Vision System for Automatic State Determination of Grid Based Board Games

Super-Resolution-based Inpainting

IMAGE COMPLETION BY SPATIAL-CONTEXTUAL CORRELATION FRAMEWORK USING AUTOMATIC AND SEMI-AUTOMATIC SELECTION OF HOLE REGION

Texture. Frequency Descriptors. Frequency Descriptors. Frequency Descriptors. Frequency Descriptors. Frequency Descriptors

A Log-Polar Interpolation Applied to Image Scaling

International Journal of Advance Research In Science And Engineering IJARSE, Vol. No.3, Issue No.10, October 2014 IMAGE INPAINTING

ISSN: (Online) Volume 2, Issue 5, May 2014 International Journal of Advance Research in Computer Science and Management Studies

Recent Advances inelectrical & Electronic Engineering

Texture Synthesis and Image Inpainting for Video Compression Project Report for CSE-252C

Admin. Data driven methods. Overview. Overview. Parametric model of image patches. Data driven (Non parametric) Approach 3/31/2008

Image Resizing Based on Gradient Vector Flow Analysis

Local Image Registration: An Adaptive Filtering Framework

SCALE INVARIANT TEMPLATE MATCHING

Bilateral Depth-Discontinuity Filter for Novel View Synthesis

Color Image Segmentation Editor Based on the Integration of Edge-Linking, Region Labeling and Deformable Model

Image Restoration using Multiresolution Texture Synthesis and Image Inpainting

Partial Differential Equation Inpainting Method Based on Image Characteristics

REMOVING OCCLUSION IN IMAGES USING SPARSE PROCESSING AND TEXTURE SYNTHESIS

A reversible data hiding based on adaptive prediction technique and histogram shifting

Image Compression and Resizing Using Improved Seam Carving for Retinal Images

Salient Region Detection and Segmentation in Images using Dynamic Mode Decomposition

Image Inpainting with the Navier-Stokes Equations

A Computationally Efficient Approach for Exemplar-based Color Image Inpainting using GPU

Film Line scratch Detection using Neural Network and Morphological Filter

Comparative Analysis of Edge Based Single Image Superresolution

Image completion based on views of large displacement

Optimizing Monocular Cues for Depth Estimation from Indoor Images

Inpainting of Occluded Regions in Handwritings

Symmetry-based completion

Noises Removal from Image Sequences Acquired with Moving Camera by Estimating Camera Motion from Spatio-Temporal Information

Marcelo Bertalmío, Vicent Caselles, Simon Masnou, Guillermo Sapiro

Geometry Completion and Detail Generation by Texture Synthesis

3D Object Repair Using 2D Algorithms

Stereoscopic image inpainting: distinct depth maps and images inpainting

Image Completion Using Efficient Belief Propagation via Priority Scheduling and Dynamic Pruning

Shift-Map Image Editing

Topics to be Covered in the Rest of the Semester. CSci 4968 and 6270 Computational Vision Lecture 15 Overview of Remainder of the Semester

Edge-directed Image Interpolation Using Color Gradient Information

Final Exam Schedule. Final exam has been scheduled. 12:30 pm 3:00 pm, May 7. Location: INNOVA It will cover all the topics discussed in class

Wook Kim. 14 September Korea University Computer Graphics Lab.

Pattern Recognition Letters

Transcription:

Image Inpainting by Hyperbolic Selection of Pixels for Two Dimensional Bicubic Interpolations Mehran Motmaen motmaen73@gmail.com Majid Mohrekesh mmohrekesh@yahoo.com Mojtaba Akbari mojtaba.akbari@ec.iut.ac.ir Nader Karimi nader.karimi@cc.iut.ac.ir Shadrokh Samavi samavi96@cc.iut.ac.ir Abstract Image inpainting is a restoration process which has numerous applications. Restoring of scanned old images with scratches, or removing objects in images are some of inpainting applications. Different approaches have been used for implementation of inpainting algorithms. Interpolation approaches only consider one direction for this purpose. In this paper we present a new perspective to image inpainting. We consider multiple directions and apply both one-dimensional and two-dimensional bicubic interpolations. Neighboring pixels are selected in a hyperbolic formation to better preserve corner pixels. We compare our work with recent inpainting approaches to show our superior results. Keywords-component; Image Inpainting; Bicubic Inpainting; Image Restoration; I. INTRODUCTION Image inpainting refers to filling missing places based on neighbor regions. Inpainting algorithms works based on human perception of image and also image saliency. These algorithms find degraded places using degradation mask. One of the important goals of any inpainting algorithm is to create artificial textures similar to neighbor pixels. There are different methods in this field based on different requirements and image features. These methods could be categorized in one dimensional or two dimensional image features. Different libraries are available for image inpainting. For example, [1] is an open source library for image restoration that can be used as image inpainting tool. Library of [1] automatically finds degraded pixels and fills them with an approximation of neighbor pixels. The application in [2] for image inpainting is available online for image enhancement and is especially appropriate for old images. Some algorithms use linear structure methods for image restoration and inpainting. Linear structure method uses partial differential equations for removing objects in image. The methods proposed in [3], [4], [5] and [6] are the leading methods for inpainting that use some sort of linear structure. In previous works, several researchers have considered texture synthesis as a way to fill large image regions with pure textures repetitive two-dimensional textural patterns with moderate stochasticity [7]. The work in [8] is used for inpainting in order to satisfy the neighborhood coherency of missed patches. Method proposed in [6], [9] and [10] uses exemplar based inpainting method. Exemplar based inpainting generates new textures by sampling and copying colors from the source region. This exemplar based methods have difficulties in filling holes of real world scenes. In this paper we circumvent the deficiencies of other methods by using an aggregation mechanism on the results of 1D and 2D bicubic algorithms. Structure of the paper is as follows: in section 2 we present the proposed method. In section 3 experimented results are presented. Section 4 belongs to concluding remarks. Figure 1. One dimensional interpolation using four known points. -------------------------------------- IEEE

Figure 2. Two dimensional interpolation by fitting a surface. II. PROPOSED METHOD Our proposed method takes advantage of bicubic interpolation and aggregates different results of various such interpolations to produce a final result for a missing pixel. Bicubic interpolation is a graphical method for increasing or decreasing the number of pixels in digital images. Digital cameras use this method for zoom in or zoom out when the desired resolution is different from that of their sensor. The quality of final image depends on the precision and efficiency of the interpolation method. Bicubic interpolation is a complex method of interpolation that produces better edges and causes less blockiness. Fig. 1 demonstrates a one dimensional interpolation of four instances in a curve. The four samples produce a curve and a single predicted intensity is produced on the curve. Fig. 2 shows a 2- dimensional output of bicubic interpolation to produce a denser mesh from a low number of instances. The resulted mesh is 21 21 made of an 11 11 original grid function. The extra samples are generated by interpolation. Fig. 3 is an example of degraded image and the mask of degradation. The mask is a binary map of degraded pixels. Fig. 4 demonstrates the block diagram of our proposed method. The input data of the algorithm consist of one degraded image and the mask of degradation which addresses the coordinates of degraded pixels. The mask is a logical map of degraded pixels. Block (1) produces a patch of 16 pixels around the missing target. The 16 pixels that are considered for a missing pixel are demonstrated in Fig. 5. Block (1) of the algorithm sends the mentioned 16 pixels to the two following blocks. Block (2) produces four lines, each consisting of four pixels. These lines consist of one horizontal, one vertical, and two diagonals. The output of Block (2) is sent to Block (3). This block assumes each line as a five-pixel line with a missing center and predicts an intensity value for the center pixel using 1-D bicubic interpolation. Hence, there will be 4 predictions for the center pixel. The missing pixel is the intersection point of all four lines. Figure 4. Block diagram of algorithm Figure 5. Blue target and the 16 neighbor red pixels In the right path the block diagram of Fig. 4, below Block (1), there is Block (4), which forms two 12-element matrices as is demonstrated in Fig. 6. In Fig. 6 a 5 5 patch is demonstrated around a single missing pixel. Each pixel of the patch is addressed by an integer number between 1 and 25. Fig. 6(a) and Fig. 6(b) show the selection policy that is used by Block (4). Two 12-pixel formations of the entire 25 pixels use a hyperbolic approach for selection of pixels from the patch. Fig. 6(a) and Fig. 6(b) show the horizontal and vertical hyperbolas respectively. The center of symmetry in both (a) (b) Figure 3. (a) Degraded image and (b) the mask

Hyperbolic Selection of Pixels for Matrix Formation (a) Hyperbolic Selection of Pixels for Matrix Formation (b) Figure 6. Matrix formation for 2-D bicubic interpolation by hyperbolic selection of pixels. Patch of 25 neighboring pixels is shown (a) 4 3 selection, (b) 3 4 selection selections are placed on the missing pixel. This hyperbolic selection enhances the corners and prevents the interpolation algorithm to weaken such corners. Block (5) performs 2-D bicubic interpolation on both 12 element paths and produces two new matrices, a 7 5 matrix as the interpolation output of the 4 3 output of Block (4) and a 5 7 for the result of performing interpolation on the 3 4 matrix. The two resulted matrices have two centers as two different predictions for the missing target. Predictions are performed in two distinct prediction modules and the results are aggregated to produce the final intensity value. One module tries to predict the result in one dimensional directions and the other interpolates in two dimensional patches of intensities. Both modules predict the value of intensity using bicubic interpolation. Block (2) groups the 16 pixels to four lines of four pixels of any direction and Block (3) produces one value of intensity for each direction and results in four values for the target. Then the most different predicted value among these four values are replaced with the average of the other three values. The one dimensional prediction result is the four-member set of intensities in the output of Block (3). Blocks (4) and (5) are the building blocks of 2-dimensional prediction in our proposed inpainting method. Block (5) gets the 16 pixels from Block (1) and produces a pair of one 3 4 and one 4 3 matrix. The two sets are then delivered to Block (5) for 2-dimensional interpolation. There a new set of 5 7 and 7 5 matrices are created from the input set of 3 4 and 4 3 matrices respectively. The central point of these new sets of matrices are two new predicted values for the target pixel. These two intensity values, with the four values from onedimensional predictions, produce six values for Block (6) to produce the final result by averaging this set of values. TABLE I. COMPARISON OF PROPOSED METHOD WITH PROPOSED INPAINTING METHOD IN [7] AND [10] IMAGE Criterion Korman [8] Criminisi [7] Proposed Method PSNR(dB) 29.62 41.01 46.54 SSIM 0.995 0.999 0.999 PSNR(dB) 31.04 32.73 37.17 SSIM 0.987 0.991 0.995 PSNR(dB) 26.78 27.68 30.82 SSIM 0.968 0.972 0.981 PSNR(dB) 27.29 29.14 30.01 SSIM 0.966 0.973 0.972

Input Image Degraded Image Proposed Method Criminisi [7] Korman [8] PSNR 19.21 34.03 31.49 30.40 SSIM 0.948 0.990 0.987 0.986 Figure 7. Comparison of Proposed Method with the Method in [7] and [10] III. EXPERIMENTAL RESULTS Fig. 7 shows a typical output of our proposed method in comparison with the method proposed in [7] and [8]. Fig. 8 shows change of PSNR values with respect to the number of defective lines that are present in the image. It can be seen that increasing the number of lines would increase the total number of missing pixels. When a missing pixel is restored by interpolation, the estimated value would be close but different from the original value. This cause a degradation in the PSNR value. But it can be seen that our algorithm results in lower degradation of the restored image. This means that our algorithm can predict the missing pixel better than the other two methods. Fig. 9 shows changes that occur in the obtained PSNR with respect to the width of defective lines for the proposed method as compared to the methods of [7] and [8]. In this experiment initially two defect lines are introduced with the width of one pixel. Then widths of these lines are increased by one pixel at each iteration, all the way up to width of 15 pixels. At each iteration the missing pixels are restored using our proposed method and the methods of [7] and [10]. Then the PSNR value for each method is calculated. The plot of Fig. 9 shows that as the widths of the lines are increased the overall PSNR value would decrease. But it can be seen that our results for any line width has higher PSNR values. Furthermore, we compared results of our proposed method with that of similar works when we have equal amount of degradations. Table I shows comparison of the proposed method with the methods of [7] and [8]. Our proposed method has better quality assessment results in terms of PSNR and SSIM. IV. CONCLUSION In this paper we proposed a novel method of inpainting for the degradations that are in the form of lines. These types of deforms are very probable, especially in the case of scanning of old photographs. Our method is based on one dimensional and two dimensional bicubic interpolations. When selecting neighboring pixels for interpolation, we consider pixels that form hyperbolic paths. By doing this type of selective interpolation we preserve corners in four directions in most cases. The experimental results prove better performance of our method in most cases as are shown by graphs of Fig. 8 and Fig. 9. REFERENCES [1] Image Restoration. [Online]. Available: http://restoreinpaint.sourceforge.net/. [2] AKVIS Retoucher. [Online]. Available: http://akvis.com/en/retoucher/index.php. [3] C. Ballester, V. Caselles, J. Verdera, M. Bertalmio, and G. Sapiro, A variational model for filling-in gray level and color images, in Computer Vision, 2001. ICCV 2001. Proceedings. Eighth IEEE International Conference on, 2001, vol. 1, pp. 10 16. [4] M. Bertalmio, A. L. Bertozzi, and G. Sapiro, Navier-stokes, fluid Figure 9. Variation of PSNR for different widths of defective lines. Figure 8. Variation of PSNR for different number of defective lines

dynamics, and image and video inpainting, in Computer Vision and Pattern Recognition, 2001. CVPR 2001. Proceedings of the 2001 IEEE Computer Society Conference on, 2001, vol. 1, pp. I--I. [5] M. Bertalmio, G. Sapiro, V. Caselles, and C. Ballester, Image inpainting, in Proceedings of the 27th annual conference on Computer graphics and interactive techniques, 2000, pp. 417 424. [6] T. F. Chan and J. Shen, Nontexture inpainting by curvature-driven diffusions, J. Vis. Commun. Image Represent., vol. 12, no. 4, pp. 436 449, 2001. [7] A. Criminisi, P. Pérez, and K. Toyama, Region filling and object removal by exemplar-based image inpainting, IEEE Trans. image Process., vol. 13, no. 9, pp. 1200 1212, 2004. [8] S. Korman and S. Avidan, Coherency sensitive hashing, in Computer Vision (ICCV), 2011 IEEE International Conference on, 2011, pp. 1607 1614. [9] M. Ashikhmin, Synthesizing natural textures, in Proceedings of the 2001 symposium on Interactive 3D graphics, 2001, pp. 217 226. [10] W. T. Freeman, E. C. Pasztor, and O. T. Carmichael, Learning low-level vision, Int. J. Comput. Vis., vol. 40, no. 1, pp. 25 47, 2000.