High Quality Image Deblurring Panchromatic Pixels ACM Transaction on Graphics vol. 31, No. 5, 2012 Sen Wang, Tingbo Hou, John Border, Hong Qin, and Rodney Miller Presented by Bong-Seok Choi School of Electrical Engineering and Computer Science Kyungpook National Univ.
Aim to Proposed method Abstract High quality image deblurring Proposed method Introducing new imaging system Designing new sensor pattern Modifying bayer pattern» Adding panchromatic pixels Introducing new demosaicing algorithm Estimation of deblurring image Using two-step maximum-a-posteriori method Solving joint minimization problem» Statistical part and spatial part 2/36
Introduction Problem of image deblurring Inverse problem with unknown variables Inadequate information for deblurring Producing unwanted solution Recent research of Image deblurring Including image prior for regularization Designing camera system Acquiring more information 3/36
Previous method of image deblurring Using regularization based on natural image prior Using maximum-a-posterori(map) method Selecting large gradient for prior Heavy-tailed distribution of gradient for natural image Problem of this method Producing ringing artifact and small error 4/36
Designing new image system Acquiring more information rather than blurry scene Using different shutter speed Slow shutter speed and fast shutter speed Imaging system with affect image deblurring Alignment» Spatial and temporal displacement Shutter speed» Producing sharp image by fast shutter speed» Introducing noise by fast shutter speed ISO setting» Involving noise on high ISO setting as same image sensor 5/36
Proposed method Blind deblurring method with novel image prior based on new imaging system New imaging system Using new sensor pattern by adding panchromatic pixels Producing multiple images with different shutter speed» Introducing sharp gray-scale image by fast shutter speed» Introducing blurred color image by slow shutter speed Deblurring algorithm Regularization by combined image prior» Statistical part for sharpness» Spatial part for ringing control Two step MAP method» Estimating blur kernel and deblurred images under joint minimization 6/36
Deblurring method of previous method Categorization of deblurring method Blind method Previous method Unknowing blur kernel Non-blind method Knowing blur kernel 7/36
Non-blind deblurring method Conventional technique of non-blind method Richardson-Lucy deconvolution» Computing latent image with poisson distribution Method to improving ringing artifact Sparse prior for non-blind deconvolution Using interscale and intrascale Using color prior in non-blind deconvolution Blind deblurring method Recovering blur kernel Using statistical distribution for natural image gradients Using concatenation of two-piece-wise continuous linear and quadratic function to model image prior Using Curvelet system for kernel and framelet system for image Framelet system for reducing ill-posed problem» Using sparsity of kenerl and sharp image 8/36
Improving computation time Using Graphics Processing Unit(GPU) Improving MAP method in blind deconvolution Evaluation of single image deconvolution algorithm» Using collected blur data with ground truth Collecting more information Using multiple images Problem of capturing multiple image» Divergence between images 9/36
Proposed method Proposed new type of deblurring method Capturing additional information using new imaging system quasi-blind type deblurring method Deblurring method with blurred/noisy image paris Capturing image separated shot» Short exposure and long exposure 10/36
Image sensor Digital imaging system Imaging system Using electronic image sensors Charge-Coupled Device(CCD) sensor Active Pixel sensor(aps) Complementary Metal Oxide Semiconductor(CMOS) Using Color filter for capturing color image Filtering portion of visible-light spectrum» Red, green, and blue Necessity of color filter» Reducing amount of light reaching sensor 11/36
New color image sensor with panchromatic pixel Using four pixel array with color filter Red, green or blue, and no color filter array(pan pixel) Capturing image by new color image sensor Representing Monochrome image» high-resolution and high-light-sensitivity Representing three low-resolution images» Low-light sensitivity 12/36
New color imaging system Color filter and sensor Fig. 1. A new image sensor and pixel pattern. R, G, B, and P stand for Red, Green, Blue, and Panchromatic pixels, respectively. 13/36
Image demosaicing New demosaicing algorithm of new image sensor Reducing visual artifacts Demosaicing with Bayer pattern to reduce visual artifact» Color difference between green/red, and green/blue Proposed demosaicing method to reduce visual artifact» Color difference interpolation between pan pixels and RGB pixels Difference of bayer CFA pattern demosaicing In case of bayer CFA pattern» Providing luminance information in green pixel» Providing chrominance information in red and blue pixel In case of new pattern» Providing luminance information in panchromatic pixel» Providing chrominance information in red, blue and green pixel 14/36
Procedure of demosaicing algorithm Noise removal Removing high-frequency and impulse noise» Applying sigma filter and median filter Reomoving low-frequqncy chromatic noise» Average gradient of R,G,B images equal to average gradient of pan image Pan image reconstruction Interpolation of pan pixels» Generating high-resolution gray image Bayer pattern interpolation Interpolation of RGB pixels» Generating low-resolution color image 15/36
Color difference image computing Downsampling pan pixels to get low-resolution pan image Extracting R,G,B from raw image sensor data to get bayer pattern Computing color difference pattern» Subtracting bayer pattern from pan image Applying bayer pattern demosaicing method» Generating full-color difference image (R-P, G-P, and B-P) Color difference image upsampling Image demosaicing Combination of panchromatic image and color difference image 16/36
Procedure of demosaicing algorithm (a) Original CFA pattern (b) Interpolated panchromatic image (c) Color difference computing (d) Upsampled color difference (e) Demosaiced image Fig. 2. Full-color image demosaicing with panchromatic pixels: (a) is the original CFA pattern from the new image sensor; (b) is the interpolated highresolution panchromatic image; (c) is the computing process of low-resolution color difference; (d) is upsampled high-resolution color difference; (e) is the high-resolution demosaiced image. In (c), there are the low-resolution panchromatic images at top-left, low-resolution Bayer pattern (red, green, and blue) at top-right, color difference pattern between Bayer pattern and panchromatic image at bottom-left, and demosaicing color difference image at bottom-right. Zoom-in allows us to see details of the CFA pattern. 17/36
Deblurring with pan pixels Problem formulation Goal of deblurring Recovering latent image from blurred image Formulation of blurred image Convolution of latent image and blur kernel with noise where B is blurred image L is latent image N is noise and B L K N (1) is convolution operator. 18/36
Acquiring image pair for deblurring Producing two images in one shot for dim light condition Using two shutters» One with short exposure and other with longer exposure Property of short exposure image Sharp and noisy Capturing image twice with different ISO setting t 1 At time» Read out pan pixels» Using high ISO setting with pan-pixels» Generating sharp gray-scale image P1 After time» Resetting pan pixel`s sensitivity as low ISO» Starting exposure again t 2 t 1 At time» Read out all pixels (Red, green, blue, and pan pixels)» Generating blurred color image B and blurred gray-scale image P2 19/36
comparing between proposed method and image-pair method (a) (b) (c) (d) Fig. 3. (Comparison between our imaging system and the image-pair systems introduced by Lim and Silverstein [2006] and Yuan et al. [2007]: (a) is a sharp gray-scale image P1 with short exposure; (b) is a blurred gray-scale image P2 with long exposure; (c) is a blurred color image B; (d) is a noisy image used for the image-pair system. The comparison between these two capture strategies in terms of timeline is shown in the bottom row. 20/36
Bayesian deconvolution Image deconvolution model Using Bayes` rule proportionality posteriori given by conditional probability and prior probability Likelihood, p B L, K p L p K p L K B (2) Definition of likelihood as Gaussian-type function» Noise is subject to gaussian distribution with zero mean L K B, 2 p B L K e (3) 21/36
Image prior Conventional image prior» Considering statistical distribution of intensity or gradient magnitude Designing new image prior where p L p L p L (4) t p pt L denote statistical prior that statistical distribution of gradient magnitude of image p L is spatial prior that describe spatial distribution of image p» Representation of statistic prior i t p L e 1 xl i i yli (5) where is a coefficient, i denote pixel index is positive exponent value set in range of [0.5, 0.8] 22/36
» Representation of spatial prior 2 2 2 xli x p i i yli ypi p L e p (6) where 2 is a coefficient 23/36
Deringing effect of proposed spatial prior (a) (b) (c) (d) (e) (f) Fig. 4. The deringing effect of our spatial prior: (a) is a blurred image with the ground-truth kernel (shown in the red box), and a noisy kernel (shown in the blue box) used for all three methods; (b) is our P image; (c) is x-derivative map of pan image P; (d) is the deblurring result with the local prior in Shan et al. [2008]; (e) is the result with the sparse prior in Levin et al. [2007]; (f) is the result by our spatial prior only. 24/36
Kernel prior Definition of kernel prior» Regularization as 1-norm of K 3 K 1, 1, 0 p K e with k k p j j j where 3 is a coefficient, denote l -norm operator l k denote entry of K with index j j (7) 25/36
Estimation Using MAP solution Solving regularized minimization problem Definition of objective function 2, 1 x i y i J L K L K B L L i 2 x i x i y i y i L P L P K 2 3 1 i 2 (8) Solving joint minimization problem by two step estimation» Estimation of kernel» Estimation of latent image 26/36
Kernel estimation Using two gray-scale images Short exposure and long exposure pan images 2 Estimation of latent image J K P K P K 1 2 3 1 Using iterative reweighted least square 2 1 x i y i J L L K B L L 2 xli xpi yli ypi 2 1 1 i i 2 (9) (10) 27/36
Experimental Experimental results Camera setting and software parameter setting Table 1. Shitter speed and ISO settings in figure 5-8 28/36
Table 1. Software Parameters in Experiments 29/36
Comparison of nonblind method Fig. 5. Comparison of nonblind methods. From left to right: blurred/pan images, Lucy [1974], Shan et al. [2008], Levin et al. [2007], and our method. We apply our estimated kernel to all other methods. 30/36
Comparison with four blind method Fig. 6. Comparison with four blind methods. From left to right: Fergus et al. [2006], Shan et al. [2008], Cho and Lee [2009], Xu and Jia [2010], and our method. The blur kernel, indicated by the smears of characters in the keyboard, is accurately estimated by our method. 31/36
Experiments on large kernels Fig. 7. Experiments on large kernels. From left to right: noisy images, Yuan et al. [2007], Cho and Lee [2009], Xu and Jia [2010], and our method 32/36
Experiments on large kernels Fig. 8. Experiments on large noise. From left to right: noisy images, Yuan et al. [2007], Cho and Lee [2009], Xu and Jia [2010], and our method. 33/36
Fig. 9. Experiments on large kernels. From left to right: noisy images, Yuan et al. [2007], Cho and Lee [2009], Xu and Jia [2010], and our method 34/36
Disscussion Proposed deblurring method New imaging system Novel image prior Evaluation of proposed method Fig. 11. Examples of spatially varying kernels.we rotate the ground-truth kernel in the third dimension from 0 at the top to d at the bottom of a sharp image (a) to synthesize blurred images (c) and (e) with nonuniform blurs. 35/36
Proposed method Conclusion Introducing new imaging system Designing new sensor pattern Modifying bayer pattern» Adding panchromatic pixels Introducing new demosaicing algorithm Estimation of deblurring image Using two-step maximum-a-posteriori method Solving joint minimization» Statistical part and spatial part 36/36