Practical Image Deblurring with Synthetic Boundary Conditions, with GPUs, and with Multiple Frames Open Access

Fan, Ying Wai (2010)

Permanent URL: https://etd.library.emory.edu/concern/etds/vx021g18h?locale=en%255D
Published

Abstract

Researchers usually use several assumptions when they tackle the image deblurring problem. In particular, it is usually assumed that the blur is known exactly, and that the true image scene outside the field of view is approximated well by periodic boundary conditions. These assumptions are certainly not true in most realistic situations.

In this thesis we develop a new method to derive adaptive synthetic boundary conditions directly from the blurred images. Compared with classical boundary conditions, our approach gives better deblurring results, especially for motion blurred images. To speed up the deblurring algorithms, we also develop a new regularized DCT preconditioner.

We have written two new software packages to facilitate research in image deblurring. The first one PYRET is a serial CPU implementation in Python. With the object-oriented paradigm, we implement numerical algorithms for the general linear problem, and then specialize them for deblurring problems with a new matrix class. A web user interface for PYRET is also provided.

The second software package PARRET is a parallel implementation on NVIDIA CUDA GPU architecture. GPUs provide an economical way to obtain parallel processing power. On a consumer laptop equipped with a GPU, we can attain order of magnitude speedup with PARRET.

Finally, we consider a blind deconvolution problem in which the involved atmospheric blurs are not known in advance. We first reduce the number of variables using a variable projection technique, then solve the reduced problem by the Gauss-Newton algorithm. With careful mathematical manipulation, the Jacobian matrix is decomposed into a series of diagonal and Fourier matrices for inexpensive multiplication. To further improve the deblurring quality, we use more than one blurred image from the same object. We use a new decoupling approach for the sparsity of the Jacobian matrix in this multi-frame case. Experiments show that the deblurring result improves when more images are used.

Table of Contents

1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.1 Convolution Model of Image Formation . . . . . . . . . . 1
1.2 Literature Survey . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.3 Overview of this Dissertation . . . . . . . . . . . . . . . . . 3
1.4 New Contributions . . . . . . . . . . . . . . . . . . . . . . . . 4

2 Synthetic Boundary Conditions . . . . . . . . . . . . . . . . 6
2.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
2.2 Image Deblurring and Boundary Conditions . . . . . . 9
2.2.1 One Dimensional Problems . . . . . . . . . . . . . . . . 9
2.2.2 Two Dimensional Problems . . . . . . . . . . . . . . . 14
2.2.3 Synthetic Boundary Conditions . . . . . . . . . . . . . 17
2.3 Preconditioners for Synthetic Boundary Conditions . . . . 22
2.4 Numerical Experiments . . . . . . . . . . . . . . . . . . . . 25
2.4.1 Gaussian Blur . . . . . . . . . . . . . . . . . . . . . . . . . . 27
2.4.2 Diagonal Motion Blur . . . . . . . . . . . . . . . . . . . . . 30
2.4.3 Gaussian Blur with Additive Gaussian Noise . . . . . . 31
2.4.4 Diagonal Motion Blur with Additive Gaussian Noise . 36
2.4.5 Preconditioning . . . . . . . . . . . . . . . . . . . . . . . . . . 39
2.4.6 Other Images and Additional Experiments . . . . . . . 40
2.5 Conclusions for this Chapter . . . . . . . . . . . . . . . . . . . 44

3 Python and GPU Implementation of Deblurring Algorithms . . . . 45
3.1 Efficient Algorithms for Convolution Matrix Operations . . . . 46
3.2 Deblurring Algorithms . . . . . . . . . . . . . . . . . . . . . . . 50
3.2.1 Direct Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . 50
3.2.2 Iterative Methods . . . . . . . . . . . . . . . . . . . . . . . . . 51
3.3 PYRET: The Implementation in Python . . . . . . . . . . . . 53
3.3.1 Why Python? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
3.3.2 Implementation Details of PYRET . . . . . . . . . . . . . . 54
3.3.3 Web GUI Interface . . . . . . . . . . . . . . . . . . . . . . . . 56
3.4 PARRET: Parallel Implementation on GPUs . . . . . . . . . 58
3.4.1 Parallelizability of Vector and Matrix Operations in Deblurring Algorithms . . . . 58
3.4.2 Why GPUs? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
3.4.3 Compute Unied Device Architecture (CUDA) . . . . . . . 59
3.4.4 Python Wrapper of CUDA (PyCUDA) . . . . . . . . . . . . . 61
3.4.5 Interfacing CUBLAS and CUFFT with PyCUDA . . . . . . 63
3.4.6 Complex Branch of PyCUDA . . . . . . . . . . . . . . . . . . 65
3.4.7 Features of PARRET . . . . . . . . . . . . . . . . . . . . . . . . 66
3.4.8 Speedup of PARRET . . . . . . . . . . . . . . . . . . . . . . . . 67
3.5 Conclusions for this Chapter . . . . . . . . . . . . . . . . . . . 74
4 Multi-frame Pupil Phase Blind Deconvolution Problem . . . . 76
4.1 Overview of Blind Deconvolution . . . . . . . . . . . . . . . . 76
4.2 Variable Projection Method . . . . . . . . . . . . . . . . . . . 77
4.3 Applying Variable Projection Method to Blind Deconvolution Problems . . . . 78
4.4 Deblurring Using More than One Image . . . . . . . . . . . . 87
4.5 Pupil Phase Parametrization of Atmospheric Blurs in Astronomical Imaging . . . . 98

4.5.1 Efficient Computations with $\nabla diag(\Lambda)$ . . . . 99
4.6 Experimental Results . . . . . . . . . . . . . . . . . . . . . . . 102
4.6.1 Removing Mild Blurs . . . . . . . . . . . . . . . . . . . . . . 102
4.6.2 Removing Severe Blurs . . . . . . . . . . . . . . . . . . . . . 108
4.7 Conclusions for this Chapter . . . . . . . . . . . . . . . . . . . 113
5 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115
Appendix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120
6.1 Proof of Lemma 4.2 in Section 4.4 . . . . . . . . . . . . . . . 120
6.2 Proof of Lemma 4.3 in Section 4.4 . . . . . . . . . . . . . . . 120
6.3 Derivation of (4.20) in Section 4.3 . . . . . . . . . . . . . . . 121
6.4 Derivation of (4.74) in Section 4.4 . . . . . . . . . . . . . . . 123
Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124

About this Dissertation

Rights statement
  • Permission granted by the author to include this thesis or dissertation in this repository. All rights reserved by the author. Please contact the author for information regarding the reproduction and use of this thesis or dissertation.
School
Department
Degree
Submission
Language
  • English
Research Field
Keyword
Committee Chair / Thesis Advisor
Committee Members
Last modified

Primary PDF

Supplemental Files