next up previous
Next: About this document ...

Misha Kilmer
An adaptive inner-outer iterative regularization method for edge recovery

Tufts University
Department of Mathematics
503 Boston Ave
Medford
MA 02155
misha.kilmer@tufts.edu
Oguz Semerci
Eric Miller

In this talk we consider a new inner-outer iterative algorithm for edge recovery in image restoration and reconstruction problems. Specifically, we consider generating regularized solutions $ x_{reg}$ to

$\displaystyle Ax = b_{true} + \eta =b ,$    with $\displaystyle b_{true} := A x_{true} ,$

where $ A \in \mathbb{R}^{m \times n}$ is a known forward operator, $ b$ represents the known measured data but $ \eta$ is unknown, and $ x_{true}$ is the unknown, vectorized version of the true image to which we would like to generate an approximation.

One of the most well-known methods of regularization is Tikhonov regularization

$\displaystyle \min_{x} \Vert A x - b \Vert _2^2 + \lambda^2 R(x), $

where $ R(x)$ is referred to as the regularization term in the cost functional. The regularization parameter $ \lambda$ determines the trade-off between the fidelity to the model, and damping of the noise through enforcement of apriori information via $ R$ . Often, $ R(x) := \Vert L x \Vert _2^2$ where $ L$ is often either the identity or a discrete gradient or discrete Laplacian operator. Due to the use of the 2-norm on the constraint term $ \Vert L x \Vert _2^2$ , solutions tend to be smooth, assuming an appropriate value of $ \lambda$ is known. On the other hand, solving this regularized least squares problem with a suitable Krylov method is straightforward if $ \lambda$ is known, as $ A$ and $ L$ are usually structured. Even if $ \lambda$ is not known, hybrid methods on the so-called standard form version of the problem, or joint-bidiagonalization approaches that seek to find $ \lambda$ by operating on the projected problem are possible alternatives.

Two well-known alternative choices for $ R(x)$ when one desires regularized solutions with sharp edges are $ TV(x)$ (total variation) and $ \Vert Lx \Vert _p^p$ with $ L$ the discrete gradient operator $ p$ close to 1. However, approaches such as these require overall more computational effort than if the 2-norm constraint is used, a problem that is further complicated by the fact that a good value of $ \lambda$ is typically not known apriori.

We propose a novel, two-step approach, in which we solve a sequence of regularized least squares problems

$\displaystyle \min_x \Vert A x - b \Vert _2^2 + \lambda_k^2 \Vert D_k x \Vert _2^2 $

where the near-optimal value of $ \lambda_k$ is determined on-the-fly for each regularization operator $ D_k$ through a hybrid regularization approach used in (Kilmer, Hansen, Espanol; SISC, 2007). We give a method for designing $ D_k$ adaptively in the outer iteration in such a way that edges are enhanced as $ k$ increases. We will discuss expected behavior on a class of images. We present results on applications in X-ray CT and image deblurring that show that our algorithm has the potential to produce high-quality images in a computationally efficient manner.




next up previous
Next: About this document ...
Copper Mountain 2014-02-23