Gerd Teschke (7. Juni 2010) 1/68 Sparse recovery and compressed sensing in inverse problems Gerd Teschke (joint work with Evelyn Herrholz) Institute for Computational Mathematics in Science and Technology Neubrandenburg University of Applied Sciences, Germany Konrad-Zuse-Institute Berlin (ZIB), Germany June 7, 2010
Gerd Teschke (7. Juni 2010) 2/68 1 Motivation 2 Sparse recovery for inverse problems 3 Compressively sensed data and recovery 4 Ill-posed sensing model and sparse recovery 5 Numerical experiment 6 Short summary
Gerd Teschke (7. Juni 2010) 3/68 Motivation Motivation
Gerd Teschke (7. Juni 2010) 4/68 Motivation
Gerd Teschke (7. Juni 2010) 5/68 Motivation References E. Candes, The restricted isometry property and its implications for compressed sensing, Academie des sciences, 2008. E. J. Candes and T. Tao, Decoding by linear programming, IEEE Trans. Inform. Theory, 51(12), 4203-4215, 2005. Y. Eldar, Compressed sensing of analog signals in shift invariant space, IEEE Trans. on Signal Processing, 57(8), 2009. I. Daubechies, M. Fornasier and I. Loris, Accelerated projected gradient method for linear inverse problems with sparsity constraints, J. Fourier Anal. Appl., 2008. G. Teschke and C. Borries, Accelerated projected steepest descent method for nonlinear inverse problems with sparsity constraints, Inverse Problems, 26, 025007, 2010. E.Herrholz and G.Teschke, A compressed sensing approach for inverse problems, preprint, 2010.
Gerd Teschke (7. Juni 2010) 6/68 Motivation Solve Ax = y or F (x) = y (A, F : X Y, X, Y Hilbert spaces)
Gerd Teschke (7. Juni 2010) 7/68 Motivation Solve Ax = y or F (x) = y (A, F : X Y, X, Y Hilbert spaces) Often noisy data y δ y δ
Gerd Teschke (7. Juni 2010) 8/68 Motivation Solve Ax = y or F (x) = y (A, F : X Y, X, Y Hilbert spaces) Often noisy data y δ y δ The inverse problem is ill-posed
Gerd Teschke (7. Juni 2010) 9/68 Motivation Solve Ax = y or F (x) = y (A, F : X Y, X, Y Hilbert spaces) Often noisy data y δ y δ The inverse problem is ill-posed Often: incomplete data or no data sampled at adequate rate
Gerd Teschke (7. Juni 2010) 10/68 Motivation Solve Ax = y or F (x) = y (A, F : X Y, X, Y Hilbert spaces) Often noisy data y δ y δ The inverse problem is ill-posed Often: incomplete data or no data sampled at adequate rate Some a-priori knowledge required: solution has sparse representation / is compressible
Gerd Teschke (7. Juni 2010) 11/68 Sparse recovery for inverse problems Sparse recovery for inverse problems
Gerd Teschke (7. Juni 2010) 12/68 Sparse recovery for inverse problems Tikhonov approach for linear/nonlinear problems: min F (x) y δ 2 + 2α x p x p
Gerd Teschke (7. Juni 2010) 13/68 Sparse recovery for inverse problems Tikhonov approach for linear/nonlinear problems: min F (x) y δ 2 + 2α x p x p Linear case: [Daubechies,Defrise,DeMol 2004] x n+1 = S α (x n + γa (y Ax n ))
Gerd Teschke (7. Juni 2010) 14/68 Sparse recovery for inverse problems Tikhonov approach for linear/nonlinear problems: min F (x) y δ 2 + 2α x p x p Linear case: [Daubechies,Defrise,DeMol 2004] x n+1 = S α (x n + γa (y Ax n )) Nonlinear case: [Ramlau, T. 2005, 2007] x n+1 = S α (x n + γf (x n+1 ) (y F (x n )))
Gerd Teschke (7. Juni 2010) 15/68 Sparse recovery for inverse problems Tikhonov approach for linear/nonlinear problems: min F (x) y δ 2 + 2α x p x p Linear case: [Daubechies,Defrise,DeMol 2004] x n+1 = S α (x n + γa (y Ax n )) Nonlinear case: [Ramlau, T. 2005, 2007] x n+1 = S α (x n + γf (x n+1 ) (y F (x n ))) Many alternatives and improvements (incomplete list...) steepest descent, domain decomposition, semi-smooth Newton methods, projection methods [Bredies, Daubechies, Fornasier, Lorenz, Loris,...] adaptive iteration [Ramlau, T., Zhariy and Dahlke, Fornasier, Raasch,...]
Gerd Teschke (7. Juni 2010) 16/68 Sparse recovery for inverse problems Sparsity through projection + acceleration Problem: x n+1 = S α (x n + γa (y Ax n )) overshooting (dynamics discussed by Daubechies, Fornasier, Loris 2008)
Gerd Teschke (7. Juni 2010) 17/68 Sparse recovery for inverse problems Sparsity through projection + acceleration Problem: x n+1 = S α (x n + γa (y Ax n )) overshooting (dynamics discussed by Daubechies, Fornasier, Loris 2008) Alternative: (Daubechies et.al. 2008, Borries, T. 2009 (nonlinear case)) min { Ax y 2 } where B R := {x l 2 ; x 1 R} x B R x n+1 = P R (x n + γ n A (y Ax n )).
Gerd Teschke (7. Juni 2010) 18/68 Sparse recovery for inverse problems Vector-valued setup and joint sparsity formulation: min y δ Ax 2 d C p,q R where C p,q R = x (l 2(Λ)) m : Ψ p,q (x) := ( ) q m p x l,λ p l=1 λ Λ R
Gerd Teschke (7. Juni 2010) 19/68 Sparse recovery for inverse problems Vector-valued setup and joint sparsity formulation: min y δ Ax 2 d C p,q R where C p,q R = x (l 2(Λ)) m : Ψ p,q (x) := ( ) q m p x l,λ p l=1 λ Λ R Special case p = 2, q = 1: P R (x) = (S µ ({x 1,λ } λ Λ ),..., S µ ({x m,λ } λ Λ ))
Gerd Teschke (7. Juni 2010) 20/68 Sparse recovery for inverse problems Vector-valued setup and joint sparsity formulation: min y δ Ax 2 d C p,q R where C p,q R = x (l 2(Λ)) m : Ψ p,q (x) := ( ) q m p x l,λ p l=1 λ Λ R Special case p = 2, q = 1: P R (x) = (S µ ({x 1,λ } λ Λ ),..., S µ ({x m,λ } λ Λ )) with {x l,λ } λ Λ S µ ({x l,λ } λ Λ ) = max( {x l,λ } λ Λ {x l,λ } λ Λ l2 (λ) µ, 0) l2 (λ)
Gerd Teschke (7. Juni 2010) 21/68 Compressively sensed data and recovery Compressively sensed data
Gerd Teschke (7. Juni 2010) 22/68 Compressively sensed data and recovery Compressed Sensing Idea: Suppose there is a signal x. x
Gerd Teschke (7. Juni 2010) 23/68 Compressively sensed data and recovery Compressed Sensing Idea: = Ã Suppose we measure y just a few linear samples. Can we reconstruct x? x
Gerd Teschke (7. Juni 2010) 24/68 Compressively sensed data and recovery Compressed Sensing Idea: y = Ã B Yes! If x is sparse in a given uncoherent basis or frame,i.e. x = B d d
Gerd Teschke (7. Juni 2010) 25/68 Compressively sensed data and recovery Compressed Sensing Idea: y = à B Yes! If x is sparse in a given uncoherent basis or frame,i.e. x = B d à is p m matrix with p m d
Gerd Teschke (7. Juni 2010) 26/68 Compressively sensed data and recovery Compressed Sensing Idea: y = à B Yes! If x is sparse in a given uncoherent basis or frame,i.e. x = B d à is p m matrix with p m d k-sparse, then p 2k data points are required (stability: p ck log(m/p)) d
Gerd Teschke (7. Juni 2010) 27/68 Compressively sensed data and recovery Compressed Sensing Idea: y = à B Yes! If x is sparse in a given uncoherent basis or frame,i.e. x = B d à is p m matrix with p m d k-sparse, then p 2k data points are required (stability: p ck log(m/p)) A = ÃB has to fulfill a restricted isometry property d
Gerd Teschke (7. Juni 2010) 28/68 Compressively sensed data and recovery Definition (restricted isometry property RIP) For each integer k = 1, 2,..., define the isometry constant δ k of a sensing matrix A as the smallest number such that (1 δ k ) d 2 l 2 Ad 2 l 2 (1 + δ k ) d 2 l 2 holds for all k-sparse vectors d. A vector is said to be k-sparse if it has at most k non-vanishing entries.
Gerd Teschke (7. Juni 2010) 29/68 Compressively sensed data and recovery Optimization problem: min d d l1 subject to y = Ad or y δ Ad δ ( )
Gerd Teschke (7. Juni 2010) 30/68 Compressively sensed data and recovery Optimization problem: min d d l1 subject to y = Ad or y δ Ad δ ( ) (E. Candes 2008) Let A satisfy RIP with δ 2k < 2 1 and d k denote the best k-term approximation, then the solution d to ( ) obeys d d l2 C 0 k 1/2 d k d l1 or in case of noisy data y δ with y δ y l2 δ d d l2 C 0 k 1/2 d k d l1 + C 1 δ
Gerd Teschke (7. Juni 2010) 31/68 Ill-posed sensing model and sparse recovery Ill-posed sensing model and sparse recovery
Gerd Teschke (7. Juni 2010) 32/68 Ill-posed sensing model and sparse recovery Indirect measurement Kx: y = Ã K B d Consequences K may be ill-conditioned RIP is usually lost.
Gerd Teschke (7. Juni 2010) 33/68 Ill-posed sensing model and sparse recovery Indirect measurement Kx: y = Ã K B d Consequences K may be ill-conditioned RIP is usually lost. Question: How to overcome this problem and how to adjust the finite-dimensional CS setting to inverse problems?
Gerd Teschke (7. Juni 2010) 34/68 Ill-posed sensing model and sparse recovery (setup by Y. Eldar, 2009) Let X be a Hilbert space and X m X the reconstruction space { } m X m = x X, x = d l,λ a l,λ, d (l 2 (Λ)) m l=1 λ Λ
Gerd Teschke (7. Juni 2010) 35/68 Ill-posed sensing model and sparse recovery (setup by Y. Eldar, 2009) Let X be a Hilbert space and X m X the reconstruction space { } m X m = x X, x = d l,λ a l,λ, d (l 2 (Λ)) m l=1 λ Λ Sparse structure: support set I {1,..., m} X k = x X, x = d l,λ a l,λ, d (l 2 (Λ)) m l I, I =k λ Λ
Gerd Teschke (7. Juni 2010) 36/68 Ill-posed sensing model and sparse recovery Analysis and synthesis operators T φ : X l 2 via x x = { x, φ λ } λ Λ T φ : l 2 X via x λ Λ x λ φ λ.
Gerd Teschke (7. Juni 2010) 37/68 Ill-posed sensing model and sparse recovery representation of solution: x = T a d
Gerd Teschke (7. Juni 2010) 38/68 Ill-posed sensing model and sparse recovery representation of solution: x = T a d data model: y = T v KT a d
Gerd Teschke (7. Juni 2010) 39/68 Ill-posed sensing model and sparse recovery representation of solution: x = T a d data model: y = T v KT a d compressed sampling: s 1,λ. s p,λ = A v 1,λ. v m,λ for all λ Λ
Gerd Teschke (7. Juni 2010) 40/68 Ill-posed sensing model and sparse recovery representation of solution: x = T a d data model: y = T v KT a d compressed sampling: s 1,λ. s p,λ = A v 1,λ. v m,λ for all λ Λ compressed data model: y = T s KT a d
Gerd Teschke (7. Juni 2010) 41/68 Ill-posed sensing model and sparse recovery compressed data model: y = T s KT a d
Gerd Teschke (7. Juni 2010) 42/68 Ill-posed sensing model and sparse recovery compressed data model: y = T s KT a d Assume v l,λ, Ka l,λ = κ l,λδ l,l δ λ,λ
Gerd Teschke (7. Juni 2010) 43/68 Ill-posed sensing model and sparse recovery compressed data model: y = T s KT a d Assume v l,λ, Ka l,λ = κ l,λδ l,l δ λ,λ then for all λ Λ: y λ = AD λ d λ with D λ = diag(κ 1,λ,..., κ m,λ )
Gerd Teschke (7. Juni 2010) 44/68 Ill-posed sensing model and sparse recovery compressed data model: y = T s KT a d = ADd Assume v l,λ, Ka l,λ = κ l,λδ l,l δ λ,λ then for all λ Λ: y λ = AD λ d λ with D λ = diag(κ 1,λ,..., κ m,λ )
Gerd Teschke (7. Juni 2010) 45/68 Ill-posed sensing model and sparse recovery compressed data model: y = T s KT a d = ADd Assume v l,λ, Ka l,λ = κ l,λδ l,l δ λ,λ then for all λ Λ: y λ = AD λ d λ with D λ = diag(κ 1,λ,..., κ m,λ ) Problem: AD does not satisfy RIP
Gerd Teschke (7. Juni 2010) 46/68 Ill-posed sensing model and sparse recovery For all λ Λ consider optimization problem min yλ δ AD λd λ 2 l d λ B 2 + α d λ 2 l 2, R where B R = {d λ l 2 : d λ l1 R}
Gerd Teschke (7. Juni 2010) 47/68 Ill-posed sensing model and sparse recovery For all λ Λ consider optimization problem min yλ δ AD λd λ 2 l d λ B 2 + α d λ 2 l 2, R where B R = {d λ l 2 : d λ l1 R} Corresponding model operator L 2 := D λ A AD λ + αi, leading to stabilized RIP (κ 2 min(1 δ k ) + α) x 2 Lx 2 (κ 2 max(1 + δ k ) + α) x 2
Gerd Teschke (7. Juni 2010) 48/68 Ill-posed sensing model and sparse recovery Theorem (Herrholz, T. 2010) Define d λ as the B R-best approximation to the true solution. Assume R was chosen such that the solution d λ B R and that 0 δ 2k < (1 + 2)κ 2 min κ2 max + 2α (1 + 2)κ 2 min + κ2 max. Then the minimizer d λ satisfies d λ d λ l2 C 0 k 1/2 d k λ d λ l1 +C 1 δ+c 2 L(d λ d λ) l2 +C 3 αr.
Gerd Teschke (7. Juni 2010) 49/68 Ill-posed sensing model and sparse recovery Treatment of all λ simultaneously by a joint sparsity formulation: min y δ ADd 2 (l 2 (Λ)) p + α d 2 (l 2 (Λ)) m d C 2,1 R
Gerd Teschke (7. Juni 2010) 50/68 Ill-posed sensing model and sparse recovery Treatment of all λ simultaneously by a joint sparsity formulation: min y δ ADd 2 (l 2 (Λ)) p + α d 2 (l 2 (Λ)) m d C 2,1 R where C 2,1 R = {d (l 2(Λ)) m : Ψ 2,1 (d) R}
Gerd Teschke (7. Juni 2010) 51/68 Ill-posed sensing model and sparse recovery Theorem (Herrholz, T. 2010) Let d be the B R -best approximation to the true solution, R as before and 0 δ 2k < (1 + 2)κ 2 min κ2 max + 2α (1 + 2)κ 2 min + κ2 max. Then the minimizer d satisfies d d (l2 (Λ)) m C 0k 1/2 Ψ 2,1 (d k d) + C 1 δ +C 2 L(d d) (l2 (Λ)) m + C 3 αr.
Gerd Teschke (7. Juni 2010) 52/68 Ill-posed sensing model and sparse recovery Discussion on the balancing on δ 2k of α:
Gerd Teschke (7. Juni 2010) 53/68 Ill-posed sensing model and sparse recovery Discussion on the balancing on δ 2k of α: Given K and sensing matrix A: (1 + δ 2k )κ 2 max (1 + 2)(1 δ 2k )κ 2 min 2 < α
Gerd Teschke (7. Juni 2010) 54/68 Ill-posed sensing model and sparse recovery Discussion on the balancing on δ 2k of α: Given K and sensing matrix A: (1 + δ 2k )κ 2 max (1 + 2)(1 δ 2k )κ 2 min 2 < α If δ 2k < 1, a stabilization becomes necessary if 1 + δ 2k 1 δ 2k κ2 max κ 2 min > 1 + 2.
Gerd Teschke (7. Juni 2010) 55/68 Ill-posed sensing model and sparse recovery Discussion on the balancing on δ 2k of α: Given K and sensing matrix A: (1 + δ 2k )κ 2 max (1 + 2)(1 δ 2k )κ 2 min 2 < α If δ 2k < 1, a stabilization becomes necessary if 1 + δ 2k 1 δ 2k κ2 max κ 2 min > 1 + 2. If δ 2k 1, lower bound is always positive
Gerd Teschke (7. Juni 2010) 56/68 Ill-posed sensing model and sparse recovery Discussion on the balancing on δ 2k of α: Given K and sensing matrix A: (1 + δ 2k )κ 2 max (1 + 2)(1 δ 2k )κ 2 min 2 < α If δ 2k < 1, a stabilization becomes necessary if 1 + δ 2k 1 δ 2k κ2 max κ 2 min > 1 + 2. If δ 2k 1, lower bound is always positive Specializing the case δ 2k < 1 to κ 2 max = κ 2 min = 1: δ 2k > 2 1
Gerd Teschke (7. Juni 2010) 57/68 Ill-posed sensing model and sparse recovery Given K, stabilize first and suppose A can be chosen accordingly: κ 2 max (1 + 2)κ 2 min 2 < α.
Gerd Teschke (7. Juni 2010) 58/68 Ill-posed sensing model and sparse recovery Given K, stabilize first and suppose A can be chosen accordingly: κ 2 max (1 + 2)κ 2 min 2 < α. No stabilization necessary (α = 0): 1 κ 2 max/κ 2 min < 1 + 2
Gerd Teschke (7. Juni 2010) 59/68 Ill-posed sensing model and sparse recovery Given K, stabilize first and suppose A can be chosen accordingly: κ 2 max (1 + 2)κ 2 min 2 < α. No stabilization necessary (α = 0): 1 κ 2 max/κ 2 min < 1 + 2 δ 2k < 1 + 2 κ 2 max/κ 2 min 1 + 2 + κ 2 max/κ 2 min
Gerd Teschke (7. Juni 2010) 60/68 Ill-posed sensing model and sparse recovery Given K, stabilize first and suppose A can be chosen accordingly: κ 2 max (1 + 2)κ 2 min 2 < α. No stabilization necessary (α = 0): 1 κ 2 max/κ 2 min < 1 + 2 δ 2k < 1 + 2 κ 2 max/κ 2 min 1 + 2 + κ 2 max/κ 2 min Stabilization necessary (α > 0): 1 + 2 κ 2 max/κ 2 min
δ 2k < 1 + 2 κ 2 max/κ 2 min + 2α/κ 2 min 1 + 2 + κ 2 max/κ 2 min Gerd Teschke (7. Juni 2010) 61/68 Ill-posed sensing model and sparse recovery Given K, stabilize first and suppose A can be chosen accordingly: κ 2 max (1 + 2)κ 2 min 2 < α. No stabilization necessary (α = 0): 1 κ 2 max/κ 2 min < 1 + 2 δ 2k < 1 + 2 κ 2 max/κ 2 min 1 + 2 + κ 2 max/κ 2 min Stabilization necessary (α > 0): 1 + 2 κ 2 max/κ 2 min
Gerd Teschke (7. Juni 2010) 62/68 Numerical experiment Numerical experiment (MATLAB)
Numerical experiment a l,λ - sinc- wavelets, l { 3, 2,..., 4, 5}, i.e. m = 9 K convolution operator D λ x is 2-sparse, d 1,21 = 3, d 4,20 = 1, d 4,25 = 2 yλ δ = AD λd λ + δ Gerd Teschke (7. Juni 2010) 63/68
Gerd Teschke (7. Juni 2010) 64/68 Numerical experiment K corresponds to diagonal entries κ l,λ :
Gerd Teschke (7. Juni 2010) 65/68 Numerical experiment minimization of yields: min y δ ADd 2 (l 2 (Λ)) p + α d 2 (l 2 (Λ)) m d C 2,1 R
Gerd Teschke (7. Juni 2010) 66/68 Numerical experiment Algorithmic aspects:
Gerd Teschke (7. Juni 2010) 67/68 Short summary Summary: Algorithms for sparse recovery: - projected steepest descent, step length control, vector-valued, joint sparsity Suggestion for a setup for incomplete/compressed data: - semi-infinite dimensional In the compressed data model: - balance between the sensing properties of A and stabilization by α - choice of α restricted (typically α 0 to ensure CS recovery) Future work: - apply infinite CS model (A. Hansen, 2010) - combine compressed data model with adaptive operator evaluation
Gerd Teschke (7. Juni 2010) 68/68 Short summary Workshop on CS, Sparsity and Inverse Problems September 6-7, 2010 at TU Braunschweig, Germany http://www.dfg-spp1324.de/nuhagtools/event/make.php?event=cssip10