NumericalNumerical
MethodsMethods
Rafał ZdunekRafał Zdunek
UnderdeterminedUnderdetermined
problemsproblems
(2h.)(2h.)
(FOCUSS, M(FOCUSS, M--FOCUSS, FOCUSS, ApplicationsApplications))
Introduction
• Solutions to underdetermined linear systems,
• Morphological constraints,• FOCUSS algorithm,• M-FOCUSS algorithm.
Bibliography[1] A. Bjorck, Numerical Methods for Least Squares Problems, SIAM, Philadelphia, 1996, [2] G. Golub, C. F. Van Loan, Matrix Computations, The John Hopkins
University Press,
(Third Edition), 1996, [3] I. F. Gorodnitsky
and B. D. Rao, “Sparse signal reconstructions from limited
data using
FOCUSS: A re-weighted minimum norm algorithm,”
IEEE Trans. Signal Process., vol. 45, no. 3, pp. 600–616, Mar. 1997
,
[4] B. D. Rao
and K. Kreutz-Delgado, “Deriving algorithms for computing sparse solutions to linear inverse problems,”
in Proc. 31st Asilomar Conf. Signals Syst. Comput., CA,
Nov. 2–5, 1997, vol. 1, pp. 955–959
,[5] B. D. Rao, K. Engan, S. F. Cotter, J. Palmer, and K. Kreutz-Delgado, “Subset selection
in noise based on diversity measure minimization,”
IEEE Trans. Signal Process., vol. 51, no. 3, pp. 760–770, Mar. 2003
,
[6] S. F. Cotter, B. Rao, K. Engan, and K. Kreutz-Delgado, “Sparse solutions to linear inverse problems with multiple measurement vectors,”
IEEE Trans. Signal Process.,
vol. 53, no. 7, pp. 2477–2488, Jul. 2005
,
Solutions to linear systemsA system of linear equations can be expressed in the following matrix form:
bAx = , (1)
where [ ] NMija ×ℑ∈=A is a coefficient matrix, [ ] M
ib ℑ∈=b is a data vector, and
[ ] Njx ℑ∈=x is a solution to be estimated. Let [ ] )1( +×ℑ== NMbAB be the augmented
matrix to the system (1). The system of linear equations may behave in any one of three possible ways:
A The system has no solution if ( ) ( )BA rankrank < .
B The system has a single unique solution if ( ) ( ) Nrankrank == BA .
C The system has infinitely many solutions if ( ) ( ) Nrankrank <= BA .
Solutions to linear systems
The case C occurs for rank-deficient problems or under-determined
problems. In spite of infinitely many solutions, a good approximation to
a true solution can be obtained if some a priori knowledge about the
nature of the true solution is accessible. The additional constraints are
usually concerned with a degree of sparsity or smoothness of the true
solution.
RREFBasic variables
Free variables
ExampleLet: 1 2 3
1 2 3
1 2 3
32
3 7
x x x ax x x bx x x c
+ + =⎧⎪− − + =⎨⎪ + − =⎩
[ ]Gauss-Jordan
1 3 1 1 0 5 2 3| 1 2 1 0 1 2
3 7 1 0 0 0 2
a a bb b ac c a b
⎡ ⎤ ⎡ ⎤− − −⎢ ⎥ ⎢ ⎥= − − → +⎢ ⎥ ⎢ ⎥⎢ ⎥ ⎢ ⎥− − +⎣ ⎦ ⎣ ⎦
A b
(Basic variables) (Free variable)
The system is consistent (it has solutions) if 2 0c a b− + = ⇒ 2c a b= −
2 32 ,x a b x= + −Solution: 3 free variable,x = 1 32 3 5 .x a b x= − − +
ExampleLet:
1 2 3 4
1 2 3 4
1 2 3 4
2 2 3 42 4 3 53 6 4 7
x x x xx x x xx x x x
+ + + =⎧⎪ + + + =⎨⎪ + + + =⎩
[ ]Gauss-Jordan
1 2 2 3 4 1 2 0 1 2| 2 4 1 3 5 0 0 1 1 1
3 6 1 4 7 0 0 0 0 0
⎡ ⎤ ⎡ ⎤⎢ ⎥ ⎢ ⎥= →⎢ ⎥ ⎢ ⎥⎢ ⎥ ⎢ ⎥⎣ ⎦ ⎣ ⎦
A bConsistent
Solution: 1 2 42 2 ,x x x= − + 2 free variable,x =
3 41 ,x x= − 4 free variable.x =
ExampleThus:
1 2 4
2 22 4
3 4
4 4
2 2 2 2 10 1 0
1 1 0 10 0 1
x x xx x
x xx xx x
− − − −⎡ ⎤ ⎡ ⎤ ⎡ ⎤ ⎡ ⎤ ⎡ ⎤⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥= = + +⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥− −⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥
⎣ ⎦ ⎣ ⎦ ⎣ ⎦⎣ ⎦ ⎣ ⎦
Particular solution Homogenous solution ∈
N(A)General solution
general particular homogeneous= +x x x
The homogeneous solution is a solution to the system Ax = 0.
Problem: How to choose free variables? Remark: Replacing free variables with zero-values is not always a good solution!
• Regularized least-squares problem:
• Lp diversity measure (Gorodnitsky, Rao, 1997):
Sparseness constraints
( ){ }2 ( )2
min pEγ− +x
Ax b x
( ) ( )∑=
=N
j
p
jp xpE
1
)( sgnx 1.p ≤
FOCUSS algorithm( )2 ( )
2( ) pJ Eγ= − +x Ax b xLet
Stationary point:
( )2* * * *( ) 2 2 2 0T TJ λ −∇ = − + =
xx A Ax A b W x x
( ) { }1 /2
* diag ,p
jx−
=W x γλ2p
=
Hence: ( ) ( )( ) 12 2* * * .T T
Mλ−
= +x W x A AW x A I b
Iterative updates:( ) ( )( ) 12 2
1 .T Tk k k Mλ
−
+ = +x W x A AW x A I b
FOCUSS algorithm
[ ]1,0∈p , 0>λ Randomly initialize )0(x , For …,2,1,0=k until convergence do
{ }1 /2( )diag pk
k jx−
=W ,
( ) 1( 1) 2 2k T Tk k Mλ
−+ = +x W A AW A I b ,
Wiener filtered FOCUSS algorithm
[ ]1,0∈p , 0>λ
Random ly initialize )0(x , For …,2,1,0=k until convergence do
{ }1 / 2( )diag pk
k jx−
=W ,
( ) 1( 1) 2 2k T Tk k Mλ
−+ = +x W A AW A I b ,
( ))1()1( ++ ← kk F xx
R. Zdunek, Z. He, A. Cichocki, Proc. IEEE ISBI, 2008
Wiener filtered FOCUSS algorithm
- local mean around j-th pixel
⎪⎭
⎪⎬
⎫
⎪⎩
⎪⎨
⎧
+++−++−
+−−−−=
1,,1,1,1
,1,,1
hjhjhjjj
hjhjhjN j
First and second-order interactions
∑∈
+
−=
jNn
knj x
L)1(
11μ
( )∑∈
+ −−
=jNn
jk
nj xL
22)1(2
11 μσ
- local variance
9=L
Markov Random Field (MRF)
Wiener filtered FOCUSS algorithm
• Update rule
• Mean noise variance
( )jkj
j
jj
kj xx μ
σνσ
μ −−
+← ++ )1(2
22)1(
∑=
=N
jjN 1
22 1 σν
Limited-view tomographic imaging
(2 2)
1 1 0 00 0
0 00 0 1 1
α αα α×
⎡ ⎤⎢ ⎥⎢ ⎥=⎢ ⎥⎢ ⎥⎣ ⎦
A 52
α =
[ ]{ }(2 2)( ) 1,1, 1,1 ,TN span× = − −A( )(2 2) 3rank × =A( ) { }min!:;
2=−ℜ∈= bAxxbA NLSS
(Rank-deficient)
( ) )(; AxbA NLSS LS += where ( ) exactRrLS TP xxx A==
Tomographic imaging example
Phantom image Minimal l2 norm least squares solution:(LS algorithms: ART, SIRT)
Tomographic imaging (noise-free data)
FOCUSS algorithm (p = 1, k = 15, )810−=λ
Wiener filtered FOCUSS algorithm (p = 1, k = 15, )810−=λ
Tomographic imaging (noisy data SNR = 30 dB)
FOCUSS algorithm(p = 1, k = 15, )
Wiener filtered FOCUSS algorithm(p = 1, k = 15, )40=λ 40=λ
Tomographic imaging (Normalized RMSE, noise-free)
Tomographic imaging (Normalized RMSE, p = 1, noisy data)
M-FOCUSSLet + =AX N B where ,M N×∈ℜA [ ]1, , ,N T
T×= ∈ℜX x x… ,M T×∈ℜB
( )rank ,M=A,M N≤ (under-determined) ,T N> [ ]1, , .M TT
×= ∈ℜN n n…
(additive noise)If M < N, the nullspace of A is non-trivial, thus additional constraints are necessary to select the right solution. The M-FOCUSS assumes sparse solutions.
Theorem: The sparse solution to the consistent system AX = B is unique, if with any M columns of A are linearly independent (unique representation property (URP) condition), and for each t: xt has at most
nonzero entries, where is a ceil function.
( )rank ,T=B ,T M≤
( ) / 2 1M T⎡ ⎤+ −⎢ ⎥⋅⎡ ⎤⎢ ⎥
(Cotter, Rao, Engan, Kreutz-Delgado, 2005)
M-FOCUSSThe M-FOCUSS algorithm iteratively solves the following equality constrained problem:
( )( )min ,pJX
X s.t. ,=AX B
where ( )/2
( ) 2
21 1 1
.pN N Tpp T
j jtj j t
J x= = =
⎛ ⎞= = ⎜ ⎟
⎝ ⎠∑ ∑ ∑X x - lp diversity measure for sparsity
0 ≤ p ≤ 2 – degree of sparsity Tjx - the j-th row of X
(l0 norm solution – NP-hard problem)
(LS solution with minimal l2 -norm)
M-FOCUSSFor inconsistent data, i.e. B ∉
R(A), Cotter at al. developped the regularized
M-FOCUSS algorithm that solves the Tikhonov regularized least-squares problem in a single iterative step:
( )( ) ( 1)arg min | ,k k−= ΨX
X X X where ( ) 22( 1) 1| ,kF F
λ− −Ψ = − +X X B AX W X
( )1 /2diag ,pjw −=W with ( ) ( )
1/22 ( 1)( 1)
21.
T kk Tj jt j
tw x
−−
=
⎛ ⎞= =⎜ ⎟⎝ ⎠∑ x
Regularized M-FOCUSS:For k = 1, 2, ...
( 1) ( ) ,k k+ =A AW
( ) ( )( ) 1( 1) ( ) ( ) ( ) ( ) .
T Tk k k k kMλ
−+ = +X W A A A I B
λ
> 0 – regularization parameter