1
An Introduction to Compressed Sensing
Student : Shenghan TSAIAdvisor : Hsuan-Jung Su and Pin-Hsun Lin
Date : May 02, 2014
2
Outline• Introduction• Signal---Sparse and Compressible - Sparse & Compressible - Power law -The p-norm in finite dimensions
• Sensing Matrices - NSP(Null space conditions) -RIP(Restricted isometry proerty)• Sparse Signal Recovery• Conclusion
3
Introduction
Compressed Sensing• Compressed sensing is a signal processing technique for efficiently acquiring and reconstructing a signal, by finding solutions to underdetermined linear systems.This takes advantage of the signal's sparseness or compressibility in some domain, allowing the entire signal to be determined from relatively few measurements.
4
History
• “If we sample a signal at twice its highest frequency, then we can recover it exactly.”
Whittaker-Nyquist-Kotelnikov-Shannon
• Emmanuel Candès, Terence Tao, and David Donoho proved that given knowledge about a signal's sparsity(2004)
5
Motivation
Motivation
8
How it work?• Y=ΦX Sensing matrix : ,
0 0 0
^^1= =( ) ( )T Tx x
9
Deterministic compressive sensingSignal---Sparse and CompressibleSensing Matrices—NSP and RIP
10
Sparse & Compressible• Sparse Model: Signals of interest are often sparse or compressible, i.e., very few large coefficients, many close to zero.• Sparse signals: have few non-zero coefficients. i.e. K-sparse mean it
has at most K nonzeros.
• Compressible signals: have few significant coefficients; coefficients decay as a power law.
0x K
0{ : }k x x K
11
Power law• A signal is compressible if its sorted coefficient magnitudes in decay
rapidly. x be a signal • The signal should observe a power law decay : s=1,2,… q decay faster, more compressible
x 1| | q
s C s
The p-norm in finite dimensions• Lp mean norm p• EX: P=2 Ex: p=0 Ex: p=∞
2 ,X X X 0 | sup ( ) | { : 0}iX p x i x
1
1 2(| | | | .... | | )p p p pnpX x x x
1 2max{| |,| |,...,| |}nX x x x
The p-norm in finite dimensions• The grid distance between two points is never shorter than the length
of the line segment between them. Formally, this means that the Euclidean norm of any vector is bounded by its 1-norm
• Using Cauchy–Schwarz inequality.
2 1X X p a pX X
12
XX
K
15
Sensing MatricesNSP(Null space conditions)RIP(Restricted isometry proerty)
16
Sensing Matrices
17
Sensing Matrices• Y=ΦX Sensing matrix : ,
18
How to design Sensing Matrices• If we sure our date is sparse and compressible, then we want to
design Φ with M<<N and want it can recover• To ensure Φ can recover there are two property NSP & RIP need to
follow and we got a measurement bound
19
The Null space property• If we want it can recover K-sparse signals it is that require Φx1 ≠ Φx2
for all K-sparse x1 ≠ x2• So we necessary that Φ must have at least 2K rows otherwise there
exist K-sparse x1,x2 s.t. Φ(x1-x2)=0 • Spark are almost the same meaning• M >=2K
20
The Null space property• Null space property (NSP) of order K if there exists a constant C > 0
• holds for all and for all such that
12
chh C
K
( )h K
:RM NR
12
( )( ) K xx x CK
^
^( ) min
K
K p Kpx
x x x
21
22
The restricted isometry property• A A matrix satises the restricted isometry property (RIP) of order K if
there exists a (0~ 1)
• Make sure Φx be stable
12
( )( ) K xx x CK
2 2 22 2 2(1 ) (1 )k kx x x
23
24
The RIP and NSP(relationship between RIP & NSP)• Suppose that Φ satisfies the RIP of order 2K with . Then Φ
satisfies the NSP of order 2K with constant (接第 X幾頁 )
• Suppose that Φ satisfies the RIP of order 2K,
12
chh C
K
2
2
21 (1 2)
k
k
C
2 2 1k
0 12
2
| , |ch h hhhK
2
2
21
k
k
2
11 k
0| | K
0 {1,2,3,..., }N
01 ch
0 1
25
Sparse Signal RecoveryNSP(Null space conditions)RIP(Restricted isometry proerty)
26
Sparse Signal Recovery• with noiseless in noise
^
1argminz
zx( ) { : }z y z z y 2( ) { : }z y z z y
27
Cont.
• When then ( ) { : }z y z z y ^
y x x 0h
^1
02 22
( )( ) K xx x x x h CK
^h x x
20
2
1 (1 2)21 (1 2)
k
k
C
28
29
Recovery in noise
20
2
1 (1 2)21 (1 2)
k
k
C
1
2
21 (1 2) k
C
^ ^1
0 12 22 2 2
( ) | , |( ) K x h hx x x x h x x C ChK
0| | K 0 {1,2,3,..., }N
01 ch
0 1
2( ) { : }z y z z y ^
+y x x
30
31
0 0 0 0
^1( )T Tx y
0
^0cx
0 0 0 0 0 0
^1 1
2 22
( ) ( ) ( )T T T Tx x x e x e
32
Conclusion• If signals are sparse and compressible then can use CS to compressed.
• Signals can be perfect recovered ,if satisfies NSP & RIP.
References1. [E. Cand[U+FFFD] The restricted isometry property and its implications for compressed sensing. Comptes rendus de l'Acad[U+FFFD]e des Sciences, S[U+FFFD]e I, 346(9-10):5898211;592, 2008..2. [Yu TSP 11] G. Yu and Guillermo Sapiro, “Statistical Compressed Sensing of Gaussian
Mixture Models,” IEEE Trans of Signal Processing, vol. 59, no. 12, pp. 5842–5857, Dec. 2011.
3. [R. Baraniuk, M.A. Davenport, M.F. Duarte, C. Hegde], An Introduction to Compressive Sensing,
CONNEXIONS, Rice University, Houston, Texas, 2010.4. [R.G. Baraniuk,] “Compressive sensing,” IEEE Signal Processing Mag., vol. 24, no. 4, pp. 118–120, 124, 2007.5. http://en.wikipedia.org/wiki/Lp_space6. http://en.wikipedia.org/wiki/Compressed_sensing
33
34