Richard Baraniuk
Rice Universitydsp.rice.edu/cs
Compressive
Signal Processing
Compressive Sensing (CS)
• When data is sparse/compressible, can directly acquire a condensed representation with no/little information loss
• Random projection will work
measurementssparsesignal
sparsein some
basis
[Candes-Romberg-Tao, Donoho, 2004]
• Reconstruction/decoding: given(ill-posed inverse problem) find
CS Signal Recovery
measurementssparsesignal
nonzeroentries
• Reconstruction/decoding: given(ill-posed inverse problem) find
• L2 fast
CS Signal Recovery
• Reconstruction/decoding: given(ill-posed inverse problem) find
• L2 fast, wrong
CS Signal Recovery
Why L2 Doesn’t Work
least squares,minimum L2 solutionis almost never sparse null space of
translated to
(random angle)
• Reconstruction/decoding: given(ill-posed inverse problem) find
• L2 fast, wrong
• L0
CS Signal Recovery
number ofnonzeroentries:
ie: find sparsestpotential solution
• Reconstruction/decoding: given(ill-posed inverse problem) find
• L2 fast, wrong
• L0 correct, slowonly M=K+1 measurements required to perfectly reconstruct K-sparse signal[Bresler; Rice]
CS Signal Recovery
• Reconstruction/decoding: given(ill-posed inverse problem) find
• L2 fast, wrong
• L0 correct, slow
• L1 correct, mild oversampling [Candes et al, Donoho]
CS Signal Recovery
linear program
Why L1 Works
minimum L1 solution= sparsest solution (with high probability) if
• Gaussian white noise basis is incoherent with any fixed orthonormal basis (with high probability)
• Signal sparse in time domain:
Universality
• Gaussian white noise basis is incoherent with any fixed orthonormal basis (with high probability)
• Signal sparse in frequency domain:
• Product remains white Gaussian
Universality