Fast Compressive Sampling UsingFast Compressive ......• E. Candès and J. Romberg, “Sparsity a...

Post on 15-Aug-2021

2 views 0 download

transcript

Fast Compressive Sampling UsingFast Compressive Sampling Using Structurally Random Matrices

Presented by:Thong Do (thongdo@jhu.edu)g ( g @j )The Johns Hopkins University

A joint work with

P f T T Th J h H ki U i it1

Prof. Trac Tran, The Johns Hopkins University

Dr. Lu Gan, Brunel University, UK

Compressive Sampling FrameworkCompressive Sampling Framework• Main assumption: K-sparse representation of an input signal

Ψ: sparsifying transform;

• Compressive sampling:Φ

1 1N N N Nx α× × ×= Ψ Ψ: sparsifying transform;α: transform coefficients;

– : random matrices, random row subset of an orthogonal matrix (partial Fourier),etc.

1 1M M N Ny x× × ×= ΦM N×Φ

Sensing matrix

M N×Φ N N×Ψ1My × 1Nα ×

has K nonzero entries

1Nα ×

Sensing matrix=

C d

e t es

• Reconstruction: L1-minimization (Basis Pursuit)

Compressed measurements

21

arg minα

α α= s.t. y α= ΦΨ

x α= Ψ

A Wish-list of the Sampling OperatorA Wish list of the Sampling Operator

• Optimal Performance:Optimal Performance:– require the minimal number of compressed

measurementsmeasurements

• Universality:incoherent with various families of signals– incoherent with various families of signals

• Practicality:f i– fast computation

– memory efficiency– hardware friendly– streaming capability

3

Current Sensing MatricesCurrent Sensing Matrices• Random matrices[Candes, Tao, Donoho]

Optimal performanceHuge memory and computational complexityNot appropriate in large scale applications

• Partial Fourier[Candes et. al.]Fast computationNon-universality

O l i h t ith i l i ti• Only incoherent with signals sparse in time• Not incoherent with smooth signals such as natural images.

• Other methods (Scrambled FFT RandomOther methods (Scrambled FFT, Random Filters,…)

Either lack of universality or no theoretical guarantee4

y g

MotivationMotivation• Significant performance improvement of

bl d i l iscrambled FFT over partial FourierA well-known fact [Baraniuk, Candès] Original 512x512 Lena image[ ]

But no theoretical justificationCompressed

FFTRandom downsampler

pmeasurements

Reconstruction from 25% of measurements: 16.5 dB

ReconstructionReconstructionBasis Pursuit

5

MotivationMotivation• Significant performance improvement of

bl d i l iscrambled FFT over partial FourierA well-known fact [Baraniuk, Candès] Original 512x512 Lena image[ ]

But no theoretical justificationCompressed

ScrambledFFT

Random downsampler

pmeasurements

Reconstruction from 25% of measurements: 29.4 dB

ReconstructionReconstructionBasis Pursuit

6

Our ContributionsOur Contributions• Propose the concept of Structurally random• Propose the concept of Structurally random

ensemblesE t i f S bl d F i E bl– Extension of Scrambled Fourier Ensemble

• Provide theoretical guarantee for this novel sensing framework

• Design sensing ensembles with practical g g pfeatures– Fast computable memory efficient hardwareFast computable, memory efficient, hardware

friendly, streaming capability etc.

7

Proposed CS SystemProposed CS System• Pre-randomizer:

• Global randomizer: random permutation of sample indicesp

• Local randomizer: random sign reversal of sample valuessample values

FFT,WHT,DCT Pre-randomizerInput signalRandom

downsampler

Compressed measurements

downsampler

Reconstruction signal recovery

8Basis Pursuit

Proposed CS SystemProposed CS System• Compressive Sampling

• Pre-randomize an input signal (P)• Apply fast transform (T)

( ( ( ))) ( )y D T P x A x= =

pp y ( )• Pick up a random subset of transform coefficients (D)

• ReconstructionB i P it ith i t d it dj i t• Basis Pursuit with sensing operator and its adjoint:

( ( ( ( ))))A D T P= Ψ • * * * * *( ( ( ( )))))A P T D= Ψ •

TFFT WHT DCT

PPre-randomizer

Input signalDRandom downsampler

Compressed measurements

x α= ΨFFT,WHT,DCT,… Pre randomizerRandom downsampler

Reconstruction signal recovery

x α= Ψ

9Basis Pursuit

Proposed CS SystemProposed CS System• Compressive Sampling

• Pre-randomize an input signal (P)• Apply fast transform (T)

( ( ( ))) ( )y D T P x A x= =

pp y ( )• Pick up a random subset of transform coefficients (D)

• ReconstructionB i P it ith i t d it dj i t

Structurally random matrix

• Basis Pursuit with sensing operator and its adjoint:

( ( ( ( ))))A D T P= Ψ • * * * * *( ( ( ( )))))A P T D= Ψ •

TFFT WHT DCT

PPre-randomizer

Input signalDRandom downsampler

Compressed measurements

x α= ΨFFT,WHT,DCT,… Pre randomizerRandom downsampler

Reconstruction signal recovery

x α= Ψ

10Basis Pursuit

Structurally Random MatricesStructurally Random Matrices

• Structurally random matrices with local• Structurally random matrices with local randomizer: a product of 3 matrices

Random downsamplerPr( 0) 1d M N Fast transformPr( 1)iid M N= =Pr( 0) 1iid M N= = −

Local randomizerPr( 1) 1/ 2iid ′ = ± =

Fast transformFFT, WHT, DCT,…

( )ii

11

Structurally Random MatricesStructurally Random Matrices• Structurally random matrices with global y g

randomizer: a product of 3 matrices

Global randomizerRandom downsamplerPr( 0) 1d M N Fast transform Global randomizer

Uniformly random permutation matrix

Pr( 1)iid M N= =Pr( 0) 1iid M N= = − FFT, WHT, DCT,…

p

12Partial Fourier

Sparse Structurally Random MatricesSparse Structurally Random Matrices• With local randomizer:

• Fast computation • Memory efficiency• Hardware friendly• Streaming capabilityg p y

Random downsampler Local randomizerBlock-diagonal WHT, DCT FFT etc Pr( 1) 1/ 2d ′ ±p

Pr( 1)iid M N= =Pr( 0) 1iid M N= = −

DCT, FFT, etc. Pr( 1) 1/ 2iid = ± =

13

Sparse Structurally Random MatricesSparse Structurally Random Matrices• With global randomizer

• Fast computation • Memory efficiency• Hardware friendly• Nearly streaming capabilityy g p y

Global randomizerUniformly random Random downsampler

Block-diagonal WHT, DCT FFT etc f y

permutation matrixp

Pr( 1)iid M N= =Pr( 0) 1iid M N= = −

DCT, FFT,etc.

14

Theoretical AnalysisTheoretical Analysis• Theorem 1: Assume that the maximum absolute

entries of a structurally random matrix and an orthonormal matrix is not larger than Wi h hi h b bili h f d iΨΦ

M N×ΦN N×Ψ 1 log N

With high probability, coherence of and is not larger than

th b f t i f( log / )N sΟ

Φ

N N×ΨM N×Φ

– s: the average number of nonzero entries per row of

• Proof: M N×Φ

– Bernstein’s concentration inequality of sum of independent random variables

Th ti l h (G i /B lli d• The optimal coherence (Gaussian/Bernoulli random matrices): ( log / )N NΟ

15

Theoretical Analysisy• Theorem 2: With the previous assumption, sampling

a signal using a structurally random matrix guaranteesa signal using a structurally random matrix guarantees exact reconstruction (by Basis Pursuit) with high probability, provided thatp y, p

– s: the average number of nonzero entries per row of the

2~ ( / ) logM KN s N

sampling matrix – N: length of the signal, – K: sparsity of the signalK: sparsity of the signal

• Proof: – follow the proof framework of [Candès2007] and previous p [ ] p

theorem of coherence of the structurally random matrices• The optimal number of measurements required by

Ga ssian/Berno lli d random matrices: lK N16

Gaussian/Bernoulli dense random matrices: logK N[Candès2007] E. Candès and J. Romberg, “Sparsity and incoherence in compressive sampling”, Inverse Problems, 23(3) pp. 969-985, 2007

Simulation Results: Sparse 1D Signalsp g• Input signal sparse in DCT p g p

domain– N=256, K = 30

R t ti• Reconstruction: – Orthogonal Matching Pursuit(OMP)

• WHT256 + global randomizerWHT256 + global randomizer– Random permutation of samples indices

+ Walsh-Hadamard

WHT256 + l l d i• WHT256 + local randomizer– Random sign reversal of sample values

+ Walsh-Hadamardh f i f• WHT8 + global randomizer

– Random permutation of samples indices + block diagonal Walsh-8 8×

The fraction of nonzero entries:1/32

32 times sparser than S bl d FFT i i d

17Hadamard Scrambled FFT, i.i.d

Gaussian ensemble,…

Simulation Results: Compressible 2D Signalsp g• Experiment set-up:

Test images: 512×512 Lena and Boat;– Test images: 512×512 Lena and Boat;– Sparsifying transform ψ: Daubechies 9/7 wavelet transform;

L1 minimi ation sol er: GPSR [Mario Robert Stephen]– L1-minimization solver: GPSR [Mario, Robert, Stephen]– Structually random sensing matrices Φ:

• WHT512 & local randomizer– Random sign reversal of sample values + 512×512 block diagonal

Hadamard transform;– Full streaming capability

• WHT32 & global randomizer

– Random permutation of sample indices + 32×32 block diagonal Hadamard transform;

– Highly sparse: The fraction of nonzero entries is only 1/213

18

– Highly sparse: The fraction of nonzero entries is only 1/2

Rate-Distortion Performance: Lena

P i l FFT i lFull

Partial FFT in wavelet domain:– transform the image into wavelet coeffs

streaming capability

8000 times sparser than

into wavelet coeffs– sense these coeffs (rather than sense directly image pixels) using partial FFTsparser than

the Scrambled FFT

FFTNo universalityMore computational

complexityp yServes as a benchmark

19R-D performance of 512x512 Lena

Reconstructed Images: Lena• Reconstruction from 25% of measurements using GPSR

Original 512x512 Lena image Partial FFT:16 dB Partial FFT in wavelet domain: 30.1 dBOriginal 512x512 Lena image Partial FFT:16 dB

WHT512 + local WHT32 + globalScrambled FFT:29 3 dB

20

WHT512 + local randomizer: 28.4 dB

WHT32 + global randomizer: 29 dB

Scrambled FFT:29.3 dB

Rate-Distortion Performance: BoatRate Distortion Performance: Boat

21R-D performance of 512x512 Boat image

Future ResearchFuture Research• Develop theoretical analysis of structurally random p y y

matrices with greedy, iterative reconstruction algorithms such as OMP

• Application of structurally random matrices to high dimensionality reduction– Fast Johnson-Lindenstrauss transform using structurally

random matricesCl t d t i i ti i li• Closer to deterministic compressive sampling– Replace a random downsampler by a deterministic

downsampler or a deterministic lattice of measurements*downsampler or a deterministic lattice of measurements– Develop theoretical analysis for this nearly deterministic

framework

22* Basarab Matei and Yves Meyer, “A variant of the compressed sensing of Emmanuel Candès ”, Preprint, 2008.

ConclusionsConclusions• Structurally random matrices

– Fast computable; – Memory efficient:

Hi hl l i d i f bl k• Highly sparse solution: random permutation ⇒ fast block diagonal transform ⇒ random sampling;

– Streaming capability: g p y• Solution with full streaming capability: random sign flipping

⇒ fast block diagonal transform ⇒ random sampling; Hardware friendly;– Hardware friendly;

– Performance:• Nearly optimal theoretical bounds;Nearly optimal theoretical bounds; • Numerical simulations: comparable with completely random

matrices

23

ReferencesReferences• E. Candès and J. Romberg, “Sparsity and incoherence in compressive sampling”, Inverse Problems,

23(3) pp. 969-985, 2007E C dè J R b d T T “R b i i i l E i l i f• E. Candès, J. Romberg, and T. Tao, “Robust uncertainty principles:Exact signal reconstruction from highly incomplete frequency information,” IEEE Trans. on Information Theory, vol. 52, pp. 489 – 509, Feb. 2006.

• E. Candès, J. Romberg, and T. Tao, “Stable signal recovery from incomplete and inaccurate measurements,” Communications on Pure and Applied Mathematics, vol. 59, pp. 1207–1223, Aug. 2006.

• M. F. Duarte, M. B. Wakin, and R. G. Baraniuk, “Fast reconstruction of piecewise smooth signals from incoherent projections,” SPARS’05, Rennes, France, Nov 2005.

• Basarab Matei and Yves Meyer “A variant of the compressed sensing of Emmanuel Candès ” PreprintBasarab Matei and Yves Meyer, A variant of the compressed sensing of Emmanuel Candès , Preprint, 2008.

• Mário A. T. Figueiredo, Robert D. Nowak, and Stephen J. Wright, “Gradient projection for sparse reconstruction: Application to compressed sensing and other inverse problems”, IEEE Journal of Selected Topics in Signal Processing: Special Issue on Convex Optimization Methods for Signal Processing 1(4)Topics in Signal Processing: Special Issue on Convex Optimization Methods for Signal Processing, 1(4), pp. 586-598, 2007.

24

• thanglong ece jhu edu/CS/fast cs SRM rar• thanglong.ece.jhu.edu/CS/fast_cs_SRM.rar

25