Date post: | 14-Apr-2018 |
Category: |
Documents |
Upload: | jay-prakash |
View: | 236 times |
Download: | 0 times |
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 1/54
AN INTRODUCTION TO COMPRESSIVE SENSING
Rodrigo B. Platte
School of Mathematical and Statistical Sciences
APM/EEE598 Reverse Engineering of Complex Dynamical Networks
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 2/54
INTRODUCTION INCOHERENCE RIP POLYNOMIAL MATRICES DYNAMICAL SYSTEMS
OUTLINE
1 INTRODUCTION
2 INCOHERENCE
3 RIP
4 POLYNOMIAL MATRICES
5 DYNAMICAL SYSTEMS
AN INTRODUCTION TO COMPRESSIVE SENSING R. PLATTE MATHEMATICS AND STATISTICS 2 / 37
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 3/54
INTRODUCTION INCOHERENCE RIP POLYNOMIAL MATRICES DYNAMICAL SYSTEMS
THE RICE DSP WEBSITE
Resources for papers, codes, and more ....
http://www.dsp.ece.rice.edu/cs/
References:
Emmanuel Candes, Compressive sampling. (Proc. International
Congress of Mathematics, 3, pp. 1433-1452, Madrid, Spain, 2006)Richard Baraniuk, A Lecture on Compressive Sensing. (IEEE
Signal Processing Magazine, July 2007)
Emmanuel Candes and Michael Wakin, An introduction to
compressive sampling. (IEEE Signal Processing Magazine, 25(2),pp. 21 - 30, March 2008)
m-files and some links are available in the course page
AN INTRODUCTION TO COMPRESSIVE SENSING R. PLATTE MATHEMATICS AND STATISTICS 3 / 37
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 5/54
INTRODUCTION INCOHERENCE RIP POLYNOMIAL MATRICES DYNAMICAL SYSTEMS
UNDERDETERMINED SYSTEMS
cafeperss.com $20AN INTRODUCTION TO COMPRESSIVE SENSING R. PLATTE MATHEMATICS AND STATISTICS 5 / 37
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 6/54
INTRODUCTION INCOHERENCE RIP POLYNOMIAL MATRICES DYNAMICAL SYSTEMS
UNDERDETERMINED SYSTEMS
=Solve
Ax = b ,
where A is m × N
and m < N .
AN INTRODUCTION TO COMPRESSIVE SENSING R. PLATTE MATHEMATICS AND STATISTICS 6 / 37
I I RIP P D
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 7/54
INTRODUCTION INCOHERENCE RIP POLYNOMIAL MATRICES DYNAMICAL SYSTEMS
UNDERDETERMINED SYSTEMS
=Solve
Ax = b ,
where A is m × N
and m < N .
In CS we want to obtain sparse solutions, i.e., x j ≈ 0, for several j s .
AN INTRODUCTION TO COMPRESSIVE SENSING R. PLATTE MATHEMATICS AND STATISTICS 6 / 37
INTRODUCTION INCOHERENCE RIP POLYNOMIAL MATRICES DYNAMICAL SYSTEMS
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 8/54
INTRODUCTION INCOHERENCE RIP POLYNOMIAL MATRICES DYNAMICAL SYSTEMS
UNDERDETERMINED SYSTEMS
=Solve
Ax = b ,
where A is m × N
and m < N .
In CS we want to obtain sparse solutions, i.e., x j ≈ 0, for several j s .
One option: Minimize x 1subject to Ax = b .
x p =|x 0|p + |x 2|p + · · ·+ |x N |p
1/
p
Why p = 1?
Remark: the location of nonzero x j ’s is not known in advance.
AN INTRODUCTION TO COMPRESSIVE SENSING R. PLATTE MATHEMATICS AND STATISTICS 6 / 37
INTRODUCTION INCOHERENCE RIP POLYNOMIAL MATRICES DYNAMICAL SYSTEMS
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 9/54
INTRODUCTION INCOHERENCE RIP POLYNOMIAL MATRICES DYNAMICAL SYSTEMS
WHY 1?Unit ball:
0, 1/2, 1, 2, 4, ∞
x p =|x 0|p + · · ·+ |x N |p
1/p
or, for 0 ≤ p < 1,
x p =|x 0|
p
+ · · ·+ |x N |p
x 0= # of nonzero entries in x
ideal (?) but leads to a NP-complete problem
p , with p < 1 is not a norm (triangular inequality). Also notpractical.
2 computationally easy but does not lead to sparse solutions.
The unique solution of minimum 2 norm is (pseudo-inverse)
x = A(AA)−1b
AN INTRODUCTION TO COMPRESSIVE SENSING R. PLATTE MATHEMATICS AND STATISTICS 7 / 37
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 10/54
INTRODUCTION INCOHERENCE RIP POLYNOMIAL MATRICES DYNAMICAL SYSTEMS
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 11/54
INTRODUCTION INCOHERENCE RIP POLYNOMIAL MATRICES DYNAMICAL SYSTEMS
SPARSITY AND THE 1-NORM (2D CASE)
EXAMPLE – 2
minx 1,x 2
x 21 + x 22 subject to a 1x 1 + a 2x 2 = b 1
x1
x 2
x2
1+ x
2
2> 0.8944
x2
1+ x
2
2< 0.8944
!1.5 !1 !0.5 0 0.5 1 1.5!1.5
!1
!0.5
0
0.5
1
1.5
AN INTRODUCTION TO COMPRESSIVE SENSING R. PLATTE MATHEMATICS AND STATISTICS 8 / 37
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 12/54
INTRODUCTION INCOHERENCE RIP POLYNOMIAL MATRICES DYNAMICAL SYSTEMS
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 13/54
MINIMIZING x 2
Recall Parseval’s Formula:f (t ) =
N k =0 x k φk (t ), with φk orthonormal in L2.
f
22 =
N
k =0 |
x k
|2.
Also, 2 penalizes heavily large values, while small values don’t affect
the norm significantly. In general will not give a sparse representation!
See matlab experiment! (Test-l1-l2.m)
AN INTRODUCTION TO COMPRESSIVE SENSING R. PLATTE MATHEMATICS AND STATISTICS 9 / 37
INTRODUCTION INCOHERENCE RIP POLYNOMIAL MATRICES DYNAMICAL SYSTEMS
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 14/54
MINIMIZING x 1
Matlab experimet! (Test-l1-l2.m)
Note: solution may not be unique!
Solve an optimization problem (in practice O (N 3) operations).
Several codes are available for CS see:
http://www.dsp.ece.rice.edu/cs/
AN INTRODUCTION TO COMPRESSIVE SENSING R. PLATTE MATHEMATICS AND STATISTICS 10 / 37
INTRODUCTION INCOHERENCE RIP POLYNOMIAL MATRICES DYNAMICAL SYSTEMS
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 15/54
A SIMPLE EXAMPLE
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1−0.2
−0.15
−0.1
−0.05
0
0.05
0.1
0.15
0.2
f (t ) =1√ N
N k =1
x k sin(πkt )
N = 1024, number of samples: m = 50
AN INTRODUCTION TO COMPRESSIVE SENSING R. PLATTE MATHEMATICS AND STATISTICS 11 / 37
INTRODUCTION INCOHERENCE RIP POLYNOMIAL MATRICES DYNAMICAL SYSTEMS
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 16/54
A SIMPLE EXAMPLE
System of equations:
f (t j ) =1√
1024
1024k =1
x k sin(πkt j ), j = 1 . . . 50
SOLVE:min x 1
subject to Ax = b ,
where A has 50 rows and 1024 columns.
A j ,k =1
√ 1024sin(πkt j ), b j = f (t j ).
Matlab code on Blackboard: ”SineExample.m” (uses CVX)
AN INTRODUCTION TO COMPRESSIVE SENSING R. PLATTE MATHEMATICS AND STATISTICS 11 / 37
INTRODUCTION INCOHERENCE RIP POLYNOMIAL MATRICES DYNAMICAL SYSTEMS
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 17/54
A SIMPLE EXAMPLE
0 200 400 600 800 1000 1200−1.5
−1
−0.5
0
0.5
1
1.5
2
original
decoded
Recovery of coefficients is accurate to almost machine precision!
x − x 02
x 02= 7.9611... × 10−11
AN INTRODUCTION TO COMPRESSIVE SENSING R. PLATTE MATHEMATICS AND STATISTICS 11 / 37
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 18/54
INTRODUCTION INCOHERENCE RIP POLYNOMIAL MATRICES DYNAMICAL SYSTEMS
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 19/54
WHY SPARSITY?
Sparsity is often a good regularization criteria because most signals
have structure.
Gray scale please!
AN INTRODUCTION TO COMPRESSIVE SENSING R. PLATTE MATHEMATICS AND STATISTICS 12 / 37
INTRODUCTION INCOHERENCE RIP POLYNOMIAL MATRICES DYNAMICAL SYSTEMS
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 20/54
WHY SPARSITY?
Sparsity is often a good regularization criteria because most signals
have structure.
50 100 150 200 250 300 350 400 450 500
50
100
150
200
250
300
350
400
450
500
Find wavelet coefficients. Daubechies(6,2), 3 vanish. moments
AN INTRODUCTION TO COMPRESSIVE SENSING R. PLATTE MATHEMATICS AND STATISTICS 12 / 37
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 21/54
INTRODUCTION INCOHERENCE RIP POLYNOMIAL MATRICES DYNAMICAL SYSTEMS
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 22/54
WHY SPARSITY?
Sparsity is often a good regularization criteria because most signals
have structure.
Restored image from 25% of the coefficients.
AN INTRODUCTION TO COMPRESSIVE SENSING R. PLATTE MATHEMATICS AND STATISTICS 12 / 37
INTRODUCTION INCOHERENCE RIP POLYNOMIAL MATRICES DYNAMICAL SYSTEMS
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 23/54
WHY SPARSITY?
Sparsity is often a good regularization criteria because most signals
have structure.
50 100 150 200 250 300 350 400 450 500
50
100
150
200
250
300
350
400
450
500
Relative error
≈3%.
AN INTRODUCTION TO COMPRESSIVE SENSING R. PLATTE MATHEMATICS AND STATISTICS 12 / 37
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 24/54
INTRODUCTION INCOHERENCE RIP POLYNOMIAL MATRICES DYNAMICAL SYSTEMS
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 25/54
WHY SPARSITY?
Sparsity is often a good regularization criteria because most signals
have structure.
Reconstructed image from 2% of the coefficients.
AN INTRODUCTION TO COMPRESSIVE SENSING R. PLATTE MATHEMATICS AND STATISTICS 12 / 37
INTRODUCTION INCOHERENCE RIP POLYNOMIAL MATRICES DYNAMICAL SYSTEMS
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 26/54
SPARSITY IS NOT SUFFICIENT FOR CS TO WORK!
Example: A is a finite difference matrixA maps a sparse vector x into another sparse vector y .
0
0
...1
−1
0.
..
=
1 0 0
· · ·0
−1 1 0 · · · 00 −1 1 · · · 0
. . . . . . . . . . . . . . . . . . . . .
0 0 · · · −1 1
0
0
...1
0
0.
..
AN INTRODUCTION TO COMPRESSIVE SENSING R. PLATTE MATHEMATICS AND STATISTICS 13 / 37
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 27/54
INTRODUCTION INCOHERENCE RIP POLYNOMIAL MATRICES DYNAMICAL SYSTEMS
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 28/54
SPARSITY IS NOT SUFFICIENT FOR CS TO WORK!
The image below is sparse in physical domain and Haar waveletcoefficients.
AN INTRODUCTION TO COMPRESSIVE SENSING R. PLATTE MATHEMATICS AND STATISTICS 14 / 37
INTRODUCTION INCOHERENCE RIP POLYNOMIAL MATRICES DYNAMICAL SYSTEMS
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 29/54
A GENERAL APPROACH
Sample coefficients in a representation by random vectors.
y =N
k =1
< y , ψk > ψk ,
ψk are obtained from orthogonalized Gaussian matrices.
Ax = y
⇒ΨAx = Ψy
⇒Θx = z
AN INTRODUCTION TO COMPRESSIVE SENSING R. PLATTE MATHEMATICS AND STATISTICS 15 / 37
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 31/54
INTRODUCTION INCOHERENCE RIP POLYNOMIAL MATRICES DYNAMICAL SYSTEMS
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 32/54
INCOHERENCE + SPARSITY IS NEEDED
NUMERICAL EXPERIMENT
Signal recovered from Fourier coefficients:
0 100 200 300 400 500 600−2
−1.5
−1
−0.5
0
0.5
1
1.5
2
original
decoded
Code ”FourierSampling.m”.
AN INTRODUCTION TO COMPRESSIVE SENSING R. PLATTE MATHEMATICS AND STATISTICS 16 / 37
INTRODUCTION INCOHERENCE RIP POLYNOMIAL MATRICES DYNAMICAL SYSTEMS
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 33/54
INCOHERENT SAMPLING
Let (Φ,Ψ) be orthonormal bases of R n .
f (t ) =n
i =1
x i ψi (t ) and y k = f , ϕk , k = 1, . . . , m .
Representation matrix: Ψ = [ψ1 ψ2 · · · ψn ]Sensing matrix: Φ = [ϕ1 ϕ2
· · ·ϕn ]
COHERENCE BETWEEN Φ AND Ψ
µ(Φ,Ψ) =√
n max1≤ j ,k ≤n
|ϕk , ψ j |.
Remark: µ(Φ,Ψ) ∈ [1,√
n ]Upper bound: Cauchy-Schwarz
Lower bound: ΨT Φ is also orthonormal, hence
|ϕk , ψ j
|2 = 1
⇒max j
|ϕk , ψ j
| ≥1/
√ n
AN INTRODUCTION TO COMPRESSIVE SENSING R. PLATTE MATHEMATICS AND STATISTICS 17 / 37
INTRODUCTION INCOHERENCE RIP POLYNOMIAL MATRICES DYNAMICAL SYSTEMS
A
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 34/54
A GENERAL RESULT FOR SPARSE RECOVERY
f (t ) =n
i =1
x i ψi (t ) and y k =
f , ϕk
, k = 1, . . . , m .
Consider the optimization problem:
minx ∈R n
x 1subject to y k = Ψx , ϕk , k = 1, . . . , m .
THEOREM (CANDES AND ROMBERG, 2007)
Fix f ∈ R n and suppose that the coefficient sequence x of f in the
basis Ψ is s -sparse. Select m measurements in the Φ domain
uniformly at random. Then if
m ≥ C µ2(Φ,Ψ) S log(n /δ )
for some positive constant C , the solution of the problem above is
exact with probability exceeding 1
−δ .
AN INTRODUCTION TO COMPRESSIVE SENSING R. PLATTE MATHEMATICS AND STATISTICS 18 / 37
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 35/54
INTRODUCTION INCOHERENCE RIP POLYNOMIAL MATRICES DYNAMICAL SYSTEMS
M SO O S O O
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 36/54
MULTIPLE SOLUTIONS OF MIN 1-NORM
f (t ) =a 0
2+
N
k =1
a k
cos(πkt ) +N
k =1
b k
sin(πkt ), t ∈[−
1, 1]
Data: f (−1) = 1, f (0) = 1, f (1) = 1
AN INTRODUCTION TO COMPRESSIVE SENSING R. PLATTE MATHEMATICS AND STATISTICS 20 / 37
INTRODUCTION INCOHERENCE RIP POLYNOMIAL MATRICES DYNAMICAL SYSTEMS
MULTIPLE SOLUTIONS OF MIN NORM
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 37/54
MULTIPLE SOLUTIONS OF MIN 1-NORM
f (t ) =a 0
2+
N
k =1
a k
cos(πkt ) +N
k =1
b k
sin(πkt ), t ∈[−
1, 1]
Data: f (−1) = 1, f (0) = 1, f (1) = 1
even function: b k = 0
Solutions of min 1:
{a 2 = 1, a k = 0(k
= 2)
},
{a 4 = 1, a k = 0(k
= 4)
},
. . .AN INTRODUCTION TO COMPRESSIVE SENSING R. PLATTE MATHEMATICS AND STATISTICS 20 / 37
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 41/54
INTRODUCTION INCOHERENCE RIP POLYNOMIAL MATRICES DYNAMICAL SYSTEMS
POLYNOMIAL MATRICES
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 42/54
POLYNOMIAL MATRICES
Back to Dr. Lai’s dynamical system problem:
d x
dt = F(x(t )),
with
[F(x(t ))] j =k 1
k 2 · · ·k m
(a j )k 1k 2···k m x k 1
1 (t ) . . . x k m m (t )
This does not fit in classical CS-results.
monomial basis becomes ill-conditioned even for small powers
we know condition numbers of Vadermonde depend on where x isevaluated.
Some CS results are available for orthogonal polynomials.
AN INTRODUCTION TO COMPRESSIVE SENSING R. PLATTE MATHEMATICS AND STATISTICS 25 / 37
INTRODUCTION INCOHERENCE RIP POLYNOMIAL MATRICES DYNAMICAL SYSTEMS
ORTHOGONAL POLYNOMIALS
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 43/54
ORTHOGONAL POLYNOMIALS
For Chebyshev polynomials expansions we have that
f (x ) ≈N
k =0
λk cos (k arccos (x ))
If we let y = arccos(x ) or x = cos (y ),
f (cos (y )) ≈N
k =0
λk cos (ky )
A Chebyshev expansion is equivalent to a cosine expansion on the
variable y .
Results carry over from Fourier expansions but with samples chosenindependently according to the chebyshev measure
d ν (x ) = π−1(1 − x 2)−1/2dx
AN INTRODUCTION TO COMPRESSIVE SENSING R. PLATTE MATHEMATICS AND STATISTICS 26 / 37
INTRODUCTION INCOHERENCE RIP POLYNOMIAL MATRICES DYNAMICAL SYSTEMS
SPARSE LEGENDRE EXPANSIONS
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 44/54
SPARSE LEGENDRE EXPANSIONS
Rauhut and Ward (2010) proved that the same type sampling applies
for Legendre exapasions.
How about polynomial expansions as power series?
AN INTRODUCTION TO COMPRESSIVE SENSING R. PLATTE MATHEMATICS AND STATISTICS 27 / 37
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 45/54
INTRODUCTION INCOHERENCE RIP POLYNOMIAL MATRICES DYNAMICAL SYSTEMS
ROBERT THOMPSON’S EXPERIMENTS
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 46/54
m/N
k / m
1d polynomial recovery for N = 36, uniform sampling
0.25 0.5 0.75 1
0.75
0.5
0.25
0
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.2
0
m/N
k / m
1d polynomial recovery for N = 36, Chebyshev sampling
0.25 0.5 0.75 1
0.75
0.5
0.25
0
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.2
0
Each pixel, 50 experiments: choose random polynomial with knon-zero Gaussian i.i.d coefficients, measure m samples, attempt to
recover polynomial coefficients.Sampling at Chebyshev points give (very) slightly better results thanuniform points.
AN INTRODUCTION TO COMPRESSIVE SENSING R. PLATTE MATHEMATICS AND STATISTICS 29 / 37
INTRODUCTION INCOHERENCE RIP POLYNOMIAL MATRICES DYNAMICAL SYSTEMS
ROBERT THOMPSON’S EXPERIMENTS
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 47/54
Consider linear combinations of Chebyshev polynomials:
y =�N
i=1 T i(t), T i(t) = cos(i arccos(t))Φm: m randomly chosen rows of identity matrix.
And assume that x is K -sparse.
td according to some distribution in (−1, 1).
ym =
f (td1)f (td2)
...
...
f (tdm)
m
= Φm
T 0(t0) T 1(t0) . . . T N (t0)T 0(t1) T 1(t1) . . . T N (t1)
......
......
T 0(tN ) T 1(tN ) . . . T N (tN )
x1x2
...
...
xN
AN INTRODUCTION TO COMPRESSIVE SENSING R. PLATTE MATHEMATICS AND STATISTICS 30 / 37
INTRODUCTION INCOHERENCE RIP POLYNOMIAL MATRICES DYNAMICAL SYSTEMS
ROBERT THOMPSON’S EXPERIMENTS
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 48/54
Vandermonde
m/N
k / m
1d polynomial recovery for N = 36, Chebyshev sampling
0.25 0.5 0.75 1
0.75
0.5
0.25
0
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.2
0
ChebyshevSparse 1d Chebyshev polynomial recovery, N = 36
m/N
k / m
0.25 0.5 0.75 1
0.75
0.5
0.25
0
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.2
0
Using Chebyshev basis functions, we realize improvement as m
increases.
AN INTRODUCTION TO COMPRESSIVE SENSING R. PLATTE MATHEMATICS AND STATISTICS 31 / 37
INTRODUCTION INCOHERENCE RIP POLYNOMIAL MATRICES DYNAMICAL SYSTEMS
ROBERT THOMPSON’S EXPERIMENTS
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 49/54
Columns of C are orthogonal.
All vectors will be distinguishable if we use full C .
If we use less than full C , orthogonality is lost, some vectors start to
become indistinguishable.
V
−0.8
−0.6
−0.4
−0.2
0
0.2
0.4
0.6
0.8
1
C
−0.8
−0.6
−0.4
−0.2
0
0.2
0.4
0.6
0.8
1
AN INTRODUCTION TO COMPRESSIVE SENSING R. PLATTE MATHEMATICS AND STATISTICS 32 / 37
INTRODUCTION INCOHERENCE RIP POLYNOMIAL MATRICES DYNAMICAL SYSTEMS
ROBERT THOMPSON’S EXPERIMENTS
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 50/54
What about 2-D polynomials?
In natural basis: f (t, u) =�
i+ j=0..Q
xijtiu j
(td, ud) according to some distribution in (−1, 1)× (−1, 1).
ym =
f (td1 , ud1)f (td2 , ud2)
...
...
f (tdm , udm)
m
= Φm
1 t0 u0 t0u0 t20 u20 . . .
1 t1 u1 t1u1 t21 u21 . . ....
...
... ...
1 tN uN tN uN t2N u2N . . .
x00x10x01x11x20
...
AN INTRODUCTION TO COMPRESSIVE SENSING R. PLATTE MATHEMATICS AND STATISTICS 33 / 37
INTRODUCTION
INCOHERENCE
RIP POLYNOMIAL MATRICES
DYNAMICAL SYSTEMS
ROBERT THOMPSON’S EXPERIMENTS
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 51/54
2d polynomial recovery, N = 36
m/N
k / m
0.25 0.5 0.75 1
0.75
0.5
0.25
0
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.2
0
Similar to 1-d results.
Again increasing m doesn’t change much.
AN INTRODUCTION TO COMPRESSIVE SENSING R. PLATTE MATHEMATICS AND STATISTICS 34 / 37
INTRODUCTION
INCOHERENCE
RIP POLYNOMIAL MATRICES
DYNAMICAL SYSTEMS
ROBERT THOMPSON’S EXPERIMENTS (BACK TO DYNAMICAL SYSTEMS)
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 52/54
xn+1 = f (xn) = rxn(1− xn)
Coefficient vector: (0, r,−
r,0, . . . )We can recover the system equation in chaotic regime taking about
10 sample pairs or more.
0 5 10 15 20 25 30 350.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
n
x n
Sampling the logistic map, m = 10
0 5 10 15 20 25 30 3510
−8
10−6
10−4
10−2
100
102
104
106
m
|
|
c ∗
−
c |
|
2
Recovery error for logistic map, r = 3 .7
AN INTRODUCTION TO COMPRESSIVE SENSING R. PLATTE MATHEMATICS AND STATISTICS 35 / 37
INTRODUCTION INCOHERENCE RIP POLYNOMIAL MATRICES DYNAMICAL SYSTEMS
ROBERT THOMPSON’S EXPERIMENTS (BACK TO DYNAMICAL SYSTEMS)
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 53/54
5 10 15 20 25 30 35
2.4
2.5
2.6
2.7
2.8
2.9
3
3.1
3.2
3.3
3.4
3.5
3.6
3.7
3.8
3.9
4
0.9
0.8
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
Sensitive to the dynamics determined by r.
(Bifurcation diagram: Wikipedia).
AN INTRODUCTION TO COMPRESSIVE SENSING R. PLATTE MATHEMATICS AND STATISTICS 36 / 37
INTRODUCTION INCOHERENCE RIP POLYNOMIAL MATRICES DYNAMICAL SYSTEMS
FINAL REMARKS
7/27/2019 Apm598 Cs Intro
http://slidepdf.com/reader/full/apm598-cs-intro 54/54
As previously pointed by Dr. Lai – recovery seems impractical with
monomial basis of large degree. Change of basis to orthogonalpolynomials result in full coefficients.
Considering small degree expansions in high dimensions – what
is the optimal sampling strategy?
How about a system of PDEs? For example,
u t = u (1 − u )− uv +u
v t = v (1 − v ) + uv +v
Thanks! In particular to Robert Thompson and Wen Xu.
AN INTRODUCTION TO COMPRESSIVE SENSING R. PLATTE MATHEMATICS AND STATISTICS 37 / 37