Low Rank Approximation withEntrywise `1-Norm Error
Zhao Song, David P. Woodruff, Peilin Zhong
UT-Austin, IBM Almaden, Columbia University
. .
. .
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 1 / 31
`1-Low Rank Approximations. .
. .Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 2 / 31
`1-Low Rank Approximations. .
. .
Given : A ∈ Rn×d , n > d , k ∈ N, α > 1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 2 / 31
`1-Low Rank Approximations. .
. .
Given : A ∈ Rn×d , n > d , k ∈ N, α > 1
Output :
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 2 / 31
`1-Low Rank Approximations. .
. .
Given : A ∈ Rn×d , n > d , k ∈ N, α > 1
Output : rank-k A ∈ Rn×d s.t.
A A−
1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 2 / 31
`1-Low Rank Approximations. .
. .
Given : A ∈ Rn×d , n > d , k ∈ N, α > 1
Output : rank-k A ∈ Rn×d s.t.‖A − A‖1 6 α · min
rank−k A ′‖A ′ − A‖1
A A−
1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 2 / 31
`1-Low Rank Approximations. .
. .
Given : A ∈ Rn×d , n > d , k ∈ N, α > 1
Output : rank-k A ∈ Rn×d s.t.‖A − A‖1 6 α · min
rank−k A ′‖A ′ − A‖1
.
∑i,j|A ′i,j − Ai,j |
A A−
1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 2 / 31
`1-Low Rank Approximations. .
. .
Given : A ∈ Rn×d , n > d , k ∈ N, α > 1
Output : rank-k A ∈ Rn×d s.t.‖A − A‖1 6 α · min
rank−k A ′‖A ′ − A‖1
.
∑i,j|A ′i,j − Ai,j |
.OPT
A A−
1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 2 / 31
`1-Low Rank Approximations. .
. .
Given : A ∈ Rn×d , n > d , k ∈ N, α > 1
Output :
..
U ∈ Rn×k ,V ∈ Rk×d s.t.‖UV − A‖1 6 α · OPT
U
V
A−
1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 2 / 31
Why is `1-low Rank Approximation Interesting?
It is more robust than the Frobenius norm in the presence ofoutliers
It is indicated in models where Gaussian assumptions on thenoise may not applyThe problem was shown to be NP-hard by Gillis-Vavasis’15 and anumber of heuristics have been proposedIt was asked in multiple places if there are any approximationalgorithms
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 3 / 31
Why is `1-low Rank Approximation Interesting?
It is more robust than the Frobenius norm in the presence ofoutliersIt is indicated in models where Gaussian assumptions on thenoise may not apply
The problem was shown to be NP-hard by Gillis-Vavasis’15 and anumber of heuristics have been proposedIt was asked in multiple places if there are any approximationalgorithms
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 3 / 31
Why is `1-low Rank Approximation Interesting?
It is more robust than the Frobenius norm in the presence ofoutliersIt is indicated in models where Gaussian assumptions on thenoise may not applyThe problem was shown to be NP-hard by Gillis-Vavasis’15 and anumber of heuristics have been proposed
It was asked in multiple places if there are any approximationalgorithms
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 3 / 31
Why is `1-low Rank Approximation Interesting?
It is more robust than the Frobenius norm in the presence ofoutliersIt is indicated in models where Gaussian assumptions on thenoise may not applyThe problem was shown to be NP-hard by Gillis-Vavasis’15 and anumber of heuristics have been proposedIt was asked in multiple places if there are any approximationalgorithms
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 3 / 31
Visualization of `p-norm. .
. .Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 4 / 31
Visualization of `p-norm. .
. .
p = ∞ p = 2 2 > p > 1 p = 1 1 > p > 0 p = 0
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 4 / 31
Thought Experiments. .
. .Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 5 / 31
Thought Experiments. .
. .
Given : A set of points in 2-dimension
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 5 / 31
Thought Experiments. .
. .
Given : A set of points in 2-dimension
Goal : Find a line to fit those points
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 5 / 31
Thought Experiments. .
. .
Given : A set of points in 2-dimension
Goal : Find a line to fit those points
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 5 / 31
Thought Experiments. .
. .
Given : A set of points in 2-dimension
Goal : Find a line to fit those points
.
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 5 / 31
Thought Experiments. .
. .
Given : A set of points in 2-dimension
Goal : Find a line to fit those points
.
Output : `2 minimizer line,
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 5 / 31
Thought Experiments. .
. .
Given : A set of points in 2-dimension
Goal : Find a line to fit those points
.
Output : `2 minimizer line, `1 minimizer line
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 5 / 31
Thought Experiments. .
. .Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 6 / 31
Thought Experiments. .
. .
Given : k = 1 and A =
4 0 0 00 1 1 10 1 1 10 1 1 1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 6 / 31
Thought Experiments. .
. .
Given : k = 1 and A =
4 0 0 00 1 1 10 1 1 10 1 1 1
Output : rank-1 ‖‖F−solution A =
4 0 0 00 0 0 00 0 0 00 0 0 0
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 6 / 31
Thought Experiments. .
. .
Given : k = 1 and A =
4 0 0 00 1 1 10 1 1 10 1 1 1
Output : rank-1 ‖‖F−solution A =
4 0 0 00 0 0 00 0 0 00 0 0 0
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 6 / 31
Thought Experiments. .
. .
Given : k = 1 and A =
4 0 0 00 1 1 10 1 1 10 1 1 1
Output : rank-1 ‖‖F−solution A =
4 0 0 00 0 0 00 0 0 00 0 0 0
Output : rank-1 ‖‖1−solution A =
0 0 0 00 1 1 10 1 1 10 1 1 1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 6 / 31
Thought Experiments. .
. .
Given : k = 1 and A =
4 0 0 00 1 1 10 1 1 10 1 1 1
Output : rank-1 ‖‖F−solution A =
4 0 0 00 0 0 00 0 0 00 0 0 0
Output : rank-1 ‖‖1−solution A =
0 0 0 00 1 1 10 1 1 10 1 1 1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 6 / 31
Real-life Applications and Datasets
Recover image with outliers3D model reconstruction from a sequence of 2D imagesBackground modeling from surveillance videoMore applications
. .
. .
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 7 / 31
Real-life Applications and DatasetsRecover image with outliers
3D model reconstruction from a sequence of 2D imagesBackground modeling from surveillance videoMore applications
. .
. .
2D Image
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 7 / 31
Real-life Applications and DatasetsRecover image with outliers3D model reconstruction from a sequence of 2D images
Background modeling from surveillance videoMore applications
. .
. .
2D Image
3DReconstruction
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 7 / 31
Real-life Applications and DatasetsRecover image with outliers3D model reconstruction from a sequence of 2D imagesBackground modeling from surveillance video
More applications
. .
. .
2D Image
3DReconstruction
Video
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 7 / 31
Real-life Applications and DatasetsRecover image with outliers3D model reconstruction from a sequence of 2D imagesBackground modeling from surveillance videoMore applications
. .
. .
2D Image
3DReconstruction
Video
Milk,glasshandwritten
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 7 / 31
Other `1 Problems
Linear regression Clarkson’05, Sohler-W’11,Clarkson-Drineas-Magon-Ismail-Mahoney-Meng-W’13,Clarkson-W’13, Li-Miller-Peng’13 W-Zhang’13,Mahoney-Meng’13, Cohen-Peng’15, Yang-Chow-Re-Mahoney’16
I Given A,b, solve minx ‖Ax − b‖1
Sparse recovery Gormode-Muthukrishnan’04,Gilbert-Strauss-Tropp-Vershynin’06, Indyk-Ruzic’08,Berinde-Indyk-Ruzic’08, Berinde-Gilbert-Indyk-Harloff-Strauss’08,Do Ba-Indyk-Price-W’10, Nelson-Nguyen-W’12,Bhattacharyya-Nair’14, Osher-Yin’14
I Given Ax , find x s.t. ‖x − x‖1 6 C mink−sparse x ′ ‖x ′ − x‖1
Regularizers for sparsity(Lasso) Tibshirani’96, Breiman’95,Tibshirani’97
I Given A,b, solve minx ‖Ax − b‖22 + λ‖x‖1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 8 / 31
Other `1 Problems
Linear regression Clarkson’05, Sohler-W’11,Clarkson-Drineas-Magon-Ismail-Mahoney-Meng-W’13,Clarkson-W’13, Li-Miller-Peng’13 W-Zhang’13,Mahoney-Meng’13, Cohen-Peng’15, Yang-Chow-Re-Mahoney’16
I Given A,b, solve minx ‖Ax − b‖1
Sparse recovery Gormode-Muthukrishnan’04,Gilbert-Strauss-Tropp-Vershynin’06, Indyk-Ruzic’08,Berinde-Indyk-Ruzic’08, Berinde-Gilbert-Indyk-Harloff-Strauss’08,Do Ba-Indyk-Price-W’10, Nelson-Nguyen-W’12,Bhattacharyya-Nair’14, Osher-Yin’14
I Given Ax , find x s.t. ‖x − x‖1 6 C mink−sparse x ′ ‖x ′ − x‖1
Regularizers for sparsity(Lasso) Tibshirani’96, Breiman’95,Tibshirani’97
I Given A,b, solve minx ‖Ax − b‖22 + λ‖x‖1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 8 / 31
Other `1 Problems
Linear regression Clarkson’05, Sohler-W’11,Clarkson-Drineas-Magon-Ismail-Mahoney-Meng-W’13,Clarkson-W’13, Li-Miller-Peng’13 W-Zhang’13,Mahoney-Meng’13, Cohen-Peng’15, Yang-Chow-Re-Mahoney’16
I Given A,b, solve minx ‖Ax − b‖1
Sparse recovery Gormode-Muthukrishnan’04,Gilbert-Strauss-Tropp-Vershynin’06, Indyk-Ruzic’08,Berinde-Indyk-Ruzic’08, Berinde-Gilbert-Indyk-Harloff-Strauss’08,Do Ba-Indyk-Price-W’10, Nelson-Nguyen-W’12,Bhattacharyya-Nair’14, Osher-Yin’14
I Given Ax , find x s.t. ‖x − x‖1 6 C mink−sparse x ′ ‖x ′ − x‖1
Regularizers for sparsity(Lasso) Tibshirani’96, Breiman’95,Tibshirani’97
I Given A,b, solve minx ‖Ax − b‖22 + λ‖x‖1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 8 / 31
Other `1 Problems
Linear regression Clarkson’05, Sohler-W’11,Clarkson-Drineas-Magon-Ismail-Mahoney-Meng-W’13,Clarkson-W’13, Li-Miller-Peng’13 W-Zhang’13,Mahoney-Meng’13, Cohen-Peng’15, Yang-Chow-Re-Mahoney’16
I Given A,b, solve minx ‖Ax − b‖1
Sparse recovery Gormode-Muthukrishnan’04,Gilbert-Strauss-Tropp-Vershynin’06, Indyk-Ruzic’08,Berinde-Indyk-Ruzic’08, Berinde-Gilbert-Indyk-Harloff-Strauss’08,Do Ba-Indyk-Price-W’10, Nelson-Nguyen-W’12,Bhattacharyya-Nair’14, Osher-Yin’14
I Given Ax , find x s.t. ‖x − x‖1 6 C mink−sparse x ′ ‖x ′ − x‖1
Regularizers for sparsity(Lasso) Tibshirani’96, Breiman’95,Tibshirani’97
I Given A,b, solve minx ‖Ax − b‖22 + λ‖x‖1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 8 / 31
Other `1 Problems
Linear regression Clarkson’05, Sohler-W’11,Clarkson-Drineas-Magon-Ismail-Mahoney-Meng-W’13,Clarkson-W’13, Li-Miller-Peng’13 W-Zhang’13,Mahoney-Meng’13, Cohen-Peng’15, Yang-Chow-Re-Mahoney’16
I Given A,b, solve minx ‖Ax − b‖1
Sparse recovery Gormode-Muthukrishnan’04,Gilbert-Strauss-Tropp-Vershynin’06, Indyk-Ruzic’08,Berinde-Indyk-Ruzic’08, Berinde-Gilbert-Indyk-Harloff-Strauss’08,Do Ba-Indyk-Price-W’10, Nelson-Nguyen-W’12,Bhattacharyya-Nair’14, Osher-Yin’14
I Given Ax , find x s.t. ‖x − x‖1 6 C mink−sparse x ′ ‖x ′ − x‖1
Regularizers for sparsity(Lasso) Tibshirani’96, Breiman’95,Tibshirani’97
I Given A,b, solve minx ‖Ax − b‖22 + λ‖x‖1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 8 / 31
Other `1 Problems
Linear regression Clarkson’05, Sohler-W’11,Clarkson-Drineas-Magon-Ismail-Mahoney-Meng-W’13,Clarkson-W’13, Li-Miller-Peng’13 W-Zhang’13,Mahoney-Meng’13, Cohen-Peng’15, Yang-Chow-Re-Mahoney’16
I Given A,b, solve minx ‖Ax − b‖1
Sparse recovery Gormode-Muthukrishnan’04,Gilbert-Strauss-Tropp-Vershynin’06, Indyk-Ruzic’08,Berinde-Indyk-Ruzic’08, Berinde-Gilbert-Indyk-Harloff-Strauss’08,Do Ba-Indyk-Price-W’10, Nelson-Nguyen-W’12,Bhattacharyya-Nair’14, Osher-Yin’14
I Given Ax , find x s.t. ‖x − x‖1 6 C mink−sparse x ′ ‖x ′ − x‖1
Regularizers for sparsity(Lasso) Tibshirani’96, Breiman’95,Tibshirani’97
I Given A,b, solve minx ‖Ax − b‖22 + λ‖x‖1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 8 / 31
Previous Work
Non `1-setting, Low rank approximation and related
I Frobenious norm Sarlos’06, Mahoney-Meng’13, Li-Miller-Peng’13Clarkson-W’09, Clarkson-W’13, Nelson-Nguyen’13,Bourgain-Dirksen-Nelson’15, Cohen’16
I Spectral norm W’14, Cohen-Lee-Musco-Musco-Peng-Sidford’15,Cohen-Elder-Musco-Musco-Persu’15
I Weighted low rank approximation Srebro-Jaakkola’03,Gillis-Glineur’11, Razenshteyn-Song-W’16, Li-Liang-Risteski’16
I Matrix completion Candes-Recht’09, Candes-Recht’10,Keshavan’12, Hardt-Wootters’14, Jain-Netrapalli-Sanghavi’13,Hardt’15, Sun-Luo’15, Peeters’96, Gillis-Glineur’11,Hardt-Meka-Raghavendra-Weitz’14
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 9 / 31
Previous Work
Non `1-setting, Low rank approximation and relatedI Frobenious norm Sarlos’06, Mahoney-Meng’13, Li-Miller-Peng’13
Clarkson-W’09, Clarkson-W’13, Nelson-Nguyen’13,Bourgain-Dirksen-Nelson’15, Cohen’16
I Spectral norm W’14, Cohen-Lee-Musco-Musco-Peng-Sidford’15,Cohen-Elder-Musco-Musco-Persu’15
I Weighted low rank approximation Srebro-Jaakkola’03,Gillis-Glineur’11, Razenshteyn-Song-W’16, Li-Liang-Risteski’16
I Matrix completion Candes-Recht’09, Candes-Recht’10,Keshavan’12, Hardt-Wootters’14, Jain-Netrapalli-Sanghavi’13,Hardt’15, Sun-Luo’15, Peeters’96, Gillis-Glineur’11,Hardt-Meka-Raghavendra-Weitz’14
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 9 / 31
Previous Work
Non `1-setting, Low rank approximation and relatedI Frobenious norm Sarlos’06, Mahoney-Meng’13, Li-Miller-Peng’13
Clarkson-W’09, Clarkson-W’13, Nelson-Nguyen’13,Bourgain-Dirksen-Nelson’15, Cohen’16
I Spectral norm W’14, Cohen-Lee-Musco-Musco-Peng-Sidford’15,Cohen-Elder-Musco-Musco-Persu’15
I Weighted low rank approximation Srebro-Jaakkola’03,Gillis-Glineur’11, Razenshteyn-Song-W’16, Li-Liang-Risteski’16
I Matrix completion Candes-Recht’09, Candes-Recht’10,Keshavan’12, Hardt-Wootters’14, Jain-Netrapalli-Sanghavi’13,Hardt’15, Sun-Luo’15, Peeters’96, Gillis-Glineur’11,Hardt-Meka-Raghavendra-Weitz’14
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 9 / 31
Previous Work
Non `1-setting, Low rank approximation and relatedI Frobenious norm Sarlos’06, Mahoney-Meng’13, Li-Miller-Peng’13
Clarkson-W’09, Clarkson-W’13, Nelson-Nguyen’13,Bourgain-Dirksen-Nelson’15, Cohen’16
I Spectral norm W’14, Cohen-Lee-Musco-Musco-Peng-Sidford’15,Cohen-Elder-Musco-Musco-Persu’15
I Weighted low rank approximation Srebro-Jaakkola’03,Gillis-Glineur’11, Razenshteyn-Song-W’16, Li-Liang-Risteski’16
I Matrix completion Candes-Recht’09, Candes-Recht’10,Keshavan’12, Hardt-Wootters’14, Jain-Netrapalli-Sanghavi’13,Hardt’15, Sun-Luo’15, Peeters’96, Gillis-Glineur’11,Hardt-Meka-Raghavendra-Weitz’14
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 9 / 31
Previous Work
Non `1-setting, Low rank approximation and relatedI Frobenious norm Sarlos’06, Mahoney-Meng’13, Li-Miller-Peng’13
Clarkson-W’09, Clarkson-W’13, Nelson-Nguyen’13,Bourgain-Dirksen-Nelson’15, Cohen’16
I Spectral norm W’14, Cohen-Lee-Musco-Musco-Peng-Sidford’15,Cohen-Elder-Musco-Musco-Persu’15
I Weighted low rank approximation Srebro-Jaakkola’03,Gillis-Glineur’11, Razenshteyn-Song-W’16, Li-Liang-Risteski’16
I Matrix completion Candes-Recht’09, Candes-Recht’10,Keshavan’12, Hardt-Wootters’14, Jain-Netrapalli-Sanghavi’13,Hardt’15, Sun-Luo’15, Peeters’96, Gillis-Glineur’11,Hardt-Meka-Raghavendra-Weitz’14
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 9 / 31
Heuristics and Nearly `1 Setting`1-setting
I Heuristic algorithms Ke-Kanade’05, Ding-Zhou-He-Zha’06,Kwak’08, Brooks-Dula-Boone’13
I Robust PCA Candes-Li-Ma-Wright’11,Wright-Ganesh-Rao-Peng-Ma’09,Netrapalli-Niranjan-Sanghavi-Anandkumar-Jain’14,Yi-Park-Chen-Caramanis’16
. .
. .Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 10 / 31
Heuristics and Nearly `1 Setting`1-setting
I Heuristic algorithms Ke-Kanade’05, Ding-Zhou-He-Zha’06,Kwak’08, Brooks-Dula-Boone’13
I Robust PCA Candes-Li-Ma-Wright’11,Wright-Ganesh-Rao-Peng-Ma’09,Netrapalli-Niranjan-Sanghavi-Anandkumar-Jain’14,Yi-Park-Chen-Caramanis’16
. .
. .
.
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 10 / 31
Heuristics and Nearly `1 Setting`1-setting
I Heuristic algorithms Ke-Kanade’05, Ding-Zhou-He-Zha’06,Kwak’08, Brooks-Dula-Boone’13
I Robust PCA Candes-Li-Ma-Wright’11,Wright-Ganesh-Rao-Peng-Ma’09,Netrapalli-Niranjan-Sanghavi-Anandkumar-Jain’14,Yi-Park-Chen-Caramanis’16
. .
. .
.For any β ∈ (0,0.5) and γ > 0, k = 3
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 10 / 31
Heuristics and Nearly `1 Setting`1-setting
I Heuristic algorithms Ke-Kanade’05, Ding-Zhou-He-Zha’06,Kwak’08, Brooks-Dula-Boone’13
I Robust PCA Candes-Li-Ma-Wright’11,Wright-Ganesh-Rao-Peng-Ma’09,Netrapalli-Niranjan-Sanghavi-Anandkumar-Jain’14,Yi-Park-Chen-Caramanis’16
. .
. .
.For any β ∈ (0,0.5) and γ > 0, k = 3Construct A ∈ R(2n+2)×(2n+2) as follows
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 10 / 31
Heuristics and Nearly `1 Setting`1-setting
I Heuristic algorithms Ke-Kanade’05, Ding-Zhou-He-Zha’06,Kwak’08, Brooks-Dula-Boone’13
I Robust PCA Candes-Li-Ma-Wright’11,Wright-Ganesh-Rao-Peng-Ma’09,Netrapalli-Niranjan-Sanghavi-Anandkumar-Jain’14,Yi-Park-Chen-Caramanis’16
. .
. .
.For any β ∈ (0,0.5) and γ > 0, k = 3Construct A ∈ R(2n+2)×(2n+2) as follows
A =
n2+γ 0 0 0
0 n1.5+β 0 00 0 B 00 0 0 B
where B ∈ Rn×n is all 1s
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 10 / 31
Heuristics and Nearly `1 Setting`1-setting
I Heuristic algorithms Ke-Kanade’05, Ding-Zhou-He-Zha’06,Kwak’08, Brooks-Dula-Boone’13
I Robust PCA Candes-Li-Ma-Wright’11,Wright-Ganesh-Rao-Peng-Ma’09,Netrapalli-Niranjan-Sanghavi-Anandkumar-Jain’14,Yi-Park-Chen-Caramanis’16
. .
. .
.For any β ∈ (0,0.5) and γ > 0, k = 3Construct A ∈ R(2n+2)×(2n+2) as follows
A =
n2+γ 0 0 0
0 n1.5+β 0 00 0 B 00 0 0 B
where B ∈ Rn×n is all 1s
Then any of them is not able to achievebetter than α = nmin(γ,0.5−β) approximation ratio!
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 10 / 31
Heuristics and Nearly `1 Setting`1-setting
I Heuristic algorithms Ke-Kanade’05, Ding-Zhou-He-Zha’06,Kwak’08, Brooks-Dula-Boone’13
I Robust PCA Candes-Li-Ma-Wright’11,Wright-Ganesh-Rao-Peng-Ma’09,Netrapalli-Niranjan-Sanghavi-Anandkumar-Jain’14,Yi-Park-Chen-Caramanis’16
. .
. .
.For any β ∈ (0,0.5) and γ > 0, k = 3Construct A ∈ R(2n+2)×(2n+2) as follows
A =
n2+γ 0 0 0
0 n1.5+β 0 00 0 B 00 0 0 B
where B ∈ Rn×n is all 1s
Then any of them is not able to achievebetter than α = nmin(γ,0.5−β) approximation ratio!
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 10 / 31
Main Question
QuestionGiven matrix A ∈ Rn×d , is there a provable aglorithm that is able tooutput a (factorization of a) rank-k matrix A such that
‖A − A‖1 6 α · minrank−k A ′
‖A ′ − A‖1?
(From the other perspective) Is there any inapproximability hardness?
This question has been asked at least 4 different places.
I Kannan-Vempala’09I Stack Exchange’13I W’14I Gillis-Vavasis’15I Some machine learning, computer vision papers
Very recent, independent work,Chierichetti-Gollapudi-Kumar-Lattanzi-Panigrahy’16
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 11 / 31
Main Question
QuestionGiven matrix A ∈ Rn×d , is there a provable aglorithm that is able tooutput a (factorization of a) rank-k matrix A such that
‖A − A‖1 6 α · minrank−k A ′
‖A ′ − A‖1?
(From the other perspective) Is there any inapproximability hardness?
This question has been asked at least 4 different places.I Kannan-Vempala’09I Stack Exchange’13I W’14I Gillis-Vavasis’15I Some machine learning, computer vision papers
Very recent, independent work,Chierichetti-Gollapudi-Kumar-Lattanzi-Panigrahy’16
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 11 / 31
Main Question
QuestionGiven matrix A ∈ Rn×d , is there a provable aglorithm that is able tooutput a (factorization of a) rank-k matrix A such that
‖A − A‖1 6 α · minrank−k A ′
‖A ′ − A‖1?
(From the other perspective) Is there any inapproximability hardness?
This question has been asked at least 4 different places.I Kannan-Vempala’09I Stack Exchange’13I W’14I Gillis-Vavasis’15I Some machine learning, computer vision papers
Very recent, independent work,Chierichetti-Gollapudi-Kumar-Lattanzi-Panigrahy’16
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 11 / 31
Main Results - Algorithms
Upper bound (Algorithm)
I poly(k , log n)-approximation algorithm for an arbitrary n × d matrixA
F Polynomial time
I poly(k) log d-approximation algorithm for an arbitrary n× d matrix A
F Polynomial time
I O(k)-approximation algorithm for an arbitrary n × d matrix A
F Exponential in k running time.
I CUR decomposition algorithm for an arbitrary n × d matrix A
F C has O(k log k) columns from AF R has O(k log k) rows from AF rank-k UF poly(k) log d-approximation, polynomial running time
I Bicriteria algorithm, rank-2k O(1)-approximation algorithm
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 12 / 31
Main Results - Algorithms
Upper bound (Algorithm)I poly(k , log n)-approximation algorithm for an arbitrary n × d matrix
A
F Polynomial timeI poly(k) log d-approximation algorithm for an arbitrary n× d matrix A
F Polynomial time
I O(k)-approximation algorithm for an arbitrary n × d matrix A
F Exponential in k running time.
I CUR decomposition algorithm for an arbitrary n × d matrix A
F C has O(k log k) columns from AF R has O(k log k) rows from AF rank-k UF poly(k) log d-approximation, polynomial running time
I Bicriteria algorithm, rank-2k O(1)-approximation algorithm
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 12 / 31
Main Results - Algorithms
Upper bound (Algorithm)I poly(k , log n)-approximation algorithm for an arbitrary n × d matrix
AF Polynomial time
I poly(k) log d-approximation algorithm for an arbitrary n× d matrix A
F Polynomial time
I O(k)-approximation algorithm for an arbitrary n × d matrix A
F Exponential in k running time.
I CUR decomposition algorithm for an arbitrary n × d matrix A
F C has O(k log k) columns from AF R has O(k log k) rows from AF rank-k UF poly(k) log d-approximation, polynomial running time
I Bicriteria algorithm, rank-2k O(1)-approximation algorithm
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 12 / 31
Main Results - Algorithms
Upper bound (Algorithm)I poly(k , log n)-approximation algorithm for an arbitrary n × d matrix
AF Polynomial time
I poly(k) log d-approximation algorithm for an arbitrary n× d matrix A
F Polynomial timeI O(k)-approximation algorithm for an arbitrary n × d matrix A
F Exponential in k running time.
I CUR decomposition algorithm for an arbitrary n × d matrix A
F C has O(k log k) columns from AF R has O(k log k) rows from AF rank-k UF poly(k) log d-approximation, polynomial running time
I Bicriteria algorithm, rank-2k O(1)-approximation algorithm
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 12 / 31
Main Results - Algorithms
Upper bound (Algorithm)I poly(k , log n)-approximation algorithm for an arbitrary n × d matrix
AF Polynomial time
I poly(k) log d-approximation algorithm for an arbitrary n× d matrix A
F Polynomial time
I O(k)-approximation algorithm for an arbitrary n × d matrix A
F Exponential in k running time.
I CUR decomposition algorithm for an arbitrary n × d matrix A
F C has O(k log k) columns from AF R has O(k log k) rows from AF rank-k UF poly(k) log d-approximation, polynomial running time
I Bicriteria algorithm, rank-2k O(1)-approximation algorithm
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 12 / 31
Main Results - Algorithms
Upper bound (Algorithm)I poly(k , log n)-approximation algorithm for an arbitrary n × d matrix
AF Polynomial time
I poly(k) log d-approximation algorithm for an arbitrary n× d matrix A
F Polynomial timeI O(k)-approximation algorithm for an arbitrary n × d matrix A
F Exponential in k running time.I CUR decomposition algorithm for an arbitrary n × d matrix A
F C has O(k log k) columns from AF R has O(k log k) rows from AF rank-k UF poly(k) log d-approximation, polynomial running time
I Bicriteria algorithm, rank-2k O(1)-approximation algorithm
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 12 / 31
Main Results - Algorithms
Upper bound (Algorithm)I poly(k , log n)-approximation algorithm for an arbitrary n × d matrix
AF Polynomial time
I poly(k) log d-approximation algorithm for an arbitrary n× d matrix A
F Polynomial timeI O(k)-approximation algorithm for an arbitrary n × d matrix A
F Exponential in k running time.
I CUR decomposition algorithm for an arbitrary n × d matrix A
F C has O(k log k) columns from AF R has O(k log k) rows from AF rank-k UF poly(k) log d-approximation, polynomial running time
I Bicriteria algorithm, rank-2k O(1)-approximation algorithm
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 12 / 31
Main Results - Algorithms
Upper bound (Algorithm)I poly(k , log n)-approximation algorithm for an arbitrary n × d matrix
AF Polynomial time
I poly(k) log d-approximation algorithm for an arbitrary n× d matrix A
F Polynomial timeI O(k)-approximation algorithm for an arbitrary n × d matrix A
F Exponential in k running time.I CUR decomposition algorithm for an arbitrary n × d matrix A
F C has O(k log k) columns from AF R has O(k log k) rows from AF rank-k UF poly(k) log d-approximation, polynomial running time
I Bicriteria algorithm, rank-2k O(1)-approximation algorithm
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 12 / 31
Main Results - Algorithms
Upper bound (Algorithm)I poly(k , log n)-approximation algorithm for an arbitrary n × d matrix
AF Polynomial time
I poly(k) log d-approximation algorithm for an arbitrary n× d matrix A
F Polynomial timeI O(k)-approximation algorithm for an arbitrary n × d matrix A
F Exponential in k running time.I CUR decomposition algorithm for an arbitrary n × d matrix A
F C has O(k log k) columns from AF R has O(k log k) rows from AF rank-k U
F poly(k) log d-approximation, polynomial running timeI Bicriteria algorithm, rank-2k O(1)-approximation algorithm
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 12 / 31
Main Results - Algorithms
Upper bound (Algorithm)I poly(k , log n)-approximation algorithm for an arbitrary n × d matrix
AF Polynomial time
I poly(k) log d-approximation algorithm for an arbitrary n× d matrix A
F Polynomial timeI O(k)-approximation algorithm for an arbitrary n × d matrix A
F Exponential in k running time.I CUR decomposition algorithm for an arbitrary n × d matrix A
F C has O(k log k) columns from AF R has O(k log k) rows from AF rank-k UF poly(k) log d-approximation, polynomial running time
I Bicriteria algorithm, rank-2k O(1)-approximation algorithm
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 12 / 31
Main Results - Algorithms
Upper bound (Algorithm)I poly(k , log n)-approximation algorithm for an arbitrary n × d matrix
AF Polynomial time
I poly(k) log d-approximation algorithm for an arbitrary n× d matrix A
F Polynomial timeI O(k)-approximation algorithm for an arbitrary n × d matrix A
F Exponential in k running time.I CUR decomposition algorithm for an arbitrary n × d matrix A
F C has O(k log k) columns from AF R has O(k log k) rows from AF rank-k UF poly(k) log d-approximation, polynomial running time
I Bicriteria algorithm, rank-2k O(1)-approximation algorithm
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 12 / 31
Main Results - ApplicationsFor any p ∈ (1,2), `p-low rank approximation
Eath-mover-distance, EMD-low rank approximation`1-low rank approximation with limited independent variablesStreaming settingDistributed setting
. .
. .Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 13 / 31
Main Results - ApplicationsFor any p ∈ (1,2), `p-low rank approximationEath-mover-distance, EMD-low rank approximation
`1-low rank approximation with limited independent variablesStreaming settingDistributed setting
. .
. .
.
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 13 / 31
Main Results - ApplicationsFor any p ∈ (1,2), `p-low rank approximationEath-mover-distance, EMD-low rank approximation`1-low rank approximation with limited independent variables
Streaming settingDistributed setting
. .
. .
..
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 13 / 31
Main Results - ApplicationsFor any p ∈ (1,2), `p-low rank approximationEath-mover-distance, EMD-low rank approximation`1-low rank approximation with limited independent variablesStreaming setting
Distributed setting
. .
. .
...
Receive data stream (i , j ,∆):
(3,4,61), (1,7,23), (5,2,44), (2,3,16), (4,1,98), (2,8,54), · · · · · ·
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 13 / 31
Main Results - ApplicationsFor any p ∈ (1,2), `p-low rank approximationEath-mover-distance, EMD-low rank approximation`1-low rank approximation with limited independent variablesStreaming settingDistributed setting
. .
. .
...
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 13 / 31
Input Sparsity Time Algorithm. .
. .Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 14 / 31
Input Sparsity Time Algorithm. .
. .
Given : A ∈ Rn×d , k ∈ N
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 14 / 31
Input Sparsity Time Algorithm. .
. .
Given : A ∈ Rn×d , k ∈ N
Output :
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 14 / 31
Input Sparsity Time Algorithm. .
. .
Given : A ∈ Rn×d , k ∈ N
Output : matrices U ∈ Rn×k V ∈ Rk×d s.t.
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 14 / 31
Input Sparsity Time Algorithm. .
. .
Given : A ∈ Rn×d , k ∈ N
Output : matrices U ∈ Rn×k V ∈ Rk×d s.t.‖UV − A‖1 6 poly(k) log d · min
rank−k A ′‖A ′ − A‖1
U
V
A−
1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 14 / 31
Input Sparsity Time Algorithm. .
. .
Given : A ∈ Rn×d , k ∈ N
Output : matrices U ∈ Rn×k V ∈ Rk×d s.t.‖UV − A‖1 6 poly(k) log d · min
rank−k A ′
.
∑i,j|A ′i,j − Ai,j |
U
V
A−
1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 14 / 31
Input Sparsity Time Algorithm. .
. .
Given : A ∈ Rn×d , k ∈ N
Output : matrices U ∈ Rn×k V ∈ Rk×d s.t.‖UV − A‖1 6 poly(k) log d ·
..
OPT
U
V
A−
1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 14 / 31
Input Sparsity Time Algorithm. .
. .
Given : A ∈ Rn×d , k ∈ N
Output : matrices U ∈ Rn×k V ∈ Rk×d s.t.‖UV − A‖1 6 poly(k) log d ·
..
OPTwith prob. 9/10
U
V
A−
1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 14 / 31
Input Sparsity Time Algorithm. .
. .
Given : A ∈ Rn×d , k ∈ N
Output : matrices U ∈ Rn×k V ∈ Rk×d s.t.‖UV − A‖1 6 poly(k) log d ·
..
OPTwith prob. 9/10in nnz(A) + (n + d) poly(k) time
U
V
A−
1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 14 / 31
Input Sparsity Time Algorithm. .
. .
Given : A ∈ Rn×d , k ∈ N
Output : matrices U ∈ Rn×k V ∈ Rk×d s.t.‖UV − A‖1 6 poly(k) log d ·
..
OPTwith prob. 9/10in nnz(A) + (n + d) poly(k) time
U
V
A−
1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 14 / 31
O(k)-Approximation Algorithm. .
. .Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 15 / 31
O(k)-Approximation Algorithm. .
. .
Given : A ∈ Rn×d , k ∈ N
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 15 / 31
O(k)-Approximation Algorithm. .
. .
Given : A ∈ Rn×d , k ∈ N
Output :
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 15 / 31
O(k)-Approximation Algorithm. .
. .
Given : A ∈ Rn×d , k ∈ N
Output : matrices U ∈ Rn×k V ∈ Rk×d s.t.
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 15 / 31
O(k)-Approximation Algorithm. .
. .
Given : A ∈ Rn×d , k ∈ N
Output : matrices U ∈ Rn×k V ∈ Rk×d s.t.‖UV − A‖1 6 O(k) · min
rank−k A ′‖A ′ − A‖1
U
V
A−
1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 15 / 31
O(k)-Approximation Algorithm. .
. .
Given : A ∈ Rn×d , k ∈ N
Output : matrices U ∈ Rn×k V ∈ Rk×d s.t.‖UV − A‖1 6 O(k) · min
rank−k A ′
.
∑i,j|A ′i,j − Ai,j |
U
V
A−
1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 15 / 31
O(k)-Approximation Algorithm. .
. .
Given : A ∈ Rn×d , k ∈ N
Output : matrices U ∈ Rn×k V ∈ Rk×d s.t.‖UV − A‖1 6 O(k) ·
..
OPT
U
V
A−
1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 15 / 31
O(k)-Approximation Algorithm. .
. .
Given : A ∈ Rn×d , k ∈ N
Output : matrices U ∈ Rn×k V ∈ Rk×d s.t.‖UV − A‖1 6 O(k) ·
..
OPTwith prob. 9/10
U
V
A−
1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 15 / 31
O(k)-Approximation Algorithm. .
. .
Given : A ∈ Rn×d , k ∈ N
Output : matrices U ∈ Rn×k V ∈ Rk×d s.t.‖UV − A‖1 6 O(k) ·
..
OPTwith prob. 9/10in poly(n)d O(k)2O(k2) time
U
V
A−
1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 15 / 31
O(k)-Approximation Algorithm. .
. .
Given : A ∈ Rn×d , k ∈ N
Output : matrices U ∈ Rn×k V ∈ Rk×d s.t.‖UV − A‖1 6 O(k) ·
..
OPTwith prob. 9/10in poly(n)d O(k)2O(k2) time
U
V
A−
1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 15 / 31
CUR Decomposition Algorithm. .
. .Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 16 / 31
CUR Decomposition Algorithm. .
. .
Given : A ∈ Rn×d , k ∈ N
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 16 / 31
CUR Decomposition Algorithm. .
. .
Given : A ∈ Rn×d , k ∈ N
Output : C ∈ Rn×s, rank-k U ∈ Rs×r , R ∈ Rr×d s.t.
C
U R
A−
1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 16 / 31
CUR Decomposition Algorithm. .
. .
Given : A ∈ Rn×d , k ∈ N
Output : C ∈ Rn×s, rank-k U ∈ Rs×r , R ∈ Rr×d s.t.‖CUR − A‖1 6 poly(k) log d · min
rank−k A ′‖A ′ − A‖1
C
U R
A−
1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 16 / 31
CUR Decomposition Algorithm. .
. .
Given : A ∈ Rn×d , k ∈ N
Output : C ∈ Rn×s, rank-k U ∈ Rs×r , R ∈ Rr×d s.t.‖CUR − A‖1 6 poly(k) log d · min
rank−k A ′‖A ′ − A‖1
with prob. 9/10
C
U R
A−
1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 16 / 31
CUR Decomposition Algorithm. .
. .
Given : A ∈ Rn×d , k ∈ N
Output : C ∈ Rn×s, rank-k U ∈ Rs×r , R ∈ Rr×d s.t.‖CUR − A‖1 6 poly(k) log d · min
rank−k A ′‖A ′ − A‖1
with prob. 9/10in nnz(A) + (n + d) poly(k) time
C
U R
A−
1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 16 / 31
CUR Decomposition Algorithm. .
. .
Given : A ∈ Rn×d , k ∈ N
Output : C ∈ Rn×s, rank-k U ∈ Rs×r , R ∈ Rr×d s.t.‖CUR − A‖1 6 poly(k) log d · min
rank−k A ′‖A ′ − A‖1
with prob. 9/10in nnz(A) + (n + d) poly(k) timeC has s = O(k) columns from AR has r = O(k) rows from A
C
U R
A−
1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 16 / 31
Bicriteria O(1)-Approximation Algorithm. .
. .Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 17 / 31
Bicriteria O(1)-Approximation Algorithm. .
. .
Given : A ∈ Rn×d , k ∈ N
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 17 / 31
Bicriteria O(1)-Approximation Algorithm. .
. .
Given : A ∈ Rn×d , k ∈ N
Output : matrices U ∈ Rn×2k V ∈ R2k×d s.t.
U
V
A−
1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 17 / 31
Bicriteria O(1)-Approximation Algorithm. .
. .
Given : A ∈ Rn×d , k ∈ N
Output : matrices U ∈ Rn×2k V ∈ R2k×d s.t.‖UV − A‖1 6 O(1) · min
rank−k A ′‖A ′ − A‖1
U
V
A−
1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 17 / 31
Bicriteria O(1)-Approximation Algorithm. .
. .
Given : A ∈ Rn×d , k ∈ N
Output : matrices U ∈ Rn×2k V ∈ R2k×d s.t.‖UV − A‖1 6 O(1) · min
rank−k A ′‖A ′ − A‖1
with prob. 9/10
U
V
A−
1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 17 / 31
Bicriteria O(1)-Approximation Algorithm. .
. .
Given : A ∈ Rn×d , k ∈ N
Output : matrices U ∈ Rn×2k V ∈ R2k×d s.t.‖UV − A‖1 6 O(1) · min
rank−k A ′‖A ′ − A‖1
with prob. 9/10in (nd)O(k2) time
U
V
A−
1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 17 / 31
AlgorithmChoose sketching matrix S ′ (a Cauchy matrix or sparse Cauchymatrix.)
Compute S ′A, form C by C i ← arg minx ‖xS ′A − Ai‖1. FormB = C · S ′A.Choose sketching matrices T1,R,S,T2 (Cauchy matrices orsparse Cauchy matrices.)Solve minX ,Y ‖T1BRXYSBT2 − T1BT2‖FOutput BRX , YSB
. .
. .Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 18 / 31
AlgorithmChoose sketching matrix S ′ (a Cauchy matrix or sparse Cauchymatrix.)Compute S ′A, form C by C i ← arg minx ‖xS ′A − Ai‖1. FormB = C · S ′A.
Choose sketching matrices T1,R,S,T2 (Cauchy matrices orsparse Cauchy matrices.)Solve minX ,Y ‖T1BRXYSBT2 − T1BT2‖FOutput BRX , YSB
. .
. .
.
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 18 / 31
AlgorithmChoose sketching matrix S ′ (a Cauchy matrix or sparse Cauchymatrix.)Compute S ′A, form C by C i ← arg minx ‖xS ′A − Ai‖1. FormB = C · S ′A.Choose sketching matrices T1,R,S,T2 (Cauchy matrices orsparse Cauchy matrices.)
Solve minX ,Y ‖T1BRXYSBT2 − T1BT2‖FOutput BRX , YSB
. .
. .
..
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 18 / 31
AlgorithmChoose sketching matrix S ′ (a Cauchy matrix or sparse Cauchymatrix.)Compute S ′A, form C by C i ← arg minx ‖xS ′A − Ai‖1. FormB = C · S ′A.Choose sketching matrices T1,R,S,T2 (Cauchy matrices orsparse Cauchy matrices.)Solve minX ,Y ‖T1BRXYSBT2 − T1BT2‖F
Output BRX , YSB
. .
. .
...
T1 B R X Y S B T2 T1 B T2−
F
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 18 / 31
AlgorithmChoose sketching matrix S ′ (a Cauchy matrix or sparse Cauchymatrix.)Compute S ′A, form C by C i ← arg minx ‖xS ′A − Ai‖1. FormB = C · S ′A.Choose sketching matrices T1,R,S,T2 (Cauchy matrices orsparse Cauchy matrices.)Solve minX ,Y ‖T1BRXYSBT2 − T1BT2‖FOutput BRX , YSB
. .
. .
....
T1 B R X Y S B T2 T1 B T2−
F
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 18 / 31
Main Ideas of Meta Algorithm. .
. .Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 19 / 31
Main Ideas of Meta Algorithm. .
. .
1. Multiple regression sketch
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 19 / 31
Main Ideas of Meta Algorithm. .
. .
1. Multiple regression sketch
2. No closed-form for `1, but we can use `2 relaxation
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 19 / 31
Main Ideas of Meta Algorithm. .
. .
1. Multiple regression sketch
2. No closed-form for `1, but we can use `2 relaxation
3. There exists a solution in the row span of A
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 19 / 31
Main Ideas of Meta Algorithm. .
. .
1. Multiple regression sketch
2. No closed-form for `1, but we can use `2 relaxation
3. There exists a solution in the row span of A
4. Obtain B by projecting rows of A onto SA
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 19 / 31
Main Ideas of Meta Algorithm. .
. .
1. Multiple regression sketch
2. No closed-form for `1, but we can use `2 relaxation
3. There exists a solution in the row span of A
4. Obtain B by projecting rows of A onto SA
5. Repeatedly apply multiple regression sketch
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 19 / 31
Main Ideas of Meta Algorithm. .
. .
1. Multiple regression sketch
2. No closed-form for `1, but we can use `2 relaxation
3. There exists a solution in the row span of A
4. Obtain B by projecting rows of A onto SA
5. Repeatedly apply multiple regression sketch
6. A low rank approximation to B givesa low rank approximation to A
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 19 / 31
Multiple Regression Sketch. .
. .Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 20 / 31
Multiple Regression Sketch. .
. .
Given : matrix A ∈ Rn×d ,
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 20 / 31
Multiple Regression Sketch. .
. .
Given : matrix A ∈ Rn×d ,Choose : sketching matrix S ∈ Rm×n
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 20 / 31
Multiple Regression Sketch. .
. .
Given : matrix A ∈ Rn×d ,Choose : sketching matrix S ∈ Rm×n
Define : U∗,V ∗ = arg minU,V‖UV − A‖1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 20 / 31
Multiple Regression Sketch. .
. .
Given : matrix A ∈ Rn×d ,Choose : sketching matrix S ∈ Rm×n
Define : U∗,V ∗ = arg minU,V‖UV − A‖1V ′ = arg minV ‖SU∗V − SA‖1
SU∗
V SA−
1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 20 / 31
Multiple Regression Sketch. .
. .
Given : matrix A ∈ Rn×d ,Choose : sketching matrix S ∈ Rm×n
Define : U∗,V ∗ = arg minU,V‖UV − A‖1V ′ = arg minV ‖SU∗V − SA‖1
If : with prob. 9/10for all V , ‖SU∗V − SA‖1 > ‖U∗V − A‖1 and
SU∗
V SA−
1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 20 / 31
Multiple Regression Sketch. .
. .
Given : matrix A ∈ Rn×d ,Choose : sketching matrix S ∈ Rm×n
Define : U∗,V ∗ = arg minU,V‖UV − A‖1V ′ = arg minV ‖SU∗V − SA‖1
If : with prob. 9/10for all V , ‖SU∗V − SA‖1 > ‖U∗V − A‖1 andfor any fixed V , ‖SU∗V − SA‖1 6 β‖U∗V − A‖1
SU∗
V SA−
1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 20 / 31
Multiple Regression Sketch. .
. .
Given : matrix A ∈ Rn×d ,Choose : sketching matrix S ∈ Rm×n
Define : U∗,V ∗ = arg minU,V‖UV − A‖1V ′ = arg minV ‖SU∗V − SA‖1
If : with prob. 9/10for all V , ‖SU∗V − SA‖1 > ‖U∗V − A‖1 andfor any fixed V , ‖SU∗V − SA‖1 6 β‖U∗V − A‖1
Then : with prob. 9/10,‖U∗V ′ − A‖1 6 β‖U∗V ∗ − A‖1
SU∗
V SA−
1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 20 / 31
Existence Result. .
. .Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 21 / 31
Existence Result. .
. .
Given :
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 21 / 31
Existence Result. .
. .
Given : matrix A ∈ Rn×d ,
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 21 / 31
Existence Result. .
. .
Given : matrix A ∈ Rn×d ,Choose :
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 21 / 31
Existence Result. .
. .
Given : matrix A ∈ Rn×d ,Choose : sketching matrix S ∈ Rm×n
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 21 / 31
Existence Result. .
. .
Given : matrix A ∈ Rn×d ,Choose : sketching matrix S ∈ Rm×n
Define :
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 21 / 31
Existence Result. .
. .
Given : matrix A ∈ Rn×d ,Choose : sketching matrix S ∈ Rm×n
Define : U∗,V ∗ := arg minU∈Rn×k ,V∈Rk×d‖UV − A‖1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 21 / 31
Existence Result. .
. .
Given : matrix A ∈ Rn×d ,Choose : sketching matrix S ∈ Rm×n
Define : U∗,V ∗ := arg minU∈Rn×k ,V∈Rk×d‖UV − A‖1V i := arg minV ‖SU∗V i − SAi‖2, for all i ∈ [d ]
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 21 / 31
Existence Result. .
. .
Given : matrix A ∈ Rn×d ,Choose : sketching matrix S ∈ Rm×n
Define : U∗,V ∗ := arg minU∈Rn×k ,V∈Rk×d‖UV − A‖1V i := arg minV ‖SU∗V i − SAi‖2, for all i ∈ [d ]
=⇒ V i = (SU∗)†SAi ,
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 21 / 31
Existence Result. .
. .
Given : matrix A ∈ Rn×d ,Choose : sketching matrix S ∈ Rm×n
Define : U∗,V ∗ := arg minU∈Rn×k ,V∈Rk×d‖UV − A‖1V i := arg minV ‖SU∗V i − SAi‖2, for all i ∈ [d ]
=⇒ V i = (SU∗)†SAi , =⇒ V = (SU∗)†SA
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 21 / 31
Existence Result. .
. .
Given : matrix A ∈ Rn×d ,Choose : sketching matrix S ∈ Rm×n
Define : U∗,V ∗ := arg minU∈Rn×k ,V∈Rk×d‖UV − A‖1V i := arg minV ‖SU∗V i − SAi‖2, for all i ∈ [d ]
=⇒ V i = (SU∗)†SAi , =⇒ V = (SU∗)†SAV ′ := arg minV ‖SU∗V − SA‖1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 21 / 31
Existence Result. .
. .
Given : matrix A ∈ Rn×d ,Choose : sketching matrix S ∈ Rm×n
Define : U∗,V ∗ := arg minU∈Rn×k ,V∈Rk×d‖UV − A‖1V i := arg minV ‖SU∗V i − SAi‖2, for all i ∈ [d ]
=⇒ V i = (SU∗)†SAi , =⇒ V = (SU∗)†SAV ′ := arg minV ‖SU∗V − SA‖1
Then :
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 21 / 31
Existence Result. .
. .
Given : matrix A ∈ Rn×d ,Choose : sketching matrix S ∈ Rm×n
Define : U∗,V ∗ := arg minU∈Rn×k ,V∈Rk×d‖UV − A‖1V i := arg minV ‖SU∗V i − SAi‖2, for all i ∈ [d ]
=⇒ V i = (SU∗)†SAi , =⇒ V = (SU∗)†SAV ′ := arg minV ‖SU∗V − SA‖1
Then : ‖SU∗V − SA‖1 6√
m‖SU∗V ′ − SA‖1 (`2-relaxation)
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 21 / 31
Existence Result. .
. .
Given : matrix A ∈ Rn×d ,Choose : sketching matrix S ∈ Rm×n
Define : U∗,V ∗ := arg minU∈Rn×k ,V∈Rk×d‖UV − A‖1V i := arg minV ‖SU∗V i − SAi‖2, for all i ∈ [d ]
=⇒ V i = (SU∗)†SAi , =⇒ V = (SU∗)†SAV ′ := arg minV ‖SU∗V − SA‖1
Then : ‖SU∗V − SA‖1 6√
m‖SU∗V ′ − SA‖1 (`2-relaxation)
If :
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 21 / 31
Existence Result. .
. .
Given : matrix A ∈ Rn×d ,Choose : sketching matrix S ∈ Rm×n
Define : U∗,V ∗ := arg minU∈Rn×k ,V∈Rk×d‖UV − A‖1V i := arg minV ‖SU∗V i − SAi‖2, for all i ∈ [d ]
=⇒ V i = (SU∗)†SAi , =⇒ V = (SU∗)†SAV ′ := arg minV ‖SU∗V − SA‖1
Then : ‖SU∗V − SA‖1 6√
m‖SU∗V ′ − SA‖1 (`2-relaxation)
If : ‖U∗V ′ − A‖1 6 β‖U∗V ∗ − A‖1 (proved earlier)
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 21 / 31
Existence Result. .
. .
Given : matrix A ∈ Rn×d ,Choose : sketching matrix S ∈ Rm×n
Define : U∗,V ∗ := arg minU∈Rn×k ,V∈Rk×d‖UV − A‖1V i := arg minV ‖SU∗V i − SAi‖2, for all i ∈ [d ]
=⇒ V i = (SU∗)†SAi , =⇒ V = (SU∗)†SAV ′ := arg minV ‖SU∗V − SA‖1
Then : ‖SU∗V − SA‖1 6√
m‖SU∗V ′ − SA‖1 (`2-relaxation)
If : ‖U∗V ′ − A‖1 6 β‖U∗V ∗ − A‖1 (proved earlier)
Then :
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 21 / 31
Existence Result. .
. .
Given : matrix A ∈ Rn×d ,Choose : sketching matrix S ∈ Rm×n
Define : U∗,V ∗ := arg minU∈Rn×k ,V∈Rk×d‖UV − A‖1V i := arg minV ‖SU∗V i − SAi‖2, for all i ∈ [d ]
=⇒ V i = (SU∗)†SAi , =⇒ V = (SU∗)†SAV ′ := arg minV ‖SU∗V − SA‖1
Then : ‖SU∗V − SA‖1 6√
m‖SU∗V ′ − SA‖1 (`2-relaxation)
If : ‖U∗V ′ − A‖1 6 β‖U∗V ∗ − A‖1 (proved earlier)
Then : ‖U∗V − A‖1 6√
mβ‖U∗V ∗ − A‖1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 21 / 31
Existence Result. .
. .
Given : matrix A ∈ Rn×d ,Choose : sketching matrix S ∈ Rm×n
Define : U∗,V ∗ := arg minU∈Rn×k ,V∈Rk×d‖UV − A‖1V i := arg minV ‖SU∗V i − SAi‖2, for all i ∈ [d ]
=⇒ V i = (SU∗)†SAi , =⇒ V = (SU∗)†SAV ′ := arg minV ‖SU∗V − SA‖1
Then : ‖SU∗V − SA‖1 6√
m‖SU∗V ′ − SA‖1 (`2-relaxation)
If : ‖U∗V ′ − A‖1 6 β‖U∗V ∗ − A‖1 (proved earlier)
Then : ‖U∗V − A‖1 6√
mβ‖U∗V ∗ − A‖1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 21 / 31
Existence Result. .
. .
Given : matrix A ∈ Rn×d ,Choose : sketching matrix S ∈ Rm×n
Define : U∗,V ∗ := arg minU∈Rn×k ,V∈Rk×d‖UV − A‖1V i := arg minV ‖SU∗V i − SAi‖2, for all i ∈ [d ]
=⇒ V i = (SU∗)†SAi , =⇒ V = (SU∗)†SAV ′ := arg minV ‖SU∗V − SA‖1
Then : ‖SU∗V − SA‖1 6√
m‖SU∗V ′ − SA‖1 (`2-relaxation)
If : ‖U∗V ′ − A‖1 6 β‖U∗V ∗ − A‖1 (proved earlier)
Then : ‖U∗V − A‖1 6√
mβ‖U∗V ∗ − A‖1
Thus, there exists a√
mβ-approximation in the row span of A.
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 21 / 31
Repeatedly Apply Multiple Regression Sketch. .
. .Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 22 / 31
Repeatedly Apply Multiple Regression Sketch. .
. .
Given :
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 22 / 31
Repeatedly Apply Multiple Regression Sketch. .
. .
Given : rank-t B ∈ Rn×d , k > 1, t = poly(k)
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 22 / 31
Repeatedly Apply Multiple Regression Sketch. .
. .
Given : rank-t B ∈ Rn×d , k > 1, t = poly(k)s, r , t1, t2 = poly(k)
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 22 / 31
Repeatedly Apply Multiple Regression Sketch. .
. .
Given : rank-t B ∈ Rn×d , k > 1, t = poly(k)s, r , t1, t2 = poly(k)
Choose :
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 22 / 31
Repeatedly Apply Multiple Regression Sketch. .
. .
Given : rank-t B ∈ Rn×d , k > 1, t = poly(k)s, r , t1, t2 = poly(k)
Choose : sketching matrices S ∈ Rs×n, R ∈ Rd×r
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 22 / 31
Repeatedly Apply Multiple Regression Sketch. .
. .
Given : rank-t B ∈ Rn×d , k > 1, t = poly(k)s, r , t1, t2 = poly(k)
Choose : sketching matrices S ∈ Rs×n, R ∈ Rd×r
sketching matrix T1 ∈ Rt1×n, T2 ∈ Rd×t2
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 22 / 31
Repeatedly Apply Multiple Regression Sketch. .
. .
Given : rank-t B ∈ Rn×d , k > 1, t = poly(k)s, r , t1, t2 = poly(k)
Choose : sketching matrices S ∈ Rs×n, R ∈ Rd×r
sketching matrix T1 ∈ Rt1×n, T2 ∈ Rd×t2
By S,R :
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 22 / 31
Repeatedly Apply Multiple Regression Sketch. .
. .
Given : rank-t B ∈ Rn×d , k > 1, t = poly(k)s, r , t1, t2 = poly(k)
Choose : sketching matrices S ∈ Rs×n, R ∈ Rd×r
sketching matrix T1 ∈ Rt1×n, T2 ∈ Rd×t2
By S,R : minX∈Rr×k ,Y∈Rk×s
‖BRXYSB − B‖1 6 β√
m minrank−k B ′
‖B ′ − B‖1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 22 / 31
Repeatedly Apply Multiple Regression Sketch. .
. .
Given : rank-t B ∈ Rn×d , k > 1, t = poly(k)s, r , t1, t2 = poly(k)
Choose : sketching matrices S ∈ Rs×n, R ∈ Rd×r
sketching matrix T1 ∈ Rt1×n, T2 ∈ Rd×t2
By S,R : minX∈Rr×k ,Y∈Rk×s
‖BRXYSB − B‖1 6 β√
m minrank−k B ′
‖B ′ − B‖1
By T1,T2:
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 22 / 31
Repeatedly Apply Multiple Regression Sketch. .
. .
Given : rank-t B ∈ Rn×d , k > 1, t = poly(k)s, r , t1, t2 = poly(k)
Choose : sketching matrices S ∈ Rs×n, R ∈ Rd×r
sketching matrix T1 ∈ Rt1×n, T2 ∈ Rd×t2
By S,R : minX∈Rr×k ,Y∈Rk×s
‖BRXYSB − B‖1 6 β√
m minrank−k B ′
‖B ′ − B‖1
By T1,T2: ‖BRX ∗Y ∗SB − B‖1 6 γ minX∈Rr×k ,Y∈Rk×s
‖BRXYSB − B‖1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 22 / 31
Repeatedly Apply Multiple Regression Sketch. .
. .
Given : rank-t B ∈ Rn×d , k > 1, t = poly(k)s, r , t1, t2 = poly(k)
Choose : sketching matrices S ∈ Rs×n, R ∈ Rd×r
sketching matrix T1 ∈ Rt1×n, T2 ∈ Rd×t2
By S,R : minX∈Rr×k ,Y∈Rk×s
‖BRXYSB − B‖1 6 β√
m minrank−k B ′
‖B ′ − B‖1
By T1,T2: ‖BRX ∗Y ∗SB − B‖1 6 γ minX∈Rr×k ,Y∈Rk×s
‖BRXYSB − B‖1
where X ∗,Y ∗ := arg minX∈Rr×k ,Y∈Rk×s
‖T1BRXYSBT2 − T1BT2‖1 (∗)
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 22 / 31
Repeatedly Apply Multiple Regression Sketch. .
. .
Given : rank-t B ∈ Rn×d , k > 1, t = poly(k)s, r , t1, t2 = poly(k)
Choose : sketching matrices S ∈ Rs×n, R ∈ Rd×r
sketching matrix T1 ∈ Rt1×n, T2 ∈ Rd×t2
By S,R : minX∈Rr×k ,Y∈Rk×s
‖BRXYSB − B‖1 6 β√
m minrank−k B ′
‖B ′ − B‖1
By T1,T2: ‖BRX ∗Y ∗SB − B‖1 6 γ minX∈Rr×k ,Y∈Rk×s
‖BRXYSB − B‖1
where X ∗,Y ∗ := arg minX∈Rr×k ,Y∈Rk×s
‖T1BRXYSBT2 − T1BT2‖1 (∗)
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 22 / 31
Repeatedly Apply Multiple Regression Sketch. .
. .
Given : rank-t B ∈ Rn×d , k > 1, t = poly(k)s, r , t1, t2 = poly(k)
Choose : sketching matrices S ∈ Rs×n, R ∈ Rd×r
sketching matrix T1 ∈ Rt1×n, T2 ∈ Rd×t2
By S,R : minX∈Rr×k ,Y∈Rk×s
‖BRXYSB − B‖1 6 β√
m minrank−k B ′
‖B ′ − B‖1
By T1,T2: ‖BRX ∗Y ∗SB − B‖1 6 γ minX∈Rr×k ,Y∈Rk×s
‖BRXYSB − B‖1
where X ∗,Y ∗ := arg minX∈Rr×k ,Y∈Rk×s
‖T1BRXYSBT2 − T1BT2‖1 (∗)
Thus, BRX∗,Y ∗SB gives a βγ√
m-approximation to B.
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 22 / 31
Repeatedly Apply Multiple Regression Sketch. .
. .
Given : rank-t B ∈ Rn×d , k > 1, t = poly(k)s, r , t1, t2 = poly(k)
Choose : sketching matrices S ∈ Rs×n, R ∈ Rd×r
sketching matrix T1 ∈ Rt1×n, T2 ∈ Rd×t2
By S,R : minX∈Rr×k ,Y∈Rk×s
‖BRXYSB − B‖1 6 β√
m minrank−k B ′
‖B ′ − B‖1
By T1,T2: ‖BRX ∗Y ∗SB − B‖1 6 γ minX∈Rr×k ,Y∈Rk×s
‖BRXYSB − B‖1
where X ∗,Y ∗ := arg minX∈Rr×k ,Y∈Rk×s
‖T1BRXYSBT2 − T1BT2‖1 (∗)
Thus, BRX∗,Y ∗SB gives a βγ√
m-approximation to B.
We can solve (∗) by either using polynomial system solver
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 22 / 31
Repeatedly Apply Multiple Regression Sketch. .
. .
Given : rank-t B ∈ Rn×d , k > 1, t = poly(k)s, r , t1, t2 = poly(k)
Choose : sketching matrices S ∈ Rs×n, R ∈ Rd×r
sketching matrix T1 ∈ Rt1×n, T2 ∈ Rd×t2
By S,R : minX∈Rr×k ,Y∈Rk×s
‖BRXYSB − B‖1 6 β√
m minrank−k B ′
‖B ′ − B‖1
By T1,T2: ‖BRX ∗Y ∗SB − B‖1 6 γ minX∈Rr×k ,Y∈Rk×s
‖BRXYSB − B‖1
where X ∗,Y ∗ := arg minX∈Rr×k ,Y∈Rk×s
‖T1BRXYSBT2 − T1BT2‖1 (∗)
Thus, BRX∗,Y ∗SB gives a βγ√
m-approximation to B.
We can solve (∗) by either using polynomial system solveror relaxing ‖ · ‖1 to ‖ · ‖F .
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 22 / 31
Reducing Arbitrary Matrix A to Rank-r Matrix B. .
. .Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 23 / 31
Reducing Arbitrary Matrix A to Rank-r Matrix B. .
. .
Given k ∈ N , r > k , A ∈ Rn×d and rank-r B,C ∈ Rn×d .
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 23 / 31
Reducing Arbitrary Matrix A to Rank-r Matrix B. .
. .
Given k ∈ N , r > k , A ∈ Rn×d and rank-r B,C ∈ Rn×d .
If ‖B − A‖1 6 β minrank−k A′
‖A ′ − A‖1 and ‖C − B‖1 6 γ minrank−k B ′
‖B ′ − B‖1,
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 23 / 31
Reducing Arbitrary Matrix A to Rank-r Matrix B. .
. .
Given k ∈ N , r > k , A ∈ Rn×d and rank-r B,C ∈ Rn×d .
If ‖B − A‖1 6 β minrank−k A′
‖A ′ − A‖1 and ‖C − B‖1 6 γ minrank−k B ′
‖B ′ − B‖1,
then ‖C − A‖1 6 O(βγ) minrank−k A′
‖A ′ − A‖1.
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 23 / 31
Reducing Arbitrary Matrix A to Rank-r Matrix B. .
. .
Given k ∈ N , r > k , A ∈ Rn×d and rank-r B,C ∈ Rn×d .
If ‖B − A‖1 6 β minrank−k A′
‖A ′ − A‖1 and ‖C − B‖1 6 γ minrank−k B ′
‖B ′ − B‖1,
then ‖C − A‖1 6 O(βγ) minrank−k A′
‖A ′ − A‖1.
Proof :
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 23 / 31
Reducing Arbitrary Matrix A to Rank-r Matrix B. .
. .
Given k ∈ N , r > k , A ∈ Rn×d and rank-r B,C ∈ Rn×d .
If ‖B − A‖1 6 β minrank−k A′
‖A ′ − A‖1 and ‖C − B‖1 6 γ minrank−k B ′
‖B ′ − B‖1,
then ‖C − A‖1 6 O(βγ) minrank−k A′
‖A ′ − A‖1.
Proof : ‖C − A‖1 6 ‖C − B‖1 + ‖B − A‖1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 23 / 31
Reducing Arbitrary Matrix A to Rank-r Matrix B. .
. .
Given k ∈ N , r > k , A ∈ Rn×d and rank-r B,C ∈ Rn×d .
If ‖B − A‖1 6 β minrank−k A′
‖A ′ − A‖1 and ‖C − B‖1 6 γ minrank−k B ′
‖B ′ − B‖1,
then ‖C − A‖1 6 O(βγ) minrank−k A′
‖A ′ − A‖1.
Proof : ‖C − A‖1 6 ‖C − B‖1 + ‖B − A‖1
( B∗ := arg minrank−k B ′
‖B ′ − B‖1)6 γ‖B∗ − B‖1 + ‖B − A‖1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 23 / 31
Reducing Arbitrary Matrix A to Rank-r Matrix B. .
. .
Given k ∈ N , r > k , A ∈ Rn×d and rank-r B,C ∈ Rn×d .
If ‖B − A‖1 6 β minrank−k A′
‖A ′ − A‖1 and ‖C − B‖1 6 γ minrank−k B ′
‖B ′ − B‖1,
then ‖C − A‖1 6 O(βγ) minrank−k A′
‖A ′ − A‖1.
Proof : ‖C − A‖1 6 ‖C − B‖1 + ‖B − A‖1
( B∗ := arg minrank−k B ′
‖B ′ − B‖1)6 γ‖B∗ − B‖1 + ‖B − A‖1
( A∗ := arg minrank−k A′
‖A ′ − A‖1)6 γ‖A∗ − B‖1 + ‖B − A‖1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 23 / 31
Reducing Arbitrary Matrix A to Rank-r Matrix B. .
. .
Given k ∈ N , r > k , A ∈ Rn×d and rank-r B,C ∈ Rn×d .
If ‖B − A‖1 6 β minrank−k A′
‖A ′ − A‖1 and ‖C − B‖1 6 γ minrank−k B ′
‖B ′ − B‖1,
then ‖C − A‖1 6 O(βγ) minrank−k A′
‖A ′ − A‖1.
Proof : ‖C − A‖1 6 ‖C − B‖1 + ‖B − A‖1
( B∗ := arg minrank−k B ′
‖B ′ − B‖1)6 γ‖B∗ − B‖1 + ‖B − A‖1
( A∗ := arg minrank−k A′
‖A ′ − A‖1)6 γ‖A∗ − B‖1 + ‖B − A‖1
6 γ‖A∗ − A‖1 + (γ+ 1)‖B − A‖1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 23 / 31
Reducing Arbitrary Matrix A to Rank-r Matrix B. .
. .
Given k ∈ N , r > k , A ∈ Rn×d and rank-r B,C ∈ Rn×d .
If ‖B − A‖1 6 β minrank−k A′
‖A ′ − A‖1 and ‖C − B‖1 6 γ minrank−k B ′
‖B ′ − B‖1,
then ‖C − A‖1 6 O(βγ) minrank−k A′
‖A ′ − A‖1.
Proof : ‖C − A‖1 6 ‖C − B‖1 + ‖B − A‖1
( B∗ := arg minrank−k B ′
‖B ′ − B‖1)6 γ‖B∗ − B‖1 + ‖B − A‖1
( A∗ := arg minrank−k A′
‖A ′ − A‖1)6 γ‖A∗ − B‖1 + ‖B − A‖1
6 γ‖A∗ − A‖1 + (γ+ 1)‖B − A‖1
6 O(βγ)‖A∗ − A‖1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 23 / 31
Reducing Arbitrary Matrix A to Rank-r Matrix B. .
. .
Given k ∈ N , r > k , A ∈ Rn×d and rank-r B,C ∈ Rn×d .
If ‖B − A‖1 6 β minrank−k A′
‖A ′ − A‖1 and ‖C − B‖1 6 γ minrank−k B ′
‖B ′ − B‖1,
then ‖C − A‖1 6 O(βγ) minrank−k A′
‖A ′ − A‖1.
Proof : ‖C − A‖1 6 ‖C − B‖1 + ‖B − A‖1
( B∗ := arg minrank−k B ′
‖B ′ − B‖1)6 γ‖B∗ − B‖1 + ‖B − A‖1
( A∗ := arg minrank−k A′
‖A ′ − A‖1)6 γ‖A∗ − B‖1 + ‖B − A‖1
6 γ‖A∗ − A‖1 + (γ+ 1)‖B − A‖1
6 O(βγ)‖A∗ − A‖1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 23 / 31
Main Results - Summary of Hard Instance andHardness
Lower bound (hard instance, hardness)
I Under the Exponential Time Hypothesis(ETH)I Hard instance for row subset selectionI Hard instance for oblivious subspace embeddings(OSE)I Hard instance for Cauchy embeddingsI Hard instance for row span
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 24 / 31
Main Results - Summary of Hard Instance andHardness
Lower bound (hard instance, hardness)I Under the Exponential Time Hypothesis(ETH)
I Hard instance for row subset selectionI Hard instance for oblivious subspace embeddings(OSE)I Hard instance for Cauchy embeddingsI Hard instance for row span
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 24 / 31
Main Results - Summary of Hard Instance andHardness
Lower bound (hard instance, hardness)I Under the Exponential Time Hypothesis(ETH)I Hard instance for row subset selection
I Hard instance for oblivious subspace embeddings(OSE)I Hard instance for Cauchy embeddingsI Hard instance for row span
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 24 / 31
Main Results - Summary of Hard Instance andHardness
Lower bound (hard instance, hardness)I Under the Exponential Time Hypothesis(ETH)I Hard instance for row subset selectionI Hard instance for oblivious subspace embeddings(OSE)
I Hard instance for Cauchy embeddingsI Hard instance for row span
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 24 / 31
Main Results - Summary of Hard Instance andHardness
Lower bound (hard instance, hardness)I Under the Exponential Time Hypothesis(ETH)I Hard instance for row subset selectionI Hard instance for oblivious subspace embeddings(OSE)I Hard instance for Cauchy embeddings
I Hard instance for row span
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 24 / 31
Main Results - Summary of Hard Instance andHardness
Lower bound (hard instance, hardness)I Under the Exponential Time Hypothesis(ETH)I Hard instance for row subset selectionI Hard instance for oblivious subspace embeddings(OSE)I Hard instance for Cauchy embeddingsI Hard instance for row span
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 24 / 31
Hardness - Under Exponential Time Hypothesis. .
. .Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 25 / 31
Hardness - Under Exponential Time Hypothesis. .
. .
`1-Low Rank Approximation Hardness
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 25 / 31
Hardness - Under Exponential Time Hypothesis. .
. .
`1-Low Rank Approximation Hardness
Given : A ∈ Rn×d , k ∈ N, α > 1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 25 / 31
Hardness - Under Exponential Time Hypothesis. .
. .
`1-Low Rank Approximation Hardness
Given : A ∈ Rn×d , k ∈ N, α > 1
Output : U ∈ Rn×k , V ∈ Rk×d s.t. ‖UV − A‖2F 6 α ·OPTwith prob. 9/10
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 25 / 31
Hardness - Under Exponential Time Hypothesis. .
. .
`1-Low Rank Approximation Hardness
Given : A ∈ Rn×d , k ∈ N, α > 1
Output : U ∈ Rn×k , V ∈ Rk×d s.t. ‖UV − A‖2F 6 α ·OPTwith prob. 9/10
Assume : Exponential Time Hypothesis(ETH)[Impagliazzo-Paturi-Zane’98]
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 25 / 31
Hardness - Under Exponential Time Hypothesis. .
. .
`1-Low Rank Approximation Hardness
Given : A ∈ Rn×d , k ∈ N, α > 1
Output : U ∈ Rn×k , V ∈ Rk×d s.t. ‖UV − A‖2F 6 α ·OPTwith prob. 9/10
Assume : Exponential Time Hypothesis(ETH)[Impagliazzo-Paturi-Zane’98]
for arbitrarily small constant γ > 0
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 25 / 31
Hardness - Under Exponential Time Hypothesis. .
. .
`1-Low Rank Approximation Hardness
Given : A ∈ Rn×d , k ∈ N, α > 1
Output : U ∈ Rn×k , V ∈ Rk×d s.t. ‖UV − A‖2F 6 α ·OPTwith prob. 9/10
Assume : Exponential Time Hypothesis(ETH)[Impagliazzo-Paturi-Zane’98]
for arbitrarily small constant γ > 0for any algorithm running in (nd)O(1) time
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 25 / 31
Hardness - Under Exponential Time Hypothesis. .
. .
`1-Low Rank Approximation Hardness
Given : A ∈ Rn×d , k ∈ N, α > 1
Output : U ∈ Rn×k , V ∈ Rk×d s.t. ‖UV − A‖2F 6 α ·OPTwith prob. 9/10
Assume : Exponential Time Hypothesis(ETH)[Impagliazzo-Paturi-Zane’98]
for arbitrarily small constant γ > 0for any algorithm running in (nd)O(1) time
Requires : α > (1 + 1log1+γ(nd)
)
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 25 / 31
Hard Instance - Row Subset Selection. .
. .Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 26 / 31
Hard Instance - Row Subset Selection. .
. .
No (O(√
k))-approximation for row subset selection
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 26 / 31
Hard Instance - Row Subset Selection. .
. .
No (O(√
k))-approximation for row subset selection
Given : A ∈ Rn×(n+k), k ∈ N, n = poly(k)
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 26 / 31
Hard Instance - Row Subset Selection. .
. .
No (O(√
k))-approximation for row subset selection
Given : A ∈ Rn×(n+k), k ∈ N, n = poly(k)
Output : rank-k matrix A is in the row span of any n/2 rows of A s.t.‖A − A‖1 6 O(k0.5−ε) ·OPT
with positive probability.ε > 0 is a constant which can be arbitrarily small
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 26 / 31
Hard Instance - Row Subset Selection. .
. .
No (O(√
k))-approximation for row subset selection
Given : A ∈ Rn×(n+k), k ∈ N, n = poly(k)
Output : rank-k matrix A is in the row span of any n/2 rows of A s.t.‖A − A‖1 6 O(k0.5−ε) ·OPT
with positive probability.ε > 0 is a constant which can be arbitrarily small
There is no such algorithm!
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 26 / 31
Hard Instance - Oblivious Subspace Embeddings. .
. .Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 27 / 31
Hard Instance - Oblivious Subspace Embeddings. .
. .
No (O(√
k))-approximation for any OSE
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 27 / 31
Hard Instance - Oblivious Subspace Embeddings. .
. .
No (O(√
k))-approximation for any OSE
Let k > 1, n = poly(k), t = poly(k).
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 27 / 31
Hard Instance - Oblivious Subspace Embeddings. .
. .
No (O(√
k))-approximation for any OSE
Let k > 1, n = poly(k), t = poly(k).There exist matrices A ∈ Rn×(k+n) s.t. for any oblivious matrix S ∈ Rt×n
U
S
A A−
1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 27 / 31
Hard Instance - Oblivious Subspace Embeddings. .
. .
No (O(√
k))-approximation for any OSE
Let k > 1, n = poly(k), t = poly(k).There exist matrices A ∈ Rn×(k+n) s.t. for any oblivious matrix S ∈ Rt×n
with probability 9/10min
U∈Rd×t‖USA − A‖1 > Ω(k0.5−ε) min
rank−k A ′‖A ′ − A‖1
ε > 0 is a constant which can be arbitrarily small
U
S
A A−
1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 27 / 31
Hard Instance - Oblivious Subspace Embeddings. .
. .
No (O(√
k))-approximation for any OSE
Let k > 1, n = poly(k), t = poly(k).There exist matrices A ∈ Rn×(k+n) s.t. for any oblivious matrix S ∈ Rt×n
with probability 9/10min
U∈Rd×t‖USA − A‖1 > Ω(k0.5−ε) min
rank−k A ′‖A ′ − A‖1
ε > 0 is a constant which can be arbitrarily small
U
S
A A−
1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 27 / 31
Hard Instance - Cauchy Embedding. .
. .Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 28 / 31
Hard Instance - Cauchy Embedding. .
. .
No (O(log d))-approximation in Row Span
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 28 / 31
Hard Instance - Cauchy Embedding. .
. .
No (O(log d))-approximation in Row Span
There exist matrices A ∈ Rd×d s.t. for any o(log d) > t > 1,
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 28 / 31
Hard Instance - Cauchy Embedding. .
. .
No (O(log d))-approximation in Row Span
There exist matrices A ∈ Rd×d s.t. for any o(log d) > t > 1,for random Cauchy matrices S ∈ Rt×d , where each entryis sampled from i.i.d. Cauchy distribution C(0,γ), γ ∈ R
U
S
A A−
1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 28 / 31
Hard Instance - Cauchy Embedding. .
. .
No (O(log d))-approximation in Row Span
There exist matrices A ∈ Rd×d s.t. for any o(log d) > t > 1,for random Cauchy matrices S ∈ Rt×d , where each entryis sampled from i.i.d. Cauchy distribution C(0,γ), γ ∈ R
with probability 9/10min
U∈Rd×t‖USA − A‖1 > Ω( log d
(t log t)) minrank−k A ′
‖A ′ − A‖1
U
S
A A−
1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 28 / 31
Hard Instance - Row Span. .
. .Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 29 / 31
Hard Instance - Row Span. .
. .
No (2-ε)-approximation in the entire row span
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 29 / 31
Hard Instance - Row Span. .
. .
No (2-ε)-approximation in the entire row span
Given : A ∈ R(d−1)×d , k ∈ N
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 29 / 31
Hard Instance - Row Span. .
. .
No (2-ε)-approximation in the entire row span
Given : A ∈ R(d−1)×d , k ∈ N
Output : rank-k matrix A is in row span of A s.t.‖A − A‖1 6 2(1 − 1
Θ(d)) ·OPTwith prob. 9/10
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 29 / 31
Hard Instance - Row Span. .
. .
No (2-ε)-approximation in the entire row span
Given : A ∈ R(d−1)×d , k ∈ N
Output : rank-k matrix A is in row span of A s.t.‖A − A‖1 6 2(1 − 1
Θ(d)) ·OPTwith prob. 9/10
There is no such algorithm!
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 29 / 31
Hard Instance - Row Span. .
. .
No (2-ε)-approximation in the entire row span
Given : A ∈ R(d−1)×d , k ∈ N
Output : rank-k matrix A is in row span of A s.t.‖A − A‖1 6 2(1 − 1
Θ(d)) ·OPTwith prob. 9/10
There is no such algorithm!
A =
1 1 0 0 · · · 01 0 1 0 · · · 01 0 0 1 · · · 0· · · · · · · · · · · · · · · · · ·1 0 0 0 · · · 1
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 29 / 31
Experimental ResultsInput matrix A = diag(n2+0.25,n1.5+0.25,B,B) ∈ R(2n+2)×(2n+2),where B ∈ Rn×n is all 1s matrix. Find rank-3 solution.
The performance of our algorithm has the best accuracy.All the algorithms(including ours) can be finished in 3 seconds,except for BDB13, KK05.
. .
. .Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 30 / 31
Experimental ResultsInput matrix A = diag(n2+0.25,n1.5+0.25,B,B) ∈ R(2n+2)×(2n+2),where B ∈ Rn×n is all 1s matrix. Find rank-3 solution.
The performance of our algorithm has the best accuracy.All the algorithms(including ours) can be finished in 3 seconds,except for BDB13, KK05.
. .
. .
0 50 100 150 200 250 300 350 4002n+2
0e+00
1e+04
2e+04
3e+04
4e+04
5e+04
6e+04
7e+04
8e+04
` 1-n
orm
cost
OursBDB13KK05rKK05sKwak08rKwak08sDZHZ06
‖A − A‖1 vs matrix dimension
0 50 100 150 200 250 300 350 4002n+2
0
10
20
30
40
50
Tim
e
OursBDB13KK05rKK05sKwak08rKwak08sDZHZ06
Running time vs matrix dimension
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 30 / 31
Experimental ResultsInput matrix A = diag(n2+0.25,n1.5+0.25,B,B) ∈ R(2n+2)×(2n+2),where B ∈ Rn×n is all 1s matrix. Find rank-3 solution.The performance of our algorithm has the best accuracy.
All the algorithms(including ours) can be finished in 3 seconds,except for BDB13, KK05.
. .
. .
0 50 100 150 200 250 300 350 4002n+2
0e+00
1e+04
2e+04
3e+04
4e+04
5e+04
6e+04
7e+04
8e+04
` 1-n
orm
cost
OursBDB13KK05rKK05sKwak08rKwak08sDZHZ06
‖A − A‖1 vs matrix dimension
0 50 100 150 200 250 300 350 4002n+2
0
10
20
30
40
50
Tim
e
OursBDB13KK05rKK05sKwak08rKwak08sDZHZ06
Running time vs matrix dimension
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 30 / 31
Experimental ResultsInput matrix A = diag(n2+0.25,n1.5+0.25,B,B) ∈ R(2n+2)×(2n+2),where B ∈ Rn×n is all 1s matrix. Find rank-3 solution.The performance of our algorithm has the best accuracy.All the algorithms(including ours) can be finished in 3 seconds,except for BDB13, KK05.
. .
. .
0 50 100 150 200 250 300 350 4002n+2
0e+00
1e+04
2e+04
3e+04
4e+04
5e+04
6e+04
7e+04
8e+04
` 1-n
orm
cost
OursBDB13KK05rKK05sKwak08rKwak08sDZHZ06
‖A − A‖1 vs matrix dimension
0 50 100 150 200 250 300 350 4002n+2
0
10
20
30
40
50
Tim
e
OursBDB13KK05rKK05sKwak08rKwak08sDZHZ06
Running time vs matrix dimension
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 30 / 31
Thank you!. .
. .
Questions?
Song-Woodruff-Zhong Low Rank Approximation with Entrywise `1-Norm Error 31 / 31