+ All Categories
Home > Documents > My Seminar on Linear Systems

My Seminar on Linear Systems

Date post: 09-Apr-2018
Category:
Upload: dilawar-singh
View: 220 times
Download: 0 times
Share this document with a friend

of 51

Transcript
  • 8/8/2019 My Seminar on Linear Systems

    1/51

    . . . . . .

    Graphs, Linear Systems and Laplacian MatricesA seminar

    Dilawar SinghVersion 0.5

    Department of Electrical EngineeringIndian Institute of Technology Bombay

    November 29, 2010

  • 8/8/2019 My Seminar on Linear Systems

    2/51

    . . . . . .

    An old Chinese philosopher never said, Words about graph wortha thousand pictures.

    Bending the curve On Langugae, William Sare 1

    1http:

    //www.nytimes.com/2009/09/13/magazine/13FOB-OnLanguage-t.html

    http://www.nytimes.com/2009/09/13/magazine/13FOB-OnLanguage-t.htmlhttp://www.nytimes.com/2009/09/13/magazine/13FOB-OnLanguage-t.html
  • 8/8/2019 My Seminar on Linear Systems

    3/51

  • 8/8/2019 My Seminar on Linear Systems

    4/51. . . . . .

    Outline

    Solve Ax = b Quick!Goal Special case when A is Laplacian matrix of a Graph.

    2Daniel A Spielman ; Proceedings of ICM Hyderabad 2010

  • 8/8/2019 My Seminar on Linear Systems

    5/51. . . . . .

    Outline

    Solve Ax = b Quick!Goal Special case when A is Laplacian matrix of a Graph.

    Connections between spectral graph theory, and numericallinear algrabra. 2

    2Daniel A Spielman ; Proceedings of ICM Hyderabad 2010

  • 8/8/2019 My Seminar on Linear Systems

    6/51. . . . . .

    Why?

  • 8/8/2019 My Seminar on Linear Systems

    7/51. . . . . .

    Why?

    Classical Approach!

  • 8/8/2019 My Seminar on Linear Systems

    8/51. . . . . .

    Why?

    Classical Approach!

    Recent Development

  • 8/8/2019 My Seminar on Linear Systems

    9/51. . . . . .

    Why?

    Classical Approach!

    Recent Development

    Connections!

  • 8/8/2019 My Seminar on Linear Systems

    10/51. . . . . .

    Why?

    Classical Approach!

    Recent Development

    Connections!

    ?

  • 8/8/2019 My Seminar on Linear Systems

    11/51

    . . . . . .

    Graphs and Laplacian Matrix

    Denition

    Graph G given by a triple (V,E,w ),

    3

    If w represents conductance in corresponding electrical network and x isany potential vector then x T Lx is the total power consumed in network.

  • 8/8/2019 My Seminar on Linear Systems

    12/51

  • 8/8/2019 My Seminar on Linear Systems

    13/51

    . . . . . .

    Graphs and Laplacian Matrix

    Denition

    Graph G given by a triple (V,E,w ),Laplacian Matrix L of a graph is naturally dened by the

    quadratic form it induce. For a vector x V , the Laplacianquadratic form of G is x T Lx = (u,v )E wu,v (x(u ) x(v)

    2).3

    More the x jumps, larger the quadratic form. Thus L providesa measure of smoothness of x over the edges of G .

    3

    If w represents conductance in corresponding electrical network and x isany potential vector then x T Lx is the total power consumed in network.

  • 8/8/2019 My Seminar on Linear Systems

    14/51

    . . . . . .

    Laplacian continued ...

    Dene D to be a diagonal matrix whose diagonal contains d,and dene the weighted adjacency matrix of G by

    A(u, v ) = {wu,v if (u, v ) E 0 otherwiset

    .a

    .b

    .e

    .c

    .d

    . 1

    .1

    .1 .1

    .2

    .1 2 1 0 0 1 1 3 1 1 00 1 2 1 00 1 1 4 2

    1 0 0 2 3

  • 8/8/2019 My Seminar on Linear Systems

    15/51

    . . . . . .

    Properties of Laplacian Matrix

    2 1 0 0 1 1 3 1 1 00 1 2 1 00 1 1 4 2

    1 0 0 2 3

    They are symetric, have zero row-sums, and have non-positiveoff-diagonal entries.

  • 8/8/2019 My Seminar on Linear Systems

    16/51

    . . . . . .

    Properties of Laplacian Matrix

    2 1 0 0 1 1 3 1 1 00 1 2 1 00 1 1 4 2

    1 0 0 2 3

    They are symetric, have zero row-sums, and have non-positiveoff-diagonal entries.

    Positive semi-denite . Every eigenvalue is non-negative.

  • 8/8/2019 My Seminar on Linear Systems

    17/51

    . . . . . .

    Properties of Laplacian Matrix

    2 1 0 0 1 1 3 1 1 00 1 2 1 00 1 1 4 2

    1 0 0 2 3

    They are symetric, have zero row-sums, and have non-positiveoff-diagonal entries.

    Positive semi-denite . Every eigenvalue is non-negative.

    Connectivity . Let 0 1 2.... n be the eigenvalues.Then 2 0 iff G is connected.

  • 8/8/2019 My Seminar on Linear Systems

    18/51

    . . . . . .

    Properties of Laplacian Matrix

    2 1 0 0 1 1 3 1 1 00 1 2 1 00 1 1 4 2

    1 0 0 2 3

    They are symetric, have zero row-sums, and have non-positiveoff-diagonal entries.

    Positive semi-denite . Every eigenvalue is non-negative.

    Connectivity . Let 0 1 2.... n be the eigenvalues.Then 2 0 iff G is connected.When two graphs are NOT isomorphic?

  • 8/8/2019 My Seminar on Linear Systems

    19/51

    . . . . . .

    Some denitions

    Denition

    Given a set of vertices S V , dene the boundary of S , written (S ) to be the set of edges in V with exactly one vertex in S

    It is of great interest in computer science to minimize ormaximize the (S ).

  • 8/8/2019 My Seminar on Linear Systems

    20/51

    . . . . . .

    Some denitions

    Denition

    Given a set of vertices S V , dene the boundary of S , written (S ) to be the set of edges in V with exactly one vertex in S

    It is of great interest in computer science to minimize ormaximize the (S ).

    Conductance of a graphLet (S )

    def = w | (S )|min( |d(S )| ,|d(V S )|) , where d(S ) is the sum of the

    weighted degree of the vertices in S then conductance of graph G ,def = min S V (S )

    d

  • 8/8/2019 My Seminar on Linear Systems

    21/51

    . . . . . .

    Some denitions

    Denition

    Given a set of vertices S V , dene the boundary of S , written (S ) to be the set of edges in V with exactly one vertex in S

    It is of great interest in computer science to minimize ormaximize the (S ).

    Conductance of a graphLet (S )

    def = w | (S )|min( |d(S )| ,|d(V S )|) , where d(S ) is the sum of the

    weighted degree of the vertices in S then conductance of graph G ,def = min S V (S )

    Iso-perimeter NumberLet i(S )

    def = | (S )|min( |(S )| ,| (V S )| ) , then iso-perimeter number of G ,

    iGdef = min sG i(S ).

    C d

  • 8/8/2019 My Seminar on Linear Systems

    22/51

    . . . . . .

    Conductance

  • 8/8/2019 My Seminar on Linear Systems

    23/51

    C d t

  • 8/8/2019 My Seminar on Linear Systems

    24/51

    . . . . . .

    Conductance

    Fast solution of Linear Equations

  • 8/8/2019 My Seminar on Linear Systems

    25/51

    . . . . . .

    Fast solution of Linear Equations

    Conductance Sparcity

    Conjugate Gradient Method is fast when conductance is high!

    Fast solution of Linear Equations

  • 8/8/2019 My Seminar on Linear Systems

    26/51

    . . . . . .

    Fast solution of Linear Equations

    Conductance Sparcity

    Conjugate Gradient Method is fast when conductance is high!

    Elimination is fast when conductance is low on G and allsubgraphs.

    Fast solution of Linear Equations

  • 8/8/2019 My Seminar on Linear Systems

    27/51

    . . . . . .

    Fast solution of Linear Equations

    Conductance Sparcity

    Conjugate Gradient Method is fast when conductance is high!

    Elimination is fast when conductance is low on G and all

    subgraphs.PROBLEM Fast solutions when graph is in between theextremes.

    Fast solution of Linear Equations

  • 8/8/2019 My Seminar on Linear Systems

    28/51

    . . . . . .

    Fast solution of Linear Equations

    Conductance Sparcity

    Conjugate Gradient Method is fast when conductance is high!

    Elimination is fast when conductance is low on G and all

    subgraphs.PROBLEM Fast solutions when graph is in between theextremes.

    Sparsify the graph

    Solving Linear Equations in Laplacian

  • 8/8/2019 My Seminar on Linear Systems

    29/51

    . . . . . .

    Solving Linear Equations in Laplacian

    Conjugate Gradient Methods multiplies matrix A by somevector x at each iteration. The number of iterations dependson the eigenvalue of A . Condition number 4 .

    4 Ratio of largest to smallest eigenvalue5

    Gerard Meurant, Computer Solution of Large Linear Systems, NorthHolland

  • 8/8/2019 My Seminar on Linear Systems

    30/51

    Solving Linear Equations in Laplacian

  • 8/8/2019 My Seminar on Linear Systems

    31/51

    . . . . . .

    Solving Linear Equations in Laplacian

    Conjugate Gradient Methods multiplies matrix A by somevector x at each iteration. The number of iterations dependson the eigenvalue of A . Condition number 4 .

    Preconditioned Iterative Methods Can one nd a matrix B

    such that BA will have the same solution but have smallercondition number? We can!

    B is known as pre-conditioner.They have been provedincredibly useful in practice. 5

    4 Ratio of largest to smallest eigenvalue5 Gerard Meurant, Computer Solution of Large Linear Systems, North

    Holland

    Preconditioned Iterative Methods

  • 8/8/2019 My Seminar on Linear Systems

    32/51

    . . . . . .

    Preconditioned Iterative Methods

    When A and B are Laplacian Matrix of connected graph,similar analysis holds. 6

    Relative condition number of AB + written (A, B ) in thiscase measure how well those graphs approximate each other.B + is Moore-Penrose pseudo-inverse of B.

    6 precondition number is now the ratio of largest to smallest non-zeroeigenvalue

    Approximation by Sparse Graph

  • 8/8/2019 My Seminar on Linear Systems

    33/51

    . . . . . .

    Approximation by Sparse GraphSparsicationSparsication is the process of approximating a given graph G by a

    sparse graph H . H is said to be an -approximation of G if f (L G , L H ) 1 + , where L G and L H are Laplacian matrix of Gand H .

    Approximation by Sparse Graph

  • 8/8/2019 My Seminar on Linear Systems

    34/51

    . . . . . .

    pp y p pSparsicationSparsication is the process of approximating a given graph G by a

    sparse graph H . H is said to be an -approximation of G if f (L G , L H ) 1 + , where L G and L H are Laplacian matrix of Gand H .

    G and H are similar in many ways. They have similareigenvalues and the effective resistance between every pair of nodes in approximately the same.

  • 8/8/2019 My Seminar on Linear Systems

    35/51

    Approximation by Sparse Graph

  • 8/8/2019 My Seminar on Linear Systems

    36/51

    . . . . . .

    pp y p pSparsicationSparsication is the process of approximating a given graph G by a

    sparse graph H . H is said to be an -approximation of G if f (L G , L H ) 1 + , where L G and L H are Laplacian matrix of Gand H .

    G and H are similar in many ways. They have similareigenvalues and the effective resistance between every pair of nodes in approximately the same.It Solving system in dense matrix Solving in a sparsematrix. Conjugate Gradient Methods are faster on sparseMatrix.

    Find H with O (n ) zero entries then solve system in L G byusing a preconditioned Conjugate Gradient with L H aspreconditioner, and solving the system in L H by ConjugateGradient.

    Approximation by Sparse Graph

  • 8/8/2019 My Seminar on Linear Systems

    37/51

    . . . . . .

    pp y p pSparsicationSparsication is the process of approximating a given graph G by a

    sparse graph H . H is said to be an -approximation of G if f (L G , L H ) 1 + , where L G and L H are Laplacian matrix of Gand H .

    G and H are similar in many ways. They have similareigenvalues and the effective resistance between every pair of nodes in approximately the same.It Solving system in dense matrix Solving in a sparsematrix. Conjugate Gradient Methods are faster on sparseMatrix.

    Find H with O (n ) zero entries then solve system in L G byusing a preconditioned Conjugate Gradient with L H aspreconditioner, and solving the system in L H by ConjugateGradient.Each solve in H takes O (n 2), and for accuracy, PCG will

    take log 1. Total complexity would be O ((m + n 2)log 1).

    More on Sparcication

  • 8/8/2019 My Seminar on Linear Systems

    38/51

    . . . . . .

    Does such good specier exists?

    7 Spileman, Proceedings of ICM 2010

    More on Sparcication

  • 8/8/2019 My Seminar on Linear Systems

    39/51

    . . . . . .

    Does such good specier exists?

    Benezur and Karger seems to have developed very similar. 7 .

    7 Spileman, Proceedings of ICM 2010

    Subgraph Preconditioner and Support theory

  • 8/8/2019 My Seminar on Linear Systems

    40/51

    . . . . . .

    Vaidyas idea of precondition Laplacian matrices by the matrixof its own subgraph. The tools used to analyse them are

    known as support theory. 8

    8 urlhttp://www.sandia.gov/ bahendr/support.html9 Algebraic Tools for Analyzing Preconditioners, Bruce Hendrickson (with

    Erik Boman). Invited talk at Preconditioning0310 Kolla et all, CoRR 99

    Subgraph Preconditioner and Support theory

  • 8/8/2019 My Seminar on Linear Systems

    41/51

    . . . . . .

    Vaidyas idea of precondition Laplacian matrices by the matrixof its own subgraph. The tools used to analyse them are

    known as support theory. 8Vaidya did not publish his results. He build a software. Hisstudent used his work in his dissertation.

    8 urlhttp://www.sandia.gov/ bahendr/support.html9 Algebraic Tools for Analyzing Preconditioners, Bruce Hendrickson (with

    Erik Boman). Invited talk at Preconditioning0310 Kolla et all, CoRR 99

    Subgraph Preconditioner and Support theory

  • 8/8/2019 My Seminar on Linear Systems

    42/51

    . . . . . .

    Vaidyas idea of precondition Laplacian matrices by the matrixof its own subgraph. The tools used to analyse them are

    known as support theory. 8Vaidya did not publish his results. He build a software. Hisstudent used his work in his dissertation.

    Laplacian of maximum spanning tree is used as preconditioner.

    Lower bounds were not proved. A bit inefficient.

    8 urlhttp://www.sandia.gov/ bahendr/support.html9 Algebraic Tools for Analyzing Preconditioners, Bruce Hendrickson (with

    Erik Boman). Invited talk at Preconditioning0310 Kolla et all, CoRR 99

    Subgraph Preconditioner and Support theory

  • 8/8/2019 My Seminar on Linear Systems

    43/51

    . . . . . .

    Vaidyas idea of precondition Laplacian matrices by the matrixof its own subgraph. The tools used to analyse them are

    known as support theory. 8Vaidya did not publish his results. He build a software. Hisstudent used his work in his dissertation.

    Laplacian of maximum spanning tree is used as preconditioner.

    Lower bounds were not proved. A bit inefficient.Low stretch Spanning Tree were proven good precondition.

    A few edges may be added to maximum spanning tree toimprove the preconditioning 9 .

    8 urlhttp://www.sandia.gov/ bahendr/support.html9 Algebraic Tools for Analyzing Preconditioners, Bruce Hendrickson (with

    Erik Boman). Invited talk at Preconditioning0310 Kolla et all, CoRR 99

    Subgraph Preconditioner and Support theory

  • 8/8/2019 My Seminar on Linear Systems

    44/51

    . . . . . .

    Vaidyas idea of precondition Laplacian matrices by the matrixof its own subgraph. The tools used to analyse them are

    known as support theory. 8Vaidya did not publish his results. He build a software. Hisstudent used his work in his dissertation.

    Laplacian of maximum spanning tree is used as preconditioner.

    Lower bounds were not proved. A bit inefficient.Low stretch Spanning Tree were proven good precondition.

    A few edges may be added to maximum spanning tree toimprove the preconditioning 9 .

    Good preconditioner existed10

    but the challenge was toconstruct them quickly.

    8 urlhttp://www.sandia.gov/ bahendr/support.html9 Algebraic Tools for Analyzing Preconditioners, Bruce Hendrickson (with

    Erik Boman). Invited talk at Preconditioning0310 Kolla et all, CoRR 99

    Overview : Application of Laplacian

  • 8/8/2019 My Seminar on Linear Systems

    45/51

    . . . . . .

    Regression on Graphs . A function f is given on a subset of vertices of G , calculate f over remaining vertices. Minimize

    xT

    Lx .

    11 Daniel P. Spielman and Shang Hua Tang, Smoothed analysis of termination of linear programming algorithms, Mathematical Programming, B,2003. to appear

    12 Gilbert Strang, Introduction to Applied Mathematics, 1986

    Overview : Application of Laplacian

  • 8/8/2019 My Seminar on Linear Systems

    46/51

    . . . . . .

    Regression on Graphs . A function f is given on a subset of vertices of G , calculate f over remaining vertices. Minimize

    xT

    Lx .Spectral Graph Theory . Study of eigenvalues and theirrelation with graphs. Adiabatic computing!

    11 Daniel P. Spielman and Shang Hua Tang, Smoothed analysis of termination of linear programming algorithms, Mathematical Programming, B,2003. to appear

    12 Gilbert Strang, Introduction to Applied Mathematics, 1986

    Overview : Application of Laplacian

  • 8/8/2019 My Seminar on Linear Systems

    47/51

    . . . . . .

    Regression on Graphs . A function f is given on a subset of vertices of G , calculate f over remaining vertices. Minimize

    xT

    Lx .Spectral Graph Theory . Study of eigenvalues and theirrelation with graphs. Adiabatic computing!Interior Point Method and Maximum Flow Problem.Equivalent to solving system of linear equations that canbe reduced to restricted Laplacian Systems .11

    Resistor Networks. Measurement of effective resistancebetween two vertices.

    11 Daniel P. Spielman and Shang Hua Tang, Smoothed analysis of termination of linear programming algorithms, Mathematical Programming, B,2003. to appear

    12 Gilbert Strang, Introduction to Applied Mathematics, 1986

    Overview : Application of Laplacian

  • 8/8/2019 My Seminar on Linear Systems

    48/51

    . . . . . .

    Regression on Graphs . A function f is given on a subset of vertices of G , calculate f over remaining vertices. Minimize

    xT

    Lx .Spectral Graph Theory . Study of eigenvalues and theirrelation with graphs. Adiabatic computing!Interior Point Method and Maximum Flow Problem.Equivalent to solving system of linear equations that canbe reduced to restricted Laplacian Systems .11

    Resistor Networks. Measurement of effective resistancebetween two vertices.

    Partial Differential Equations. Laplacian Matrix of path graph Vibration of a string. FEM to solve Laplaces equationsusing a triangulation with no obtuse angle. 12

    11 Daniel P. Spielman and Shang Hua Tang, Smoothed analysis of termination of linear programming algorithms, Mathematical Programming, B,2003. to appear

    12 Gilbert Strang, Introduction to Applied Mathematics, 1986

  • 8/8/2019 My Seminar on Linear Systems

    49/51

    Conclusion

  • 8/8/2019 My Seminar on Linear Systems

    50/51

    . . . . . .

    There is a connection!

    Koutis, Miller and Peng 13 made tremendous progress on thisproblem, they produces ultra-sparsiers that lead to analgorithm for solving linear systems in Laplacian that takestime O (m log2 n (log log n )2 log 1).This is much faster than any algorithm known to date!!

    13 http://arxiv.org/abs/1003.2958v1

    Conclusion

    http://arxiv.org/abs/1003.2958v1
  • 8/8/2019 My Seminar on Linear Systems

    51/51

    . . . . . .

    There is a connection!

    Koutis, Miller and Peng 13 made tremendous progress on thisproblem, they produces ultra-sparsiers that lead to analgorithm for solving linear systems in Laplacian that takestime O (m log2 n (log log n )2 log 1).This is much faster than any algorithm known to date!!

    To Do : Solve electrical networks using interior pointmethods. Plug in Laplacian.

    13 http://arxiv.org/abs/1003.2958v1

    http://arxiv.org/abs/1003.2958v1

Recommended