Network Characterisation and Similarity
Richard C. Wilson Dept. of Computer Science
University of York
Outline
1. Brief recap of spectral graph theory
2. Spectral Similarity
3. Random Walks and Differential Equations
4. Quantum Graphs
Notation
Common notation
V is the set of vertices (|V| is the order of the graph)
E is the set of edges (|E| is the size of the graph)
X is an attribute functions, maps vertices and edges onto
their attributes
),,( XEVG
edge directed ,,,
edge undirected ,,,
VvVuvue
VvVuvueEe
Graph Spectrum
Matrix Representation
A Matrix Representation X o f a n e t w o r k i s m a t r i x w i t h e n t r i e s
r e p r e s e n t i n g t h e v e r t i c e s a n d e d g e s
A d j a c e n c y
00110
00100
11010
10101
00010
5
4
3
2
1
5 4 3 2 1
A1
2
3
4
5
20000
01000
00300
00030
00001
DDegree matrix
Matrix Representation
The Laplacian (L) is
Signless Laplacian
20110
01100
11310
10131
00011
ADL
ADL s
Matrix Representation
Normalized Laplacian
Entries are
2
1
2
1
2
1
2
1
ˆ
LDD
ADDIL
otherwise0
),(1
1
ˆ Evudd
vu
Lvu
uv
Incidence matrix
The incidence matrix of a graph is a matrix describing the relationship
between vertices and edges
Relationship to signless Laplacian
Adjacency
Laplacian
10
11
01
2,32,1
M1
2
3
DMMA T
TMMDL 2
T
s MML
Matrix Representation
Consider the Laplacian (L) of this network
Clearly if we label the network differently, we get a different matrix
In fact
represents the same graph for any permutation matrix P of the n labels
20110
01100
11310
10131
00011
5
4
3
2
1
5 4 3 2 1
1
2
3
4
5
TPLPL '
1
2
20101
01100
11301
00011
10113
5
4
3
2
1
5 4 3 2 1
Characterisations
Are two networks the same? (Graph Isomorphism), or is
there a bijection between the vertices such that all the edges
are in correspondence?
Interesting problem in computational theory, complexity
unknown but hypothesised as separate class in NP-
hierarchy, GI-hard
Graph Automorphism: Isomorphism between a graph and
itself. There is an equivalence between GI and counting
number of GAs
G1 G2
G1 G2 G1
G2
2121 ##2# GIGIGGI
Characterisations
An equivalent statement: Two networks are isomorphic iff
there exists a permutation matrix P such that
X should contain all information about the network
– Applies to L, A etc not to D
P is a relabelling; changes the order in which we label the
vertices
Our measurements from a matrix representation should be
invariant under this transformation (similarity transform)
TPPXX 12
X is a full matrix
representation
Spectral Graph Theory
Properties of the graph from the eigenvalues (eigenvectors) of
a matrix representation of the graph
1
UUUUX
UUX
T
LR
TSymmetric (undirected)
Non-symmetric
Perron-Frobenius Theorem
Perron-Frobenius Theorem:
If X is an irreducible square matrix with non-negative entries, then there exists an eigenpair (λ,u) such that
Applies to both left and right eigenvector
•Key theorem: if our matrix is non-negative, we can find a principal(largest) eigenvalue which is positive and has a non-negative eigenvector
•Irreducible implies associated digraph is strongly connected
0
j
i
u
i
R
Spectrum
The graph has a ordered set of eigenvalues (λ0, λ1,… λn-1)
in terms of size (I will use smallest first).
The (ordered) set of eigenvalues is called the spectrum of
the graph.
Theorem: The spectrum is unchanged by the relabelling
transform
Corollary: If two graphs are isomorphic, they have the
same spectrum
This does not solve the isomorphism problem, as two
different graphs may have the same spectrum
12
12
TPPXX
Spectrum
These two graphs have the same spectrum using the
Laplacian representation
This is a cospectral pair. Necessary but not sufficient to
determine isomorphism.
The matrix representation we use has a big effect on how
many of these cospectral graphs there are
5.24
[3]2
2
0.76
5.24
[3]2
2
0.76
Cospectral graphs
How many such graphs are there and how does it depend
on representation? (Zhu & Wilson 2008)
*50 trillion
graphs of size
13
*
Cospectrality
Open problem: Is there a representation in which nearly all
graphs are determined by the spectrum (non-cospectral)?
Answer for trees: No, nearly all trees are cospectral
In practice, cospectrality not a problem since two randomly
selected graphs have tiny chance of being cospectral.
If we pick graphs from a specialised family, may be a
problem, for example regular, strongly regular graphs
Spectrum of A
Spectrum of A: Positive and negative eigenvalues
Bipartite graph: If λ is an eigenvalue, then so is –λ, Sp(A)
symmetric around 0
Perron-Frobenius Theorem (A non-negative matrix)
n-1 is largest magnitude eigenvalue, corresponding
eigenvector xn-1 is non-negative
01
max110max 0
0
n
n
i
dd
Spectrum of L
Spectrum of L: L positive semi-definite
There always exists an eigenvector 1 with eigenvalue 0,
because of zero row-sums
The number zeros in the spectrum is the number of
connected components of the graph.
n
E
n
i
1100
2
Spectrum of L
A spanning tree of a graph is a tree containing only edges
in the graph and all the vertices
Example
Kirchhoff’s theorem
The number of spanning trees of a graph is
1
1
1 n
i
in
Spectrum of normalised L
Spectrum of : Positive semi-definite
As with Laplacian, the number zeros in the spectrum is the
number of disconnected components of the graph.
Eigenvector exists with eigenvalue 0 and entries
‘scale invariance’
20 110
n
i V
L
Tnddd 21
References
Spectra of Graphs, Brouwer & Haemers, Springer
Graph Spectra for Complex Networks, Van Mieghem,
Cambridge University Press
Spectral Graph Theory, Fan Chung, American
Mathematical Society
Coding Attributes
So far, we have considered edges only as present or absent
{0,1}. If we have more edge information, can encode in a
variety of ways. Edges can be weighted to encode
attributes, include diagonal entries to encode vertices
00110
00100
11010
1016.02.0
0002.04.0
A0.4
0.6
0.2
Coding Attributes
•Note: When using Laplacian, add diagonal elements after
forming L
•Label attributes: Code labels into [0,1]
•Example: chemical structures
Edges
─ 0.5
═ 1.0
Aromatic 0.75
Vertices
C 0.7
N 0.8
O 0.9
Coding Attributes
Spectral theory works equally well for complex matrices
Matrix entry is x+iy so can encode two independent
attributes per entry, x and y. Symmetric matrix becomes
Hermitian matrix
Eigenvalues real, eigenvectors complex
03.01.00
2.01.003.05.0
03.05.00
†
i
ii
i
A
AA
Coding Attributes
Example: Shape skeletons
Shock graph has vertices where shocks meets and edges
with lengths l and angles θ
Encode as complex weight
Naturally hermitian as
ijijijij
i
ijij illelA ij
sincos
jiij
jiij ll
Similarity
Similarity of Networks
How can we measure the similarity of two networks?
Key idea: Graph Edit Distance(GED) uses edit operations
– Vertex insertion, deletion
– Edge insertion, deletion
– Relabelling a vertex
Associate a cost with each operation. Find a sequence of
edit operations which transforms one network into the
other. The minimum possible cost of a sequence is the
graph edit distance.
NP-complete so we cannot actually compute it
GED - example
Edge deletion
Cost ed Vertex deletion
Cost vd
Edge insertion
Cost ei
Vertex relabel
Cost vl
G1
G2
The sequence of edit
operations is an edit path
E
c(E)=ed+vd+ei+vl
)(min),( 21 EcGGGEDE
Graph similarity
The simplest form of GED is zero cost for vertex
operations and relabelling
Then equivalent to Maximum Common Subgraph [Bunke,
PAMI 1999]
Since we cannot compute GED, we generally resort to
approximate methods, either compute matches or compare
features. If we can get good features, we can use them to
compare graphs
Eigenperturbation
Let Y=X+N where X and Y are two similar networks and
N is small
Since the change in eigenvalues is
bounded by the difference between X and Y
NXYN 1 nkkk
Fn NN 1
kj
j
jk
k
T
j
kk
k
T
kkk
uNuu
uu
Nuu
Spectral Similarity
How good is the spectrum for similarity comparisons?
[Zhu, Wilson 2008]
Theorem: The eigenvector components are permuted by the
relabelling transform
The columns of U are ordered by the eigenvalues, but the
rows still depend on the labelling
Eigenvectors
12
1122
12
PUU
PUPUUU
PPXX
TTT
T
Non-uniqueness of U
The eigenvectors of a graph are not unique. They have a
sign ambiguity, if u is an eigenvector, so is –u
Need to apply sign correction
Repeated eigenvalues makes the situation worse: If (,u1)
and (,u2) are eigenpairs, then so is (,au1+bu2)
)()( uuXuXu
212121 uuXuXuuuX bababa
Spectral Features
Theorem:
All graphs which have simple spectra can be distinguished
from each other in polynomial time
Simple spectrum means than there are no repeated eigenvalues in the
spectrum, hence the eigendecomposition is unique (up to signs).
Then we can order the components of the eigenvectors in polynomial
time (for example by sorting)
Open Problem: Repeated eigenvalues, difficult
graphs for isomorphism and labelling ambiguity
are all connected in a way not yet understood
Random walks, Diffusions and Differential Equations
Random Walks
•Spectral features are not tightly coupled to structure
•Can we explore the structure of the network?
•A random walker travels between vertices by choosing an
edge at random
•At each time step, a step is taken down an edge
Discrete Time Random Walk
Imagine that we are standing at vertex ui
At each time, we chose one of the available edges with equal
probability
Then the probability of arriving at vertex uj is
Therefore, at the next time step, the distribution is
i
ij
ji
ji
iijd
A
Euu
EuuduuP
),( 0
),( 1
)|(
i
it
i
ij
i
itijjt
uPd
A
uPuuPuP
)(
)()|()(1
Discrete Time Random Walk
T is the transition matrix of the walk, a stochastic matrix
(rows sum to 1). Perron-Frobenius theorem implies largest
magnitude eigenvalue 1.
If we start in state π0 then at time t
)(
)()(
1
1
1
1
1
ADTTππ
ADππ
tt
tt
i
it
i
ij
jt uPd
AuP
t
t Tππ 0
Discrete Time Random Walk
What happens after a very long time?
t
ts Tππ 0lim
T
L
t
n
t
t
R
T
L
t
R
tT
UU
UU
1
1
0
0
0
0lim
1||
t
t
1lim
1
t
t
T
LR UUT
T
nL
nRn
T
nLnRns
1,
1,11,1,01
11
,1
u
1uuuππ
Discrete Time Random Walks
After a very long time, the walk becomes stationary Only the
largest (left) eigenvector of T survives
This is the principal eigenvector of T (with λ=1) and is easy
to solve; it is
After a long time, we are at each node with a probability
proportional to its degree. It is natural to think of the
probability as a measure of centrality. In this situation,
eigenvector centrality (of T) coincides with degree centrality
πTπ
||2 E
dπ i
i
PageRank
One important application for random walks is for the web
More central pages are more important
Idea:
Surfer clicks links to new pages at random
May also quit and start fresh at a random page (‘teleporting’)
Importance of page is prob of ending up there
Links are directed, but makes no difference to the formulation
J is matrix of all-ones (teleportation transitions)
α is the probability of starting over
Eigenvector centrality for T is the PageRank (Google) of each
page
JADT 1)1(
Backtrackless Walks
The random walk suffers from the problem of tottering
This reduces expressive power and masks structural differences.
An alternative is to use a backtrackless walk where reverse steps
are not allowed.
We can treat backtrackless walks in much the same way by using a
graph transformation.
Backtrackless walks
•(Oriented) Line graph:
1 2
3 4
e21
e12
e23 e32 e42
e24
e41 e14
e43
e34
e21
e12
e23 e32
e42
e24
e41
e14
e43
e34
e23
e21
e12
e32
e42
e24
e41 e14
e43
e34
Line graph (LG): Random walk Oriented Line graph (OLG):
Random walk, no backtracking
Backtrackless walk
The adjacency matrix of the OLG is
A random walk using this transition matrix is a
backtrackless walk on the original graph.
The size of Q is ~|E|2 so potentially very large but sparse.
The mathematical structure allows us to find efficient ways
of evaluating properties of the walk.
[Backtrackless Walks on a Graph, Aziz, Wilson, Hancock 2012, IEEE
TNNLS]
otherwise 0
,,),(),,( if 1)],(),,[(
dacbEdcbadcbaQ
Differential Equations
Differential Equations on Graphs
A whole host of important physical processes can be
described by differential equations
Diffusion, or heat flow
Wave propagation
Schrödinger Equation
pt
p 2
pt
p 2
2
2
Vppmt
pi
22
2
Laplacian
is the Laplacian differential operator
In Euclidean space
Different in non-flat spaces
Take a 1D discrete version of this
i, i-1,i+1 denote neighbouring points
2
2
2
2
2
22
zyx
2
)(2)()(
)()()()(
)()(
11
11
2/12/12
22
iii
iiii
ii
xpxpxp
xpxpxpxp
xx
px
x
p
x
pp
xi xi+1 xi-1
Laplacian
A graph which encodes the neighbourhood structure
The Lapacian of this graph is
Apply L to a vector (a ‘function’ taking values on the vertices)
So the graph Laplacian is a discrete representation of the calculus Laplacian
– Vertices are points in space
– Edges represent neighbourhood structure of space
– Note minus sign!
p
ppp iiii
2
11 2
Lp
i i+1 i-1
110
121
011
L
Diffusion
On a network, we identify the Laplacian operator 2 with
the Laplacian of the network L
Discrete space, continuous time diffusion process
Lpp
t
-L 2
pt
p 2
Heat Kernel
Solution
Heat kernel H(t)
Hij(t) describes the amount of heat flow from vertex i to j at
time t
Essentially another matrix representation, but can vary time
to get different representations
)exp()(
)0()()(
tt
tt
LH
pHp
)0()()0()exp(
)0()exp(
)0()()0()(
pLHpLL
pL
pHpH
tt
tt
tt
tt
)()(
tt
tLp
p
Diffusion as continuous time random walk
Consider the following walk on a k-regular graph
At each time step:
stay at the same vertex with probability (1-s)
Move with prob. s to an adjacent vertex chosen uniformly at random
This is called a lazy random walk
Transition matrix
LI
AII
AIT
k
s
kk
s
kss
)(
1)1(
Diffusion as continuous time random walk
Let s be a time-step
n=t/s is the number of steps to reach time t
n
n
n
nk
t
k
s
t
LI
LI
TT )(
LLIT
k
t
nk
tt
n
nsexplim)(lim
0
Small times
Large times
Only smallest eigenvalues survive, λ1=0 and λ2
Behaves like Fiedler vector (Spectral cut)
Spectral representation
t
t
T
e
e
t
ttt
2
1
0
0
)exp(
)exp()exp()(
UULH
t
ttt
LI
LLIH
22
!2
1)(
0 te
Tte
nt 22
21
)( JH
In all but some special cases, the heat kernel can be
reconstructed from what happens at the vertices only (the
self-heat) H(u,u)
Sun et al used diagonal elements of the heat kernel to
characterise 3D object meshes
[A Concise and Provably Informative Multi-Scale Signature Based on Heat
Diffusion, Sun, Ovsjanikov ,Guibas, Comput. Graph. Forum, 2009]
Describes a particular vertex (for matching) by heat content at
various times. Times carefully chosen to be informative
about the diffusion process.
Heat Kernel Signature
]),,(),,(),,([HKS210
xxHxxHxxH ttt
A global version describing the whole network can be
constructed by histogramming over the vertices
),,(),,(hist
),,(),,(hist
GHKS 2211
2211
11
00
uuHuuH
uuHuuH
tt
tt
Example: Financial network
Analysis of stock closing prices over a long period of time.
Correlation analysis reveals connections between stocks.
Threshold to get a complex network.
Analysis of the structure of the network through graph
signatures reveals interesting features.
[Matlab plot]
Centrality
We can use the heat kernel to define another node centrality
measure. Consider the following adjacency matrix as a
weighted graph (with weights 1/√dudv on the edges)
The weighted sum of all paths of length k between two
vertices u and v is given by
2
1
2
1
ˆ
ADDA
uv
kA
Subgraph Centrality
Total communication between vertices is sum over paths of
all lengths
α allows us to control the weight of longer paths vs shorter
What should α be?
Number of possible ways to go increases factorially with k
Longer paths should be weighted less
0
ˆ
k
uv
k
uvC A
0
ˆ!k
uv
kk
uvk
tC A
Subgraph centrality
Subgraph centrality (Estrada, Rodríguez-Velázquez 2005):
centrality is the ability of vertex to communicate with others
Relationship to heat kernel
Subgraph centrality generally uses A, but results coincide
exactly for regular graphs
v k
uv
kk
v
uvuk
tCs
0
ˆ!
A
H1s
CA
H
A
AIL
V
e
ek
te
ee
eet
t
k
tkk
t
tt
ttt
0
ˆ
ˆˆ
ˆ!
)(
Wave propagation
Wave equation in discrete space, continuous time
The wave kernel (analogous to the heat kernel) is
pt
p 2
2
2
Lpp
2
2
t
LW itt exp)(
Features of wave kernel
No dissipation: waves continue travelling and no steady
state at large times
Basic waves transmitted are the eigenvectors of L and the
frequencies are √
What happens at small times t=δ?
All vertices reached with small amplitude at any non-zero
time (infinite propagation speed)
Tititt UULW expexp)(
22
2
1)( LLW iI
Wave Kernel Signature
We can define a wave kernel signature analogous to the
heat kernel signature:
A global signature of the graph can be constructed again
with histograms.
[The wave kernel signature: A quantum mechanical approach to shape
analysis,Aubry, Schlickewei, Cremers, ICCV Workshops, 2011]
k
kk
euveu
2
2
2
)log(exp)(),(WKS
Example: PPIs
PPIs often modelled with geometric model of complex
networks
Biogrid human PPI
|V|=1923, |E|=3866, density 0.00209, <k>=4.02
What model fits this network? Use Monte-carlo sampling
of model and examine similarities between samples and
between sample and PPI
Example: PPIs
PPI-model model-model
Wave kernel signature similarity [Graph Signatures for Evaluating Network Models, Wilson, ICPR 2014]
GEO3D Configurational
Quantum Graphs
Quantum Graph
Free space Schrodinger equation pmt
pi 2
2
2
Infinite boundary conditions
Constrained
More constrained
Quantum graph
1D piece of space
Graph Calculus
Key Idea: Geometric graph
Now the graph lives in two spaces
Node-space V: Point-like measure
Edge-space E: interval (Lebeguese measure)
Graph functions exist on vertices and on edges
Called a quantum graph or metric graph
Edge: interval
or length
Node: Point-like
Graph Laplacian
To solve problems like wave propagation on these graphs,
we need to know the Laplacian. The edge part is like a
normal 1D Laplacian along the edges
The vertex part is essentially the same as a type of
discrete Laplacian, except that it is adjacent to the
edges where they meet, not other vertices
eue
eueV ufu
f,
, )()(
1n
V
2
2
dx
fdE
Vertex- and Edge-functions
A function f on the graph exists both at vertices and along
the edge intervals.
If Ef=0 then the function is said to be vertex-based
Similarly Vf=0 gives an edge-based function
Ef=0 implies the function is linear on edges
In this case we have
– f exists on vertices
– f is linear between vertices
– Continuity means f is fully defined by the vertex values
– The Laplacian is a discrete Laplacian
•The new and old frameworks co-incide when edge
functions are linear
Edge-based functions
Edge-based functions are more interesting and complex.
Depend on boundary conditions imposed at the vertices.
Most natural and interesting case are Neumann conditions
Sum of inward pointing gradients must be zero at vertices. f
must also be continuous. Given these conditions, what are
the eigenvalues and eigenfunctions of the graph
Laplacian?
0)()1(
0
,
,
1 ,
eue
ue
x
V
xf
f
ue
Eigenvalues
Assumption: All edge lengths are constant and equal. f
must satisfy
B and C to be determined from the Neumann boundary
conditions
)(cos)(),( eBxeCxef ee
fdx
fd
2
2
Edge-based Eigenfunctions
A is the adjacency matrix of the (combinatorial) graph
B(e) and C(e) are determined by the corresponding eigenvector
In particular, these eigenfunctions take the value of the
eigenvector on the vertices (vertex-supported)
Behave like eigenvectors of (related to random walk)
EΔ
,-λ
of eigenvaluean is
cos
then,11 and of eigenvaluean is If
matrix)n (Transitiomatrix adjacency normalised Row
1
1
T
ADT
T
Edge-interior Eigenfunctions
The remaining eigenfunctions are edge-interior (zero on all
vertices) and have eigenvalues or 2
There are two types
Eigenvalue has condition that the gradients sum to zero at
each vertex (symmetric)
•Eigenvalue 2 has condition that the directed gradients
sum to zero, accounting for the direction of the edge
(antisymmetric)
•B(e) is /2, C(e) must be determined from the conditions
2cos)(),(
ee xeCxef
Edge-interior Eigenfunctions
Theorem: Let Q be the adjacency matrix of the OLG. Then
the eigenvectors s with eigenvalue -1 of Q have
components equal to the values of C(e) for the symmetric
eigenfunctions. The eigenvectors with eigenvalue +1 have
components equal to C(e) in the antisymmetric case.
The structure of these eigenfunctions are determined by
certain eigenvectors of Q (related to the backtrackless
walk)
Eigenfunction summary
Two sorts of eigenfunction of the edge-based Laplacian
Vertex supported
– Determined by the eigensystem of the row-normalized adjacency
matrix, or equivalently the structure of the random walk
Edge interior
– Determined by selected eigenvectors of the OLG, i.e. the structure
of the backtrackless random walk
The edge-based Laplacian has a rich structure from both
types of walk
Applications
Wave equation:
With discrete Laplacian L, signal recieved
instantly (no ‘length’ between vertices) but the
edge-based Laplacian has finite speed of
transmission
ft
f
2
2
Wave Equation
A better model of transmission in networks?
Evolution of Gaussian wave packet on a graph
Evolution of Gaussian wave packet on a graph with 5 vertices and 7 edges
Wave packet signature
Local signature for edge sampled at different
times
Global signature histograms over edges
ntutututuWPS ,,...,,,,,,)( 210
||321 ,...,,,hist)( EWPSWPSWPSWPSGGWPS
Unweighted Graphs
Truncated
Laplacian Wave packet
Signature
Method
Delaunay
Triangulatio
n
Gabriel
Graphs
RN
Graphs
Wave Packet Signature 0.9965 0.9511 0.8235
Random Walk Kernel 0.9526 0.9115 0.8197
Ihara Coefficients 0.9864 0.8574 0.7541
References
Eigenfunctions of the edge-based Laplacian on a graph, Wilson, Aziz,
Hancock,, Linear Algebra and its Applications, 2013
Edge-based Operators for Graph Characterization. Aziz,
Furqan (2014) PhD thesis, University of York.
http://etheses.whiterose.ac.uk/6218/
Graph characterization using Gaussian wave packet signature, Aziz,
Wilson, Hancock, LNCS, 2013
Wave equations on graphs and the edge-based Laplacian, Friedman, .
Tillich, Pacific Journal of Mathematics, 2004
Extra slides
General Solution of the Wave Equation
•With edge coordinate χ and time t, edge-based wave equation is
•Seek separable solutions of the form
.
with edge-based eigenfunctions
•Gives a temporal solution
•By superposition, we obtain the general solution
tu
t
tuE ,
,2
2
tntn
xnxeBeCtu
nn
n
2sin2cos
2,cos,,
,,
tgtu n ,,
xnxeBeCn 2,cos,,
tntntg nn 2sin2cos ,,
Initial Conditions
•Since the wave equation is second order partial differential equation, we can impose initial conditions on both position and speed
and we obtain
We can find these coefficients using the orthonormality of eigenfunctions. So we get
where
similarly
where
1
0
2
, ),( nixiiB
n eexedxqeG
e
nnn GGeCnx *
,,,2
1,2
1
0
2
, ),( nixiiB
n eexedxpeF
e
nnn FFeC *
,,,2
1,
n
n xnxeBeCnxq 2,cos,2,
n
n xnxeBeCp 2,cos,,
qt
u
0,
pu 0,
Gaussian Wave Packet
•We assume the initial position be a Gaussian wave packet
p(e,x)=exp{-a(x-μ)2}
• on one particular edge and zero everywhere else. Then we have
•Solving, we get
•And so
•Similarly
nxiaixa
aiiB
n edxeeeeF
2
2
24,
2
224
1
2
,
nanBi
n eea
F
nBeCe
a
na
n
2cos,2
24
1
,
nBeCe
a
na
n
2sin,2
24
1
,
General Solution of the Wave Equation
where Ωa represents the set of vertex-supported eigenvalues and Ωb and Ωc represent the set of edge-interior eigenvalues. i.e., π and 2π. Also
Solution of wave equation with Gaussian wave packet as initial condition
Cospectral Graphs
Wavepacket Signature