Last modified on 20 November 2014, at 14:36

On 2D Inverse Problems/Special matrices

An important object containing information about a weighted graph G(V,E,w) is its Laplacian matrix. Given a numbering of the vertices of the graph, it's an n by n square matrix LG, where n is the number of vertices in the graph, with the entries:

l_{kl}:=
\begin{cases}
\sum_{v_k \rightarrow v_l} w_{kl} \mbox{  if}\ k = l, \\
-w_{kl} \mbox{  if}\ v_k \rightarrow v_l, \\
0,  \mbox{  otherwise,}
\end{cases}

where vkvl means that there is a directed edge from vertex vk to the vertex vl, and where w is the weight function.

Exercise (*). Given a directed graph G without cycles, prove that one can number its vertices so that the corresponding Laplacian matrix LG is triangular.

Given a weighted graph with boundary it is often convenient to number its boundary vertices first and to write its Laplacian matrix in the block form.


L_G = 
\begin{pmatrix}
A & B \\
C & D
\end{pmatrix}

The Schur complement of the matrix L_G w/respect of the invertible block D is the matrix  L_G/D = A-BD^{-1}C.

Exercise (*). Prove the following determinant identity:\det L_G = \det(L_G/D) \det D.

The following matrix W(G) consisting of random walk exiting probabilities (sums over weighted paths in a graph) plays an important role as boundary measurement for inverse problems. Suppose a weighted graph G has N boundary nodes, then the kl 'th entry of the N by N matrix equals to the probability that the first boundary vertex, that a particle starting its random walk at the boundary vertex vk occupies, is the boundary vertex vl. For a finite connected graph the columns of the matrix W(G) add up to 1.

Exercise (**). Derive an explicit formula for the matrix W_G in terms of the blocks of Laplace matrix L_Gof the graph G: W_G = I - D_A^{-1}(A - BD^{-1}C).
Exercise (***). Prove the following expansion formulas for entries and blocks of the matrix W(G),
  • for two boundary vertices pk and pl of a graph G
 
w_{kl} = \sum_{v_k\xrightarrow[]{path} v_l}\prod_{e=(p,q)\in path}w(e)/l_{pp},

Hint: use the Leibniz definition of the determinant

  • for two distinct boundary vertices vk and vl of a graph G

w_{kl}\det D = \frac{1}{l_{kk}}\sum_{v_k\xrightarrow{path}v_l}\prod_{e\in path}l(e)\det D(\tilde{path},\tilde{path}),

where  \tilde{path} = V-(\partial G\cup path).

  • for two disjoint subsets of boundary vertices P and Q of size n of a graph G, see [6],[7] and [14]

\det W(P,Q) \det D = \pm\frac{\sum_{\sigma\in S_{n}}(-1)^{\sgn(\sigma)}\sum_{p_k\xrightarrow{paths}q_{\sigma_k}}\prod_{e\in paths}l(e)\det D(\tilde{paths},\tilde{paths})}{\prod_{p\in P}l_{pp}},

where

 \tilde{paths} = V-(\partial G\cup paths).

The exercises above provide a bridge b/w connectivity property of graph G and ranks of submatrices of its Laplacian matrix L(G) and the matrix of hitting probabilities W(G).

Exercise (*). Let G be a planar graph w/natural boundary, numbered circulary. Let P and Q be two non-interlacing subsets of boundary nodes of size n. Prove that
 (-1)^{\frac{n(n+1)}{2}}\det\Lambda_G(P,Q) \ge 0,

w/the strict inequality iff there is a disjoint set of paths from P to Q.

Exercise (*). Show that the numbers of paths in the following graph are equal to the binomial coefficients.
The numbers of the paths of the graph are binomial coefficients

Gluing the graphs w/out loops corresponds to multiplication of the weighted paths matrices.

Exercise (**). Use the result of the previous exercise to it to prove the following Pascal triangle identity, see[13],

\begin{pmatrix}
 1     & 1  &  1 & 1 & \ldots \\
 1     & 2  &  3 & 4 & \ldots \\
 1     & 3  &  6 & \ldots & \ddots \\
 1     & 4  & 10 & \ldots & \ddots \\
 1     & \ldots  & \ldots & \ddots & \ddots \\
\end{pmatrix}
 = 
\begin{pmatrix}
 1     & 0  &  0 & 0 & \ldots \\
 1     & 1  &  0 & 0 & \ldots \\
 1     & 2  &  1 & 0 & \ddots \\
 1     & 3  &  3 & \ldots & \ddots \\
 1     & \ldots  & \ldots & \ddots & \ddots \\
\end{pmatrix}
\begin{pmatrix}
 1     & 1  &  1 & 1 & \ldots \\
 0     & 1  &  2 & 3 & \ldots \\
 0     & 0  &  1 & 3 & \ddots \\
 0     & 0  &  0 & \ldots & \ddots \\
 0     & \ldots  & \ldots & \ddots & \ddots \\
\end{pmatrix}.