Repo for the sharing on applications of spectral graph theory in computer vision
Consider the adjacency matrix shown below:
Consider the adjacency matrix shown below with non-negative entries:
Then the Laplacian matrix is
Now consider the bilinear form
Since the Laplacian is symmetric positive semi-definite, all eigenvalues are non-negative.
We can draw a few observations from the above form:
-
The value is non-negative. Thus its minimum value is 0 when
$x = y = z$ . Thus the vector$1$ is the eigenvector with the corresponding eigenvalue 0. -
The second smallest eigenvalue can be seen as an optimization (minimization) of the expression
$q(x-y)^2 + r(x-z)^2 + t(y-z)^2$ . We need to find an interpretation of this expression.2.1 This expression can be written in a more human readable form as:
sum of v[i]-v[j] squared and weighted by the edge i-j
.2.2 To minimize this sum, the weight corresponding to a large
v[i]-v[j]
should be the small.2.3 In other words, whenever the edge
i-j
is weak (i.e. small), the corresponding coordinate valuesv[i]
andv[j]
will be large - in other words, they will most likely have opposite signs.2.4 Similarly, whenever the edge
i-j
is strong, the corresponding coordinate valuesv[i]
andv[j]
will be small - in other words, they will most likely have similar signs.
This argument can be extended to a matrix of any dimension.