Lecture 3

Lecture 3: Graph Convolutional Filters (9/13 – 9/18)

We begin our exploration of the techniques we will use to devise learning parametrization for graphs signals. In this lecture we study graph convolutional filters. Although we already defined graph convolutions in Lecture 1, this lecture takes a more comprehensive approach. We formally define graphs, graph signals and graph shift operators. We introduce the diffusion sequence which we build through recursive application of the shift operator. This sequence is the basis for an alternative definition of graph filters as linear combinations of the components of the diffusion sequence. This is a definition that is more operational than the definition of graph filters as polynomials on the shift. We close the lecture with the introduction of the graph Fourier transform and the frequency representation of graph filters.

Handout.

Script.

Access full lecture playlist.

Video 3.1 – Graphs

We define graphs and discuss different types: Directed and symmetric and weighted and unweighted. Just a review to set up notation. Watch it quickly as a refresher.

• Covers Slides 1-6 in the handout.

Video 3.2 – Graph Shift Operators

It is standard to represent graphs with adjacency and Laplacian matrices. In the context of graph signal processing, these matrix representations of a graph are called graph shift operators. The notational conventions used in graph shift operators are a little different from the standard conventions of graph theory. It is worth taking a look. We also define normalized adjacency and Laplacian matrices.

• Covers Slides 7-15 in the handout.

Video 3.3 – Graph Signals

Graph signals are the objects we process with graph convolutional filters and, in upcoming lectures, with graph neural networks. They are defined as vectors whose components are associated to nodes of the graph. When given a graph signal, we can multiply it with the graph shift operator. This gives rise to the diffusion sequence, which is instrumental in coming up with operational definitions of graph filters.

• Covers Slides 16-20 in the handout.

Video 3.4 – Graph Convolutional Filters

We revisit the definition of graph convolutional filters. As before, we will write them as polynomials on a matrix representation of the graph. But we will also write them as linear combinations of the components of the diffusion sequence. This gives an alternative view of graph filters which is better connected to their practical implementation and applicability. We also establish a connection to familiar shift register structures.

• Covers Slides 21-25 in the handout.

Video 3.5 – Time Convolutions and Graph Convolutions

We revisit the fact that convolutions in time can be recovered as particular cases of graph convolutions when using the adjacency matrix of a directed line graph as shift operator. We take it more slowly and show how the shift register we use for convolutions in time is modified towards the construction of a graph shift register.

• Covers Slides 26-30 in the handout.

Video 3.6 – Graph Fourier Transform

The analysis of graph signals is an important component of this course. The fundamental tool we use to analyze graph signals is the graph Fourier transform (GFT). The GFT is defined as a projection of the signal in the eigenvector space of the graph. An inverse transform is defined as well.

• Covers Slides 31-34 in the handout.

Video 3.7 – Graph Frequency Response

The graph Fourier transform is leveraged to represent graph filters in the graph frequency domain. It will turn out that in this domain graph filters admit a pointwise representation in which individual frequency components of the input are scaled to produce individual frequency components of the output. This is another fundamental property that graph filters share with time convolutional filters.

• Covers Slides 35-40 in the handout.