Lecture 8

Lecture 8: Midterm Review (10/20)

In this lecture, we will do a summary of the topics we have been studying so far. We will start by defining signals supported on graphs, graph convolutional filters, and Graph Neural Networks. Then, we will show how to learn ratings in recommendation systems tailoring the problem in an empirical risk minimization framework. Later, we will learn the ratings with different types of parameterizations and we will compare their performance. There is nothing new in this, as we have already seen in Lab 3 that GNNs will outperform both FCNN, graph filters, and linear regression. Later, we will delve into the permutation equivariance of graph filters and GNNs, showing how they can successfully exploit signal symmetries. After this, we will lecture on the stability of graph filters to graph perturbations, arguing that integral Lipschitz filters are stable to relative deformations. Next, we will study the tradeoff between stability and discriminability. Finally, we will do a recap on equivariance, stability, and transference.

Handout.

Script.

Access full lecture playlist.

Video 8.1 – First Midterm

In this lecture, we start by going over the topics we studied at the beginning of the class. Here, we will delve into machine learning on graphs, and we will argue why it is important to study them. We will exemplify signals supported on graphs and we will define graph convolutional filters. After this, we will show what a graph neural network is and how to construct it. We will finish this lecture by stating that GNNs should work because of theoretical guarantees, and we will present the practical examples in which they do.

• Covers Slides 1-8 in the handout.

Video 8.2 – Learning Ratings in Recommendation Systems

In this lecture, we formulate the recommendation systems problem as an empirical risk minimization problem. We will define what a user and item mean in this context and we will exemplify them on daily life examples. After this, we will progressively delve into the solution of the problem in the context of graph signals. We will construct the product similarity graph and we will show how to construct input and output signals.

• Covers Slides 9-16 in the handout.

Video 8.3 – Learning Ratings with Graph Filters and GNNs

In this lecture, we will show the results of tackling the recommendation problem with different parameterizations. We will show two that don’t work well and two that do work well. The ones that don’t work well in practice are the fully connected neural network and the linear regression. Naturally, due to the underlying graph structure of data, GNNs, and graph filters can successfully solve the problem. To argue in favor of adding a non-linearity, we will show how the GNN outperforms the graph filter. Finally, we will show the transferability properties of GNNs.

• Covers Slides 17-24 in the handout.

Video 8.4 – Permutation Equivariance

In this lecture, we come back to theory, we will show that GNNs and graph filters are equivariant to permutations so, they are able to exploit signal symmetries. This fundamental property allows both graph filters and GNN to outperform linear regression and FCNN. We will state the permutation equivariance of graph filters and GNN, and conclude that relabeling the input signal results in a consistent relabeling of the output signal.

• Covers Slides 25-30 in the handout.

Video 8.5 – Stability of Graph Filters to Graph Perturbations

In this lecture, we will cover the topics of the previous lecture 7, namely, we will delve into the stability properties of integral Lipschitz filters to relative deformations. By studying the frequency response of the graph filters, we will build the intuition of why is it that we require Lipschitz integrability to be stable. We will show that either the eigenvalue does not change because we are in low frequencies or, the frequency response does not change because we are in high frequencies. Pictures are worth a thousand theorems!

• Covers Slides 31-37 in the handout.

Video 8.6 – Stability and Discriminability are Incompatible in Graph Filters

In this lecture, we will delve into the compromise in which graph filters incur, we will again study the frequency response of graph filters in the context of discriminability and stability. The tradeoff between stability and discriminability is a fundamental drawback of graph filters. Here we will state that discriminability and stability are plainly incompatible in the context of graph filters. That is to say, graph filters can be discriminative OR stable, but they can not be both.

• Covers Slides 38-45 in the handout.

Video 8.7 – The Stability vs Discriminability Tradeoff of GNNs

In this lecture, we will understand the core idea that makes GNNs outperform graph filters, introducing a pointwise nonlinearity allows GNN to be both stable and discriminative. This effect is due to the fact that the pointwise nonlinearity moves high frequencies into lower parts of the spectrum, where they can discriminate successfully. So, the most important idea of this lecture is that it is possible to discriminate and be stable with a GNN, something that is impossible with the graph filter counterpart.

• Covers Slides 46-52 in the handout.

Video 8.8 – Equivariance, Stability, and Transference

In this lecture, we do a summary of the summary. We do a quick recap over the topics covered in this lecture and thus, throughout the course. We present the main theorems and the main results in practice. This lecture is a close-up, in order to double-check that all topics are well understood.

• Covers Slides 53-60 in the handout.