Lectures

Lecture 0: Python Tutorial (8/29 – 9/1)

In the first week of this course, we will introduce students to training models with PyTorch. This will allow them to gain some valuable insights and hands-on experience with Python toolboxes that they will use throughout the semester.

Access PyTorch blog.

Lecture 1: Machine Learning on Graphs (9/5 – 9/8)

Graph Neural Networks (GNNs) are tools with broad applicability and very interesting properties. There is a lot that can be done with them and a lot to learn about them. In this first lecture we go over the goals of the course and explain the reason why we should care about GNNs. We also offer a preview of what is to come. We discuss the importance of leveraging structure in scalable learning and how convolutions do that for signals in Euclidean space. We further explain how to generalize convolutions to graphs and the consequent generalization of convolutional neural networks to graph (convolutional) neural networks.

Access lecture page.

Handout.

Script.

Recording.

Lecture 2: Empirical Risk Minimization (9/12 – 9/15)

In Lecture 1 we saw that out interest in graph neural networks (GNNs) stems from their use in artificial intelligence and machine learning problems that involve graph signals. Before we move on to talk more about GNNs we need to be more specific about what we mean by machine learning (ML) and artificial intelligence (AI). Our specific meaning is that ML and AI are synonyms of statistical and empirical risk mimization. These are the subjects we will study in this lecture. We have to disclaim that this is a somewhat narrow definition of ML and AI. But one that will be sufficient for our goals. In addition to studying statistical and empirical risk minimization we will also have a brief discussion about stochastic gradient descent. Throughout we will emphasize the importance of choosing appropriate learning parametrizations for successful learning.

Access lecture page.

Handout.

Script.

Recording.

Lecture 3: Graph Convolutional Filters (9/19 – 9/22)

We begin our exploration of the techniques we will use to devise learning parametrization for graphs signals. In this lecture, we study graph convolutional filters. Although we already defined graph convolutions in Lecture 1, this lecture takes a more comprehensive approach. We formally define graphs, graph signals and graph shift operators. We introduce the diffusion sequence which we build through recursive application of the shift operator. This sequence is the basis for an alternative definition of graph filters as linear combinations of the components of the diffusion sequence. This is a definition that is more operational than the definition of graph filters as polynomials on the shift. We close the lecture with the introduction of the graph Fourier transform and the frequency representation of graph filters.

Access lecture page.

Handout.

Script.

Recording.

Lecture 4: Graph Neural Networks (9/26 – 9/29)

This lecture is devoted to the introduction of graph neural networks (GNNs). We start from graph filters and build graph perceptrons by adding compositions with pointwise nonlinearities. We stack graph perceptrons to construct GNNs. This simple GNN architectures are expanded with the use of filter banks, as well as multiple-input-multiple-output graph filters to produce multiple feature GNNs. Multiple feature GNNs are the workhorse of machine learning on graphs.

Access lecture page.

Handout.

Script.

Recording.

Lecture 5: Permutations and Dilations (10/3 – 10/6)

In this lecture, we will discuss the properties of permutation equivariance and stability to dilations of graph neural networks (GNNs). We start our discussion introducing the notion of permutation equivariance for graph filters and show how this property is inherited by GNNs. Later, we study Lipschitz and integral Lipschitz filters, necessary to state formal results about stability to deformations for both graph filters and GNNs. Then, we state formally stability results for GNNs considering deformations associated to absolute (additive) and relative (multiplicative) perturbations. To enlighten these results we discuss the details of the proofs and finally we discuss in depth the implications and meaning of these results altogether.

Access lecture page.

Handout.

Script.

Recording.

Lecture 6: Stability Properties of GNNs (10/10 – 10/18)

In this lecture, we analyze the stability of graph filters to additive and relative perturbations. We start defining the notion of additive perturbation discussing its meaning and consequences in graph filters. Later, we study the stability of Lipschitz filters to additive perturbations providing concrete calculations. Then, we introduce the important concept of relative perturbation with a detailed discussion of its implications and its different nature with respect to the additive perturbation model. After this, we study the stability of integral Lipschitz filters to relative perturbations.

Access lecture page.

Handout.

Script.

Recording.

Lecture 7: Stability of Graph Filters – Proof

In this lecture, we prove the stability of graph filters to relative perturbations. We show that integral Lipschitz filters are stable to relative perturbations, by showing that the operator norm modulo permutation of the difference between the graph filter for a graph and a relative perturbation of the graph is bounded. Following the proof will require using almost all the concepts learned so far plus some linear algebra. After this, we do the same but for the case of additive perturbations, this section however is part of the appendix of the lecture.

Access lecture page.

Handout.

Article.

Script.

Recording.

 

Lecture 8: Midterm Review (10/20)

In this lecture, we will do a summary of the topics we have been studying so far. We will start by defining signals supported on graphs, graph convolutional filters, and Graph Neural Networks. Then, we will show how to learn ratings in recommendation systems tailoring the problem in an empirical risk minimization framework. Later, we will learn the ratings with different types of parameterizations and we will compare their performance. There is nothing new in this, as we have already seen in Lab 3 that GNNs will outperform both FCNN, graph filters, and linear regression. Later, we will delve into the permutation equivariance of graph filters and GNNs, showing how they can successfully exploit signal symmetries. After this, we will lecture on the stability of graph filters to graph perturbations, arguing that integral Lipschitz filters are stable to relative deformations. Next, we will study the tradeoff between stability and discriminability. Finally, we will do a recap on equivariance, stability, and transference.

Access lecture page.

Handout.

Script.

Recording.

Lecture 9: Graphon Signal Processing (10/24 – 10/27)

In this lecture we discuss graphon signal processing. A graphon is a bounded function defined on the unit square that can be conceived as the limit of a sequence of graphs whose number of nodes and edges grows up to infinity. This framework provides a powerful set of tools and insights that facilitate the understanding of structures like GNNs when the number of nodes in the graph layers is large. By means of graphons we can translate graph signal processing into harmonic analysis on the unit interval allowing us to exploit consolidated tools from functional analysis. This provides indeed a beautiful connection of two notions of Fourier decompositions apparently disconnected.

Access lecture page.

Handout.

Script.

Recording.

The graphon Fourier transform converges to the graphon Fourier transform (Proof)

Frequency representation of graphon filters. (Proof)

Lecture 10: Transferability of GNNs (11/1 – 11/5)

In this lecture we discuss the transferability properties of GNNs, that is to say being able to tranfer a machine learning model with peformance guarantees. To begin with, we delve into the convergence of graphon filters on both the spectral and the nodal domain. Later, we discuss graphon filters as generative models. Our journey continues as we introduce graphon neural networks (WNNs) a key element to explain why GNNs are transferable between deterministic graphs obtained from a graphon. We finish our journey by reaching the goal of the lecture, we show that GNNs inherit the transferability properties of graph filters.

Handout.

Script.

Access full lecture playlist.

Convergence of filter response for Lipschitz continuous graph filters (proof).

Graph-Graphon filter approximation theorem (proof).

Graphon neural network approximation by graph neural network theorem (proof).

Lecture 11: Graph Recurrent Neural Networks

In this lecture, we will do learn yet another type of neural network architecture. In this case, we will go over recurrent neural networks, an architecture that is particularly useful when the data exhibits a time dependency. We will begin the lecture by going over machine learning on sequences and showing the limitations and problems that arise by not considering the time domain in out solution. Next, we will delve into Recurrent Neural Networks showing how they successfully exploit the history of the process. Then, we introduce the graph counterpart, the Graph Recurrent Neural Network (GRNN) and we present the stability theorem for them. We finish the lecture by introducing a numerical example in modeling an epidemic. 

Handout.

Script.

Proof of stability of GRNNs.

Access full lecture playlist.

Lecture 12: Algebraic Neural Networks (11/28 – 12/1)

In this lecture, we study algebraic neural networks (AlgNNs) as formal and natural generalizations of convolutional network architectures. Leveraging the representation theory of algebras and algebraic signal processing we analyze and study any neural network architecture where formal convolutions are used. We start discussing basic notions of linear algebra before discussing algebraic signal processing, where we define formally the notions of filtering and convolution. Additionally, we show how particular instantiations of the generic algebraic signal model leads to graph signal processing, graphon signal processing and traditional signal processing. With these notions of hand we define formally AlgNNs and discuss their stability properties. We also show how graph neural networks, graphon neural networks and traditional CNNs are particular cases of AlgNNs and how several results discussed in previous lectures can be obtained at the algebraic level.

Access lecture page.

Handout.

Script.

Proof of Stability of Algebraic Filters.

Recording.

 

Additional Lectures

Lecture: Processing Graphs with GNNs (11/7 – 12/10)

So far, we have discussed GNNs as the natural extension of CNNs that processes graph signals. However, in several interesting applications the signal of interest is the graph itself. In this lecture we will discuss how to use GNNs to process graphs. We will study the benefits of GNNs compared to other alternatives, potential limitations and ways to overcome them. Different GNN architectures will also be considered.

Lecture: Representation Power of GNNs (11/14 – 11/17)

In this lecture, we discuss the representational capabilities of GNNs. In particular, we study the ability of GNNs to solve the graph isomorphism problem, i.e., we investigate if GNNs can differentiate between different graphs. We begin our discussion by analyzing GNNs as processing tools of graphs and discuss potential limitations. This is in contrast to previous lectures, where GNNs are analyzed as processing tools of signals that are supported on a graph. Then we study GNNs with random input and show that they avoid limitations caused by the absence of input signals. Finally, we reach the goal of the lecture and prove that GNNs can differentiate between the majority of real graphs.