Lecture 10: Transferability of GNNs (11/6 – 11/8)
In this lecture we discuss the transferability properties of GNNs, that is to say being able to tranfer a machine learning model with peformance guarantees. To begin with, we delve into the convergence of graphon filters on both the spectral and the nodal domain. Later, we discuss graphon filters as generative models. Our journey continues as we introduce graphon neural networks (WNNs) a key element to explain why GNNs are transferable between deterministic graphs obtained from a graphon. We finish our journey by reaching the goal of the lecture, we show that GNNs inherit the transferability properties of graph filters.
• Handout.
• Script.
• Access full lecture playlist.
• Convergence of filter response for Lipschitz continuous graph filters (proof).
• Graph-Graphon filter approximation theorem (proof).
• Graphon neural network approximation by graph neural network theorem (proof).
Video 10.1 – Convergence of Graph Filters in the Spectral Domain
In this part of the lecture, we consider convergent sequences of graphs along with associated sequences of graph filters. We show that if the graph sequences converge towards a graphon, the graph filter sequence converges towards a graphon filter in the frequency domain. In this lecture, we present the convergence of graph filters in the frequency domain theorem, which says that the GFT representation of the graph filter sequence converges to the WFT representation of the graphon sequence. This result is not unexpected nor strong, but this lecture is just the beginning.
• Covers Slides 1-5 in the handout.
Video 10.2 – Convergence of Graph Filters in the node Domain
In this part of lecture, we show that the a sequence of graph filters converge towards graphon filters for convergent graph signal sequences. But, in this lecture, unlike the previous one our statements will be on the node domain. However, when it comes to the convergence in nodal domain, we need the graphon signal to be bandlimited. To get around with this, we turn to Lipschitz graphon filters to provide a stronger result, which states the convergence for any graphon signal. The proof of this theorem can be found here.
• Covers Slides 6-12 in the handout.
Video 10.3 – Graphon Filters are Generative Models for Graph Filters
In this part of the lecture, we discuss the conditions under which graph filters can approximate graphon filters, and how good that approximation is for different sizes of graphs. Namely, as the number of nodes increases, the graph filters become more similar to the graphon filter. We will see that we can use graph filters as approximations for the graphon filter. In this lecture, we will indeed present the conditions under which graph filters can approximate the graphon filters and we will quantify how good this approximation is.
• Covers Slides 13-28 in the handout.
Video 10.4 – Transferability of Graph Filters: Theorem
Our next goal is to prove the transferability of graph filters. That is to say, in this lecture, we will show that two different graphs with a different number of nodes are close. If two graphs with a different number of nodes are sampled from the same graphon then the graph filters for each graph will be close. After stating the assumption and the definitions, we then present the graph filter transferability theorem, whose proof can be found in the proof article.
• Covers Slides 29-34 in the handout.
Video 10.5 – Transferability of Graph Filters: Remarks
In this lecture, we introduce graphon neural networks (WNNs). We define them and compare them with their GNN counterpart. By doing so, we discuss their interpretations as generative models for GNNs. Also, we leverage the idea of a sequence of GNNs converging to a graphon neural network (WNN). Finally, we present the GNN-WNN approximation theorem, where we prove an asymptotic error bound for approximating WNNs with GNNs.
• Covers Slides 35-41 in the handout.
Video 10.6 – Transferability of GNNs
In this lecture we discuss the transferability of GNNs. Using the approximation theorem results of GNNs-WNNs and considering a broad class of graphons and graphon signals, we show that GNNs are transferable for graphs of different sizes belonging to the same graphon family, and we derive an upper bound error that decreases asymptotically with the size of the graphs. From this analysis we show how GNNs are scalable and robust to increases in the graph size. The proof can be found here.
• Covers Slides 42-52 in the handout.