Lecture 5: Permutations and Dilations (9/27 – 10/1)
In this lecture we will discuss the properties of permutation equivariance and stability to deformations of graph neural networks (GNNs). We start our discussion introducing the notion of permutation equivariance for graph filters and show how this property is inherited by GNNs. Later, we study Lipschitz and integral Lipschitz filters, necessary to state formal results about stability to deformations for both graph filters and GNNs. Then, we state formally stability results for GNNs considering deformations associated to absolute (additive) and relative (multiplicative) perturbations. To enlighten these results we discuss the details of the proofs and finally we discuss in depth the implications and meaning of these results altogether.
Video 5.1 – Permutation Equivariance of Graph Filters
In this lecture we discuss the concept of permutation equivariance for graph filters. We start introducing the basic notions of permutation and later state formally permutation equivariance for matrix operators associated to graph filters. We show formally why graph filters are permutation equivariant and what are the implications of this fact in terms of the labelings of the information that is being processed.
• Covers Slides 1-7 in the handout.
Video 5.2 – Permutation Equivariance of Graph Neural Networks
We studied permutation equivariance and showed that graph filters satisfy this property. In this lecture we show formally that graph neural networks inherit permutation equivariance from graph filters party due to the properties of the pointwise nonlinearity functions mapping the information from one layer to the other in the GNN. Additonally, we show additional implications of permutation equivariance when processing information in graphs that are symmetric or close to be symmetric.
• Covers Slides 8-15 in the handout.
Video 5.3 – Lipschitz and Integral Lipschitz Filters
Before discussing stability, we define and study in depth Lipschitz and integral Lipschitz filters. In particular we discuss in full detail their functional properties and their role in frames for the representation of general functions, making emphasizes in their discriminability attributes.
• Covers Slides 16-23 in the handout.
Video 5.4 – Stability of Graph Filters to Scaling
In this segment we show that graph neural networks are stable to deformations that can be modeled by means of scalings of the shift operator. We show how this type of deformation affects the spectrum and how Lipschitz and integral Lipschitz filters can determine whether a graph filter is stable or not.
• Covers Slides 24-32 in the handout.
Video 5.5 – Stability of Graph Neural Networks to Scaling
We have studied scalings of the shift operators when considering graph filters. Now, we study the same type of deformation for graph neural networks (GNNs). We show that GNNs, like graph filters, are stable to scaling, but we show that in the case of GNNs the limitation imposed by the tradeoff between stability and selectivity is overcome as a consequence of the use of the point-wise nonlinearity functions mapping information between the layers of the GNN. Graph Neural networks can then be stable and discriminative.
• Covers Slides 33-38 in the handout.