# Algebraic Neural Networks

In this lecture, we study algebraic neural networks (AlgNNs) as formal and natural generalizations of convolutional network architectures. Leveraging the representation theory of algebras and algebraic signal processing we analyze and study any neural network architecture where formal convolutions are used. We start discussing basic notions of linear algebra before discussing algebraic signal processing, where we define formally the notions of filtering and convolution. Additionally, we show how particular instantiations of the generic algebraic signal model leads to graph signal processing, graphon signal processing and traditional signal processing. With these notions of hand we define formally AlgNNs and discuss their stability properties. We also show how graph neural networks, graphon neural networks and traditional CNNs are particular cases of AlgNNs and how several results discussed in previous lectures can be obtained at the algebraic level.

• Handout.

• Script.

•Proof Stability of Algebraic Filters

• Access full lecture playlist.

### Video 12.1 – Linear Algebra

In this part of the lecture we provide a review of basic concepts in linear algebra, necessary for the analysis of algebraic signal models. We make emphasis in the notions of: fields, vector spaces and algebras and then we discuss their role in the linear processing of information in general.

• Covers Slides 1-9 in the handout.

### Video 12.2 – Algebraic Signal Processing

In the previous video we saw that the linear processing of a signal can be expressed as the application of an endomorphism on a vector space. We also highlighted that the space of all endomorphisms of a vector space is an algebra. One however that does not allow for the exploitation of signal structure. We know that the introduction of convolutional filters is necessary to leverage structure. In this video we explain the use of algebras and homomorphisms to restrict the set of allowable linear transformations that can be applied to a signal.

• Covers Slides 10-16 in the handout.

### Video 12.3 – Polynomials in an Algebra and Polynomial Functions

Polynomials and Polynomial functions play a central role in algebraic signal processing. This section is a short aside to introduce definitions that we will use later on.

• Covers Slides 17-21 in the handout.

### Video 12.4 – Generators, Shift Operators and Frequency Representations

Different forms of convolutional signal processing can be recast into the common abstract framework of algebraic signal processing. We use algebras and homomorphisms to define different types of convolutional filters. In the analysis of these filters there are three central components that appear: Generators, Shift Operators, and Frequency Representations. We cover these three concepts in this section.

• Covers Slides 22-34 in the handout.

### Video 12.5 – Convolutional Information Processing

Algebraic filters provide a generic framework out of which we can extract the commonalities of different forms of convolutional information processing. To substantiate this claim we have to show that, indeed, it is possible to express familiar convolutional filters in the language of algebraic filters. Doing so requires the specification of vector spaces, algebras, and homomorphisms. Which we do in this section for graph, time, and image processing.

• Covers Slides 35-45 in the handout.

### Video 12.6 – Algebraic Neural Networks

With the concept of algebraic filtering in place, we introduce the algebraic neural network architecture in this video.

• Covers Slides 46-52 in the handout.

### Video 12.7 – Perturbation Models

In this video we generalize the notion of perturbation to generic algebraic signal processing models. In particular, we show that changes in the filters over the algebra of endomorphisms can be expressed as perturbation functions defined on the set of shift operators. We also provide some examples considering particular instantiations of the algebraic signal model.

• Covers Slides 53-58 in the handout.

### Video 12.8 – Stability Theorems

In this part of the lecture, we will define stability in the context of algebraic signal processing. And use this definition to show that algebraic filters and algebraic neural networks are stable. The results obtained exhibit a large resemblance to those results derived for stability of GNNs as they can be obtained as a particular case of the stability theorems derived algebraic signal models. The proof for the stability of algebraic filters can be found here.

• Covers Slides 59-66 in the handout.

### Video 12.9 – Spectral Representations

Central to the analysis of algebraic signal models is the notion of spectral or Fourier decompositions. In this video we introduce this notion based on the concept of decompositions in terms of irreducible subrepresentations.

• Covers Slides 67-75 in the handout.