Meetings

Live Lecture Questions

We meet on Tuesdays and Thursdays at 3:30 pm in the AGH auditorium. These meetings are organized around 4 or 5 questions that I will posit and answer. You can access the questions on this page. Do remember, that I will assume that you have already watched the video lectures in these meetings. You will not be able to understand what I am saying if you haven’t done so.

Questions for Lecture 1.  There are two questions in the slides that have to do with the connections between convolution in time, convolutions on images, and convolutions on graphs. It may be a good idea to read this short paper.

Questions for Lecture 2. The definition of AI as empirical risk minimization.

Questions for Lecture 3. Introduction of graph signal processing concepts.

Questions for Lecture 4. The momentous lecture when we introduce the definition of graph neural networks. We have several architectural questions along with some conceptual questions on permutation equivariance and the behavior of graph filter banks in the GFT domain.

Questions for Lecture 5. Likely my favorite lecture of the term. We get to discuss permutation equivariance formally and introduce a basic analysis of the stability properties of graph filers and GNNs. There are only 4 questions that we will cover on Wednesday. On Friday, we will go over the proof of the stability of graph filters to scalings. A group of students will be contacted by the instructor to help with this session.

Questions for Lecture 6. We will cover Questions 1-9 on Wednesday, October 11. This week is devoted to studying the stability properties of graph neural networks. The only reason why this is not my favorite lecture of the term is because Lecture 5 covers similar concepts at a more intuitive level. However, I am intending to make professionals out of you. And professionals are in the details. We are going over the details this week.

Questions for Lecture 8. This week is devoted to a review. We have some interesting questions to cover on the stability vs discriminability tradeoff of graph filters and GNNs. We will go over them on Wednesday and Friday. We also have some interesting digressions on the performance of graph filters and GNNs relative to linear regression and fully connected neural networks. We will go over them on Friday

Questions for Lecture 7. No questions.

Questions for Lecture 9. We will cover Questions 1-4 on Wednesday and Questions 5-8 on Friday. This week sees the introduction of graphons, graphon signals, and the foundational concepts of Graphon Signal Processing: A Fourier transform and the definition of Graphon filters. Graphons are limit objects of graphs and, consequently, graphon Fourier transforms and graphon filters are limit objects of graph Fourier transforms and graph filters. This is instrumental in explaining the transferability of graph filters and GNNs as we are going to briefly touch upon this week and elaborate extensively next week.

Questions for Lecture 10. We will cover Questions 1-4 on Wednesday and Questions 5-8 on Friday. We leverage our introduction of graphons to study the transferability of graph filters and GNNs. Transferability is proven by comparing graph filters and GNNs with graphon filters and graphon neural networks.

Questions for Lecture 11. We study Markov models and hidden Markov models to motivate the introduction of recurrent neural networks and graph recurrent neural networks. We also discuss gating in time and space.

Questions for Lecture 12. We will cover Questions 1-4 on Wednesday and Questions 5-8 on Friday. This lecture introduces algebraic signal processing (ASP) along with its two fundamental tools: Algebraic filters and algebraic neural networks. ASP is an abstract formulation out of which we can, among others, recover graph, time, and image processing as particular cases. We will see how the fundamental stability results of graph neural networks and Convolutional Neural Networks are manifestations of a common phenomena that holds for general algebraic filters and algebraic neural networks.