Interactive Meetings
We meet on Wednesdays and Fridays at 8:30 am and 10:15 am Eastern Time at 3401 Walnut St. room 401B or through Zoom. To access these meetings you need to have a link and password that has been provided by your instructor. These meetings are interactive Q&A sessions. You can access the questions in this page.
Questions
You will be assigned questions to respond to during lectures. You will be randomly assigned to groups of size 3-4 each, which will be announced to you at the beginning of the semester. Each week, the students in each group should prepare 1-3 slide(s) to answer the questions they are assigned (question assignments will be announced before the beginning of every week). The students will then deliver their responses during the meetings on Wednesdays or Fridays, with a 5-minute response per group. Subsequently, the instructors will provide feedback on the question and the presented answer, and they will open the floor for discussion.
Questions for Week 1. Answers to be prepared by Tuesday, August 30. If you want to reuse my drawings when preparing your responses, you can download the latex fonts for the slides. This is optional. There are two questions in the slides that have to do with the connections between convolution in time, convolutions on images, and convolutions on graphs. It maybe a good idea to read this short paper before you prepare your responses.
Questions for Week 2. Answers to be prepared by Tuesday, September 6. The definition of AI as empirical risk minimization.
Questions for Week 3. Answers to be prepared by Tuesday, September 13. Introduction of graph signal processing concepts.
Questions for Week 4. Answers to be prepared by Tuesday, September 20. The momentous lecture when we introduce the definition of graph neural networks. We have several architectural questions along with some conceptual questions on permutation equivariance and the behavior of graph filter banks in the GFT domain.
Questions for Week 5. Answers to be prepared by Tuesday, September 27. Likely my favorite lecture of the term. We get to discuss permutation equivariance formally and introduce a basic analysis of the stability properties of graph filers and GNNs. There are only 4 questions that we will cover on Wednesday. On Friday, we will go over the proof of stability of graph filters to scalings. A group of students will be contacted by the instructor to help with this session.
Questions for Week 6. Answers to be prepared by Tuesday, October 4. We will cover Questions 1-5 on Wednesday and Questions 6-9 on Friday. This week is devoted to studying stability properties of graph neural networks. The only reason why this is not my favorite lecture of the term is because Lecture 5 covers similar concepts at a more intuitive level. However, I am intending to make professionals out of you. And professionals are in the details. We are going over details this week.
Questions for Week 8. Answers to be prepared by Wednesday, October 20. We will cover Questions 1-4 on Wednesday and Questions 5-8 on Friday. This week is devoted to a review. We have some interesting questions to cover on the stability vs discriminability tradeoff of graph filters and GNNs. We will go over them on Wednesday. We also have some interesting digressions on the performance of graph filters and GNNs relative to linear regression and fully connected neural networks. We will go over them on Friday
Questions for Week 9. Answers to be prepared by Wednesday, October 27. We will cover Questions 1-4 on Wednesday and Questions 5-8 on Friday. This week sees the introduction of graphons, graphon signals, and the foundational concepts of Graphon Signal Processing: A Fourier transform and the definition of Graphon filters. Graphons are limit objects of graphs and, consequently, graphon Fourier transforms and graphon filters are limit objects of graph Fourier transforms and graph filters. This is instrumental in explaining the transferability of graph filters and GNNs as we are going to briefly touch upon this week and elaborate extensively next week.
Questions for Week 10. Answers to be prepared by Wednesday, November 3. We will cover Questions 1-4 on Wednesday and Questions 5-8 on Friday. We leverage our introduction of graphons to study the transferability of graph filter and GNNs. Transferability is proven by comparing graph filters and GNNs with graphon filters and graphon neural networks.
Questions for Week 11. Answers to be prepared by Wednesday, November 10 We will cover Questions 1-4 on Wednesday and Questions 5-8 on Friday. We study Markov models and hidden Markov models to motivate the introduction of recurrent neural networks and graph recurrent neural networks. We also discuss gating in time and space.
Questions for Week 12. Answers to be prepared by Wednesday, November 17. We will cover Questions 1-4 on Wednesday and Questions 5-8 on Friday. This lecture introduces algebraic signal processing (ASP) along with its two fundamental tools: Algebraic filters and algebraic neural networks. ASP is an abstract formulation out of which we can, among others, recover graph, time and image processing as particular cases. We will see how fundamental stability results of graph neural networks and Convolutional Neural Networks are manifestations of a common phenomena that holds for general algebraic filters and algebraic neural networks.