Transferability of Spectral Graph Convolutional Neural Networks

Staff - Faculty of Informatics

Date: 4 June 2021 / 16:00 - 17:00

online event

Speaker:
Ron Levie, University of Munich

Abstract:
Graph neural networks (GNN) are generalizations of grid-based deep learning techniques to graph structured data. The field of GNNs has grown tremendously in the past few years, leading to many practical applications with commercial impact. In this talk we focus on spectral graph convolutional neural networks (CNNs), where convolution is defined as elementwise multiplication in the frequency domain of the graph, and review the mathematical foundations of their generalization capabilities.
In machine learning settings where the dataset consists of signals defined on many different graphs, the trained CNN should generalize to signals on graphs outside the training set. It is thus important to transfer trained filters from one graph to the other. GNN transferability, which is a certain type of generalization capability, can be loosely defined as follows: if two graphs represent the same underlying phenomenon, then a single filter/CNN should have similar repercussions on both graphs. In this talk we will discuss the different approaches to model mathematically the notions of “graphs representing the same phenomenon” and “filter/CNNs have similar repercussions on graphs.” we will then derive corresponding transferability error bounds, proving that spectral methods are transferable.

Biography:
Ron Levie received the Ph.D. degree in applied mathematics in 2018, from Tel Aviv University, Israel. During 2018-2020, he was a postdoctoral researcher with the Research Group Applied Functional Analysis, Institute of Mathematics, TU Berlin, Germany. Since 2021, he is a researcher in the Bavarian AI Chair for Mathematical Foundations of Artificial Intelligence, Department of Mathematics, LMU Munich, Germany. Since 2021, he is also a consultant at the project Radio-Map Assisted Pathloss Prediction, at the Communications and Information Theory Chair, TU Berlin. He won excellence awards for his MSc and PhD studies, and a Post-Doc Minerva Fellowship. His current research interests are in the areas of applied harmonic analysis, mathematics of deep learning, graph neural networks, explainability in deep learning, application areas of deep learning like wireless communication, and randomized algorithms.

Host: Prof. Cesare Alippi

You can join here