GNTK: Graph Neural Tangent Kernel contact: Marc Lelarge marc.lelarge@ens.fr abstract: This project is about a model that has attracted a lot of attention in the past year (https://www.offconvex.org/2019/10/03/NTK/): deep learning in the regime where the width - namely, the number of channels in convolutional filters, or the number of neurons in fully-connected internal layers - goes to infinity. In math/physics there is a tradition of deriving insights into questions by studying them in the infinite limit, and indeed here too the infinite limit becomes easier for theory. The goal of the project will be to understand the connection between infinitely wide neural networks and kernel methods and to adapt it to graph neural networks. Refs: - On Exact Computation with an Infinitely Wide Neural Net, Sanjeev Arora, Simon S. Du, Wei Hu, Zhiyuan Li, Ruslan Salakhutdinov, Ruosong Wang https://arxiv.org/abs/1904.11955 - Graph Neural Tangent Kernel: Fusing Graph Neural Networks with Graph Kernels, Simon S. Du, Kangcheng Hou, Barnabás Póczos, Ruslan Salakhutdinov, Ruosong Wang, Keyulu Xu https://arxiv.org/abs/1905.13192