AQFC2015

Tackling Over-Smoothing for General Graph Convolutional Networks

----------------------------------------------------------------------------------------------------
 
 
             Department of Systems Engineering and Engineering Management
                     The Chinese University of Hong Kong
 
 
----------------------------------------------------------------------------------------------------
 
Date: Friday, November 06, 2020, 16:30 to 17:30
 
Title: Tackling Over-Smoothing for General Graph Convolutional Networks
 
Speaker: Dr. Yu Rong
 
Abstract:
 
As is already well-known in computer vision, the depth of Convolutional Neural Network (CNN) plays a crucial role in performance. However, increasing the depth of Graph Convolutional Networks (GCN), which is expected to permit more expressivity, is shown to incur performance detriment especially on node classification. The main cause of this lies in over-smoothing. The over-smoothing issue drives the output of GCN towards a space that contains limited distinguished information among nodes, leading to poor expressivity. The over-smoothing is one of the main obstacles to construct deep and complex Graph Neural Networks.
In this talk, we will review the current progress of relieving the over-smoothing issue in Graph Convolutional Networks, including the architecture refinement, normalization and argumentation techniques. Meanwhile,  we will introduce our recent theoretical findings about the over-smoothing.
 
Biography:
 
Dr. Yu Rong is a senior researcher of Machine Learning Center in Tencent AI Lab. He received this B.E. degree from Sun Yat-sen University, Guangzhou, China in 2012 and the Ph.D. degree from The Chinese University of Hong Kong in 2016. He joined Tencent AI Lab in June 2017. In Tencent AI Lab, he is working on building the large-scale graph learning framework and applying the deep graph learning model to various applications, such as ADMET prediction and malicious detection. His main research interests include social network analysis, graph neural networks, and large-scale graph systems, with particular focus on the design and efficient training for deep and complex graph learning models. He has published several papers on data mining, machine learning top conferences KDD, WWW, NeurIPS, ICLR, CVPR, ICCV, etc. He has served as a reviewer for KDD, WWW, CVPR, CIKM, WSDM, SDM and other journals such as VLDBJ and TKDE.
 
Date: 
Friday, November 6, 2020 - 16:30 to 17:30