Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/98235
Title: Learning on graphs with graph convolution
Authors: Li, Qimai
Degree: Ph.D.
Issue Date: 2023
Abstract: Graph convolutional neural networks (GCNN) have been the model of choice for graph representation learning, which is mainly due to the effective design of graph convolution that computes the representation of a node by aggregating those of its neighbors. This thesis reveals the mechanisms behind graph convolution neural networks from the perspective of graph signal processing theory and focuses on developing theoretic algorithms for modeling complex, richly labeled, and large-scale graph-structured data, with applications spanning across computer vision, natural language processing, human action understanding, smart transportation, and malware detection.
We conducted systematic research on analyzing and extending GCNNs from different theoretical perspectives including graph signal processing and spectral graph theory. Our spatial analysis shows that the graph convolution in GCN is a special form of Laplacian smoothing, which is the key reason why GCN works, but it also brings the over-smoothing problem to deep GCN models. Our spectral analysis revisits GCN and classical label propagation methods under a graph filtering framework and shows that they extract useful data representations by a low-pass graph filter.
Our research also contributes to the development of efficient and more powerful GC-NNs models, and various high-impact real-world applications. With the new theoretical insights, we have developed new, efficient, and more powerful models based on graph convolution for semi-supervised and unsupervised learning, including Improved Graph Convolutional Networks (IGCN), Generalized Label Propagation (GLP), Adaptive Graph Convolution (AGC). We also extend 1-D GCNN to 2-D GCNN so as to explore informative relational information among object attributes, and proposed Dimensionwise Separable 2-D Graph Convolution (DSGC).
The results have been published in various top AI conferences, including AAAI-18 [1], IJCAI-19 [2], CVPR-19 [3], KDD-21 [4], and WWW-22 [5].
Subjects: Machine learning
Neural networks (Computer science)
Hong Kong Polytechnic University -- Dissertations
Pages: xiv, 193 pages : color illustrations
Appears in Collections:Thesis

Show full item record

Page views

258
Last Week
7
Last month
Citations as of Nov 30, 2025

Google ScholarTM

Check


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.