DEMO-Net: Degree-specific Graph Neural Networks for
Node and Graph Classification
Jun Wu
Arizona State University
junwu6@asu.edu
Jingrui He
Arizona State University
Jingrui.he@asu.edu
Jiejun Xu
HRL Laboratories, LLC
jxu@hrl.com
ABSTRACT
Graph data widely exist in many high-impact applications. Inspired
by the success of deep learning in grid-structured data, graph neural
network models have been proposed to learn powerful node-level
or graph-level representation. However, most of the existing graph
neural networks suer from the following limitations: (1) there is
limited analysis regarding the graph convolution properties, such
as seed-oriented, degree-aware and order-free; (2) the node’s degree-
specic graph structure is not explicitly expressed in graph con-
volution for distinguishing structure-aware node neighborhoods;
(3) the theoretical explanation regarding the graph-level pooling
schemes is unclear.
To address these problems, we propose a generic degree-specic
graph neural network named DEMO-Net motivated by Weisfeiler-
Lehman graph isomorphism test that recursively identies 1-hop
neighborhood structures. In order to explicitly capture the graph
topology integrated with node attributes, we argue that graph con-
volution should have three properties: seed-oriented, degree-aware,
order-free. To this end, we propose multi-task graph convolution
where each task represents node representation learning for nodes
with a specic degree value, thus leading to preserving the degree-
specic graph structure. In particular, we design two multi-task
learning methods: degree-specic weight and hashing functions
for graph convolution. In addition, we propose a novel graph-level
pooling/readout scheme for learning graph representation provably
lying in a degree-specic Hilbert kernel space. The experimental
results on several node and graph classication benchmark data
sets demonstrate the eectiveness and eciency of our proposed
DEMO-Net over state-of-the-art graph neural network models.
KEYWORDS
Graph Neural Network, Degree-specic Convolution, Multi-task
Learning, Graph Isomorphism Test
ACM Reference Format:
Jun Wu, Jingrui He, and Jiejun Xu. 2019. DEMO-Net: Degree-specic Graph
Neural Networks for Node and Graph Classication. In The 25th ACM
SIGKDD Conference on Knowledge Discovery and Data Mining (KDD’19),
June 22–24, 2019, Anchorage, AK, USA. ACM, New York, NY, USA, 10 pages.
https://doi.org/10.1145/3292500.3330950
Permission to make digital or hard copies of all or part of this work for personal or
classroom use is granted without fee provided that copies are not made or distributed
for prot or commercial advantage and that copies bear this notice and the full citation
on the rst page. Copyrights for components of this work owned by others than ACM
must be honored. Abstracting with credit is permitted. To copy otherwise, or republish,
to post on servers or to redistribute to lists, requires prior specic permission and/or a
fee. Request permissions from permissions@acm.org.
KDD ’19, August 4–8, 2019, Anchorage, AK, USA
© 2019 Association for Computing Machinery.
ACM ISBN 978-1-4503-6201-6/19/08.. . $15.00
https://doi.org/10.1145/3292500.3330950
1 INTRODUCTION
Nowadays, graph data is being generated across multiple high-
impact application domains, ranging from bioinformatics [
4
] to
nancial fraud detection [
27
,
28
], from genome-wide association
study [
21
] to social network analysis [
5
]. In order to leverage the
rich information in graph-structured data, it is of great impor-
tance to learn eective node or graph representation from both
node/edge attributes and the graph topological structure. To this
end, numerous graph neural network models have been proposed
recently inspired by the success of deep learning architectures on
grid-structured data (e.g., images, videos, languages, etc.). One intu-
ition behind this line of approaches is that the topological structure
as well as node attributes could be integrated by recursively aggre-
gating and compressing the continuous feature vectors from local
neighborhoods in an end-to-end training architecture.
One key component of graph neural networks [
4
,
6
] is the graph
convolution (or feature aggregation function) that aggregates and
transforms the feature vectors from a node’s local neighborhood. By
integrating the node attributes with the graph structure information
using Laplacian smoothing [
9
,
12
] or advanced attention mecha-
nism [
18
], graph neural networks learn the node representation in
a low-dimensional feature space where nearby nodes in the graph
would share a similar representation. Moreover, in order to learn the
representation for the entire graph, researchers have proposed the
graph-level pooling schemes [
1
] that compress the nodes’ represen-
tation into a global feature vector. The node or graph representation
learned by graph neural networks has achieved state-of-the-art per-
formance in many downstream graph mining tasks, such as node
classication [26], graph classication [22], etc.
However, most of the existing graph neural networks suer
from the following limitations.
(L1)
There is limited analysis on
graph convolution properties that could guide the design of graph
neural networks when learning node representation.
(L2)
In or-
der to preserve the node proximity, the graph convolution applies
a special form of Laplacian smoothing [
12
], which simply mixes
the attributes from node’s neighborhood. This leads to the loss of
degree-specic graph structure information for the learned repre-
sentation. An illustrative example is shown in Figure 1: although
nodes 4 and 5 are structurally dierent, they would be mapped
to similar representation due to rst-order node proximity using
existing methods. Moreover, the neighborhood sub-sampling meth-
ods used to improve model eciency [
5
] signicantly degraded
the discrimination of degree-specic graph structure.
(L3)
The
theoretical explanation regarding the graph-level pooling schemes
is largely missing.
To address the above problems, in this paper, we propose a
generic graph neural network model DEMO-Net that considers the
arXiv:1906.02319v1 [cs.LG] 5 Jun 2019