Accepted Papers

HGCN: A Heterogeneous Graph Convolutional Network-Based Deep Learning Model Toward Collective Classification

Zhihua Zhu: Institute of Computing Technology, Chinese Academy of Sciences, Beijing, China; Xinxin Fan: Institute of Computing Technology, Chinese Academy of Sciences, Beijing, China; Xiaokai Chu: Institute of Computing Technology, Chinese Academy of Sciences, Beijing, China; Jingping Bi: Institute of Computing Technology, Chinese Academy of Sciences, Beijing, China


Download

Collective classification, as an important technique to study networked data, aims to exploit the label autocorrelation for a group of inter-connected entities with complex dependencies. As the emergence of various heterogeneous information networks (HINs), collective classification at present is confronting several severe challenges stemming from the heterogeneity of HINs, such as complex relational hierarchy, potential incompatible semantics and node-context relational semantics. To address the challenges, in this paper, we propose a novel heterogeneous graph convolutional network-based deep learning model, called HGCN, to collectively categorize the entities in HINs. Our work involves three primary contributions: i) HGCN not only learns the latent relations from the relation-sophisticated HINs via multi-layer heterogeneous convolutions, but also captures the semantic incompatibility among relations with properly-learned edge-level filter parameters; ii) to preserve the fine-grained relational semantics of different-type nodes, we propose a heterogeneous graph convolution to directly tackle the original HINs without any in advance transforming the network from heterogeneity to homogeneity; iii) we perform extensive experiments using four real-world datasets to validate our proposed HGCN, the multi-facet results show that our proposed HGCN can significantly improve the performance of collective classification compared with the state-of-the-art baseline methods.

How can we assist you?

We'll be updating the website as information becomes available. If you have a question that requires immediate attention, please feel free to contact us. Thank you!

Please enter the word you see in the image below: