Jensen-Shannon Divergence Message-Passing for Rich-Text Graph Representation Learning
By: Zuo Wang, Ye Yuan
In this paper, we investigate how the widely existing contextual and structural divergence may influence the representation learning in rich-text graphs. To this end, we propose Jensen-Shannon Divergence Message-Passing (JSDMP), a new learning paradigm for rich-text graph representation learning. Besides considering similarity regarding structure and text, JSDMP further captures their corresponding dissimilarity by Jensen-Shannon divergence. Similarity and dissimilarity are then jointly used to compute new message weights among text nodes, thus enabling representations to learn with contextual and structural information from truly correlated text nodes. With JSDMP, we propose two novel graph neural networks, namely Divergent message-passing graph convolutional network (DMPGCN) and Divergent message-passing Page-Rank graph neural networks (DMPPRG), for learning representations in rich-text graphs. DMPGCN and DMPPRG have been extensively texted on well-established rich-text datasets and compared with several state-of-the-art baselines. The experimental results show that DMPGCN and DMPPRG can outperform other baselines, demonstrating the effectiveness of the proposed Jensen-Shannon Divergence Message-Passing paradigm
Similar Papers
Parameter-Free Structural-Diversity Message Passing for Graph Neural Networks
Machine Learning (CS)
Helps computers understand messy data better.
Parameter-Free Structural-Diversity Message Passing for Graph Neural Networks
Machine Learning (CS)
Helps computers understand complex connections better.
Extended Short- and Long-Range Mesh Learning for Fast and Generalized Garment Simulation
CV and Pattern Recognition
Makes computer clothes look real and move faster.