Published January 1, 2024 | Version v1
Journal article Open

Text-RGNNs: Relational Modeling for Heterogeneous Text Graphs

Description

Text-graph convolutional Network (TextGCN) is the fundamental work representing corpus with heterogeneous text graphs. Its innovative application of GCNs for text classification has garnered widespread recognition. However, GCNs are inherently designed to operate within homogeneous graphs, potentially limiting their performance. To address this limitation, we present Text Relational Graph Neural Networks (Text-RGNNs), which offer a novel methodology by assigning dedicated weight matrices to each relation within the graph by using heterogeneous GNNs. This approach leverages RGNNs, enabling nuanced and compelling modeling of relationships inherent in the heterogeneous text graphs, ultimately resulting in performance enhancements. We present a theoretical framework for the relational modeling of GNNs for text classification within the context of document classification and demonstrate its effectiveness through extensive experimentation on benchmark datasets. Conducted experiments reveal that Text-RGNNs outperform the existing state-of-the-art in scenarios with complete labeled nodes and minimal labeled training data proportions by incorporating relational modeling into heterogeneous text graphs. Text-RGNNs outperform the second-best models by up to 10.61% for the corresponding evaluation metric.

Files

bib-0cca5a78-52c6-4df9-85c4-4544c45e8c4c.txt

Files (158 Bytes)

Name Size Download all
md5:464d9b944b2170e6e6afde8a58b5a586
158 Bytes Preview Download