Temporal Aggregation and Propagation Graph Neural Networks for Dynamic Representation
Temporal Aggregation and Propagation Graph Neural Networks (TAP-GNN)
Background Introduction
A temporal graph is a graph structure with dynamic interactions between nodes over continuous time, where the topology evolves over time. Such dynamic changes enable nodes to exhibit varying preferences at different times, which is critical for capturing user preferences and detecting abnormal behaviors. However, existing studies often generate dynamic representations with limited neighbors, which not only reduces performance but also leads to high latency in online inference. To address these challenges, this paper proposes a novel temporal graph convolution method called Temporal Aggregation and Propagation Graph Neural Networks (TAP-GNN). This method analyzes the computational complexity of dynamic representation through message-passing by unrolling the temporal graph and designs an Aggregation and Propagation (AP) block to effectively reduce redundant computations of historical neighbors.
Source and Author Information
This paper is written by Tongya Zheng, Xinchao Wang, Zunlei Feng, Jie Song, Yunzhi Hao, Mingli Song, Xingen Wang, Xinyu Wang, and Chun Chen. They are affiliated with the BigGraph Center at Hangzhou City University, the School of Computer Science and Technology at Zhejiang University, the Department of Electrical and Computer Engineering at National University of Singapore, and the Shanghai Advanced Research Institute at Zhejiang University. The research is published in the IEEE Transactions on Knowledge and Data Engineering, October 2023 issue.
Research Optimization and Workflow
Research Process
The key procedures covered by the research include: temporal graph unrolling, AP module design, implementation of the TAP-GNN system, and experimental evaluation.
Temporal Graph Unrolling
The study first unrolls the temporal graph using the message-passing paradigm to form a Message-Passing Temporal Graph (MPTG), in which the temporal nodes are densely connected with their temporal neighbors. However, this unrolling results in enormous computational complexity.
Aggregation and Propagation Module (AP block)
To reduce computational complexity, an Aggregation and Propagation (AP) module was designed and implemented. The AP module consists of an aggregation layer and a propagation layer: - Aggregation layer: This layer aggregates node information through a lightweight message-passing graph. - Propagation layer: This layer propagates information to historical nodes, which includes information aggregated from past neighbors.
TAP-GNN Framework
Finally, the TAP-GNN framework was designed, which includes multiple AP modules and a projection layer. TAP-GNN introduces time point encoding and time activation functions to generate temporal node embeddings and supports online inference in a graph stream scenario.
Experimental Design and Results
To verify the effectiveness of TAP-GNN, experiments were conducted on various real-world temporal networks, including future link prediction and dynamic node classification tasks. The study found that TAP-GNN significantly improved prediction performance and online inference latency compared to existing methods.
Detailed Experimental Data and Discussion
Results Explanation
Experiments were conducted on various nodes of the graph, using different graph convolution kernels and batch sizes. The TAP-GNN framework improved performance while maintaining linear computational growth.
Future Link Prediction
Experiments showed that TAP-GNN performed excellently on all temporal graphs tested. Particularly on datasets with low redundancy, TAP-GNN’s performance was outstanding.
Dynamic Node Classification
TAP-GNN performed well on tasks involving extremely imbalanced labels and was able to capture dynamic changes in nodes effectively.
Research Significance and Value
- Scientific Value: This study reveals the importance of capturing the entire temporal neighborhood for temporal graph representation, providing new insights for fields such as citation networks, e-commerce recommendation, and social network behavior detection.
- Application Value: TAP-GNN innovatively introduces the AP module, significantly reducing the complexity and latency of temporal graph representation, suitable for online inference and large-scale industrial applications like e-commerce recommendation systems and real-time social network monitoring.
Notable Achievements
- Algorithm Efficiency: The AP module significantly reduces redundant computations, transforming the computational complexity from exponential to linear growth.
- Model Performance: Performance verification on different datasets shows TAP-GNN outperforming other state-of-the-art methods.
- Module Scalability: The online inference time for nodes is almost independent of the graph size, indicating that TAP-GNN has excellent scalability, suitable for large-scale temporal graphs.
Future Research Directions
Although the proposed method achieves outstanding results in temporal graph representation, some limitations still exist. Future research could further develop adaptive models to infer the potential reasons for dynamic node changes, whether permanent or temporary, thereby improving the accuracy of predicting future interaction behaviors.
Through the research and experimental results presented, TAP-GNN reveals new possibilities for temporal graph embedding in academia and demonstrates exceptionally high superiority and potential in practical applications.