Recent advancements in graph neural networks (GNNs) are addressing critical challenges in real-world applications, particularly in handling complex structures and data scarcity. New frameworks like the Riemannian Liquid Spatio-Temporal Graph Network are enhancing the modeling of non-Euclidean graphs, improving representation quality for dynamic systems. Concurrently, approaches such as the Transfer-Oriented Spatiotemporal Graph Framework are optimizing sample efficiency and generalization across domains, which is vital for industries reliant on multivariate time series forecasting. Additionally, innovations like the AdvSynGNN are fortifying GNNs against structural noise, ensuring robust performance in diverse environments. The emergence of self-supervised methods, such as BHyGNN+, is also noteworthy, as they enable effective learning from unlabeled data, addressing the scarcity of annotations in many domains. Collectively, these developments signal a shift towards more resilient, efficient, and interpretable GNN architectures, poised to tackle pressing commercial problems in sectors ranging from finance to healthcare.
A comprehensive understanding of heat transport is essential for optimizing various mechanical and engineering applications, including 3D printing. Recent advances in machine learning, combined with p...
Adapting transformer positional encoding to meshes and graph-structured data presents significant computational challenges: exact spectral methods require cubic-complexity eigendecomposition and can i...
Liquid Time-Constant networks (LTCs), a type of continuous-time graph neural network, excel at modeling irregularly-sampled dynamics but are fundamentally confined to Euclidean space. This limitation ...
Message Passing Graph Neural Networks (MP-GNNs) have garnered attention for addressing various industry challenges, such as user recommendation and fraud detection. However, they face two major hurdle...
Graph Neural Networks (GNNs) are highly vulnerable to adversarial perturbations in both topology and features, making the learning of robust representations a critical challenge. In this work, we brid...
Multivariate time series forecasting in graph-structured domains is critical for real-world applications, yet existing spatiotemporal models often suffer from performance degradation under data scarci...
The rapid rise of large language models (LLMs) and their ability to capture semantic relationships has led to their adoption in a wide range of applications. Text-attributed graphs (TAGs) are a notabl...
Message passing is a core mechanism in Graph Neural Networks (GNNs), enabling the iterative update of node embeddings by aggregating information from neighboring nodes. Graph Convolutional Networks (G...
Handling missing node features is a key challenge for deploying Graph Neural Networks (GNNs) in real-world domains such as healthcare and sensor networks. Existing studies mostly address relatively be...
The integration of Graph Neural Networks (GNNs) with Large Language Models (LLMs) has emerged as a promising paradigm for Graph Question Answering (GraphQA). However, effective methods for encoding co...