An Asymmetric Cross-attention Fusion Module integrates complementary information from different sources, such as retrieved subgraphs and student history, using a specialized cross-attention mechanism. It enhances prediction by selectively focusing on high-value signals while mitigating attention diffusion in complex, heterogeneous graph structures.
This module helps AI models combine different types of information, like a student's learning history and related knowledge concepts, in a smart way. It uses a special attention mechanism to focus only on the most important details, which improves predictions and prevents the model from getting confused by irrelevant data.
ACFM, Asymmetric Cross-Attention
Was this definition helpful?