A Convolutional Neural Network With Time-Aware Channel Weighting for Temporal Knowledge Graph Completion

Temporal Knowledge Graphs (TKGs) extend traditional knowledge graphs by incorporating a temporal dimension into triples, enabling a more precise modeling of dynamic relationships. However, TKGs often face challenges, such as data sparsity and incomplete information in real-world applications, which...

Full description

Saved in:
Bibliographic Details
Main Authors: Kesheng Zhang, Guige Ouyang, Yongzhong Huang
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/11008654/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Temporal Knowledge Graphs (TKGs) extend traditional knowledge graphs by incorporating a temporal dimension into triples, enabling a more precise modeling of dynamic relationships. However, TKGs often face challenges, such as data sparsity and incomplete information in real-world applications, which can hinder their performance in downstream tasks. To address the limitations of existing static completion models in capturing temporal information, we propose a novel Temporal Knowledge Graph Completion model, Conv-TA (Convolutional Network with Time-Aware Channel Weighting).In this model, temporal embeddings are represented as the concatenation of three independent embeddings: year, month, and date. These embeddings are then processed through linear transformations to generate time-aware weights that dynamically adjust the significance of convolutional output channels. This design allows the model to effectively capture the complex and dynamic characteristics inherent in temporal relationships.We evaluate the effectiveness of the Conv-TA model on publicly available datasets, including ICEWS and GDELT. Experimental results demonstrate that Conv-TA achieves superior performance compared to existing baselines, highlighting its potential in addressing the challenges of temporal knowledge graph completion.
ISSN:2169-3536