This code reproduces results in "Tensor decomposition for temporal knowledge graph forecasting", accepted at Neurocomputing.
If you use the code of this repository for your project or you find the work interesting, please cite the following work:
@article{Dileo2026tensor,
title = {Tensor factorization for temporal knowledge graph forecasting},
journal = {Neurocomputing},
volume = {674},
pages = {132846},
year = {2026},
issn = {0925-2312},
doi = {https://doi.org/10.1016/j.neucom.2026.132846},
url = {https://www.sciencedirect.com/science/article/pii/S0925231226002432},
author = {Manuel Dileo and Pasquale Minervini and Matteo Zignani and Sabrina Gaito},
keywords = {Temporal knowledge graphs, Tensor factorization, Temporal graph learning},
abstract = {Tensor factorization has long been a cornerstone of knowledge graph (KG) reasoning, achieving state-of-the-art performance on static link prediction tasks with models such as ComplEx. Despite their effectiveness and scalability in KG reasoning, these approaches have been largely overlooked for temporal knowledge graph (TKG) forecasting, i.e., predicting links in future unseen timestamps, given historical facts in the form of quadruples (h,r,v,t), where t represents the timestamp of the relation. Most of the recent research has instead focused on deep architectures such as recurrent and graph neural networks, or transformers, which achieve strong accuracy but incur substantial computational costs at training and inference time. In this work, we revisit tensor factorization for TKG forecasting and investigate whether these lightweight models, if carefully designed and tuned, can rival deep learning architectures. Building on TNTComplEx, we propose an extended factorization model that incorporates a radial basis function (RBF) timestamp encoder to generate embeddings for unseen timestamps and a temporal regularizer to enforce smoothness in the embedding space. We conduct an extensive evaluation of tensor factorization methods for TKG forecasting, benchmarking on the five most common datasets in the literature. Our results show that tensor factorization models can achieve performance comparable to or exceeding that of state-of-the-art deep learning models, while being substantially more efficient in terms of training and inference time. Furthermore, our approach improves over previously reported factorization baselines by +5 to +30 MRR points. These findings position tensor factorization as a scalable and computationally attractive alternative for temporal knowledge graph forecasting, motivating further extensions to inductive reasoning on entities and relations.}
}Create a conda environment with pytorch and scikit-learn :
conda create --name tkbf_env python=3.7
source activate tkbf_env
conda install --file requirements.txt -c pytorch
Then install the kbc package to this environment
python setup.py install
-
Download the dataset from the NEC research repository in a tkbc/src_data/DATASET_NAME folder.
-
Once the datasets are downloaded, preprocess to obtain normalized timestamps with, e.g., :
python tkbc/src_data/revert_icews14.py
- Finally, add the dataset to the package with, e.g., :
python tkbc/process_icews.py
This will create the files required to compute the filtered metrics.
In order to reproduce the best results reported in the paper, run the following commands
python tkbc/learner.py --dataset ICEWS14 --model TNTComplEx --rank 2000 --time_reg linear --time_enc rbf --time_reg_w 1e-3 --time_norm Np --p_norm 3 --emb_reg 1e-2
python tkbc/learner.py --dataset ICEWS18 --model TNTComplEx --rank 2000 --time_reg linear --time_enc rbf --time_reg_w 1e-2 --time_norm Np --p_norm 3 --emb_reg 1e-2 --no_time_emb_reg
python tkbc/learner.py --dataset GDELT --model TNTComplEx --rank 2000 --time_reg linear --time_enc rbf --time_reg_w 1e-3 --time_norm Np --p_norm 3 --emb_reg 1e-2
python tkbc/learner.py --dataset WIKI --model TNTComplEx --rank 2000 --time_reg linear --time_enc rbf --time_reg_w 1e-2 --time_norm Np --p_norm 3 --emb_reg 1e-4 --no_time_emb_reg
python tkbc/learner.py --dataset yago --model TNTComplEx --rank 2000 --time_reg linear --time_enc rbf --time_reg_w 1e-4 --time_norm Np --p_norm 3 --emb_reg 1e-3
tkbc-reg is CC-BY-NC licensed, as found in the LICENSE file.