Highly Efficient Knowledge Graph Embedding Learning with Orthogonal Procrustes Analysis

Publication date

2021-06-01

Authors

Peng, Xutan
Chen, GuanyiISNI 0000000492852701
Lin, Chenghua
Stevenson, Mark

Editors

Toutanova, Kristina
Rumshisky, Anna
Zettlemoyer, Luke
Hakkani-Tur, Dilek
Beltagy, Iz
Bethard, Steven
Cotterell, Ryan
Chakraborty, Tanmoy
Zhou, Yichao

Advisors

Supervisors

Document Type

Part of book
Open Access logo

License

cc_by

Abstract

Knowledge Graph Embeddings (KGEs) have been intensively explored in recent years due to their promise for a wide range of applications. However, existing studies focus on improving the final model performance without acknowledging the computational cost of the proposed approaches, in terms of execution time and environmental impact. This paper proposes a simple yet effective KGE framework which can reduce the training time and carbon footprint by orders of magnitudes compared with state-of-the-art approaches, while producing competitive performance. We highlight three technical innovations: full batch learning via relational matrices, closed-form Orthogonal Procrustes Analysis for KGEs, and non-negative-sampling training. In addition, as the first KGE method whose entity embeddings also store full relation information, our trained models encode rich semantics and are highly interpretable. Comprehensive experiments and ablation studies involving 13 strong baselines and two standard datasets verify the effectiveness and efficiency of our algorithm.

Keywords

Citation

Peng, X, Chen, G, Lin, C & Stevenson, M 2021, Highly Efficient Knowledge Graph Embedding Learning with Orthogonal Procrustes Analysis. in K Toutanova, A Rumshisky, L Zettlemoyer, D Hakkani-Tur, I Beltagy, S Bethard, R Cotterell, T Chakraborty & Y Zhou (eds), Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Association for Computational Linguistics, Online, pp. 2364-2375. https://doi.org/10.18653/v1/2021.naacl-main.187