Construction of Hyper-Relational Knowledge Graphs Using Pre-Trained Large Language Models

Datta, P., Vitiugin, F., Chizhikova, A., & Sawhney, N. (2024). Construction of Hyper-Relational Knowledge Graphs Using Pre-Trained Large Language Models. arXiv preprint arXiv:2403.11786.

Abstract

Extracting hyper-relations is crucial for constructing comprehensive knowledge graphs, but there are limited supervised methods available for this task. To address this gap, we introduce a zero-shot prompt-based method using OpenAI’s GPT-3.5 model for extracting hyper-relational knowledge from text. Comparing our model with a baseline, we achieved promising results, with a recall of 0.77. Although our precision is currently lower, a detailed analysis of the model outputs has uncovered potential pathways for future research in this area.

Read the publication here

More information

Building detailed knowledge graphs (i.e., tools that help organize and connect information) is important, but it’s hard to find good methods that can handle complex relationships between multiple things. To help solve this problem, we developed a new approach that uses OpenAI’s GPT-3.5 model to extract these complex connections from text, even without any prior training (a technique called ”zero-shot learning”).

When the authors tested their method against a basic model, it performed well, correctly identifying 77% of the relevant information. While it still makes some mistakes, our analysis of the results has revealed useful insights that could guide future improvements.

Scroll to Top