Abstract
Knowledge graphs (KGs) can structurally organize large-scale information in the form of triples and significantly support many real-world applications. While most KG embedding algorithms hold the assumption that all triples are correct, considerable errors were inevitably injected during the construction process. It is urgent to develop effective error-aware KG embedding, since errors in KGs would lead to significant performance degradation in downstream applications. To this end, we propose a novel framework named Attributed Error-aware Knowledge Embedding (AEKE). It leverages the semantics contained in entity attributes to guide the KG embedding model learning against the impact of erroneous triples. We design two triple-level hypergraphs to model the topological structures of the KG and its attributes, respectively. The confidence score of each triple is jointly calculated based on self-contradictory within the triple, consistency between local and global structures, and homogeneity between structures and attributes. We leverage confidence scores to adaptively update the weighted aggregation in the multi-view graph learning framework and margin loss in KG embedding, such that potential errors will contribute little to KG learning. Experiments on three real-world KGs demonstrate that AEKE outperforms state-of-the-art KG embedding and error detection algorithms.
Original language | English |
---|---|
Pages (from-to) | 1-16 |
Number of pages | 16 |
Journal | IEEE Transactions on Knowledge and Data Engineering |
DOIs | |
Publication status | Accepted/In press - 2023 |
Keywords
- anomaly detection
- Correlation
- graph neural network
- Knowledge graph
- Knowledge graphs
- Measurement uncertainty
- node representation learning
- Noise measurement
- Representation learning
- Semantics
- Task analysis
ASJC Scopus subject areas
- Information Systems
- Computer Science Applications
- Computational Theory and Mathematics