Artificial Intelligence 16 min read

Knowledge Representation Learning for Xiaomi Knowledge Graph: Algorithms and Applications

This article introduces Xiaomi's knowledge graph architecture, presents text‑graph joint representation learning methods, and demonstrates their practical use in entity linking, entity recommendation, and knowledge completion, highlighting model designs, feature engineering, and experimental results.

DataFunTalk
DataFunTalk
DataFunTalk
Knowledge Representation Learning for Xiaomi Knowledge Graph: Algorithms and Applications

The talk, presented by Xiaomi algorithm engineer Lv Rongrong and edited by Meng Hangcheng, outlines the role of knowledge representation as the foundation for knowledge acquisition and application within Xiaomi's large‑scale knowledge graph.

Business Introduction

Xiaomi's knowledge graph team builds high‑quality graphs that support services such as XiaoAi, Xiaomi.com, and information feeds, providing entity search, linking, and concept graphs.

Examples show how the graph powers XiaoAi answers, e.g., answering queries about a person's birthplace by extracting entities, intents, and attributes through language recognition, intent analysis, entity matching, and database retrieval.

Algorithm Introduction

Knowledge representation learning maps entities and relations into dense low‑dimensional vectors, enabling semantic similarity calculations. Traditional translation‑based models struggle with long‑tail entities; the solution integrates additional information such as textual descriptions, entity types, and graph structure.

Two classic joint text‑graph models are described:

Jointly model: aligns entity, relation, and word embeddings in a shared space using skip‑gram for words and Trans‑E for triples.

DKRL model: extends Trans‑E with continuous bag‑of‑words and deep convolutional encoders for textual descriptions, learning both semantic and structural embeddings.

Algorithm Applications

Entity Linking

The goal is to link textual mentions to the correct knowledge‑base entity. The pipeline includes coarse ranking (using context, coherence, and link count features fused by an MLP) followed by fine ranking with a BERT‑based sentence‑pair classifier.

Entity Recommendation

Given an entity, the system recommends related entities based on learned embeddings (DKRL) and a DSSM matching model that computes cosine similarity between entity vectors.

Knowledge Completion

To fill missing triples, the approach uses schema to infer tail entity types, generates candidate triples, and scores them with a BERT‑based classifier (similar to KG‑BERT), selecting high‑confidence triples for graph augmentation.

Conclusion and Outlook

The article summarizes how knowledge representation learning supports entity linking, recommendation, and completion, emphasizing industrial practicality, model simplicity, and the need for continued research in this area.

machine learningAIKnowledge Graphrepresentation learningentity linkingentity recommendationknowledge completion
DataFunTalk
Written by

DataFunTalk

Dedicated to sharing and discussing big data and AI technology applications, aiming to empower a million data scientists. Regularly hosts live tech talks and curates articles on big data, recommendation/search algorithms, advertising algorithms, NLP, intelligent risk control, autonomous driving, and machine learning/deep learning.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.