Why Ontology Is the Hidden Grammar Behind Knowledge Graphs
The article explains that ontology is not merely a list of terms but a formal model defining concepts, relationships, and constraints, outlines three quality standards, shows how it enables data integration and reasoning, compares it with simple taxonomies, and warns of common misconceptions.
01 Ontology is not a glossary but a “grammar of knowledge”
When first learning knowledge graphs, many mistake Ontology for a simple list of terms or a classification catalog. In reality an ontology cares not only about which concepts exist but also how those concepts are defined, classified, related, and what rules they must obey.
Analogy: a glossary is like a dictionary that tells you what “mammal” means, while an ontology is like a grammar book that states “mammal” is a subclass of “animal”, “has hair” is an attribute, and therefore any mammal must be an animal.
Ontology = formal modeling of a domain’s knowledge structure.
02 Three standards for a good ontology: shared, explicit, formal
Shared – the vocabulary is a consensus within the domain, not a personal invention. Example: everyone uses the term “student” with the same meaning.
Explicit – concepts, relations, and rules are clearly stated, not vague. The description must be precise enough for a machine to understand.
Formal – expressed in a normative way that computers can recognize, validate, and reason over. The machine can not only read the model but also “think” with it.
Thus an ontology is an abstract model of “how knowledge is organized”, not a concrete object.
03 Why knowledge graphs need ontology
Without a unified conceptual system, data from different sources speak different dialects and cannot be integrated. Ontology provides four kinds of unification:
① Unified concepts
In the same domain different sources may use different names. Ontology supplies stable definitions and hierarchies to reduce “each speaks its own language”.
② Standardized relations
It not only lists objects but also specifies which relations can exist between them, making the knowledge structure traceable.
③ Constrained attributes
It defines which attributes apply to which classes and their value domains, preventing messy data entry.
④ Support for inference
When concepts, relations, and constraints are formalized, the system can derive new conclusions based on logical rules.
04 What an ontology actually contains
A complete ontology includes at least four elements. The diagram below shows the skeleton:
┌─────────────────────────┐
│ Ontology │
├─────────┬────────┬───────┴───────┐
│ Class │ Property│ Relation │ Constraint │
│ Class │Property │ Relation │Constraint │
└─────────┴────────┴───────────┴───────┘Class – e.g., Person, Student, University, Mammal.
Property – e.g., hasAge, hasNationality, creationTime.
Relation – e.g., teacher‑teaches‑course, painter‑creates‑work.
Constraint – e.g., “Student” is a subclass of “Person”; the domain of “hasAge” is “Person”.
Ontology is the “skeleton”; the knowledge graph is the “flesh” that fills the skeleton with concrete facts.
05 Ontology vs. conceptual taxonomy: from human‑readable to machine‑processable
Focus : taxonomy – what concepts exist and how they are layered; ontology – how concepts are defined, linked, and constrained.
Nature : taxonomy – knowledge organization; ontology – formal modeling.
Machine friendliness : taxonomy – low; ontology – high.
Example:
Listing “Animal, Mammal, Bird, Fish” is only a conceptual taxonomy.
Stating “Mammal is a subclass of Animal”, “Bird is a subclass of Animal”, “hasHair applies to Mammal” turns it into an ontology.
Thus ontology upgrades a human‑understandable taxonomy into a machine‑processable model.
06 How ontology enables reasoning
Beyond describing knowledge, ontology lets the system generate new knowledge from descriptions.
Example 1: Hierarchical reasoning
Known: "Student" is a subclass of "Person"
Known: XiaoMing is a "Student"
Infer: XiaoMing is also a "Person"Example 2: Transitive reasoning
Known: "Ancestor" relation is transitive
Known: A is ancestor of B, B is ancestor of C
Infer: A is also ancestor of CBecause ontologies embed class hierarchies, property domains, and logical rules, machines can draw conclusions just like humans.
07 Ontology vs. knowledge graph: skeleton and flesh
Focus : ontology – structure and rules; knowledge graph – objects and facts.
Content : ontology – classes, properties, relations, constraints; knowledge graph – specific entities, concrete relations, attribute values.
Role : ontology – provides a modeling framework; knowledge graph – organizes concrete facts.
Example:
Ontology defines classes “Painter”, “Work” and relation “creates”.
Knowledge graph records facts such as “Van Gogh created ‘Starry Night’ in 1889”.
Relationship: ontology supplies the “grammar”, the knowledge graph writes the “sentences”.
08 Four common pitfalls when learning ontology
Misconception : Ontology = glossary. Reality : Ontology must also specify relations, properties, and constraints.
Misconception : Ontology = simple classification. Reality : Classification is the foundation; semantic rules are the core.
Misconception : Ontology only for theory. Reality : Widely used in KG construction, data integration, semantic search.
Misconception : Having an ontology means a complete KG. Reality : Ontology is a framework; massive instance data are still needed.
One‑sentence summary: Ontology is not the object itself but a formal model that organizes the object world, providing the shared, inferable, and evolving foundation for knowledge graphs.
AI Large-Model Wave and Transformation Guide
Focuses on the latest large-model trends, applications, technical architectures, and related information.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
