The word “ontology” comes from philosophy, where it refers to the study of being — the nature of what exists. In computer and information science, the meaning shifted: ontologies became formal tools for expressing structured knowledge in a way that machines can reason about.

For humans, concepts are intuitive — we understand “a cat is an animal” without needing formal logic. Computers, however, need these relationships defined explicitly. Ontologies fill that gap.

They describe entities (things that exist) and the logical relationships between them, forming a network of meaning. This network can be visualized as a graph: nodes represent concepts, and edges represent relationships.

Thought Experiment — The Robot at the Salad Bar

Imagine you build a robot and ask it to make a salad.

It must know: - what a restaurant is and how to find one, - what a salad is and which ingredients belong in it, - that a salad goes into a bowl and serves one or more people.

Even this simple task requires a vast amount of linked knowledge. Ontologies make that linkage explicit, allowing machines to reason about the world and act intelligently.

The modern journey of ontologies traces back to Tim Berners-Lee, the inventor of the World Wide Web. He proposed the idea of the Semantic Web — a web not just of documents, but of meaningful data that machines can interpret and connect.

To achieve this, the World Wide Web Consortium (W3C) developed technologies such as:

  • RDF (Resource Description Framework) – a graph-based model for representing data as subject–predicate–object triples.

  • OWL (Web Ontology Language) – an extension of RDF for describing richer, logical relationships.

These standards underpin the Semantic Web and form the foundation of most modern ontology frameworks, including EMMO.

---