EckherInsightsContact
Eckher
Your guide to what's next.
Eckher
Home › Eckher Insights › The building blocks of OWL
Dec 25, 2020

The building blocks of OWL

What makes up OWL ontologies and how do they support logical inference?

The Web Ontology Language (OWL) is a component of the Semantic Web used to author ontologies. In simple terms, OWL ontologies are "smart" vocabularies for RDF data. They enable reasoning over the data through a set of explicitly defined rules such as concept hierarchies, disjointness relations, and other types of logical statements.

OWL ontologies are formal knowledge models grounded in strict mathematical logic. Their formal semantics is based on a description logic, which gives them the ability to enrich RDF datasets in a consistent, deterministic, and decidable way.

The main components of OWL

Formally, OWL ontologies are made up of three types of building blocks described below.

Firstly, entities or terms encompass classes, properties, and individuals that make up an ontology and can be thought of as OWL's basic primitives. Examples of entities include ex:Place, ex:friendsWith, ex:TaylorSwift which can be interpreted as a class, property, and individual, respectively.

Secondly, expressions are used to construct various notions and, for example, define classes. In particular, a class expression defines a class by "selecting" individuals that satisfy certain rules.

Finally, axioms or assertions are statements about entities that are asserted to be true. An example of an axiom is the statement asserting that the class ex:SportsEvent is a subclass of the class ex:Event.

Examples of OWL axioms and inferred statements

Below are some examples of OWL axioms and statements that can be derived by means of semantic reasoning.

Subclass hierarchies

A statement of the form

ex:C1 rdfs:subClassOf ex:C2 .

asserts that all the instances of ex:C1 are instances of ex:C2. The ex:rdfs:subClassOf property is transitive, which means that if ex:C1 is a subclass of ex:C2 and ex:C2 is a subclass of ex:C3, then ex:C1 is also a subclass of (contained in) ex:C3.

For example, based on the following ontology:

# Electric sports car is a subclass of Sports car
ex:ElectricSportsCar rdfs:subClassOf ex:SportsCar .

# Sports car is a subclass of Car
ex:SportsCar rdfs:subClassOf ex:Car .

ex:ElectricSportsCar can be inferred to be a subclass of ex:Car, i.e. that all electric sports cars are cars.

Transitivity of custom properties

In OWL, a property is defined as being transitive when it is an instance of the the built-in OWL class owl:TransitiveProperty. Prototypical examples of such properties include "instance of", "located in", "occurred during", and other types of "part of" relations. As an example, the following excerpt

ex:locatedIn a owl:TransitiveProperty .

# Auckland is located in New Zealand
ex:Auckland ex:locatedIn ex:NewZealand .

# New Zealand is located in Oceania
ex:NewZealand ex:locatedIn ex:Oceania .

implicitly states that Auckland is located in Oceania.

Class disjointness

A statement of the form

ex:C1 owl:disjointWith ex:C2 .

asserts that ex:C1 and ex:C2 have no subclasses or individuals in common. Imposing disjointness constraints is useful for ensuring the consistency of datasets and ontologies themselves, and they can be used to derive multiple conclusions. For example, given the following excerpt:

# Fruit and Vegetable are disjoint classes
ex:Fruit owl:disjointWith ex:Vegetable .

# Apple is a Fruit
ex:Apple a ex:Fruit .

an OWL reasoner should be able to derive that apples are not vegetables.

Equivalence using owl:sameAs

Statements such as

ex:A owl:sameAs ex:B .

indicate that two entities refer to the same exact thing. The linked entities can themselves represent classes, properties, or individuals.

For example, the statement

ex:NewZealand owl:sameAs ex:Aotearoa .

asserts that ex:NewZealand and ex:Aotearoa actually have the same "identity" and can be treated as the aliases of the same entity. This means that any statements made about New Zealand can be combined with those about Aotearoa.

Bottom line

OWL ontologies are used to represent knowledge using explicit computable semantics. Overall, they help improve the quality of data, detect inconsistencies, and discover new relationships. By automatically analysing the data, they facilitate various knowledge management tasks and are a powerful tool in the hands of a knowledge engineer.

Cover
See also
An introduction to Eckher Semantic Web Browser
Navigating the Semantic Web and retrieving the structured data about entities made easy with Eckher Semantic Web Browser.
Eckher RDF Graph Editor
Author and visualize RDF-based knowledge graphs.
RDF* and the onset of Linked Data* and the Semantic Web*
The evolution of RDF and the related technologies fuelled by the need to make statements about statements.
The RDF model of the Gene Ontology, demystified
An outline of the structure of the Gene Ontology RDF graph and ways to query it.
One schema, one API: Inside the world of Data Commons
Data Commons brings thousands of public datasets together into one data graph to give data analysts and researchers a jump-start on analysing open data.
Why federation is a game-changing feature of SPARQL
SPARQL federation is an incredibly useful feature for querying distributed RDF graphs.
What does a knowledge engineer do?
An overview of knowledge engineering and the core competencies and responsibilities of a knowledge engineer.
Eckher
Your guide to what's next.
Copyright © 2021 Eckher. Various trademarks held by their respective owners.