Semantic Computing

The Holy Grail of Artificial Intelligence

“Cognitive computing refers to the ability of automated systems to handle conscious, critical, logical, attentive, reasoning modes of thought. Semantic Computing facilitates and automates the cognitive processes involved in defining, modelling, translating, transforming, and querying the deep meanings of words, phrases, and concepts. Semantic computing is what natural-language processing, the heart of cognitive computing, is doing.”

James Kobielus - IBM Data Evangelist

The Business Problem

Billions of Internet-connected devices and systems, referred to as the Internet of Things (IoT), are generating massive amounts of data (identified as Big Data). There are currently two key challenges to meet before we can maximise the economic benefits of utilising this data:

  1. The inability of devices and systems to exchange data in a readable form such that they are “Interoperable”
  2. The inability of current data analytics systems to make sense of the sheer volume of data to find the information needles in the data haystack

Hence, while massive amounts of data are available, only some of it is being collected and even less is being usefully exploited.

In June 2015, McKinsey Global Institute published a report called “Unlocking the potential of the Internet of Things.”

“Interoperability among loT systems is required to capture 40 percent of the potential value. In our analysis, of the total potential value that can be unlocked through the use of loT, 40 percent of this value, on average, requires multiple loT systems to work together. In the worksite setting, 60 percent of the potential value requires the ability to integrate and analyse data from various loT systems. Interoperability is required to unlock more than $4 trillion per year in potential economic impact from loT use in 2025, out of a total potential impact of $11.1 trillion across the nine settings that we analysed.”

“Most of the loT data collected today are not used at all, and data that are used are not fully exploited. For instance, less than 1 percent of the data being generated by the 30,000 sensors on an offshore oil rig is currently used to make decisions. And of the data that are actually used—for example, in manufacturing automation systems on factory floors—most are used only for real-time control or anomaly detection. A great deal of additional value remains to be captured, by using more data, as well as deploying more sophisticated loT applications, such as using performance data for predictive maintenance or to analyse workflows to optimize operating efficiency. Indeed, loT can be a key source of big data that can be analysed to capture value, and open data, which can be used by more than one entity.”

http://www.mckinsey.com/business-functions/digital-mckinsey/our-insights/the-internet-of-things-the-value-of-digitizing-the-physical-world

  

 

The Solutions

As emphasised by McKinsey, the two challenges discussed above will be met by making computers Interoperable, i.e. able to effectively, seamlessly and automatically communicate with each other in data exchange that is

  1. In a common language understood by source and target systems
  2. Includes the semantics of the data, to provide the context in which the data is intended to be usefully applied

These two forms of Interoperability are:

  1. Syntactic Interoperability
  2. Semantic Interoperability

Syntactic Interoperability enables the flow of data from billions of connected devices which may be using differing data syntax, by transforming and consolidating that data into a usable common data exchange format.

Semantic Interoperability enriches data with semantic metadata that provides meaning and context for an Artificial Intelligence technique known as Semantic Computing.

Proposed to W3C as a third generation “Semantic Web” by Professor Sir Tim Berners-Lee et al in 1998, the vision of Semantic Computing across the World Wide Web was to enable the automated intelligent exchange of data between machines in a self-describing form that computers could understand without human instruction, an exchange that would enable enhanced computer learning and elementary computer thinking.

“Cognitive computing”, a generic term, is enabled by the combination of multiple Artificial Intelligence techniques including

  • Semantic Computing (essential for cognitive computing)
  • Reasoning/Inferencing
  • Knowledge Representation (e.g. RDF/XML and Ontologies)
  • Machine Learning/Pattern Matching
  • Deep Learning (or Deep Neural Networking)
  • Natural Language Processing (for unstructured data)

 

 

Company Focus

In 2012 Semantic Software commissioned London-based technology analysts Bloor Research to write a White Paper describing our patented data interoperability technology. Bloor highlighted that our technology uniquely facilitated Syntactic and Semantic Interoperability, that it facilitated the next generation of core technology for Data Integration, and is a core foundation technology for Semantic Computing. 

On receiving this advice, our company began a Research program specifically tasked with unambiguously confirming that our technology could enable a Semantic Computing platform for intelligent analysis of massive amounts of heterogeneous data sourced from an Internet of Things.

In 2013 we began building our platform of data interoperability products that collect data, transform it for syntactic and semantic interoperability, and store it in graph databases for advanced analysis.

Because Semantic Computing relies on Ontologies and Graph Databases, we are also rolling out suites of tools for Ontology generation and management, query generators and application development.

These are being enhanced with other Artificial Intelligence techniques to build a full functioning, world-leading cognitive computing platform with Semantic Computing at its core.

We are evaluating potential Partnerships with innovative technology leaders to deliver advanced Semantic Data Modelling, Ontology Engineering and Natural Language Processing, where we believe such Partnerships can speed deliver of our platform over the coming years.

 

 

Benefits of Semantic Computing

In traditional computing using databases and taxonomies, a relationship between entities is direct and one-to-one. The relationship is created, known about and understood by a computer programmer. It is not understood by a computer. In traditional computing we rely on a human computer programmer for intelligence. The computer itself is not intelligent.

e.g. Bob owns a Mercedes. In a database, a programmer links Bob to the Mercedes and the programmer knows that a Mercedes is a German car. The computer is unaware of the link, or that Bob owns the car, or that a Mercedes is a car made by a German car manufacturer. The computer doesn’t know that a car is a Vehicle, and that another type of Vehicle is a truck.

In Semantic Computing, a relationship between entities and their properties is defined using a name and a definition able to be understood by a computer; a process called semantic enrichment. The relationship is not limited to entities, and can be defined by an entity and/or its properties (type, age, gender, ownership, chemical composition, and so on) and, critically, is defined within a context, to provide meaning. A relationship can be direct or indirect, with unlimited degrees of separation. Two entities can now be linked by their properties, or properties of their properties. This ability to understand the meaning of data gives a computer intelligence.

e.g. The computer knows that Bob is a person. It knows that a Mercedes is a car, a type of Vehicle, and there are many other types of Vehicle. It understands the concept of ownership, with different types known as own, lease, rent. It could find all the people named Bob who lease a Mercedes vehicle, without the intervention of a computer programmer. It could find all the people called Bob, who have purchased a car manufactured by a German company, without the intervention of a computer programmer.

In the real world of modern computing and massive amounts of data, Semantic Computing can resolve data ambiguity without the need for human intervention. It can also find relationships between entities with many degrees of separation by linking entities via their properties and the properties of other entities. This is yielding immense early benefits in applications such as medical research and compliance (tax avoidance, policing, counter-terrorism).