From Information to Knowledge with Semantic
You have probably observed many times that if you ask several people to model a domain, either with classes, databases schemas, XSD or whatever, you’ll have chance to obtain as many models as there are modelers. That’s normal, we cannot have a single view of the World, and most modeling depends of the intended usage of the model. But we know the consequence of such variability: non standardization, lack of interoperability, data silos, complexity to reuse and migrate, etc. That variability is good for IT service providers, because they can increase their presence at clients with experts to help companies transform, clean-up, aggregate their data, but everybody can feel that something better could be done if we achieved a “universal” description.
The problem is well known by philosophers, who, since Plato and Aristotle, try to find the fundamental categories of existence, the nature of properties, the relation between concrete entities and abstractions and the roots of the reasoning.
Researchers on Artificial Intelligence (AI) have studied for many years the ways to represent knowledge, and have borrowed from philosophers the term ontologies to represent entities, concepts, properties, relations, but also rules, variability, inconsistencies, etc. These works converged to what we now call “semantic technologies”.
With the emergence of the Worldwide Web, these technologies were reused and standardized in order to create a Web of Data, keeping the AAA slogan “Anybody can write Anything about Any topic”. This work leads to a fantastic stack, the “Semantic Web”, aimed to loosely link data represented as a graph of URI, through ontologies. These ontologies can be at different levels, starting from the simple database to sophisticated modeling of complex concepts.
A pragmatic implementation is “Linked Data”; a set of principles and tools able to link, aggregate, correlate, query and publish any data sets on the Internet, leveraging the sum of descriptions of entities currently in silos to get knowledge. Some related projects include schema.org , an ontology recognized by Web search engines, DBPedia, a Semantic Web enabled version of Wikipedia, or OSLC, a standard to ease tools integration.
We also teach machines using our knowledge of human made ontologies. Machine Learning algorithms were developed and used to find categories and relationships from bigger and bigger amounts of uncorrelated data. They allow e-commerce and e-advertising companies to “know” who you are in order to propose highly profiled products and services.
The combination of ontology based domain modeling, Machine Learning and Web of Data, associated with the always increasing power of computers and networks, makes possible some incredible products, such as voice commanded intelligent personal assistants. As an example, Apple Siri or Tempo AI use ontologies to know what you are talking about when you say “meeting”, “restaurant” or “reservation”, and they are able to find relevant information in the Web of Data, aggregate them and propose the best options according to what they learned from you.
I started my professional career 25 years ago working on Expert Systems, and now I see that the AI ambition underpins more and more IT business. Semantic technologies bring a new field of opportunities to improve user experience, analyses data, handle systems complexity and deliver new services. The AI Winter  is finished, as knowledge increasingly emerges from information.
 The AI winter refers to a period of reduced funding and interest in Artificial Intelligence research