Digital Twins for PLM – Part 2: Sibling Power through Machine Learning
Changing business models and technological trends both indicate Digital Twin will lead our way in solving real life issues across product lifecycle. However digital twin is no magic wand with which business issues would automatically be solved. One of the key aspects of digital twin is to make the virtual and digital worlds (products/assets) learn from each other.
Learning is targeted towards siblings to help constantly improve targeted characteristics and value outcomes. Machine Intelligence is to play a crucial role in building the learning process. But what exactly is Machine Intelligence and its role in building sibling power? Forbes describes Machine Intelligence (MI) as the current application of Artificial Intelligence (AI) which in itself is a broader concept to carry out tasks in a way we would consider “smart” machines.
Most yester generation products work completely offline where real behaviour in field was barely captured. Hence it was difficult to compare the characteristics of a physical product against its virtual sibling. However with today’s products getting smarter, seamless communication with its virtual sibling help perform real-time analytics. Carefully thought through, both siblings along with their data can and do bring benefits to end users, across the product and service value chain.
One can use ontologies and semantics in combination with neural networks which are designed to work by classifying information in a similar way to the human brain. These neural networks can be taught to recognize, for example, images, and classify them according to elements they contain. The ultimate objective is to predict behaviour and performance, to exploit the true capability of products and assets under observation.
Essentially to make this a reality the four building blocks need to be brought together to enable physical and virtual siblings to become more predictable:
- Connectivity: between edge (product) device in the field, making real-time data available.
- Microservices framework: to extract relevant enterprise data from applications such as PLM (as designed), SLM (as serviced) etc.
- IoT platform: to orchestrate raw data from field and mash up with enterprise data through semantical relationships.
- Analytic Engine: to capture specific events, build analytical models, learn from real-time data and predict behaviour.
Shifting focus towards the analytics engine with machine learning as the key component that has been developed using computer programs that can access data and use them to constantly learn. The process of learning begins with observations or data, such as, events, triggers, instructions or enterprise data from backend systems. Using these, the program establishes and looks for patterns to make better decisions in the future. The aim is to allow the model to learn automatically without human intervention or assistance and adjust actions accordingly.
Machine learning algorithms are typically classified into two broad categories, depending on the nature of the signal or feedback available to a learning system. These are:
- Supervised learning: The computer is presented with a dataset of example inputs and their desired outputs, and the goal is to learn a general rule that maps inputs to outputs.
- Unsupervised learning: No labels are given to the learning algorithm, leaving it on its own to find structure in its input. Unsupervised learning can be used to discover hidden patterns in data.
Some examples of Machine Learning incorporated in Digital Twin are as follows:
- Predictability of failure of a product. For possible proactive measures to prevent failure, we will be using supervised learning techniques supported by a trained network through regression models.
- Train network to respond to “What-if analysis”.
- Identify data patterns from sensors and let network learn from these patterns.
Today’s businesses essentially need to answer three questions as effectively as possible, current state - “what happened”, future state - “what will happen” and the process - “how to make it happen”. It would be impossible to predict anything without machine learning supported with historic data sets and complemented with real-time data from that product and services.