Introduction
The focus of this report is simulation – a branch of computing that looks at imitating the action of a system or process, and covers a wide range of mathematical techniques. Simulations may be based on discrete events or specific moments in time, or be running as a continuous process. They may be based on constant deterministic rules, or a stochastic approach providing variance. Different use cases may also require multiple approaches. The overall aim of a simulation is to test the possible outcome of a set of variables, and ask: Will this beam take this load over time? Can this pilot fly a plane under these conditions? Can we produce a better-quality, less-expensive product by changing the recipe? How long until this machine breaks?
The 451 Take
Following the trajectory of IIoT adoption from initial instrumentation and data gathering, cloud-based analytics and, increasingly, edge compute-based machine learning (ML) provide a substrate for integration. On that substrate, there is opportunity to explore the next areas of development. IoT is a silo buster, bringing together data from different departments, machine manufacturers, processes and organizations. A focus on the 'as is,' or simply historical nature, of digital twins misses the potential for richer computing solutions in the simulation world, to be applied at any stage in the life of a digital twin. One example is a product or system that is the result of complex CAD simulation modeling; yet on the shop floor, it may be expressed as a printed manual. A highly filtered skim of the deep data representing the rules the product is expected to operate in now meets the actual product in a situation. That product or process may be highly instrumented, with IIoT giving real-time date. There should be a common link between the design simulation used to create the conceptual optimum and the data representing the current. The digital thread of the digital twin should be informing any part of the lifecycle.
Simulation across the Digital Thread
These separate departments may hand information to one another, but it has often been a one-way dialog. It is the whole integration from creation to expiration that provides the potential for richer insight. Naturally, focus has tended to be the runtime data component because it forms the bulk of the lifespan of a process. Initially, brownfield IoT instrumentation that is added to a plant or existing instrumentation is opened up to analysis, with the next wave of adoption focusing on how to analyze the data. This possibly includes moving machine learning for devices closer to the plant itself with edge computing.
However, this forms a focused, but still insular, thread of a process. This works, but it could work even better by taking a broader holistic approach. The concept of a digital twin helps in understanding the whole or part of a system; but this is still just data, which in most cases represents a 'now' state, or is kept as a log of the history.
The coming next wave of IoT advances needs to offer closer integration of the environments that understand the context in which a digital twin exists, and can manipulate the entirety of the data in and around it. The extension or the aggregation of existing simulation environments will be able to offer this. Any process, machinery, object or building that is designed digitally can then be put through various levels of simulation. Once in production, instantiated in the physical world, it is then being exercised almost as in another highly detailed simulation, but with real consequences.
Changes to the physical instantiation today will be trialed through a localized 'what if' approach – a quick simulation that does not have all the features of the design simulation that created it. This simplifying has been due to the difficulty in data integration from design, spreadsheets or basic models getting used for shop floor adjustments.
Compute power and its distribution are evolving. So as edge computing provides machine learning close to the machinery, we can expect more simulation compute power on the shop floor and in process controls. For that to work, it should be fed by the richest models possible – those from the initial design stages, boosted and augmented with the real-world status created through actual physics, which in turn feeds back to the design models.
We are seeing this breaking out of data from CAD to provide information for augmented reality (AR) applications on the shop floor, primarily around the 3-D model visual data, mixed with the life IoT data, because AR is the user interface for IoT. Training simulations have also existed for many years from the most basic of operator consoles, but are starting to pull some resources both from initial designs and real-time digital twins.
Artificial intelligence and ML are important in the context of simulations, because they are the tools that can learn what the model of the process is, or act as the engines to sift through multiple what-if scenarios, and provide a curate list for decision-makers to evaluate. This can be seen in the design industry, where parameters for a product can be entered and multiple viable design form factors generated.
For those troubleshooting situations and planning use cases, the simulation for what-if can find an expedient solution to a problem. Simulation of a dangerous situation or creating a management training tool to illustrate the deep impact of choices are also relevant. These should ideally be created and drawn from the same base of truth from design, production and expiration. With disparate systems, many examples today tend to be custom creations based on snippets of the real-world model.
Conclusion

Ian Hughes is a Senior Analyst for the Internet of Things practice at 451 Research. He has 30 years of experience in emerging technology as a developer, architect and consultant through key technology trends.

Raymond Huo is a Research Associate at 451 Research. Prior to joining 451 Research, Raymond worked as a Resource Operations Associate at healthcare nonprofit Health Leads, helping to develop the company’s customer relationship management tool to meet the business needs of clients.

Aaron Sherrill is a Senior Analyst for 451 Research covering emerging trends, innovation