Industrial Internet of Things (IIoT), artificial intelligence (AI), user interface technologies such as augmented reality and virtual reality can help the form and function of digital twins to improve training, operations and outcomes.
Human intelligence has been creating and maintaining complex systems since the beginnings of civilizations. In modern times, digital twins have emerged to aid operations of complex systems, as well as improve design and production. Artificial intelligence (AI) and extended reality (XR) – including augmented reality (AR) and virtual reality (VR) – have emerged as tools that can help manage operations for complex systems. Digital twins can be enhanced with AI and emerging user interface (UI) technologies like XR can improve people’s abilities to manage complex systems via digital twins.
Digital twins can marry human and AI to produce something far greater by creating a usable representation of complex systems. End users do not need to worry about the formulas that go into machine learning (ML), predictive modeling and artificially intelligent systems, but also can capitalize on their power as an extension of their own knowledge and abilities. Digital twins combined with AR, VR and related technologies provide a framework to overlay intelligent decision making into day-to-day operations, as shown in Figure 1.
The operations of a physical twin can be digitized by sensors, cameras and other such devices, but those digital streams are not the only sources of data that can feed the digital twin. In addition to streaming data, accumulated historical data can inform a digital twin. Relevant data could include data not generated from the asset itself, such as weather and business cycle data. Also, computer-aided design (CAD) drawings and other documentation can help the digital twin provide context. AI and other analytical models can take raw data and process it into forms that help humans understand the system.
AI also can make intelligent choices of content on the user’s behalf. Such guidance could be very welcome to users because user input facilities are very different from the typical keyboard and mouse. As displayed in the upper right corner of Figure 1, humans can perceive the system as an intelligent reality – a technologically enhanced reality that can aid their cognition and judgement.
With the blueprint in Figure 1 as a basis, it’s possible to create digital twins that use AI and reality technologies to achieve operational benefits. Any number of operations could be enhanced with the techniques described here.
For example, the paper “Augmented Reality (AR) Predictive Maintenance System with Artificial Intelligence (AI) for Industrial Mobile Robot” details how a machine learning model can be used to classify the state of a robot motor which can then be presented to factory personnel with AR. This article applies the blueprint concepts to facilities management after first exploring each concept in depth. While the various data streams reach their conclusions in human perception, the starting point of a digital twin for a user is how it is perceived. Thus, the starting point for this exploration are user interfaces for digital twins, followed by a discussion of AI.
Humans have a long history of interfacing with data and data visualization, starting with William Playfair’s inventions of line, bar and pie charts in the late 1700s. Digital twins can present data in such familiar forms, but the traditions of the late eighteenth century should not restrain the digital twin’s power.
When using mobile technologies such as tablets, smart phones and AR headsets, the digital reality is overlaid on the physical reality into one view, as shown in Figure 2. AR headsets may be the obvious choice for this use case, but it is not the only one. Traditional interfaces rendering 3D models also allow workers to take advantage of digital twins.
The first step in considering the creation of intelligent realities for digital twins is understanding data visualization options across the user interface (UI) spectrum. Next, a reporting integration approach is considered which can operationalize analytics and AI without requiring a new hardware paradigm, like an AR headset. AR headsets have the potential to benefit operations, but only if applications are successfully designed for usability, which is the next consideration. An outline follows of how to build a digital twin interface for remote experts.
In Cap Gemini’s “Augmented and Virtual Reality in Operations” report, Jan Pflueger from Audi’s Center of Competence for AR/VR encouraged a business-first approach for reality projects. “First, focus on your use case and not on the technology itself. After you identify your use case, focus on your information handling and data so you can deliver the right information to the technology.”
Consider five technological approaches for rendering digital twins and their respective capabilities. These are traditional desktop; smart phone or tablet; monocle AR; stereoscopic AR, including mixed reality (MR) devices; and immersive VR. See figure 3 table for comparison.
Within each class of device, capabilities vary, and the variance may affect a product’s viabilities for different use cases. This is especially true for AR headsets. Display resolution, field-of-view and computational power differ from product to product. In addition, design decisions about whether to put battery and compute units on the headset or on a separate tethered module can affect comfort and practicality. One practical concern for AR headsets is how they integrate with work clothing and uniforms such as those required for clean room and food processing operations.
Read the full story and more related stories on Control Engineering