Digital Twins Offer Unmatched Insights For Design Engineers
By Michael Parks for Mouser Electronics
The value of a new technology is not always obvious from the get-go. Apart from a “killer application”
that makes the use case blatantly evident, innovative ideas can sometimes remain just that—ideas. That is,
unless a market develops for these ideas or additional innovations come along that make the whole greater than the
sum of the individual technologies.
The Internet of Things (IoT) is arguably one
such innovation that some might say is a solution in search of a problem. The term IoT was coined by technology
pioneer Kevin Ashton back in 1999, though only recently have enough factors—wide availability of inexpensive
embedded sensors and proliferation of wireless Internet, for example—coalesced to make the IoT ready for mass
adoption. A technology, however, is not necessarily a solution by itself. A subset of the IoT, known as the
Industrial Internet of Things (IIoT), has seen some respectable success in the manufacturing market. Still, the cost
of implementing IIoT technologies, especially in circumstances where the technology would have to be retrofitted
into operational facilities, is not chump change. That said, the IoT and the IIoT have plenty of forward momentum
and appear to be on a course for a rendezvous with another innovative concept that has been percolating for over a
decade itself—the idea of the Digital Twin.
History of the Digital Twin
The idea of the digital twin is the brainchild of Dr. Michael Grieves and John Vickers (originally used in 2003 at
a course at the University of Michigan), who are experts in manufacturing and product lifecycle management (PLM).
The basic notion is that, for every physical product, there is a virtual counterpart that can perfectly mimic the
physical attributes and dynamic performance of its physical twin. The virtual twin exists in a simulated environment
that can be controlled in very exact ways that cannot be easily duplicated in the real world, such as speeding up
time so that years of use can be simulated in a fraction of the time. These hyper-accurate models and simulations
offer engineers and product designers unmatched insights across the entire product development cycle. Still, digital
twins are more than just an evolution of digital models, although their goal is similar: Higher quality products and
better product support at less cost and less effort.
From the Factory…
For decades, engineers and designers have heavily relied on software design applications to digitally capture their
ideas for physical objects as parametric models. Even today, more complex software allows for the simulation of
certain characteristics such as thermal properties or stresses and strains. While use of simulations in product
design is nothing new, they have historically relied on relatively small data sets or engineering assumptions when
making predictions. Digital twins, however, have access to unfathomably large data sets thanks to the IIoT. Sensors
that monitor literally every facet of a product's lifecycle can be measured and fed back into an iterative
design-manufacture-observe-improve loop.
… to Your Door
Once a product leaves the factory and is acquired by an end-user, the digital twin can begin to feed off real-world
data collected by the onboard sensors. This is where perhaps the concept of the digital twin reaches its full
potential. Sensors in the end item itself will track key performance characteristics of the device as it operates in
real-world conditions. Comparing actual telemetry against the predictions of the various aspects of the digital twin
model yield insights only dreamt of until now. A fortuitous loop results from this level of integration of the
physical and virtual. Not only can the digital twin be improved based on real-world data, but future iterations can
also be improved based on better understanding of actual data from end users. In some cases, where changes can be
made through a firmware update, products that have already been shipped can also benefit from lessons learned
through using a digital twin.
In addition to product telemetry, the external operating environment—ambient temperature, relative humidity,
and so on—can also be analyzed by onboard sensors, so such factors can be accounted for in simulations. This
type of information is invaluable in debugging errant device behavior by providing some operational context that
would just not be possible otherwise. For example, if there are two products that are otherwise used and maintained
in similar fashion but one keeps failing regularly, it might be of interest to the engineers that the product that
is consistently failing is being used at very high elevations. Being able to get that feedback to a company would be
invaluable. Not having to rely on a customer to call a help desk and to have that data fed into a digital twin to
influence future design iteration is even more incredible. In addition, it would be almost magical if a customer
received an email from the company proactively describing steps they could perform to minimize the failures that
they might be experiencing without having the customer even place a call or email in the first place.
All of this data—both performance data and external factors—can be communicated in real-time back to
the equipment manufacturers to improve the digital twin model and simulation factors. The digital twin could then
analyze the operational data and predict failures if it sees data points outside of prescribed tolerances. For
example, a circuit board might be seeing higher than expected operating temperatures or motors that are experiencing
an unusually high number of stop-start cycles. The digital twin could determine with some level of confidence that
the part will fail shortly and take a series of approved actions, such as placing an order with the company
responsible for manufacturing the failing part and alerting a technician that they need to brush up on the process
for replacing the component. As a result, any downtime is minimal and relatively predictable.
Beyond the obvious use of these rich datasets in maintenance prognostics, digital twins could have profound impacts
on the design and engineering of subsequent product iterations. Understanding how a product is actually being used
in an objective and data-driven manner will lead to faster development cycles and greatly reduce the time to detect
product defects or identify useful tweaks, thus reducing waste by allowing manufacturers to make real-time
improvements to the products still coming off the assembly line. This can translate to huge savings by avoiding
costly rework.
Digital twins are not limited to assessing tweaks to physical properties of a design. Digital twins can also make
it easier to study the impacts that software and firmware revisions have on performance. Various configurations and
settings can be rapidly tested and assessed to determine which ones will deliver optimum performance. Firmware
updates could then be pushed out seamlessly to all the devices, leveraging the same Internet connection that
initially sent the data used to identify improvements.
Furthermore, complex systems such as wind turbine farms could also benefit from the application of digital twins
from a system-of-systems perspective. Having multiple instances of a single product, each with their digital twin
that communicates with all the other digital twins, means that products can begin to learn from each other. The
aggregate knowledge that a digital twin represents can help augment the capabilities of trained human operators in
ways to allow them to be more efficient and effective without having to manually collect and crunch the data before
making major decisions. Therefore, digital twins allow technology and humans to work together while letting each
focus on what the other does best. Technology can continuously monitor, collect data, and conduct analysis.
Meanwhile, humans can keep their attention on higher-level work such as exploring implications of various complex
courses of actions and making informed decisions.
The Future: More AI, Big Data Interaction
Embedded platforms, with their computational horsepower, energy efficient sensors, and reliable communications
hardware, are critical to the collection and dissemination of telemetry data. This data is necessary to make digital
twins smart enough so that their function is worthwhile. Then, all that data can be pumped into databases that are
rapidly analyzed using Big Data techniques. Throw in the possibilities from a Watson-like Artificial Intelligence
(AI) system to analyze and make improvement recommendations, and it’s possible that products could improve
over time without any human intervention. The result is the ultimate in technology self-help! Digital twins might
very well prove to be the long sought after use case that finally makes the adoption of the IoT mainstream. The
implications of a more cost-effective, rapidly moving, and increasingly intelligent product development lifecycle
would seem to make any investment well worth it.
For some, the value of digital goes way beyond just parts and products. Some see a future where every aspect of our
lives—from our cars and homes to entire cities and even the human body—will be given a digital twin as a
way to encourage experimentation and see what tweaks can be made to improve the quality of our lives. The success or
failure of all these potential digital twin candidates will come down to the ease in which the lifeblood of a
model—the data—can be collected, aggregated, and disseminated. This data and the associated dataflow,
the so-called digital thread, will undoubtedly be fed to digital twins via the IIoT. Perhaps the digital twin is the
killer app that the IIoT has been waiting for all these years. Or so my digital twin tells me.
Michael Parks, P.E.
is a contributing writer for Mouser Electronics and the owner of Green Shoe Garage, a custom electronics design
studio and technology consultancy located in Southern Maryland. He produces the S.T.E.A.M. Power Podcast to help
raise public awareness of technical and scientific matters. Michael is also a licensed Professional Engineer in the
state of Maryland and holds a Master's degree in systems engineering from Johns Hopkins University.