Is 'digital twin' anything other than a buzzword? Is there anything that distinguishes one from a model or a simulation? I had the misfortune of being introduced to the term by an executive who repeated it until is lost all meaning so I still bristle whenever I hear it. I've read the Wikipedia article, but it only increased my skepticism with blustery paragraphs like:
> Healthcare is recognized as an industry being disrupted by the digital twin technology.[45][34] The concept of digital twin in the healthcare industry was originally proposed and first used in product or equipment prognostics.[34] With a digital twin, lives can be improved in terms of medical health, sports and education by taking a more data-driven approach to healthcare.
The one distinguishing feature (in theory) of digital twins is it is supposed to be such a hyper accurate model that it can be used to predict absolutely anything about the system in question. No changing of model setup, it's a "perfect" representation.
The down side is everything explodes exponentially - setup time, mesh count, solve time; and we usually get worse results than more focused simulations because we can't squeeze enough detail in across the board.
It generally starts because some manager hears that we've created 8 different specialized models of something due to different areas of interest, and has the bright idea of "lets just create a single super-accurate model we can use for everything". I've been fighting against them my entire career, although 10 years ago it was "virtual mockups"
The next buzzword in the pipeline seems to be "virtual lab" which I can't figure out either. I've been simulating laboratory tests for over a decade and no one can explain to me why that isn't exactly what we're already doing.
None of this is to say that this team isn't doing great work, but somewhere along the way it got wrapped up in some marketing nonsense.
Edit: Restructured my reply to better address OPs question.
Yes it's fine if people want to create a new buzzword for some special case. But people act like it's a revolutionary idea that will allow them to finally address unsolved problems (and ergo deserve funding for).
"Light fields" is one that always annoyed me. People who are apparently unaware of centuries of knowledge and methods in electromagnetism, developing "new" ways to solve problems crudely. That's great if they can make some cool new imaging system, but is it research deserving of long-term high-risk funding? It's just something that anyone skilled in optics can work out if they thought to build it.
Though it's a buzzword now, the idea behind 'digital twins' was that you not only have a detailed and faithful model (of an item, or process, or system, or network, etc.) whose granularity is congruent with the level of granularity that interests you about the real thing, but you also have bi-directional movement of data between the 'real' thing and its model.
So you can have sensor and measurement data from the real thing be streamed to the model in (ideally) real-time, you can make decisions off of the state of the model, and have those decisions be sent back out into the real world to make a change happen.
The specific wording of digital twins originated from a report discussing innovations in manufacturing, but I find that railway systems and operations make for some of the best examples to explain the concept, because they manage a diverse set of physical assets over which they have partial direct control, and apply conceptual processes on top of them.
Here's three assorted writings [1][2][3] that explain how railways would benefit from this.
In my experience complex sites like railways, airports etc. tend to have a lot of nominally digital data already. Things like topographic surveys, engineering drawings, surveys etc. But usually they are more "drawing" than useful data products. Massive directory structures full of random cad files with obtuse layering for small limited contracts, often using a local coordinate system. For a long time there have been efforts to improve data quality under the bim banner, and now perhaps digital twins.
I work as a Computational Researcher at Stanford Med. My work is quite literally translating 3D scans of the eyes (read MRI) into "digital twins" (read FEA Models).
I think that there is a subtlety in differentiating a digital twin from a model/simulation in intent. Our intent is to quite literally figure out how to use the digital twin specifically, NOT the scan that it is based on, as a way to replace more invasive diagnostics.
Of course, in the process, we figure out more about diagnosing medical problems as a function of just the scans themselves too.
It is a shame they are using these buzzwords, It's basically an earth system model, one that we had for many many years. They probably will improve the code base, I don't know if they want to build everything from the ground up or use some physics aware hybrid machine learning approach for sub-grid parameterization but really nothing seem novel except trying to improve the resolution of the current models we have.
Usually Digital Twin = Model = Simulation. Different industries have different words for it. Some call a difference between a Digital Twin / Model and a Simulation where the simulation is the result of some external input being applied to a Digital Twin.
Either way I wouldn't think too much about it. Tech is full of these things. I have been working with AI and neural networks for years before it was called Machine Learning. Now I'm forced to use the term ML to sound relevant even though it is the same thing.
I am familiar with it in the aerospace industry. Digital Twin implies a higher degree of fidelity in terms of importing data from sensors and modeling of physics than just model or simulation might apply, even though it is a model and simulation.
For GE's digital twins in the jet engines, they will build a high fidelity representation of the each individual engine based on as built parts, and then they will simulate every flight based on accelerometers, force sensors, humidity sensors, temperature and pressure sensors which they have placed in the engine. This is different from a general model or simulation which will build a model from CAD and then have a series of expected flight simulations and use that to predict life of the engine.
It is getting buzzwordy, probably by virtue of being in proximity of the Big Buzzword: "Industry 4.0". But it's a real thing.
I work with digital twins in chemical manufacturing, and there the term is directly coupled with Model Predictive Control. The basic idea is that you build a model of the system (e.g. a chemical plant) you want to control, use that model to optimize controller behavior, apply the results to real controllers in the real system, and then sample the system to reground the model. Rinse, repeat. Such a model is called the "digital twin" of the real system - the idea is that it exists next to the system and is continuously updated to match the real world.
> Healthcare is recognized as an industry being disrupted by the digital twin technology.[45][34] The concept of digital twin in the healthcare industry was originally proposed and first used in product or equipment prognostics.[34] With a digital twin, lives can be improved in terms of medical health, sports and education by taking a more data-driven approach to healthcare.