To succeed in their ultimate goal of building the perfectly effective and efficient object, engineers have relied on the use of digital simulations. Only a few years ago, the best play was to make things bigger, just in case; more recently, finite elements simulations appeared to accurate predict the behavior of a model. Coming up in this trend: Digital Twins.
As the name implies, Digital Twins attempt to be trustworthy representations of physical elements: same features (physical properties, state of the materials, etc.), and same context (ground features, weather, the state of usage, etc.). In a way, they are just simulations with better information, but the more accurate simulation enables better decision making and predictions.
As Kamyab Zandi explained in his lecture Digital Twin of Civil Structures, the perfect outcome for digital twins is to have a seamless integration of data gathering, damage detection, and performance prediction. With a better understanding of the state of the infrastructure, we could make better decisions, leading to lower costs. Kamyab mainly focused on bridges in his lecture because of their high cost and the risk they can become. In such expensive infrastructure, it would be optimal to design the structure to withstand just what it is needed, and to have only the required maintenance to avoid an accident. This is what digital twins could offer.
To be the most useful, digital twins need to keep up with their physical sibling. This means that if the infrastructure ages (e.g. suffers some corrosion or gets cracks on it surface) and its properties change, the digital twin should take these in consideration to update the resistance of the structure. Updating the digital model in real time would enable the best decision making, but for that it is required planification before the construction and select which parameters are relevant for having an accurate simulation. For example, it could be needed to know the status of the deformation, for which you could place elongation sensors; or maybe the weather is relevant to predict corrosion, then you could have other sets of sensors for that purpose. However, it is not always possible to have sensors that automatically update the digital twin. As in Kamyab’s study together with Chalmers University of Technology and Stanford University. In his work, Kamyab and his team create quickly a 3D model of a physical concrete beam that had a crack on its surface using electronic tools (drone with cameras for the model and AI code to detect the crack).
In the planification phase it is important to have one thing in mind: more information does not necessarily lead to better outcomes. According to the other lecturers, who talked about DemoVirPEN, this was the main struggle during the development. The presentation was performed by Vasilis Naserentin, developer at DemoVirPEN; Jens Forssén, acoustics; and Fabio Latino, working with aesthetics of the model. They shared with us a little bit of the how it was to develop this tool that shows the amount of noise in streets.
Noise pollution is the second largest environmental cause of health problems, only second to air pollution, but maybe for being invisible, it is often forgotten. The question while developing DemoVirPEN was “how do you plot the invisible?”.
Digital twins are mainly meant as decision making tools, therefore, they need to be streamlined for that purpose. As the lecturers explained, this means: knowing the audience (e.g. what they expect, what they find relevant, what they are familiar with) and knowing the ultimate goal (e.g. what is relevant, what simplifies decision making). This is not an easy task. Sometimes the amount of information can be conflicting with the understanding of the model.
According to them, the development of the tool involved finding the right spot in terms of level of detail (LOD). It was a struggle to find the right amount of detail of facades of buildings, the detail of textures of the urban elements, and the number of urban elements among others. Too much detail, and you would have trouble to communicate what’s important, too little and you lose track of scale. The risk of flooding with information is why understanding what’s relevant and how users expect to utilize the tool is important.