"The future of digital twins lies in 
cross-plant and cross-company deployment. "

 

 

Interview with Jonas van Thiel, Partner, ROI-EFESO

 

I: Mr. van Thiel, digital twins exist for products and also now for complex systems, such as supply chains and factories. Where are we in the evolution of the concept and what do you think is coming next?


JvT: Three main variants of the digital twin concept are used in industry today. The most frequently encountered ones are for digital processes, i.e. Digital Process Twins. Here, a physical process (e.g. a production or assembly process) is mapped in terms of data technology. This results in a process simulation that comes with data such as waiting times, downtimes, cycles, etc. This helps to measure and optimize process performance.

The next variant is the Digital Product Twin, or product simulation. This is most suitable for companies that already develop their products in a simulation environment. Purchasing such an environment specifically to run individual process or production simulations (e.g. for a collision test or an ergonomics test) would rarely make economic sense, especially for small companies. But if you already have access to a development environment, such as a Siemens Teamcenter, creating product twins and production simulations is a logical next-step.

Going one step further, we come to large-scale companies, such as automobile manufacturers, which regularly employ real-time simulation platforms. These companies often have strategic collaborations with prominent gaming sector manufacturers, a popular partner being  NVIDIA. Using their technology, products and processes are mapped and rendered in real time and are immediately available for simulation.

Automobile manufacturers often use such techniques to simulate entire factories, for example, in order to streamline the production planning process. The results speak for themselves; at the factory planning level, cost savings of up to 30% can sometimes be achieved. In addition, products and processes can be fully tested and employees – and even robots! – can be completely trained in the virtual world.

For typical medium-sized companies, this is still a way off from being an affordable exercise. And it’s true that if you have simple processes, you don't really need a real-time simulation platform. But the opportunities to use digital twins at both a small and large scale are certainly there today.

 

I: What are the main implementation hurdles or success parameters for the introduction of Digital Twins and what changes can be expected?


JvT: The first main technical hurdle is to determine whether the investment in such new technologies is cost-effective. The planning effort involved and the size of the planning team that’s needed must be in proportion to the cost of the new system. If you have multiple production lines or representative lines and can achieve multiple benefits, it's going to be worth it. A process twin is almost always worthwhile in such circumstances because it is the only way to measure performance continuously. And as we all know from the world of lean management, we can only improve what we can measure…

But there are also some hurdles on the organizational side. It's no use, for instance, going to the trouble of creating a digital twin but then failing to use it. This happens more often than you think.

It is also crucial that employees are actively encouraged use the new technologies and to help  optimize them. To do this, they need to be given a certain degree of autonomy. If, for example, management sees that one production line is running much better than the others, that in itself tells them nothing about how to improve the others.

But if the employees involved themselves see that other lines are better and ask themselves why time is always lost in, say, cycle seven, then there’s a good chance they can solve the problem. As I say, however, in order to develop this potential you need an organizational framework that recognises the benefits of giving employees the appropriate autonomy.

 

I: Can you enhance digital twins with technologies like augmented reality?

JvT: Absolutely. Once you have planned and simulated a line digitally, you can place yourself right in the resulting environment, with the right AR or VR equipment. You can then take components in your hand and create motion sequences for training. These solutions are already being used very successfully and offer considerable advantages, not least because of the massive, ongoing general shortage of employees.

Remember, it can take months to initially find a qualified employee. But then, in order to train them in a factory, a good, experienced employee has to be seconded to the task, a move that actually reduces performance. In the digital world, however, you can fully train someone ‘virtually‘ in their future work environment. Learning cycles are dramatically shortened and efficiency increases.

Aside from anything else, most employees really enjoy being given such opportunities to learn. Today, it's vitally important to be innovative and show employees that they’re working for a company that‘s at the cutting edge.

 

I: Does gamification play a role in this context?


JvT: Even though many gaming-related technologies are employed by these applications, such as processors primarily developed for the the graphics industry, they really have little to do with gamification. The main thing these simulations and the gaming world have in common are that both are centred around virtual environments that are built and used in real time.

 

I: How can relatively smaller companies with, say, around EUR 1 billion in sales also take advantage of these opportunities?


JvT: That totally depends on the application. The factory twin is very exciting for smaller medium-sized companies, especially when it comes to improving and demonstrating sustainability as it gives you an overview of the factory as a whole. Companies are increasingly having to report on their activities and risks in environmental, social and governance (ESG) areas. If you simulate the entire factory, and so are fully aware of your power consumption, CO2 emissions, water use, waste products, etc., you‘re always going to be up to speed in terms of knowing your resource consumption and emissions, in real time.

And because measurements are taken on an ongoing basis, companies using factory twins are inherently well-positioned when it comes to complying with governance requirements. But there are also new optimization opportunities. While many companies may have already optimized their operations when it comes to using compressed air, for instance, they may not yet have addressed the issue of electricity consumption in their processes.

I’ll give you an example. One aircraft manufacturer we worked with had a major recall due to a faulty powder-coating process. This process was - and is - very energy-intensive, and so the real-time level of energy consumption provided an indication of whether the process is running correctly. With holistic factory monitoring, we could see that the power consumption was deviating and that something was wrong, in a similar way to how Predictive Maintenance and Predictive Quality would have assisted our understanding about possible future issues.

In previous years, the focus has been on specifics such processes, machines, plants and cycle times. Now, we look more at general areas and wider contexts, e.g. how temperatures and humidity levels change and whether there are discernible correlations with output and quality. Another important driver is the increasing trend toward improving sustainability. So this approach not only helps to optimize processes, but also aids in complying with any new requirements and guidelines.

 

I: So with the digital factory twin, we can see the context, the connections and the indirect consequences of the process flows?


JvT: Yes, all plants and processes are integrated to give a full overview. You can see output and quality, whether machines are at a standstill, or have broken down, as well as temperatures, power consumption and other influencing factors for the entire factory. This makes it possible to swiftly identify problems and draw holistic conclusions about processes and product quality.

And if you also choose to integrate artificial intelligence, you then gain new opportunities to better understand processes and decide when and when not to intervene. These are deep-learning algorithms that work at the factory level. Until now, you measured the power in the individual drive in the machine, but you didn't look to see whether the energy consumption in the hall as a whole was correct. Now, with AI, we can take a look at what’s happening remotely and know the necessary detail to conduct an analysis is being obtained automatically.

 

I: When it comes to deep learning algorithms and AI, we can already see increasing standardization in processes, as well as moves towards modularity and dramatic price reductions. Are there similar shifts happening when it comes to digital twins?


JvT: You can think of digital twins as an IT architecture, and every IT architecture is individual. Of course there are standards, e.g. how you connect individual plants or how you get from the factory level to the platform, or cloud level. But there is no set standard for interaction in an individual context, which data is needed, how it is implemented and so on. That's why so-called best practices are of little use in this context. It's more about the holistic approach, related to the specific case, and about asking questions like: What insights do I want? What do I expect as a result? Which data and calculations do I use? Developing and implementing this type and level of investigation is not a straightforward exercise.

 

I: Are there any new, possibly surprising developments that you’ve seen in this area?

JvT: Well, I’ve certainly noticed that there are some very exciting digital twins being created outside of the industrial setting, for example in the agricultural sector. And there’s definitely a lot of talk about the rise of the ‘metaverse‘ and purely digital assets. Many OEMs are looking into these areas, with some starting to market their brands and products as non-fungible tokens (NFTs) in the metaverse, for example. To be frank, this has absolutely nothing to do with digital twins; the connection to industrial operations is not there yet. But even if it looks like a gimmick right now, I don’t think the long-term importance of the metaverse should be underestimated.

 

I: Do digital twins only exist within the factory gates or is there also a cross-company perspective?


JvT: The future for digital twins absolutely is cross-plant and cross-company. If you want to implement sustainability, traceability and the circular economy in a future-proof way, and need transparent product data, you have to know the relevant origin data. When a product is used, usage data is generated, and then it might eventually be returned for remanufacturing, recycling or refurbishing. If you pass the acquired data onto the next link in the ‘chain‘, you will know the condition 4of the product, and you can correctly classify it, before initiating appropriate processes. In the ‘old world‘, you would have had to physically disassemble and analyze it - at great expense - to get the same information.

That’s just one simple example. If we continue to think about the future circular economy, traceability, and successive product build-up, the effort required with old world methodologies is exponentially higher again. Together with suppliers and customers, we need to work cooperatively, and in order to do that, we need to share the data we collect. It's not about giving away critical information associated with intellectual property (IP), for example, but it is about sharing the data that is relevant for the next stage of value creation or use. And for that, you need the digital twin.

 

I: Is the industry already thinking along these lines? It is also a question of defining suitable interfaces and governance mechanisms?


JvT: We’re still at an early stage. In industry, we talk about the trend toward digital sustainability. Here, it's all about product usage data, everything that is relevant for circular processes and compliance with emissions guidelines. It’s obvious, though, that we can only solve the issue of sustainability by working together. There are exciting examples, e.g. automotive companies are working together with plastics manufacturers, whose production processes are typically very energy-intensive. There, decentralized blockchains are being used to track what’s being processed, as well as where and how much energy is being consumed, from the creation of the granulate all the way to the final product delivered to the customer. This is a perfect example of how it’s possible to collaborate and share data securely across industries and sectors. So far, these examples are still in the minority, but we can see that such alliances can work in reality, as well as in principle.

 

I: Do you think that initiatives like Gaia-X or Catena-X will also play a role in this context?


JvT: These are also ecosystems for data exchange that need to be fed, but just on a different technological basis. Basically, there is the ERP-based approach and the open-source approach. The problem with the ERP-based approach is that, in principle, everyone who wants to exchange data needs to be using the same ERP system, or the corresponding interface. In my opinion, that poses some  difficulties.

While such an approach works in the industrialized world, it may not do elsewhere. If we want to connect suppliers from less industrialized regions, from Africa or Southeast Asia, for example, then there are often no compatible systems in place simply because they are not needed. In that situation, you have to track individually what has happened and where. If across the entire chain, from raw material to end product, only one third of the companies have the system, it simply won’t work. That's why I find open-source initiatives more exciting – that’s where there‘s greater potential.