Digital Twins with Game Engines

A digital twin (Digital Twin) is a virtual model of a product, process, or service. It can be used to visualize the operation of the object and its related information. Digital Twins for Leveraging Renewable Energy (DUKE) project, funded by Lapin Liitto and executed by the Lapland University of Applied Sciences, has its first functional pilot done in the Lapland Education Centre REDU’s educational heating plant in Jänkätie, Rovaniemi. It will be implemented with the Unity 3D game engine.

The game engine is powerful

The strength of game engines is the efficient and fast implementation of the visual environments. They are particularly well suited for 3D modelling. In addition to the visual model, the digital twin contains functionalities that can also be found in the real-world counterpart. The most straightforward functionalities to be modeled are, for example, details related to the mechanical properties of the system. In the visual model of a heating plant, mechanical objects to be modeled include, for example, modeling of building doors, inspection hatches, manually adjustable valve handles and electrical switches. Modeling doors and inspection hatches involves a visual effect that the observer sees after some input information triggering the door opening has been provided. The input information can be a virtual operation of the user to open the door or some movement of the cursor by the user in the vicinity of a suitable visual key element.

The functional model can be implemented with a game engine

There are numerous different options for implementing a key element or measure. Behind the door, a view opens to another space and through the inspection hatch, a view inside, for example, a boiler. Depending on the operating mode of the system, the boiler can be passive or produce heating power. When the heating is active, the visual effect after opening the door can be, for example, a visible combustion reaction inside, i.e. flames of fire. In the real world, opening the inspection door may also be accompanied by an emergency alert or other similar process-related action. In a real-world situation, a cross-effect, for example, on a malfunction of a vacuum that takes care of the pressure difference in the furnace, can also cause smoke flaking out of the inspection hatch. As a combination of the functional scenarios of the digital model, a description of the process is created, which in the case of game engines has to be modeled and programmed with the accuracy and to the extent required in the model.

In addition to the visual effect, sufficient realism may also require the creation of sounds that corresponds to reality. The sound associated with the opening of the door itself and the sound world carried from another space enhances the immersion of the model, i.e. the sense of presence or realism felt by the user. The concept of immersion comes from the gaming industry and especially from experiencing virtual reality.

Producing a functional model requires modeling the process

Producing a functional model of a complex process requires an understanding of the basics of process modeling. Systems combining mechanical, thermodynamic, hydraulic, and electrical subsystems, for example, have typically been modeled by describing the subsystems individually or with tools suitable for process modeling, such as Matlab Simulink, Labview, or Ptolemy II. In these simulation programs, it is possible to implement a time model required by the processes.

The functional model of the process describes the progress of the process in real time

In the time model, the time of the process progresses in real time or even faster than the actual rate of occurrence. The subprocesses take place in order and by mimicking the chain of events of the actual process so that the dependencies, i.e. causality, are realistic. In this case, for example, the fuel entering the boiler burns and, as a result of the combustion, thermal energy is transferred to the liquid circulating in the system. In a district heating plant, the circulating liquid is water and, for example, in ground source heat pumps, a water-ethanol mixture. Modeling of the time model and process progression can be implemented in its simplest way by thinking of a stable operating point for the system, where all the functional elements to be modeled in the system have settled into a stable state.

For example, in the modelling of a heat plant, the steady state is one in which the temperature of the circulating liquid, the volume flow of the circulating water in the circuit, and the pressure differences in the piping and boiler systems are in a stable state. The heating plant model has two separate water circuits connected by a heat exchanger. Through the heat exchanger, the thermal energy of the primary water, i.e. the water heated in the boiler circuit, is transferred to the secondary circuit, i.e. the district heating network.

Progressing the process over time is worth doing simply

The functionality of the sample heating plant was modeled so that at the start of the simulation, all variables, i.e., temperatures, pressure differences, and flows, have zero initial values. The calculation of the stable operating mode starts from the pump of the primary or boiler circuit and first the pressure difference produced by this power is calculated on the basis of the power set on the pump. The calculation of the pressure difference is started from the numerical value zero and the pressure differences of all elements of the circuit are calculated in small steps and at each cycle the pressure is raised to a state where the pressure difference in the pump and the set pump power theoretically correspond to each other. The pressure information produced by the pump is transferred in each revolution to the next element in the primary circuit which is the pipe. In the simplest model, the pipes transfer the pressure difference directly forward to the next model as is, so there are no pressure losses in the pipes. The pressure drop for the pipes can of course be set as a parameter, but in the simplest model it is not necessary.

First, the static state of the process is determined

The calculation proceeds element by element and the pressure difference moves forward for each circuit element. The elements form a system in which, viewed in the direction of rotation, they have a pressure difference as input data and a pressure difference after the element as output data. If the element is a valve, the numerical value of the valve position as input data is checked and based on this the pressure change produced by the valve is calculated and passed to the next element as input data. When the entire circuit is computationally rotated element by element, the calculation proceeds to the second stage, where the flow and temperature losses of the water flowing in the circuit are calculated on the basis of the pressure difference, if they have been defined for that circuit element. The temperature of the liquid circulating in the circuit is determined on the basis of the boiler power data and the volume flow of the flowing water. The pressure differences, flows and temperatures of the elements of the secondary circuit are calculated in a similar way. In the secondary circuit, the calculation is started in the same way as in the primary circuit for the pump producing the pressure difference. There is no heating boiler in the secondary or district heating circuit, but it corresponds to the output connection of the heat exchanger, which produces the heating power of the primary circuit to the secondary circuit.

The static model is sufficient for some of the simplest modeling. A more realistic model also includes modeling the dynamic properties of the system. Dynamic properties are the change phenomena that occur in the system. Such dynamic properties include, for example, increases and decreases in boiler power and flow changes in the secondary or district heating circuit. Modeling the dynamic properties leads to solving the differential equations describing the system. The simplest model of the differential equation is the so-called first-order differential equation, the mathematical solution of which is an exponentially descending or ascending graph. For example, the boost circuit boost can be modeled with a first-order differential equation if, as a result of the boost, the circulating water temperature rises to some new temperature value without the temperature behaving unstably. For example, temperature oscillation is an example of instability. A first-order differential equation is often a sufficient model for the dynamic properties of a system.

The phenomena of change, i.e. dynamic properties, are modeled with differential equations

The functionality of a simple dynamic model can be determined, for example, by determining the time during which the temperature rise occurs, i.e. the time constant of the temperature rise rate, by means of a step response test. By determining the time constant, the temperature rise corresponding to the power increase and the delay in the system, i.e. the so-called dead time, a dynamic model sufficient for most practical cases can be implemented. The dynamic properties of the sample heating plant were modeled by performing the test runs of the heating plant on 25.5. – 27.5.2020. During the test times, e.g. stepwise boiler circuit power increases and district heating circuit flow changes. During the changes, changes in the temperatures and flows of the primary or boiler circuit and the secondary or district heating circuit, as well as pressure differences, were measured. Based on the test measurements, the dynamic properties of the heating center were determined using regression analysis methods. The models of dynamic properties calculated on the basis of test runs and measurements are transferred to a functional model describing the properties of the system by programming them in the game engine into the program code of the model. The programming language is the Unity 3D game engine C#.

The visual and functional model combines into a digital twin

The final digital twin combines a visual model and a functional model describing the functionality of the heating plant process into a single entity. The goal is to produce the desired functionalities feature by feature. Each increase in functionality develops the system in a more usable and real-world direction. The digital twin, in its more advanced form, can be in real-time communication with its physical counterpart, so that operating operations through the digital twin can even control its real-world counterpart or alternatively the real-world thermal power plant functional change is realistically reflected in the digital twin. For example, so that the popping of smoke in the real world is heard and seen in the virtual world. The smell of smoke in the virtual model is still a fictitious feature for the time being, but in the future, perhaps that too can be implemented.

Official name of the project: Digital twins for leveraging renewable energy
Project timetable: 01/01/2020 – 31/12/2022
Total budget: 761 732 €
Funding: EAKR 2014-2020
Contact person: Tauno Tepsa (+358 40 821 6865)



Tauno Tepsa, Senior Lecturer

Tauno’s work is divided into teaching and working as a Project Manager. He works in the DUKE project as a Project Manager. Tauno’s special areas are Electronics, Embedded Software, Automation and IoT.

DI, MSc(Eng.)

What is sustainable development and what can software development bring to it?

As a software lab, FrostBit is not a traditional software development unit, although we have a web and mobile team in addition to the XR game team. These teams support each other in most projects in one way or another. The lab could be considered as a multidisciplinary operating environment.

Many examples of projects concerning the fields of Health, Mining or Forestry highlight the multidisciplinary actions in the lab, and one addition to this is the currently ongoing project “Towards Sustainable Tendering”. Through the implementation of a practical tool, the project seeks solutions at the level of the province of Lapland to minimize the carbon footprint of municipalities so that new ways of operating are also regionally profitable. To make this possible, a new way of thinking and procurement policy is needed among municipal decision-makers and entrepreneurs.

Ii’s municipality’s example of sustainable low-carbon procurement has even attracted the interest of foreign media, including the BBC. The municipality has invested in geothermal, solar electricity and wind power. The practices of the entire municipality starts even from a primary school level: with consumption savings, primary school children can raise money for a jointly decided purchases. Such a new way of thinking must therefore be instilled in other municipalities, decision-makers and citizens. Often, however, such a large price tag for large purchases seems to be a cost item that is difficult for the municipality to cover. At this point, we need to think about the payment period, when new jobs, tax revenues and margins from production convert large expenditures into income in the future. In addition, a sustainable society is increasingly more important for people today, and such municipalities appear to be pioneers in creating this “new image”.

The list of municipal decision-makers is not small, thus many factors must be take into consideration: choices are made in areas such as energy production, heating solutions, central kitchens operations and the food industry, sustainable construction and logistics solutions, which mentioned are the largest sources of carbon emissions in municipalities. Many projects in Lapland and throughout Finland aim to improve some of these areas, and many different partners with their needs and results are contributing to these projects. This leads to the question of how all these areas and the results of the projects can be used by the decision-makers in their development towards the goals that are also regulated by the EU goals?

The project Towards Sustainable Tendering aims to provide a tool for scalable demonstration of the impact of procurement from small businesses to provincial decision-makers. In order for the tool to work, considerable work has been invested in its design and even internationally recognized mathematicians are included in the project. In addition to retrieving background material for the tool, collaborators from various fields have been acquired from e.g. in energy, construction, transport optimization and local food-focused projects. The tool also needs a lot of users, which at times is challenging to approach from the point of view of entrepreneurs because of the Coronavirus-pandemic. Therefore, there is a strong focus on the communication and marketing actions about the project, which is established through many different channels. Thus, in addition to technological know-how, the implementation of the tool itself requires expertise, e.g. from an economic, agrological and energy point of view, and the successful outcome of the project provides a tool that can be taken to other parts of Finland and to the international level, where Lapland University of Applied Sciences is strongly increasing its project actions and gaining a variety of partners. The goals of Lapland University of Applied Sciences and the Lapland Association on the development of the Arctic region and green solutions are the driving force behind the project and the strong desire of the municipality of Kemijärvi to raise the profile of Eastern Lapland and the whole province.

The project’s website contains more detailed information about the project and upcoming events in the event section, as well as a project webinar that explains the issue in more detail.



Mika Uitto, Project Manager

Mika works as a Project Manager at the LapUAS and is currently leading the EU Green Deal related Towards Sustainable Tendering project. He is planning multidisciplinary international & national projects e.g. in healthcare, forestry, spaceweather, cultural heritage, environmental matters & industry including XR-content development, gamification & different software development with a touch of UX design. Mika participates in the lab’s marketing, event planning & communications aims to enhance sustainable development goals and actions of Lapland UAS.

Master of Engineering

Machine and deep learning – a tech trend or a tool of the future?

Everyone who is familiar with software technology trends within past few years has surely met with the concepts of machine and deep learning as well as artificial intelligence, whether in articles or other contexts. It seems nowadays these terms are also widely used even while marketing different software solutions.

The FrostBit Software Laboratory has also studied some of the ways of machine and deep learning in recent times. Our laboratory is especially interested in the wide array of possibilities of these new technologies; what are the software features they allow us to create more easily which are either extremely difficult or even outright impossible to engineer?

From the technological point of view, most machine learning related applications are created by using the Python language. The reason for this is the fact, that Python offers a stunning selection of different tools and libraries that are especially crafted for machine learning applications and cases.

The basic concept of machine learning is the following: We provide our machine learning application the data, which is used to create desired predictions for new data. The data will be split into two parts: the training data and the testing data. The training data is used to teach the machine learning algorithm to understand all the features of the data as well as the complex correlations that lie within. The test data is used later on to make sure the algorithm is capable of creating predictions within the limits of acceptable error margins.

Machine learning can be utilized in many applications. Some traditional use cases:

  • supporting decision-making based on earlier decisions
  • determining the market value based on sales history (e.g. real estate prices)
  • natural language processing (e.g. spam e-mail detection)
  • finding complex correlations in given data (e.g. what are the features of a typical web store customer during a certain ad campaign)
  • recommender systems (e.g.  web store product recommendation features)
  • data classification (e.g.. determining whether a tumor is benign or malign based on earlier measurements)
  • etc.

Deep learning is a subcategory of machine learning. The basic difference between the two concepts lie within the way they strive to create predictions based on the given data. While the traditional machine learning algorithms aim to create their predictions based on a single operation, deep learning algorithms utilize so-called neural networks while creating predictions. Neural networks are collections of layers and nodes, through which the training data is processed for the learning model. The finalized learning model will be used to create predictions when given new data. Since the training phase is distributed among multiple layers and nodes, it will create a certain amount of natural chaos or humanity in the data processing phase.

Because of this, the deep learning process is a much more organic way to create new predictions when compared to traditional machine learning methods. The organic feature of deep learning also means, that it is virtually impossible to get the same exact results every time, even while using the exact same data and chosen algorithm in the training phase.

The possibilities of machine and deep learning are amazing, but as often in trending technologies, the realistic use cases are often forgotten. Too often there are discussions, where machine and deep learning are regarded as all-solving silver bullets, which can solve any given IT problem easily. The truth, however, is much more closer to the concept, that machine and deep learning are actually just extremely demanding tools, that potentially can provide useful information for challenging problems. How well machine and deep learning algorithms work in practice, is always dependant on the context and especially on the fact, how much time and willpower the software developers and context experts can possibly provide.

The quality of the training data also plays a significant role in machine learning. If there is not enough data or it is not versatile or relevant enough, the predictions based on the data are most likely not usable.  Machine and deep learning require a lot of time and effort, and it also resembles both statistics and quantitative research analysis at the same time. From the software developer’s point of view, the challenge is to find the correct analysis method for the correct problem, while the data is processed in the correct way.

Machine learning also requires the developer to spend a great deal of time to explore the data and create personal decisions, which features of the data are relevant and which are not. All irrelevant data will undermine the precision of the machine learning application predictions. As an additional challenge, machine learning contains a vast amount of theory, that has to be utilized while developing a machine learning application. However, the actual programming phase is not that difficult in machine learning applications. Instead, deciding which methods to use and in which way is the actual challenge. Because of this, the machine learning application software developers always need to be accompanied with experts of the context of the data as well as experts on different research and analysis methods.

We are eager to work more with all these technologies in future projects!



Tuomas Valtanen, Project Manager

Tuomas works as the web/mobile team leader, a software engineer and a part-time teacher in FrostBit at Lapland UAS. His tasks include project management, project planning as well as software engineering and expertise in web, mobile and AI applications.

Master of Engineering