Tube 2016: FA 03: Industry 4.0 – automation technology of the future

The term Industry 4.0 has been a frequently buzzword for several years now. It is generally used as a synonym for the fourth industrial revolution where the real and virtual worlds are growing together into an “internet of things”. It should largely be seen against the background of changes in industrial production, with an increasing individualisation of products and highly flexible manufacturing processes. Moreover, according to the German Federal Ministry of Education and Research, there has been a “far-reaching integration of customers and business partners into value-added and business processes, while the link between production and high-quality services has led to so-called hybrid products.”

The term Industry 4.0 has manifested itself especially clearly in the increased automation of the various processes within an industrial enterprise. It only works under the development of intelligent, autonomous monitoring and decision-making processes, so that the relevant routines can be controlled and optimised in real time. Two concepts that are closely connected with Industry 4.0 are the terms smart factory and smart production. The focus of a smart factory is on developing intelligent production systems and processes and on realising distributed and networked production sites – or, to use IT jargon: on the integration of adaptive cyber-physical systems in production. Smart production includes, among other things, cross-company production logistics and man-machine interaction.

In order to implement Industry 4.0 in practice, large volumes of data (“big data”) are required. Although they are available in many companies, they still tend to be rather isolated and disconnected. To set up genuinely efficient routines within a business, it is important to analyse, process and intelligently connect all data. Prof. Katharina Morik from the Department of Artificial Intelligence at TU Dortmund University, says: “Unless large collections of data are analysed, they can degenerate into data cemeteries. This is done with the tools of Artificial Intelligence (AI) which allows machine learning, i.e. the automatic acquisition of rules based on data. To “dig” for knowledge among the available data – a process known as “data mining” – the company RapidMiner has developed a tool of the same name which is now extremely widespread and requires no programming. 

Big data and cyber-physical systems are areas which are studied by SFB 876, a unit within the IT Department of TU Dortmund University. One of their projects is the development of data stream algorithms which allow an analysis of incoming data streams in real time. The special research unit has developed a tool called Streams for the convenient configuration, parallel arrangement and distributed execution of online processes. The theoretical foundation that was created by SFB 876 has been implemented in practice together with SMS Siemag AG and a working group of Dillinger Hütte under a real-time forecasting project at a steel mill. This innovative system is adaptive, i.e. it is able to learn and therefore to fine-tune a production process based on the data which it receives from the manufacturing process, so that it can then improve the industrial process.

Dillinger Hüttenwerke, according to Dr. Dominik Schöne, is “Europe’s leading heavy plate manufacturer”, with an output of around 1.8m. Its products are used, among other things, for the production of large-diameter pipes. The central furnace at the smelting plant in Dillingen is a BOF converter (Basic Oxygen Furnace) where pig iron and scrap steel are fed in and where slag-forming agents are then added, such as lime. Using a blower lance, oxygen is subsequently blown into the molten mass at supersonic speed, burning up any undesirable elements (such as carbon, phosphorus and sulphur) and ensuring their disposal in the form of slag and waste gas. The purpose of the BOF process is to obtain melted steel with certain defined properties at the end of the oxygen blowing process (i.e. the blowing end point). The target variables are the tapping temperature, the carbon content and the phosphorus content of the molten mass as well as the iron content of the slag.

The data-driven forecasting model for the BOF converter was developed with the aim of improving predictability of the four target variables at the blowing end point. To record the process data, a computer was integrated into process automation with the capability of detecting 90 static process variables. To increase predictive accuracy even further, 36 dynamic process variables were collected, as well as using further sensors with the focus on vibration, sound and optical properties. In all, the data-driven forecasting model can deal with 126 process variables.

Not only can the newly developed forecasting model learn independently, based on large data volumes, and not only can it make real-time predictions; it can also control the blowing process by identifying suggested corrections – again, in real-time. A comparison with the forecast target values of a conventional metallurgical model shows that the data-driven model is far more accurate in predicting the temperature at the blowing endpoint. Moreover, unlike the conventional method, the new model can also predict all the other target variables.

The data-driven forecasting model has a wide range of economic benefits: Whereas steel production is increased through a reduction of the after-blowing and over-blowing rates, process costs and the costs of input materials are reduced. Also, the fire-proof lining of the converter is less subject to wear and tear, the converter produces more steel, and the company has lower personnel expenses. With the 190-tonne BOF converter in Dillingen, which delivers an annual volume of 2m tonnes, the reduction of heating fuel and its lower after-blowing rate lead to potential savings of around EUR 500,000 per year. This is under the assumption of an improved 5°C accuracy for the tapping temperature.

One major benefit of a data-driven forecasting model is its flexibility. Such models can also be transferred to other applications with relatively few adjustments. This is true for other converters and also other furnaces.

But the new automation options of Industry 4.0 offer even more benefits to system manufacturers in metallurgical and rolling mill engineering. Such systems, which are usually large, complex and technologically advanced, cover the entire portfolio of the power supply and of electrical and automation engineering. The solutions, which are consistently tailor-made, mostly comprise fully customised technical processes with the relevant automation solutions. “This is why, prior to the actual commissioning, we conduct comprehensive tests on the relevant software of all our systems,” says Hubertus Schauerte from SMS Siemag AG in Düsseldorf “so that we can ensure the highest quality standards and so that the commissioning periods are as short as possible.” Compared with the world of models mapped in Industry 4.0, the engineers even go one step further, replacing the real physical world with a virtual physical world in their system tests. To test a customer’s software engineering, this involves the use of real-time simulation of the relevant system.

To do so, they map the dynamic behaviour of the control systems, all the functional connections and the engineering of the processes in the form of models. These are then implemented in server clusters where simulation can be executed in real time. The resulting network of automation solutions and system simulation components, says Schauerte, is heterogeneous in structure, so that it comes very close to an internet of things and indeed the very basis of Industry 4.0.