Mr Lantermann, what in your view is concealed behind the multitude of terms that production specialists are currently confronted with?
All over the world there are promising approaches for making production significantly faster, more flexible, more productive and more efficient. There is also a drive to improve and reliably uphold the quality of the manufactured products. Depending on the national mentality and historical background, a variety of ways of viewing the challenges have evolved. And these have given rise to the numerous approaches and terminologies for the required methods, all of which in fact pursue the same goal. In the USA the experts come together in the Industrial Internet Consortium (IIC), in Japan there’s the Robot Revolution, the British are concerned with Catapult and the French with “Say oui to France”, while the Chinese have conjured up the slogan “Made in China 2025.
So you’ve now added further terms to the discussion. But what does this multitude of schemes and terms mean in practice?
Basically, they are all concerned with creating links between all the ‘participants’ in production with the means of modern data processing and data communication. This means, for example, that a workpiece, from entering production as a blank through to dispatch after assembly, repeatedly exchanges the relevant data with the surrounding participants. The participants, including machines, handling devices and the like, communicate in turn with the workpiece and other participants, i.e. neighbouring machines and devices. To this end, all participants have to be equipped with sensors and a balanced measure of ‘computer intelligence’. Only then can they capture, generate, filter and pass on the required data.
This initially means acquiring a huge quantity of data. But what’s the point of gathering all these data? What are the benefits of this?
We have to bear in mind at all times that the overriding goal is to make production more flexible, productive and efficient. This is the goal pursued with data acquisition and data communication. This can be illustrated with an example from production: if a clamping device reports that a blank for the envisaged machining operation cannot be sufficiently firmly clamped – because of chips deposited in the jaws, for instance – machining isn’t fully disabled, for example. The participants decide autonomously by means of data communication, for example, to swivel an air nozzle into position to clean the device. The captured data from the affected participants – machine, clamping device and blowing nozzle – are sent additionally as production and machine data to the high-order control, the ERP system. From these data it is possible to read out the identities of the affected machines, the causes of the malfunction, the unproductive downtime and the automatically taken measures to remedy the malfunction. If such a fault recurs several times, solutions can be devised to prevent the malfunction from occurring again. All of this collectively ensures that production runs as smoothly as possible, i.e. becomes more productive, more flexible and faster, while the quality of the manufactured products is enhanced and upheld. Taken to its logical extreme, production even optimises itself. However, this calls for ingenious algorithms enabling the interlinked participants to automatically find the optimum solution in each situation. This can also entail a phone call to a service technician.
Are there any solutions that already optimise production in the way you suggest?
Yes, in some limited areas, almost all manufacturers of production and automation machinery are already offering practicable solutions. So far this has been mainly confined to self-contained systems. These can be machining centres, for example, with the associated loading and unloading systems along with buffer stores. Where it is a question of sending the collected data to a high-order data network so that they can be suitably logged and evaluated, such systems often still run up against their limits.
What are the reasons for this?
Open interfaces have so far been insufficiently standardised. A multitude of parameters have to be defined in these interfaces in order to unify data interchange and set the required scale of such interchange. This starts with such basic parameters as the digital data format and extends to conventions about the safety and data protection rules the sent data are subject to. This means in practice that not all participants are able yet to exchange the required data among themselves.
You work on the standardisation committees. What feedback have you had from your work?
In a first step, we drafted the standard OPC UA which defines open interfaces for data communication. This data protocol facilitates open data communication between almost any participants in a company-wide data network and even encompasses the planning-related and commercial MES and ERP systems. Anyone investing today in automation and production equipment should make sure that the control equipment already has the OPC UA open interface or is at least adapted to it.
Will comprehensive data networking in future affect all companies, from small and medium-size companies to large corporations?
It depends. Anyone can decide for himself the scale that makes sense for his production environment. Most importantly, it’s essential to always concentrate on the goal of being more efficient, more flexible and faster while improving and maintaining quality. Even the smallest businesses will have no option but to concern themselves with the coming demand for data communication. For instance, suppliers to larger companies that have optimised their entire production will soon have to make available a minimum volume of data and data communication on the state of their own production. These will be demanding trends for everyone, but a challenge that is worth tackling in my view.
Mr Lantermann, many thanks for this detailed information on the currently controversially debated concepts of digitisation, the smart factory, Internet of Things and Industry 4.0.