New open-source communication standards burst in with force
One of the most important challenges in the factory of the future is the interconnection between elements and the high rate of information exchange. Many new industrial communication protocols are proposed in a new connected factory. The Industrial Ethernet the Ethernet applied to an industrial environment, the communication protocols IoT and IIoT and, in the not too distant future, the 5G will try to unite in a simple way all this information that comes from the connected elements in order to achieve a better determinism and data transmission rates much higher than the current ones. From the point of view of monitoring industrial processes and networks, open-source communication protocols vis-à-vis owners are much better adapted to the needs as the information is much more easily accessible and structured. During the Hanover Messe we were able to observe that the following standards are the trends that the market will follow in the coming years, among others.
- OPC-UA: OPC is the interoperability standard for the safe and reliable exchange of data in an industrial automation environment. The industry's demand for direct communication between field technology and the different monitoring applications useful for industrial processes (ERP, SCADAs, MES...) led to the creation of OPC, a framework for real-time communications with multiple benefits, as it abstracts from the hardware and allows simple integration of elements and adaptation to changes.
- EtherCAT:EtherCAT Ethernet for Control of Automation Technology) is an open-source, high-performance computer protocol created and developed by Beckhoff Automation. EtherCAT media fits well for industrial or control environments since it can be operated with or without switches. EtherCAT is an open standard that has been published as an IEC specification based on EtherCAT technology group input.
- DDS: Data Distribution Service (DDS) is a machine-to-machine (M2M) middleware standard promoted by Object Management Group (OMG) whose objective is to enable scalable, real-time, reliable, high-performance and interoperable data exchanges between publishers and subscribers, i.e. for M2M communication.
- MQTT / AMQP: These are communication protocols for typical IoT elements and are widely used in manufacturing processes that work with decentralized sensors.
- 5G: The communication par excellence of the future, a data transmission speed 1,000 times higher than that of the LTE network. Today's 4G networks are not robust enough to control tomorrow's products and services. In addition to increasing speed, 5G networks will offer lower latency, greater reliability, better connectivity from more locations; also higher capacity, allowing more devices to be connected at the same time.
Predictive analytics
Once we have the centralized data, the next step is to work with it, since the information is useless without its analysis. The trend in the market is to predict failures and problems before they occur, as seems logical. This prediction facilitates maintenance work and generates cost savings both for elements that break down and the increase in efficiency (OEE) of the production lines, as there is much less machine downtime. Predictive analysis, which can be performed using complex artificial intelligence (AI) or learning tools and algorithms or machine learning, has a direct and positive impact on the company's bottom line at the end of the month.
We are no longer talking about BigData but about Edge Computing. Edge Computing significantly reduces the load of data to be handled by pre-processing them before they are ingested, which makes it possible to alleviate the calculation and storage power of the equipment, and therefore, a reduction in costs derived from the hardware and software elements in charge of computing and processing these data.
Data in the cloud
And where do we store all this information? Currently, and increasingly, the trend is to send everything to the cloud, although it is true that industrial companies are more reluctant to this and justify leaving the data in local (premise) to network security and cyberattacks. The platforms par excellence AWS, Microsoft Azure or Google Cloud, are very reliable as far as security is concerned and provide a very great flexibility as companies do not need to have the infrastructure at home and be a managed service, and above all the possibility of growing and decreasing in a very simple way, and above all pay for what is really being used avoiding over- or under-dimensioning the Datacenter infrastructure.