Author: Klaus Hübschle Created on: August 2015 Version: 1.0 Content Most IT experts predict a promising future for cloud computing also in the automation industry. However, which are the applications where the involvement of this technology really makes sense and how can it possibly be implemented? So far, only very few real cloud solutions have been implemented in the automation industry. Real cloud solutions are based on Public Platform-as-a-Service (PaaS). This means they are characterized by an infrastructure and application logic that is shared among the users. While in particular small and medium-size enterprises (SME) are still largely sceptical about the cloud approach, there is a sufficient number of potential application scenarios which are ideally suited for the use of such software solutions: A machine tool manufacturer wants to permanently monitor the performance and condition data of his machines installed all over the globe in order to consolidate and manage this data centrally. By means of systematic data analysis he intends to be able to guarantee his customers utmost availability of his machine tools. Last but not least, the machine tool manufacturer wants to use the globally collected data to derive new ideas for the development of future products. A manufacturer of field devices has similar intentions. In order to be able to offer global status monitoring for the supplied devices, he wants to make sure that the continuously collected device data are maintained centrally. To offer added value to his customers, he also wants to make the status monitoring results directly available online and independent of platforms. The process engineer of a globally acting manufacturing company want to systematically acquire data for certain process steps at distributed production sites for future analysis concerning systematic interdependencies using data mining and in order to derive potential process improvements. The operator of a small manufacturing company wants to be able to monitor the status of the different production facilities of his plant even while he is travelling. In case of extremely critical situations, he wants to receive alarm messages via his smartphone and to remotely check the status of his facility. The amount of data to be collected may start from just a few megabytes, but may well also reach the terabyte range. This means that common relational database systems may be overburdened or require massive hardware resources making it difficult for SMEs to start using this technology. At this point, cloud solutions are beginning to make sense, offering resources, which compared to traditional in-house solutions can be scaled exactly to current requirements and can be billed accordingly. 1 / 7
For an efficient monitoring, data have to be visualized in an appropriate way. The easiest way to do this is to use browser-based solutions which are developed using HTML5 and which can then be made available for mobile devices, too. Real benefit can be achieved in combination with mobile cloud services. Cloud-based service platform: The relevant data reach the cloud via data collection agents, to be collected and processed there. The visualization for the monitoring is largely deviceindependent. This is due to cross-platform services offering standard interfaces to different device platforms (ios, Android, etc.), e.g. for push-type messages or services solving common problems such as synchronisation of offline data. The raw data alone, such as e.g. collected during monitoring, are not of much use. It is therefore necessary to analyse and to aggregate the data parallel to real-time monitoring according to certain criteria in order to calculate meaningful key performance indicators (KPIs). These can then be used to derive automatic actions or notifications. Appropriate solutions are complex and not easy to design. Cloud service providers therefore offer pre-configured services, which can be linked to existing data. The analysis of historic mass data may lead to important conclusions. New big data methods are gaining more and more importance for this. In addition to short-term computation performance, considerable amounts of storage are required for data archiving. Considering the multitude of existing devices (sensors, control systems, etc.) which are able to transmit data in short intervals, conventional systems quickly reach their limits. Devices can easily be connected to the provided services since cloud services usually offer a REST interface. REST services are based on the HTTP protocol and can be addressed in almost any environment and programming language even beyond firewall boundaries. In addition, the providers usually offer additional frameworks for embedded devices, allowing for easy interfacing to the cloud (e.g. Microsoft via.net Micro Framework). The collected data are usually very sensitive and therefore need to be effectively protected against unauthorized access. As data security is so enormously important, it may be better to entrust a highly professional operator of big data centres instead of the internal, potentially under-staffed IT department, or, even worse, a so-called shadow IT managed by the internal development and production department. Certificates such as ISO27001, SAS70/SSAE 16, 2 / 7
FISMA, HISPAA, or similar, which are approved as test labels for IT security and compliance, can be helpful for the selection and assessment of cloud providers. Microsoft Azure an ideal platform M&M Software currently develops a new cloud-based service platform for the collection, storage, analysis, and visualisation of data, based on the Microsoft Azure platform. This platform can be imagined as a web-based SCADA system with some MES aspects which is fully hosted in the public The dxpert platform with a core runtime system is the common basis for different OEM solutions and specialized SaaS offers. 3 / 7
dxpert meets both the functions of a subscription-based real-time process data interface and the function of a query-based historical process database. This is made possible by linking Azure SQL for fast storage and Azure Table Storage for long time storage cloud as software-as-a-service. So-called agents are used for data collection. These are locally installed components, which collect local data in the plant and transmit them to the cloud using secure Internet connections. The collected data or process values are stored and archived in so-called time series. Based on this, different application modules are used from process visualization to live trends and process-dependent alarms and messages. In the future, sophisticated real-time data, business intelligence, big data, or even machine learning analyses may well be possible. The modular design will also allow for additional user-specific application modules and the integration of thirdparty systems. The new platform e.g. accesses the Azure Service Bus integration service which was designed as a messaging system that interconnects different applications, services, or even devices, independent of where they are located. While this may cause a relatively strong dependency on the provider, the potentials of cloud computing in terms of scalability and availability can be leveraged in an ideal way. This is usually not the case if a software application designed for Intranet usage is simply hosted in a virtual machine in the cloud. The platform core system is mainly a process data server with integrated process database, which is optimized for the storage of time series data. The redundant components on all system levels ensure that the system remains available and operational even in case of outages, maintenance, or platform update. Universal core system The locally installed data collection agents can be implemented in multiple ways. This can be an extension of a PC-based machine control system which transmits service-related data depending on the current status. Alternatively, a generic and configurable OPC agent represented by a Windows services can also be used. The OPC agent uses existing OPC-UA- and OPC-DA 4 / 7
servers and transmits data to the cloud platform. Another possibility is a data collection agent integrated in the firmware of a fieldbus gateway as a component or implemented as a functional module in a PLC. In standard cases, the data collection agents all use the same HTTPS-based programming interface (API) which is also used by the library made available by M&M for the implementation of agents. The data collection agents can be configured both locally or using application-specific websites in the web portal. The data transmission interval is individually selectable up to once per second. In case of higher sample rates, the data collection agent has to act as buffer and transmit several groups of measurement data in a package. The number of agents is expected to grow fast. The processing of these huge amounts of data arriving simultaneously almost in real-time therefore is a big challenge in particular in combination with the high availability to be guaranteed. Azure Service Bus Event Hub, a component developed by Microsoft for IoT applications, can provide remedy. It can process millions of events per second. The event hub can also be used as powerful buffer for unexpected data overloads, making sure that no data are lost even in case of overload. The cloud-based platform gets even more powerful thanks to the integration with the analysis service of the Azure platform and the linking with established business intelligence tools. Hybrid data storage The core system stores the received data in a historical process database. Thanks to the combination of a fast Azure SQL database with efficient Azure table storage, requirements for constant write and read speed can be met even in case data volumes constantly increase to the terabyte region. While purely SQL-based databases are subject to limitations when dealing with large data volumes, they offer clear performance benefits when handling queries. The latter has been the main reason for the two-stage storage because the most recent data are normally used 5 / 7
very frequently e.g. to calculate KPIs or aggregates. Data are therefore first stored in an SQL database before being transferred to the table storage for long term storing. Microsoft has recently introduced Azure Key Vault, a service for the management and administration of cryptographic keys. Key Vault encrypts the stored keys using keys, which are stored in strictly controlled and monitored hardware security modules. This makes sure that the key is not visible to anyone and even cannot be extracted by Microsoft. Aggregations and scripts Additional calculations can be triggered either time-based or event-controlled e.g. via incoming data. These can be simple standard aggregations such as average or maximum value over a certain time period, or very individual calculation algorithms. In order to achieve utmost flexibility, all functions are mapped in JavaScript format. This not only allows for individual calculations, but also for customizing and upgrading during live operation. Technically, this has been implemented using Node.js a framework for the development of server applications using JavaScript. This way it is possible to use the originally client-side browser language also on the server side. The scripts are executed in a security sandbox in the cloud in order to isolate them from other scripts, users, projects, and their data. Via the core system APIs, the scripts can also access all project data in the historical database where the result of the calculation is also stored. Dashboards and reports All web interfaces are implemented in HTML5 and follow current web standards. A subscription model is used to access the real-time data stored in the core system, similar to OPC UA. The historical data can also be requested via a secure web API and processed further. M&M proprietary or, if required, also external third party tools are used for the development of dashboards and reports. The latter are mainly used for complex analyses and evaluations such as data mining, big data, or machine learning. Based on the new service platform, M&M Software currently implements initial pilot projects with different partners and pilot users. Klaus Hübschle Technical Director, M&M Software. 6 / 7
M&M Company Profile M&M Software is a globally acting technology and consulting company. Our service portfolio includes management and technology consulting, software development and maintenance, as well as quality assurance and IT services. From our locations in Germany und China, we provide unique, premium software solutions to our customers. M&M stands for innovation, problem-solving competence, and quality. For almost 30 years, M&M has gained a reputation as a reliable partner countless established companies from all parts of the globe. Our extensive industry know-how is reflected by a multitude of innovative and unique software solutions which we develop for and also in collaboration with our customers from various industries. The company headquarters is located in St. Georgen (Black Forest, Germany). With our subsidiary in Suzhou near Shanghai (China), we are not only serving the upcoming Asian markets, but we are also offering offshore development services with proven M&M quality to our customers in Europe and America with significant price benefits. Your contact person for this press release: Kenan Sengün Tel.: +49 7724 9415-42 Fax: +49 7724 9415-23 press@mm-software.com M&M Software GmbH +49(0)7724/9415-0 Industriestr. 5 Fax +49(0)7724/9415-23 78112 St. Georgen Email: info@mm-software.com GERMANY Internet: www.mm-software.com 7 / 7 Registered Office: St.Georgen, Germany Registry Court: Freiburg HRB 602021 Directors: Erwin Mueller, Klaus Huebschle, Andreas Boerngen