The advent of Big Data and Industry 4.0 has lead companies to drive enterprise-wide digitalisation strategies. Engineers are now being tasked with implementing machine learning and deep learning into their projects to make their systems or applications ‘smart’. These challenges are mostly embraced and the allure of working with or developing Artificial Intelligence applications captures the imagination of those who have a passion for technology. Exploring these new possibilities and dipping into the details can be engrossing (and satisfying), but can lead to longer term project pitfalls.
I would like to step away from the very appealing topic of machine learning model development and instead elaborate on what is required to operationalise an Industry 4.0 ‘type’ application or system. Once intelligent algorithms have been created, they need to be integrated into a IT environment such as a server or the cloud. The algorithms can also be embedded directly into equipment. Traditionally, both processes require re-coding the algorithms in a low-level language so that they can be integrated. This approach is extremely time consuming and error prone. The deployment and integration of algorithms can significantly delay project implementation. Therefore, methods of taking a system to operation should be considered when deciding on the model development tools.
There is a misconception that MATLAB is a research and development tool only. MATLAB enables you to integrate algorithms directly into production on servers and on embedded real-time hardware without recoding anything. There are several ways this can be done:
Let’s look at how you can go from developing a ‘smart’ Industry 4.0 ‘type’ application in MATLAB to sharing your real-time analytics on a dashboard:
Why a dashboard? It is probably the most common or most recognised form of deployed analytics
In the image below, we can see the process of developing of a model or algorithm and then taking it into production. The MATLAB desktop instance can access data on remote databases or warehouses and can leverage the additional processing power of the MATLAB Distributed Computing Server (MDCS). Once a predictive model is developed, it is packaged using MATLAB Compiler SDK. This package is then placed on the MATLAB Production server, where it extracts data from databases, runs the models and pushes computations to the MDCS in real time. The beauty of operationalising an algorithm in this way is not only in the simplicity, but in its scalability because you can do development in an environment that mimics the production environment.
The key difference between the development and production environments is the way in which you interact with the model. On your desktop you will run scripts and commands, where in production the model can run continually or periodically in the background and push the results to a dashboard. The dashboard view (e.g. html page hosted on a webserver) interacts with the model via a wrapper. The dashboard can be developed in any web development language and your interface is limited by your developers’ creativity and completely independent of MATLAB.
This was just a quick look at one way an algorithm can be operationalised. Below we can see some of the most recognised ‘Big Data’ infrastructure and how MATLAB can integrate with it. Despite the large variety of platforms MATLAB can integrate with, the process of taking an algorithm to a production environment remains the same. The simplicity of this process not only speeds up the implementation of an algorithm (success of your Industry 4.0 or digitalisation initiatives hinge on this), but substantially reduces the time and costs to maintain it.