Download the code

AI in Industry 4.0: Neural Networks for Time Series Modelling in MATLAB

An essential aspect of the mining process is the froth flotation process. This removes impurities from minerals, such as silica from iron ore, which ultimately determines the quality of the product. In this article we focus on how neural networks can be applied to the mineral extraction process, ensuring products of a higher quality.

The example deals with the prediction of silica concentrate in iron ore at the end of the froth flotation process. The iron ore concentrate dataset, which can be downloaded from Kaggle, consists of measurements for the concentration of the iron ore before it’s fed into the flotation pump, as well as the air flow and level of the floatation columns. The data spans over approximately 6 months with some measurements being recorded at 20 second intervals while the silica content was recorded every hour. Typically, lab measurements of the silica concentrate take time to reach the process engineers. With the use of neural networks to determine silica concentrate forecasts, engineers can take corrective action as the process changes in real time.

The model used in this example is a Long Short-Term Memory Network (LSTM). This is a type of recurrent neural network (RNN) that learns long term dependencies between steps of sequential data, making it a good model for time series problems.

The data used in the example has already been cleaned and prepared. The dataset is visualised to determine what can be learnt from it. The silica concentrate typically lies between 0.5% and 5.5%.













The data is split into training and test data (85% and 15% respectively) with the latter being used for evaluation of the model. The LSTM architecture is defined with 50 hidden units and the various training options are set for the model. Once the model is trained using the training data, predictions are carried out. The diagram below is the training progress plot which displays the Root Mean Square Error (RMSE) and the loss during the training phase.


In order to evaluate the model’s performance on the test dataset, the RMSE is used as the evaluation metric. The RMSE calculated for the test data is 0.7603. The figure below depicts the predicted error distribution.













This example illustrates an elementary approach towards applying LSTM neural networks for time series forecasting applications. Additional feature engineering coupled with fine tuning LSTM hyperparameters to suit the application domain will provide increased accuracy in forecasts.

You can download the MATLAB script for this example here.

What Can I Do Next?