Skip to content
CIMPLO

Cross-Industry Predictive Maintenance Optimization Platform

CIMPLO

  • Home
  • Project
  • Partners
  • Resources
  • Events
  • Contact

More than meets the eye…

March 5, 2020

When we are asked to recall a memory, usually most, if not all of us, will not go through the events describing it step-by-step. If it is a good memory, we will usually associate it with feelings of happiness and for example, mental images of smile, laughter, or maybe bright colors to name a few. If it is a bad memory, usually with feelings of sadness, and a tendency to flee, coupled perhaps with mental pictures of stress, tears, and darker shades.

 

It seems that our brains tend to compress experiences in a way that only the essentials are kept. Essentials with respect to us. These “encoded” memories provide a mental hook, in order to “reconstruct” the experience we had step-by-step.

 

In the fields of Mathematics and Computer Science, neural networks (NN) provide an in-silico brain that has been around in theory since the 1940’s and in practice (in simple forms) since the 1950’s with Frank Rosenblatt’s perceptron. NNs have seen an explosion in research and applications, especially since the early 2000’s, with various variants suited for different tasks.

 

One of these variants, the auto-encoder (AE), works similarly. As it’s name suggests it encodes itself, which in NN terms translates to an unsupervised (or self-supervised) learning method. In this method, the inputs are the same as the outputs (there are no labels), and so this way the network is conditioned to learn in its hidden layers, deeper and more important representation of the inputs, in a way that these representations are enough, in order to reconstruct the input [1]. Thus, these representations (or embeddings) retain characteristic information of the initial (raw) data, that is not necessary obvious, but is nevertheless important. Based on the dimensionality of the hidden layers, this latent representation, can be used for dimensionality reduction, feature learning and generative data models, to name a few. See Figure 1 for a simple visual interpretation.

Figure 1. Visual representation of auto-encoder. From Michela Massi (own work), CC BY-SA 4.0 (source: https://en.wikipedia.org/wiki/Autoencoder)

In the CIMPLO project, we are currently investigating methods that use sequential data, such as time-series of sensor recordings, in order to estimate what we call the remaining useful life-time (RUL) of machinery. The latter is defined as the useful life-time left in terms of current operating state of machinery, such as cars or airplane engines. The way that an AE helps us in this task is two-fold:

First as a dimensionality reduction technique in order to transform the space in a lower-dimension. This helps in visualizing the data, as well as it mitigating the computational burden, of any subsequent machine learning (ML) or AI technique. The interesting aspect is that the dimensionality can be reduced in a non-linear manner, which allows for uncovering and constructing a lower dimensional space, taking into account the non-linear relationship of the initial (input) space.

Secondly, depending on the type of layers and cells used (RNN, LSTM, GRU) it helps in summarizing the time-steps, up to and including a particular time. This latter embedding can be used to see how sensor readings change between time windows. This is the part that can be associated with the RUL, due to the fact that embeddings of healthy machines behave differently, than those of unhealthy machines, and it can provide insight into how much useful life-time is left.

 

The importance of estimating the RUL lies in the fact that the knowledge of pending equipment failure allows for sufficient time prior to it, so that necessary maintenance decisions can be made and organized. In this way the machinery is maintained when required (predictive maintenance) and not prematurely, such as in preventative maintenance or after a failure has occurred (corrective maintenance). The estimation of the RUL lies in the heart of prognostics and health management, which is a more general methodology that aims at minimizing maintenance costs and predicting when a failure could occur by the assessment, prognosis, diagnosis, and health management of engineered system [2].

 

References:

[1] https://www.deeplearningbook.org/

[2] Nguyen, V.D., Kefalas, M., Yang, K., Apostolidis, A., Olhofer, M., Limmer, S.: A Review: Prognostics and Health Management in Automotive and Aerospace. International Journal of Prognostics and Health Management: Vol 10 (2) 023, pages: 35, 2019.

Post navigation

Previous Post:

The data path: from raw files to searchable compressed data

Next Post:

Multi-objective evolutionary algorithm for optimizing maintenance scheduling

Leave a Reply Cancel reply

You must be logged in to post a comment.

Upcoming Events

12 May 2022, 14:00
CIMPLO User Committee Meeting #9
© 2022 CIMPLO | Powered by WordPress | Theme by MadeForWriters