Andrea Ronchi’s speech at the Smart Manufacturing Summit 2022.
On May 5th, Andrea Ronchi, Principal of 3rdPlace, AI Models Company of Datrix, participated in a roundtable at the Smart Manufacturing Summit 2022 in Milan.
The third edition of this event promoted by The Innovation Group aimed at enhancing dialogue on the digital factory, involving the stakeholders of the main production and technology companies.
After two years of pandemic, the most enlightened entrepreneurs and managers are giving their companies a real breakthrough in terms of digital transformation. Italian manufacturing was already among the most competitive in the world, and today it is able to restart with new challenges and opportunities for growth, even more efficiently.
In the name of technological evolution from a smart perspective, also through Industrial IOT, cloud infrastructures, cybersecurity and data-driven processes, this is a reset of the Industry in all of its respects.
How we can help companies to be really data-driven
“Data-driven” is one of those semantic miracles introduced by the majors in these revolutionary years of technological progress. All companies believe they have always used data to support their decisions. Our task is to help companies to consider even richer data sets and to relate more variables that contribute to describing a physical or immaterial phenomenon.
I believe this is an endless search – to describe and understand the world – which has engaged man since he started counting using the fingers of his hand.
The computing power and connectivity of our times have made a kind of mathematics already known for almost a century much more useful, and we have tagget it as Artificial Intelligence.
I am the Team Principal of 3rdPlace, Artificial Intelligence Models Company of Datrix, listed on Euronext Growth Milan. I work with over 130 colleagues who are genuine data science geeks.
What can we do with collected data?
The answer is “Nothing”. Deliberately provocative, though.
We can do nothing unless we start from the other end of the line, from business needs.
First of all, what we should ask ourselves is ‘What phenomenon do I want to describe or understand better?’, ‘What is the function – objective that I want to pursue?’.
The answers can be many: an increase in efficiency, in OEE, a reduction in climate-altering emissions, an increase in sales, the correct sizing of the warehouse, etc.
In this sense, our role is to help companies to bring out the right questions, and support the different departments involved in the process, in order to get the correct exchange of information: from domain experts (our customers) to data management experts.
The next step is to achieve mature business data management and start the right investigation, to reach a higher level of understanding.
A data-driven strategy based on Artificial Intelligence basically consists of three phases:
First of all, data will be used to describe the process by analyzing the level of correlation between the different variables: a Descriptive Analysis of the phenomenon, where we try to model it on a numerical approach.
In our opinion, this is the best way to understand all the relationships between the variables – even the most subtle and marginal ones – which are often neglected due to a lack of time and the limited ability of the human being to consider too many variables simultaneously.
At this stage, we realize that some data we need are not really available, perhaps because there are no sensors to detect them or because they depend on external elements.
Here a specific human factor makes the difference: the ability to listen to domain experts and the experience of data scientists in searching for sources outside the company.
The success of the Datrix Group and in particular of 3rdPlace lies on this specific expertise, regardless of the reference area.
We have shown that we are able to listen very carefully to the digital traces left by people online, and we have become familiar with many third-party data sources (always compliant with all data protection and privacy standards). This allows us to learn a lot about our customers, obtaining information that would be impossible to get directly from the companies themselves.
If you can correctly describe the behavior among the variables, you can move on to the Prevision phase: you will be able to understand what will happen to our entire system when one or more of these variables change.
Ultimately, once accurate forecasts have been obtained, we can even create Prescriptions, to help companies make decisions or even automate some actions based on the outputs of our models.
Predictive models: a few highlights on success stories and best practices
Predictive control: Many industrial processes are managed in feedback, for example the set points of the DCS or PLCs are defined once a workflow is producing a detection.
However, two kinds of problems arise:
- the variation typically has a latency before producing a result
- the change made will affect the second workflow, which is slightly different from the first. In this way, you continue to endlessly chase an excellence that you can never reach.
The example of the old shower
Like a shower without the mixer: if the temperature is too low, you will increase the flow of hot water with the knob, risking scalding yourself in the following seconds, eventually proceeding by further attempts, with great waste of time, water and comfort while finding an acceptable mix.
Imagine a process where you have a few hundred “knobs”, and where some of the circumstances do not entirely depend on you, as in the shower.
Within the metaphor, you can correctly model the response times of my boiler, give an optimal mix recipe between the two variables (hot and cold water); but if you analyze the process more accurately, if you talk to people, you will discover that there is a variable that you can not control directly: when someone else enters the shower in the same building and changes the reaction of my boiler. Here, the exogenous data appropriately historicized, can help you:
- Predicting the flows of people living in the building
- Recommending the best time to shower in order to waste less water and energy
- Suggesting the optimal set-point of your knobs to have the optimal temperature as soon as possible and take a relaxed shower
Now, let’s get out of the shower at home and use the same approach for an air conditioning system in a supermarket, for the setting up and assortment of a store, or a scenario that is most familiar to you. It’s the same process.
That is what to do, when and with what sequence.
Here we enter the world of operations research: beyond the scheduling software for the machines’ efficiency, we want to suggest how to keep the factory or, more generally, a production process efficient.
If you have N production lines and Y product codes that can be made on all lines, what is produced, where and when?
Of course, it depends on the orders loaded in the management system, but the algorithms continually recommend the optimal allocation, considering the normal machine contraints and all the continuous new constraints due to the production choices that are gradually implemented, plus the new orders that are loaded in ERP management systems. Predicting the demand and actual withdrawals by the customers decreases inefficiencies.
We leave the factory and enter the world of Sales: how do you organize the visit agenda of your sales force? You can use routing logic to optimize the number of visits, taking into account factors such as the probability of road congestion, but you can also score customers with a numerical / quantitative approach in order to obtain a “temperature level” of your leads and act accordingly.
From Anomaly Detection to the Prioritization of Interventions
The expression Predictive Maintenance is a buzzword, but it implies too many facets. I prefer to tackle the issue by breaking it down into more restricted areas, starting with:
- Anomaly Detection
- Anomaly Prediction
- Priorities of Interventions
Anomaly Detection: That is the identification of anomalies that with traditional systems could not be found soon enough. Still remaining in the field of Prediction, a very effective method is the comparison between Nowcasting and punctual surveys.
Nowcasting is the ability to predict what should happen now, as the result of a correct description of the phenomenon. If the timely detection differs too much from the nowcasting forecast, it is an alarm signal to be monitored carefully, valid in the most disparate contexts: from wind turbine generators to gas meters, up to data traffic.
It is crucial to infer missing data from exogenous sources: precise meteorological data, state of deterioration of buildings, road traffic flows, etc.
In many phenomena it is possible to carry out Anomaly Detection with the acquisition of visual data, entering the algorithmic branch of Computer Vision.
Among the different fields of application, we have already carried out frontier projects of biomedicine together with prestigious research centers on the analysis of spectroscopic images, using advanced Machine Learning techniques to identify and classify cancer cells, and we have applied our technologies to ”documentary analysis” for the identification of falsified documents.
Anomaly Prediction: If the anomaly signals between nowcasting and detection follow certain specific patterns, then it is likely that certain defaults can be predicted, allowing companies to provide ordinary or extraordinary maintenance recommendations in advance.
Priorities of interventions: One of the reasons why I prefer not to talk about Predictive Maintenance is that the expression carries a bias: if I do many interventions – even if not strictly necessary – the chances of breakages or anomalies clearly decrease.
But in the medium-long term, the real advantage is to obtain a correct prioritization of the interventions considering the budget constraints”.