|Derek Ong, Electronic Industrial Software Solutions manager, Keysight Technologies |
Squeezing every last drop of productivity from invested manufacturing equipment on the factory floor was the goal, and hence a lot of focus was on downtime and throughput. Predictive maintenance and asset utilisation are important business outcomes from any successful Industry 4.0 implementation.
Then COVID-19 happened. Other than the race to 5nm chips, 5G, and cloud computing, some sectors of the electronics manufacturing industry have seen a drastic drop in volume, leading to a surplus of production assets on the floor. For some, machines have idled. For others, COVID-19 has caused massive supply chain disruptions.
The necessary steps taken by governments around the globe to manage and halt the spread of this epidemic, has restricted movement of factory employees and subsequently lowered productivity and output. The trade situation between the US and China has forced manufacturers to shuffle operations for business continuity. There are everlasting shifts in manufacturing paradigms as a result of COVID-19. The new “norm” needs a rethink on how Industry 4.0 technology enablers will be used to address the new challenges.
Quality over quantity
Before COVID-19, Industry 4.0 adoption mostly revolved around asset utilisation. In the current situation, it may be better to ensure that every single manufactured product is of the highest quality the process allows. Due to shortages of materials and parts, rising logistics costs and restricted factory employees, manufacturers will have to minimise Return Merchandise Authorizations (RMA) even more than before. Better quality may also prove to be a compelling value differentiator to win more business.
Quality has always been one of the most important manufacturing performance metrics but rather than the usual narrative of adopting Industry 4.0 technologies such as big data analytics, AI, and the Industrial Internet-of-Things (IIOT) to maximise asset utilisations, will need to pivot to adding more focus on improving the quality of the product being manufactured. Keeping machines up and running with minimal downtime gives less Return of Investment (ROI) if product recalls are happening or assets are loaded only half the time most days.
|Manufacturing analytics is quickly rising to prominence |
Qualitative and quantitative data on products – usually from test and measurement equipment on the floor – is an important source of insights for any big data analytics implementation. They allow engineers to maintain process parameters that yield the highest quality and they provide a real-time barometer of gross reproducibility and repeatability of equipment and processes, which is important for the predictable quality standard of the products.
This means that lower Cost-of-Poor-Quality (COPQ) is going to be something Industry 4.0 technology adoption has to address quickly.
Dangers of anomaly detection and things to look out for
Since the launch of Keysight’s PathWave Manufacturing Analytics in 2018, more manufacturers are embracing the new “normal” and using big data advanced analytics on test and measurement that are generated every second on the production floor.
A core fundamental analytics insight from the platform is being able to predict potential quality issues before they happen. The machine learning tool usually used to do this is around anomaly detection. We have seen a lot of examples of factories investing in setting up a generic big data platform and using publicly available open-source anomaly detection algorithms in production.
What is eventually evident is that these algorithms tend to be low in accuracy when dealing with test and measurement data, as opposed to continuous signals from sensors. This is what drove us to develop our own anomaly detection machine learning model at Keysight, which is tuned to provide the highest accuracy on test and measurement data from the floor.
We also identified “Alert Fatigue” in manufacturing industries that use anomaly detection as a predictor. Hundreds of thousands of measurements are taken in real-time in productionand a large number of anomalies are being alerted to operators or engineers every minute of the day. It is an impossible task for the users to decide which anomaly is most important and what are the most urgent actions to take.
Ultimately, this fatigue leads users to ignore the alerts, and the slow but sure demise of the entire advanced analytics project begins. If the right actions to prevent losses cannot be taken, then the ROI cannot be realised. This is important as, in order to make any investments in big data advanced analytics implementation in the factory worthwhile, it has to directly correlate with business outcomes.
Last year, we put together a team of data scientists and test and measurement experts in Keysight to develop an alert scoring machine learning model that works seamlessly with our anomaly detection algorithms to score measurement anomaly alerts in real-time, and we are planning the release of the new Alert Scoring feature in our upcoming PathWave Manufacturing Analytics 2.4.0 release in the spring of 2021. Alerts are labelled and sorted by the machine learning model as either high, medium, or low severity. The interpretation of the machine learning model of severity required supervised learning that Keysight’s test and measurement were able to provide.
With this first-in-industry alert scoring model, we were able to reduce the number of alerts sent to users for disposition by 90 per cent, in real-life testing. Instead of a hundred alerts, the engineer or operator will only receive ten of the most severe or important alerts.
The ability to combine domain knowledge and data science, sets companies such as Keysight, apart from generic big data platform partners, and we look forward to helping manufacturers achieve more tangible business outcomes with our 2021 roadmap.