Quality Control – The Meeting Point of Big Data Analytics & AI
This article first appeared on The Manufacturer on April 26th, 2019
Automated manufacturing processes have been commonplace for many years. Quality control, however, is an area that has traditionally remained in the realm of human operations. That is until now.
With customers demanding more customization in the manufacturing environment, product development cycles decreasing and refreshing more frequently; quality control effectiveness has become an increasing challenge for manufacturing.
Thankfully, automated data collection and IoT connectivity is helping to drive production efficiencies and improve production quality.
Mark Sullivan, business engagement and operations manager at Hewlett Packard Enterprise (HPE), took to the stage at this year’s this year’s lively Industrial Data Summit to explain more.
“People assume big data has only been around since the birth of Amazon, Facebook or the iPhone, but big data is nothing new,” Sullivan noted, “it’s been around since the 1950s. What has changed, however, is that technology and what it offers has evolved significantly.”
Manufacturers are increasingly looking to artificial intelligence (AI) and digital technologies to accelerate and enhance their decision-making for both existing and new processes. Yet, most manufacturers are only using 1% of the data they collect for decision-making.
“Manufacturing could generate the greatest value from data and AI, deploying analytics within the plant or factory and acting on real-time data can dramatically improve decision speed, lower costs and increase worker safety; that’s where the edge comes in,” he continued.
Quality control looks to be the meeting point of big data, artificial intelligence and analytics.
The past 10 years has seen an explosion in the number of cameras – specifically surveillance systems – being deployed in factories and cities the world over.
Individual unit price has plummeted, while image quality has exponentially increased. A high-resolution, high frame-rate camera now represents a very cost-effective proposition; but more importantly, it offers a tremendous opportunity to leverage the data each camera captures.
And yet, that’s easier said than done. The data packet from one 4K camera is sizable, the data packet from an entire network’s worth is monumental. Storing, let alone processing and analyzing, such a volume just isn’t feasible – at least not in the traditional sense. That’s why edge computing represents an absolute sea-change in capability and thinking.
Life on the Edge
The Edge is increasingly becoming a centerpiece of the digital enterprise where things and people alike generate and act on massive amounts of data.
The Edge in action
As part of its own digital supply chain journey and increasing use of big data and analytics, Sullivan shared how HPE has partnered with Relimetrics to implement an optical quality assessment for the high-tech production of computer servers.
The system uses an innovative AI-powered image recognition process to check the configuration and material properties of manufactured components, enabling HPE to improve and verify quality of its own products.
“Quality control has always been a human process, but the pace and variability of production means humans can no longer keep up,” concluded Sullivan. “Our partnership with Relimetrics and the resulting solution has reduced the inspection time for one of our servers from five minutes to 10 seconds, with 99.9% accuracy.”
Click and watch the short video below to see how the system works and the advantages it has over traditional computer vision: