Big Data, Edge Computing, and the Future of Manufacturing
The world of big data has undergone tectonic change over the past decade. Advances in machine learning and artificial intelligence have unlocked new insights and opportunities for process optimization.
In the manufacturing sector, this change is taking place in tandem with a shift in computing infrastructure. More and more computing is moving out of the data center to the edge, with data flowing to and from the cloud only as needed.
A Continuum of Edge to Cloud
We recently talked about these changes with Gerald Kleyn, Director of Edge Compute products at Hewlett Packard Enterprise. HPE started its Converged Edge Systems business unit in 2016 in response to enterprise customers who were using HPE technology in data centers, but wanted that same hardware in a smaller form factor that could be used on the edge of their networks.
According to Kleyn, the key to understanding this demand is awareness that customers want an edge-to-cloud continuum where they can work with their data. “There’s a continuum of how much processing or analytics you perform at the edge versus how much takes place in the cloud,” he said.
This flexibility will become ever more important as edge computing matures. “We believe that it will change over time,” said Kleyn. “There’s no set rule there. We’re building platforms that allow our customers to dial that in appropriately for their workload and their use case.”
OT and IT Are Converging
Kleyn said that what customers really want to do is run their familiar data center applications at the edge—and not bespoke or cut-down versions of those applications running on an appliance. “We’ve got a few key tenets that we stand on, and one is that we can run an unmodified software from the data center, down at the edge,” he explained. “It’s not different, it’s not a smaller version, it’s not a skinny version—which means it’s readily supportable by IT.”
He also said that he believes that OT (operational technology) and IT (information technology) are converging in several ways:
- Organizational convergence, where IT and OT staff work more closely together
- Software convergence, including virtualization and consolidation at the edge
- Hardware convergence between OT and IT
“There are lots of appliance-based systems out there that are relatively closed and unchangeable—and the data coming out of them isn’t coming out of them in the right form or fashion for modern analytics,” added Kleyn. “What we’ve done is actually converge OT and IT hardware together in some of our systems to create benefits for our customers through an industry-standard ecosystem that scales with the applications.”
How Can Workloads Be Handled More Effectively in a Converged World?
As this convergence happens, it offers the possibility of handling various workloads in new ways—bringing important changes to how smart factories work.
Consider the vast quantities of data generated by all manner of IoT devices, from sensors to video cameras to robots. Cloud-based analytics already allow manufacturers to gain valuable insights from this data. But moving that analysis to the edge allows manufacturers access to a far greater amount of data and to act on that insight immediately.
Kleyn cited as an example the work that HPE did to deploy such a solution at the manufacturing facility of Cupertino, California-based Seagate Technology. Seagate had been performing analytics using AI-based models that it had developed in its data center using traditional high-performance computing (HPC) systems.
These analytics parse scanning electron microscope images of silicon wafers to look for defects—thus improving the quality of Seagate products, and providing faster and earlier detection and correction of anomalies.
Being able to do that work at the edge makes a huge difference. Doing real-time analytics at the edge means that when something unusual shows up in the data, Seagate can take preventive measures before the operational anomalies lead to product defects. This helps reduce unanticipated maintenance downtime and saves the company money.
“With AI, you need to have a pretty high level of compute in order to infer across complex things like video data or—as in Seagate’s example—scanning electron microscope image data,” said Kleyn. “These are the kinds of things that require a lot of compute, storage, and bandwidth, which you typically have to send back to the data center. Performing these analytics at the edge and then sending only the results back to the data center—or just the interesting clips of video or images—clearly can save companies a lot.”
Delivering Greater Compute Power to the Edge
At the heart of HPE’s work on edge computing is its technology partnership with Intel®, which Kleyn said has been vital in helping the company meet the growing demands of its manufacturing sector customers. He said the two companies are “pushing the envelope” with respect to performance, while also optimizing HPE’s edge computing devices for size, weight, and power.
“All the work that Intel does in features, performance and miniaturization, helps us get more value out of smaller and smaller node sizes—that is important for us,” he concluded. “We deliver those savings directly to our users.”
Just the Beginning
Edge computing has broad applicability across a range of manufacturing use cases, with the examples provided here being only a small sample of what HPE, Intel, and its customers promise to achieve. For more details on their work, visit the Solutions Directory.