Skip to main content

DIGITAL DISPLAYS

Unlock Customer-Facing Edge AI with Workload Consolidation

Edge AI

The way consumers and businesses interact today has changed. “In the post-pandemic era, there is an emphasis on minimizing physical contact and streamlining customer service,” explains Jarry Chang, General Manager of Product Center at DFI, a global leader in embedded motherboards and industrial computers.

As a result, there has been growing demand for integration of edge AI applications in the retail space. For instance, AI-powered self-service kiosks and check-in solutions can help reduce physical interactions and wait times by allowing customers to complete transactions on their own. These solutions can also analyze customer behavior and preferences in real time, allowing retailers to offer personalized experiences that enhance customer satisfaction and loyalty while driving up sales.

“These requirements are driving a shift towards edge AI, where processing occurs closer to the data source, reducing latency and enhancing privacy,” says Chang. “This change is driven by the need for real-time decision-making and the growing volume of data generated at the edge.”

Spurring AI Evolution at the Edge

But the problem is that businesses often struggle to find the best approach to deploying edge AI applications around their existing infrastructure and processes.

While edge AI can dramatically reduce the load on networks and data centers, it also can create new burdens locally, where resources are already constrained. The question arises: How can edge AI be deployed without adding costs and complexity?

Workload consolidation is one way these challenges can be addressed—by enabling a single hardware platform to incorporate AI alongside other functionality. The result is multifunction edge devices “capable of running multiple concurrent workloads with limited resources through features such as resource partitioning, isolation, and remote management,” Chang explains.

DFI recently showcased the possibilities of workload consolidation at embedded world 2024 with a demo that combined an EV charger with an informational kiosk (Video 1). The kiosk element used biometrics, speech recognition, and an integrated chatbot to recommend nearby shopping and dining opportunities that drivers could enjoy while their vehicle recharges. Once the driver walks away, the screen launches into a digital signage mode, displaying enticing advertising for nearby businesses.

Video 1. DFI showcases the possibilities of workload consolidation at embedded world 2024. (Source: insight.tech)

The DFI RPS630 Industrial motherboard leverages hardware virtualization support in 13th Gen Intel® Core™ processors to seamlessly consolidate AI functions alongside a content management system, EV charger controls, and payment processing. Meanwhile, an Intel® Arc™ GPU is used to provide power- and cost-efficient acceleration for AI components.

DFI also uses the Intel® OpenVINO™ toolkit for GPU optimization to reduce its AI memory footprint, allowing it to run complex large language models in less than 6 GB of memory. Moreover, by offloading complex AI tasks at the edge to the Intel Arc GPU, DFI was able to support multiple AI workloads while simultaneously reducing response time by 66%.

“These #EdgeAI use cases will all require workload consolidation platforms to enable real-time processing of customer #data and efficient operations” – Jarry Chang, @DFI_Embedded via @insightdottech

Charging into the Future of Intelligent Systems

DFI’s workload consolidation technology extends well beyond EV charging applications. The platform integrates its industrial-grade products with software and AI solutions from partners—targeting the global self-service industry for applications in retail, healthcare, transportation, smart factory, hospitality, and beyond.

Through the integration of a hypervisor virtual machine, DFI consolidated all the client’s workloads onto a single industrial PC. This system supports diverse resourcing, enabling various OS platforms to function concurrently.

“These edge AI use cases will all require workload consolidation platforms to enable real-time processing of customer data and efficient operations,” says Chang. “And as more industries and organizations adopt the technology, we expect to see another evolution.”

“The integration of edge AI with workload consolidation platforms is crucial in the deeper development of edge computing,” he continues. “There is no doubt in my mind that as hardware, software, and other technology around edge AI continue to develop, workload consolidation will become more mainstream—ultimately unlocking the next generation of intelligent edge computing applications.”

The Value of Collaboration at the Edge

Edge AI represents an immense opportunity for many industries. Chang explains that so far, we’ve really just started to scratch the surface. By pairing efficient acceleration with the right workload consolidation platform, we can start to explore what the technology can really achieve.

DFI’s partnership with Intel gives an insight into what’s necessary to support this continued advancement: collaboration. Modern edge AI applications demand a multidisciplinary approach that combines hardware, software, AI, and industry expertise.

“Embedded virtualization requires strong partnerships in hardware and software,” explains Chang. “Developing and deploying workload consolidation technology demands significant research and development resources. By partnering with other companies such as virtual integration software vendors, we can significantly reduce both development time and time-to-market.”

“And through strong partnerships such as what DFI has with Intel, we’re able to explore and develop new technologies that help define the future of edge computing,” he concludes. “We’re proud of what we’ve achieved together so far. And we’re enthusiastic at the prospect of further collaboration with Intel on workload consolidation, AI, and a great deal more.”
 

This article was edited by Christina Cardoza, Editorial Director for insight.tech.

About the Author

Brandon is a long-time contributor to insight.tech going back to its days as Embedded Innovator, with more than a decade of high-tech journalism and media experience in previous roles as Editor-in-Chief of electronics engineering publication Embedded Computing Design, co-host of the Embedded Insiders podcast, and co-chair of live and virtual events such as Industrial IoT University at Sensors Expo and the IoT Device Security Conference. Brandon currently serves as marketing officer for electronic hardware standards organization, PICMG, where he helps evangelize the use of open standards-based technology. Brandon’s coverage focuses on artificial intelligence and machine learning, the Internet of Things, cybersecurity, embedded processors, edge computing, prototyping kits, and safety-critical systems, but extends to any topic of interest to the electronic design community. Drop him a line at techielew@gmail.com, DM him on Twitter @techielew, or connect with him on LinkedIn.

Profile Photo of Brandon Lewis