Skip to main content

PREDICTIVE MAINTENANCE

AI-Powered Manufacturing: Creating a Data-Driven Factory

Aji Anirudhan, Jonathan Weiss

Imagine being able to predict machine failures, detect defects, prevent costly downtime, and ensure working safety in real time. That’s exactly what AI-powered manufacturing aims to do.

It’s no longer just about efficiency. AI revolutionizes the factory floor, boosting product quality, reducing waste, and personalizing training for higher productivity. Is your factory ready to take on these changes?

Join us as we explore the opportunities and challenges of embracing and integrating AI in manufacturing. We address concerns, share success stories, and equip you with the knowledge to build a smarter, safer factory.

Listen Here

Apple Podcasts      Spotify      Amazon Music

Our Guests: AllGoVision and Eigen Innovations

Our guests this episode are Aji Anirudhan, Chief Sales & Marketing Officer at AllGoVision, AI video analytics company; and Jonathan Weiss, Chief Revenue Officer at Eigen Innovations, industrial machine vision provider.

Prior to AllGoVision, Aji was the Vice President of Sales and Marketing at AllGo Embedded Systems and Manager of Sales and Business Development for elnfochips. At AllGoVision, he focuses on the product strategy and growth of the company.

Before joining Eigen Innovations, Jonathan served as the Global GTM Leader in Industrial Manufacturing for AWS Strategic Industries and was the Vice President of Emerging Technologies at Software AG. As CRO for Eigen, Jon oversees revenue generation activities, and drives machine vision software and engineering sales.

Podcast Topics

Aji and Jon answer our questions about:

  • (3:26) Industry 4.0 challenges and pressures
  • (6:19) Safety risks for factory workers
  • (10:29) Creating a data-driven factory
  • (15:48) Ongoing factory floor transformations
  • (18:18) Data-driven factory strategies
  • (20:26) Industrial AI video analytic use cases
  • (25:31) Industrial machine vision examples
  • (30:07) Manufacturing opportunities up ahead

Related Content

To learn more about AI-powered manufacturing, read Transforming the Factory Floor with Real-Time Analytics and Machine Vision Solutions: Detect and Prevent Defects. For the latest innovations from AllGoVision and Eigen Innovations, follow them on Twitter/X at @AllGoVision and @EigenInnovation, and on LinkedIn at AllGoVision and Eigen Innovations Inc.

Transcript

Christina Cardoza: Hello and welcome to the IoT Chat, where we discuss the latest developments in the Internet of Things. I’m your host, Christina Cardoza, Editorial Director of insight.tech, and today we’re talking about AI and manufacturing—everything from defect detection and prevention to worker safety. We’re going to look at how AI continues to transform the factory floor.

And joining us today we have two expert guests from Eigen Innovations and AllGoVision Technologies. So, before we get started, let’s get to know our guests. Jon, I’ll start with you from Eigen. Please tell us more about yourself and what you do there.

Jon Weiss: Yeah, wonderful. Thanks, Christina. Great to be here, and thanks for having me. My name’s Jon Weiss. I’m based here in Greenville, South Carolina, and I work for a company called Eigen Innovations. I’m the Chief Revenue Officer, so essentially responsible for really everything that’s customer facing in front of the house here.

Eigen has a simple mission. It’s quite complex sometimes, but it’s really simple and straightforward. We want to help manufacturers all over the world not just detect defects but prevent defects, to ensure that they make the highest standard quality parts every single time they make them.

Christina Cardoza: Great. Great to have you here. And Aji from AllGoVision, please tell us more about yourself and the company.

Aji Anirudhan: Thank you, Christina for inviting us for this podcast. So, my name is Aji Anirudhan. I have a global responsibility with the company AllGoVision. I head the sales and marketing for the team. I’m also part of the management looking at the product strategy and growth of the company. And to just give background about the company: AllGoVision, we have been in business for 15 years, always focused on developing video analytics–based software products. We have been doing this, and implementation has been done across different segments.

One of the segments is right from smart cities to critical infrastructure to retail to airports. And this implementation has been in different markets worldwide. And with respect to the implementation, right now our focus is moving away from—it’s not moving away, but evolving from giving security or a people-based solution toa more-safety-focused solution, which is in the manufacturing sector.

That’s one focus area, which we—that’s where we are building solutions, are trying to enhance our solution support, especially workplace safety and productivity, how we can enhance this for different customers worldwide in manufacturing, warehousing, or different other industry segments.

Christina Cardoza: Great, yeah. I’m looking forward to hearing more about that, and especially how companies can leverage some of their existing solutions, like you said, like security solutions that they have and really transform it to provide even deeper insights and more benefits to the company.

So, but before we get there, Jon, I wanted to start off with something you said in your introduction. You guys are not only helping with detection but prevention of some of these quality issues. And so I’m wondering, obviously this is a driving factor of Industry 4.0. Manufacturers are really under pressure to transform and evolve and take care, take advantage, of some of these intelligent capabilities out there.

So I’m wondering, from your perspective, what challenges have you seen manufacturers have to deal with because of Industry 4.0, and how do things like machine vision and AI, like you mentioned, help address some of those challenges?

Jon Weiss: Yeah, absolutely. So it’s important to understand one of the key challenges in what we do here at Eigen—and really in the industry as a whole, if you think about traditional vision systems, and by the way that’s all we do. We do machine vision for quality inspection; that’s what we do. And we’re hyper-focused in manufacturing, by the way, that’s not just a vertical of ours; that’s all we do is industrial-manufacturing and process-manufacturing quality inspection.

Now historically, traditional vision systems really lend themselves to detect problems within the production footprint, right? So, if you’re making a product, traditional vision systems will tell you if the product is good or bad, generally speaking. You may wonder, well then how on earth are you able to help people prevent defects, right? Not just tell them that they have produced a defect.

And that’s where our software gets pretty unique in the sense that we don’t just leverage vision systems and cameras and different types of sensors, but we also interface directly with process data—so, historians, OPC UA servers, even direct connections to PLCs at the control-network level, meaning we don’t only show people what has been produced, we give them insights into the process variables and metrics that actually went into making the part. So we go a step further from just being able to say, “Hey, you have a surface defect.” Or, “You have some kind of visible or nonvisible defect.” But we’ll also show people what went wrong in the process, what kind of variation occurred that resulted in the defect.

Now, to answer your question, how does AI and vision systems play a role? Well, naturally everything we do is vision-system based; but a lot of what we do is also AI and ML based. You see this a lot in our thermal applications. For example, how we help metal and plastics companies inspect welds for both metals and plastic processes to determine with very high confidence whether or not they have a good or bad weld. We use AI and ML for a lot of that type of capability here.

Christina Cardoza: Great. And that’s obviously a big competitive advantage, that quality assurance aspect of it. And it’s great to see that these technologies like machine vision can be used to not only take the burden off of workers but to help them pinpoint some of the problems and really improve their operations. But I’m sure it also creates a lot of pressure for the manufacturers, for the people on the factory floor—making sure everything is as perfect as can be, and all of the processes and operations are efficient as possible.

So, Aji, can you talk a little bit about how some of these challenges to quality assurance or to improve the factory floor, how some of that puts pressure on workers. And what are the risks that you see involved with regards to workers in Industry 4.0, and how can the video analytics that you were talking about address some of those?

Aji Anirudhan: So, Industry 4.0: the primary thing, as you say, is how do you enhance the automation with this industry? How do you bring in more machines? How do you actually make the shopper more effective? But as we all know, the people are not going away from the industry and the factory, which basically means that there is going to be more interaction between people and machines within one.

To just give you a context, the UN has some data which says that worldwide companies spent $2,680 billion as an annual cost of workplace injuries and damages. Which basically means that this cost is going to be a key concern for every manufacturer. So traditionally what they have done is they have looked at different scenarios where there were accidents, different hazard situations which have come, and come out with policies to make sure that doesn’t happen, and to investigate whenever it happens to make sure that that policies are updated.

What we are trying to do is that’s not enough to actually bring this cost down. There could be different, other reasons why these accidents are happening. So you have to have a mechanism to actually make sure that you do a real-time detection or real-time monitoring and compliance of these policies to make sure that the accident never happens. That means if an employee who is on a shop floor is supposed to wear a hard hat and if he is not wearing a hard hat, even though the accident doesn’t happen, we’ll make sure that that is identified and reported back so that the frontline managers make sure that this is being taken care of immediately so that a potential accident can be avoided.

So what we are trying to look at is any event with—any scenarios where a workplace, a worker, is following the policies, or looking at a scenario which is otherwise not anticipated which can create a potential accident. We continuously monitor and alert and give data for the managers, the EHS safety inspectors to make sure that they’re addressing this in real time, updating the policies, training the people so that we—so the solution is not just replacing the existing models or existing policies, but enhancing the policies to give them a real-time insight so that they can generate enhanced safety policies.

So that—two ways to help this in factory scenarios: it can enhance the policies to reduce accidents; it can also make sure that the compliance—which needs to be met by the safety inspectors—it becomes much easier for them. And, again, the bottom line: reduce accidents means reduced insurance costs; that adds to the top line/bottom line for the companies. That’s what we are trying to actually bring. And it is based on different other—different AI- ML-based algorithms, as well as some of the applications which we see very specific to each industry.

Christina Cardoza: Yeah, absolutely. And, to your point, it’s very important that the people that are still on the manufacturing floor, they are following these safety procedures, and are making sure that everything is running smoothly on the floor so that it continues to run smoothly. And, to your point, this idea of Industry 4.0 that’s involving more machines, more autonomous robots being integrated in the floor—so you really need to make sure that this is all working cohesively together and pinpoint any issues that you may find.

You both mentioned data: data is going to be really important in, end-to-end, adding these advanced capabilities, making sure that they are running smoothly, operating smoothly, they’re picking up the quality insurance aspects, they’re not missing anything—from things on the product line to people on the factory floor.

And so, Jon, I’m wondering, looking at that data aspect and creating this data-driven factory, how can manufacturers begin to set this up—just looking at some of the value that it creates having this data-driven factory from end to end?

Jon Weiss: Yeah, it’s a good question. Before I answer that, I’m just going to piggyback on one thing that Aji said, because it is really important where he started his explanation by saying that the people—people aren’t going away, right? So we need technology to keep people in this world safe, right? It’s: keep people safe and ensure that they can do their job effectively while protecting themselves.

So it’s interesting, because in our world, although we stay hyperfocused on what’s being made—looking at the quality of the product or the part that’s being made—there’s also the same idea that people aren’t going away, right? And I think that is a common misconception in a lot of—especially these days—in a lot of artificial intelligence–type discussions where that’s what makes up most of the headlines: AI is going to replace you; it’s going to take your job away, and all this kind of stuff.

And I think it’s important to talk about that for a second, because what we see in the world of quality is actually the exact opposite of that. We’ve had some amazing discussions with our customers in various types of factories, really all over the country and even the world. And what we find in the quality realm is by bringing vision systems and software tools to the hands of these folks in factories by enabling them to inspect parts faster—and oftentimes at second and sub-second intervals, whereas it used to take minutes or sometimes even longer than that per part—now they’re able to produce more, which means they’re actually able to hire more people to produce more parts in a given shift.

And so it’s been really interesting to see that paradigm where I think there’s a lot of FUD and fear, if you will, around replacing people with this. But actually we see the opposite in our world where it’s actually empowering manufacturers to hire more people to produce more. So just a really interesting point on that that I wanted to mention.

That said, now I’ll answer the question around the data, the significance of it in organization, how to get started; I think that was the question. And when you think about Industry 4.0, holistically there’s a lot that goes into that. What Aji and myself do, we’re kind of small pieces of a much larger puzzle; but there is one common thread in that puzzle, and it’s really data, right? It’s all powered and connected by data. That’s how we drive actionable insights or automation, right? That that’s how we were able to do a lot of what Industry 4.0 promises, and the way organizations typically are successful in large scale digital transformations in Industry 4.0 is by really creating a single source of truth for all production data.

So, many years ago we called this “data lakes,” “data warehouses,” things like that. Then it kind of turned into style architectures. And these days now it’s really the—what’s been coined as the unified namespace, right? UNS is what a lot of people talk about now. But, simply put, it’s a single place to put everything—from quality data, process data, safety data, field-services-type data, customer data, warranty information—like all of this kind of stuff. You put that all into a single place, and then you start to create bi-directional connections with various enterprise-grade applications so that ERP knows what quality is looking at and vice versa, right?

This is how you get into automated replenishment of consumables and inventory management, material flow, all this kind of stuff. I know it’s a lot and I’m going fast, but it’s a real—that’s a such a loaded question. Oh my goodness—we could spend a whole hour talking just about that, but hopefully that makes sense. It really all starts with single source of truth, and then having the right strategy and architecture to then implement various types of software into that single source of truth for the entire industrial enterprise. Hopefully that makes sense.

Christina Cardoza: Absolutely. And we probably could have had our entire conversation just be around the data-driven culture of manufacturers, and I agree with what you said earlier about people are often afraid to implement AI; they’re afraid it’s going to take their jobs and how it’s going to be implemented. I would argue that in the industry sometimes there’s a lack of job skills available.

So AI really helps replace some of these mundane tasks that we don’t really have enough people or the labor there to do that. And then, for the people that we do still need for the manufacturing floor, it’s a little bit of a better job-life experience. You’re able to focus on more priority important tasks rather than these mundane issues, and it’s a little less error prone, so hopefully less stressful on the worker for having to find these incidents.

But going back to the data-driven idea of the factory—obviously we’ve been talking about Industry 4.0 for a couple of years now. Everybody knows the benefits of it, everybody wants all the data to gain that value and make those better-informed decisions. But do you feel like manufacturers are there today? Are they prepared to really take on this idea of the data-driven factory, Jon? Or is there still some education or learning that they need to do, still some more transformations that need to happen on the manufacturing floor?

Jon Weiss: Yeah. Well, you know, quite frankly, I don’t think anybody’s an expert in all facets of Industry 4.0, whether it’s a manufacturer or a vendor, because it’s such a vast topic. I do think you have experts for certain portions of it. But it really is a really wide topic.

Now, I’ll say manufacturers as a whole I think are on board generally speaking with the need to digitize, the need to automate. I think there’s no doubt about that. I do think there’s still a lot of education that has to take place on the right way to strategically go about large-scale initiatives—where to start; how to ensure its effectiveness, program effectiveness and success; and then how to scale that out beyond factories.

That’s still a problem for even some of the most mature industrial organizations on the planet. How do you get one—in my world it’s a vision system, right? So in my world it’s trying, it’s helping industrials overcome the challenges of camera systems being siloed and not communicating with other enterprise systems and not being able to scale those AI models across lines, factories, or even just across machines. That’s where traditional camera systems fail. And we’ve cracked that nut. So it’s an exciting time to be a part of that journey, that’s for sure.

Christina Cardoza: And, like you said, no one is an expert in this space, and there’s a lot of different pieces going on. We have AllGoVision from an AI video-analytics perspective, Eigen from the machine-vision defect detection and prevention perspective. Obviously insight.tech and the IoT Chat as a whole—we are sponsored by Intel.

But I think to do some of this, to really create that data-driven factory and to make some of these capabilities happen, it really is important to have partners like Intel to help drive these forward. I can imagine with the AI-driven video analytics—that’s collecting a lot of the data that manufacturers do need. And I can see partners like Intel being able to work with partners like Eigen and AllGoVision to make some of that data—get that data fast, make it possible, make it valuable that people can actually read through it and find what’s important.

So, Aji, I’m curious from your perspective, what’s the importance of partnerships like Intel, and how is that helping bring some advantages of video analytics to manufacturing?

Aji Anirudhan: Well, definitely. I’m saying we, as we said, the company has been there for 15 years now; we’ve been offering a video-analytics solution. Intel has been one of our first partners to actually engage with, to actually run our algorithms. And we have—from there we have grown over a period of time. We were one of the first partners or first video-analytics vendors to actually embrace their open-window architecture, because when we moved our algorithms to a deep learning–based model, this was very easy for us to actually port it to different platforms, which Intel was prime.

And over a period we have been using Intel processes right from the early version, right now to Gen4 and Gen5. And what we’ve seen is a significant performance improvement for us. I mean like the number of codes which we require on Gen4 and Gen5 is much, much optimized, lower than what we had used before. That that is very advantageous. That’s what Intel is doing in terms of making platforms available and suitable for running DL-based models, is very good for us. It’s very important because we do different use cases simultaneously, which means we can’t have a lot of servers; we want to optimize on the cost per channel. So that way Intel is a good partner for us.

And now some of the new enhancements they’re doing, especially for running deep learning algorithms, like their integrated GPUs or the new Arc GPUs which are coming in—we are excited to actually see how we can optimize between the processes and the GPU to actually make it more effective to run our algorithm. So, yes, Intel is a key partner with respect to our strategy currently and going forward, and we are very happy to actually engage with them in terms of different customers or different products and different use cases.

Christina Cardoza: Yeah, and talking about those different use cases, that you’re going to need a lot of those servers or the power from partners like Intel behind to make those happen. Can you talk about some of the advantages of AI-driven video analytics and manufacturing in addition to worker safety? What are those use cases that video analytics can bring? And if you have any customer examples or use cases that you can provide, that would be great also.

Aji Anirudhan: So I’m saying, as I’ve said, we have been engaged with different manufacturing setups even from our early stages, in which we started looking at worker safety. So there were different use cases—like definitely the security and prevention of other safety requirements within the plan, right? From things like access to the plan, restricted access to the plan, making sure that only the right people are coming in, and making sure that the location where the manufacturing happens is clear of obstacles.

There’s a lot of use cases with respect to operational-security things, which was always a use case. Then we looked at, when we worked on things like what we call inventory management, basically saying that once a production happens and then it goes back to inventory, how do we actually track the inventory with respect to vehicles, with respect to size of the loading. Those are things which are beyond worker safety that we have been looking at.

And that is linked to the supply chain as well. Then more to with—so one of the new use cases which we’re coming in is how do we actually manage predictive maintenance of machines? I think this is an area which we are working on now, this use case, which is coming from a customer. See, for example there was—I think this is a utility company, very interesting use case, where they wanted to use our algorithms to monitor their big machines through thermal cameras to actually make sure that the temperature profile of those machines didn’t change over a period of time. If it changes, it means something has to go for predictive maintenance. So this is another area where we see a lot of applications coming in.

And work safety definitely is evolving, because what we see in worker-safety requirements for one specific customer—electric, oil, and gas is different from what we see in a pharmaceutical company—so the use cases, the equipment which they use, the protective gear they need to actually deploy, and the plan-safety requirements which they have. For example, we were working with a company in India where they have this hot metal which is part of their production line, and there are instances where it gets spilled. It’s hugely, heavily hazardous, both from a plant-safety as well as a people-safety point of view. They just want to make sure that this is continuously monitored and immediately reported if there is anything. It’s a huge cause, and it is a production loss if it happens. That’s one thing.

And then we work with multiple oil and gas companies where a couple of the requirements include making sure there’s an early detection of fire or smoke within the plant. So we have a foreign-smoke solution which we are continuously enhancing to make sure that we do that. And I also want to look at the color of the flame which comes out while it is burning something, and to make sure that that color is detects certain behavior of what chemicals is burning.

So these are—some of them are experimental, some of them, they were the standard thing which we can do. So, use case-wise, a combination of different behavioral patterns of people, to interaction between people and machines or people and vehicles going within the industrial-manufacturing segment are bringing in new use cases for us. So this one—some of them we have implemented, some of them we are working on a consulting model with our customers to make sure that we bring in new algorithms, or enhance our algorithm and train them to actually address their use cases.

Christina Cardoza: And I’m just curious—because I know we talked in the beginning about looking at some of these solutions or these infrastructure, these cameras that manufacturers already have, like beyond the security systems—so is it the—to get these use cases, are a lot of manufacturers leveraging the video cameras that they already have existing on the manufacturing floor and then adding additional advanced capabilities to it?

Aji Anirudhan: Yes, yes. I’m thinking most of the factories are now covered with cameras, CCTV cameras, for their compliance and other requirements. We are going to ride on top of that, because our requirements easily match with the in/output coming from these cameras, and then we look at the positioning of the camera, and then maybe very specific use cases require a different camera with respect to maybe a thermal camera there, maybe the position of the camera or the lighting conditions.

So those are things which are which enhanced. But 80% of the time we can reuse existing infrastructure and ride on top of the video feed which is coming, and then do these use cases with respect to safety, security, or other people-based behavioral algorithms.

Christina Cardoza: That’s great to hear, because it sort of lowers the barrier of entry for some of these manufacturers to start adding and taking advantage of the intelligent capabilities and really building towards Industry 4.0.

I’m curious, Jon, from a machine-vision perspective, how can manufacturers start applying machine vision into their operations to improve their factory? And if you have any customer examples also or use cases of how Eigen is really coming in and transforming the manufacturing world.

Jon Weiss: Yeah. Well, holy cow, we have tons of use cases and success stories similar to Aji. We’ve been around, we’re a little bit younger—so I think 15 years for you folks. We’ve been around 14 years, so we’re the maybe—

Aji Anirudhan: I’m the big brother here.

Jon Weiss: The big brother, that’s right. But, yeah, we’ve got all kinds of success stories in the verticals that we focus in. Like I mentioned, all we do is manufacturing, but we do focus on a few different verticals. So, automotive makes up probably about 70% of our business, and then both with OEMs and tier one, tier two, and even end tier suppliers throughout the value chain. We also do a good bit in paper and packaging, as well as what we call industrials: so, metals, aluminum, steel. I’ll give you some success stories or use cases there that we’ve put into production environments.

But to answer the first part of the question: how do you get started? Well, it all starts by really—just like any other Industry 4.0 project, or really any project in general—you have to define the problem statement, right? And understand what is it that you’re trying to solve. I always recommend against adopting technology just for the sake of adopting technology. That’s how you get stuck in POC, or pilot purgatory as people call it, where you just—you end up with a science project, and it doesn’t scale out to production and it’s a waste of time for everybody involved.

So, start with a clear understanding of the business problem. What is your highest value defect that occurs the most frequently that you would like to mitigate? Maybe you start there, but it all starts by understanding what is it that you’re trying to see in your process that is giving you problems today.

In the world of welding it’s oftentimes something that the human eye can’t see. That’s why vision systems become so important. You need infrared cameras in complex assembly processes, for example. It becomes multiple perspectives that are important, because a human eye cannot easily see all around the entire geometry of a part to understand if there’s a defect somewhere, or it makes it incredibly challenging to find it. Same with very, very low-tolerance, sub-millimeter-tolerance-type geometry verifications for parts. There are things that are quite difficult for the human eye to see.

And so I always recommend starting with something like that—finding a use case that’s going to bring you the most value, and then kind of working backwards from there. Once you do that, then it’s all about selecting technology, right? So I always encourage people to find technology that’s going to be adaptable and scalable, because if all goes well it’s probably not going to be the only vision system you deploy within the footprint of your plant.

So it’s really important you select technology that’s going to be adaptable, lets you use different types of sensors. You want to avoid, typically, something that’s going to require a whole new vision-system purchase for a different type of inspection. Meaning, if today you want to do a thermal inspection, tomorrow you want to do, I don’t know, an optical- or a laser- or a line-scan-type inspection—you don’t want to be in a situation where you have to buy a whole new system again, right? That becomes very expensive, both from OpEx and CapEx perspectives. So I think if you follow that recipe, find something that’s adaptable, agile, flexible, and work backwards from a defined problem statement, I think you’ll be set up for success.

Christina Cardoza: I love what you said: it’s not just adopting the technology just to adopt the technology; you really should be adopting technology to solve a problem. And so it’s great to see partners like AllGoVision and Eigen—you guys are developing these systems not to just develop these systems, but because you see a trend, you see a problem in the industry and you want to fix it. And it’s great to see that these technologies that you guys are creating and deploying, they are, like you said, adaptable, interoperable, so that manufacturers can be confident that going with an AllGoVision or an Eigen Innovations. They’re really future-proofing their investments, and they’re going to be able to continue to evolve as this space evolves.

And, with that said, I want to put on some forward-thinking hats a little bit. Obviously we’ve been talking around Industry 4.0—I think a lot of people in the industry are already looking towards Industry 5.0. We’re not there yet, but we’ll probably be there before we know it. So as this AI space continues to evolve, what opportunities do you think are still to come or that we can look forward to? So, Jon, I’ll start with you on that one.

Jon Weiss: Yeah, I can’t help but laugh, because the buzzwords in this industry are just absurd. So I think we should probably figure out Industry 4.0 before we start focusing on 5.0. That’s just me personally though. I think a lot of manufacturers are still really just at the beginning of their journey, or maybe some are closer to the middle stage. But, yeah, I think there’s still a lot of work to do before we get to Industry 5.0, personally.

I’ll say that, forward looking, I think what’s going to happen is technology is just going to become even more powerful and the ways that we use it are going to become more versatile, right? There’s going to be a variety of things we can do. From my perspective, I see the democratization of a lot of these complex tools gaining traction. And so that’s one thing we do at Eigen. We build our software really from the ground up, with the intent of letting anybody within the production footprint, with any experience level, be able to build a vision system.

That’s really important to us, and it’s really important to our customers, and giving folks who may not be data scientists, may not be engineers, the ability to build out a product that’s going to tell them, or build out a dashboard or a closed-loop automation system that’s going to actually do something in real time to prevent bad product from leaving the line. That’s incredibly powerful. So I can only see that getting more and more powerful as time evolves.

Now, I would be remiss if I didn’t answer part of the question that I didn’t answer before, which was use cases. I got too excited telling you about all the other stuff, I forgot to tell you some use cases. So what I’ll do is I’ll tie this to this answer as well, if that’s okay.

So when you think about what the future holds, and I’ll phrase it like this: today we do really a variety of types of inspections, right? Just some examples: we do everything from inspecting at very high speeds, inspecting specialty paper and specialty coatings on paper to ensure that there’s no buildup on equipment. And this one example in particular in this specialty piece of machine that basically is grading the paper as it goes under it. You only have eight seconds to catch a buildup of about two and a half, three millimeters or so. If you don’t catch it in eight seconds it does about $150,000 worth of damage, okay?

And that can happen many, many times throughout the course of a year. It can even happen multiple times throughout the course of a shift, if you don’t catch it fast enough. And so when I think about what the future holds, we’re able to do that today: we have eight seconds to actually detect it and automate an action on the line to prevent the equipment failure. We do that in about one second, but it’s really exciting to think about when we do that in two-thirds of a second, half a second in the future—like the speed at which this stuff starts to execute, that’s exciting to me.

The other thing that’s exciting to me, when I think about the future of some of the sensor technologies, we also have use cases where we inspect fairly large surfaces. So, think about three-meter-wide surfaces that are getting welded, like big metal grates for example. And we’re inspecting every single cross section in real time as it’s welded. We use multiple cameras to do that, and then we stitch those images together, standardize them, and assess based on what we see.

And so it’s interesting to me to think, you know, could we in the future with more, let’s say powerful technology, could we inspect the whole side of a cargo ship fast enough,? During some kind of fabrication or welding exercise or painting exercise—something like that. So, thinking about like really large-scale assets, that’s kind of intriguing to me.

Christina Cardoza: Yeah, I love those use cases, because when you look at it from that perspective it really paints the picture of how valuable machine vision or AI can be in this space. You know, how much can go wrong and just simply adding working with partners like Eigen, adding these intelligent capabilities, it can really save you a world of hurt and pain and—

Jon Weiss: And it’s not just a nice-to-have, you know, just to tie this all back to the human element before I kick it back to Aji, just because we started on the human element. And so to bring this all back, one thing that’s really interesting to understand in this world of quality, from my experience what I’ve heard from a lot of my customers is actually they have the highest turnover in their plants, some of the highest turnover is within the visual-inspection roles. In some instances it’s very monotonous; it’s very—it could be an uncomfortable job if you’re standing on your feet for a 12-hour shift and you’re staring at parts going past you and you have your head on a swivel for 12 hours straight. And so as it turns out it’s very difficult to actually retain people in those roles.

And so this becomes almost a vitamin versus a painkiller sort of need, right? It’s no longer a vitamin for these businesses; it’s becoming a painkiller, meaning we’re helping alleviate an organizational pain point that otherwise exists. So, interesting stuff.

Christina Cardoza: Absolutely. And I totally agree with what you said earlier. We love our buzzwords, but I think that’s why it’s so important to have conversations like the one we’re having now, so we can really see where the industry’s at and how we can make strides to move forward and what is available.

Unfortunately, we are running a little bit out of time. Before we go, Aji, I just want to throw it back to you one last time, if there’s anything you want to add, any additional opportunities as AI continues to evolve that you want to talk about that’s still to come.

Aji Anirudhan: Yeah, I agree with what Jon said, that we—I think these technologies we’re talking about, especially worker safety, we are kind of enhancing the existing model, the workplace and environment. It is either establishing a worker is not colliding with another vehicle—all that. But the thing we are seeing is these technologies for each vertical or each manufacturing segment, it’s a little customized. And because there is a huge scene where you have machines, you have people, the people doing different things which are going to be there. So we have to detect this; we want to make sure the right decision is made, and we report back in real time.

So definitely it takes time for us to actually implement all the use cases for each vertical. But what is interesting—which is happening in the, what we call the AI world—is all the generative AI. And we are also looking at things, how we can utilize some of those technologies to actually address these use cases. So rather than going to 5.0, we ourselves are defining new use cases and utilizing the enhancement with the AI world that is happening.

Like what we talk about-large vision models, which basically look at explaining complex vision or complex scenarios and help us—see, I give an example. They say that when, if there is an environment where vehicles are moving and a person is not allowed to move, that’s not for a pedestrian, that’s for vehicle movement. But if the pedestrian—we were talking to a customer who says, “Yes, the worker can move through that same path if he’s carrying a trolley.” But how do you define if the person is with a trolley or without a trolley?

So we are looking at new enhancements in technology like the LVMs we talked about, which we will implement and bring out new use cases there. So that way technology which is happening, we talking about generative AI, is going to help us address these use cases in the factory in a much better way in the coming years. But to actually get into what the mark, the 4.0, requires, we still have a lot of things to catch up. We still have to look at each vertical and see things like behavioral, things like people-based activities to be mapped and trained so that we can give them a 90%, 95% accuracy when we are real time detecting the activities of people within this location.

So we are excited about technology, we are excited about implementation which is going on. So we look forward to much bigger business with various customers worldwide.

Christina Cardoza: Absolutely. And I look forward to seeing where else this space is going to go. Sounds like there’s still more to come, but there’s still a lot that we can improve and be doing today. So I invite all of our listeners to check out Eigen Innovations and AllGoVision websites to see how you can partner with them and they can help really transform your operations.

In addition, insight tech, we’ve done a number of articles on the two partners here today. So if you’d like to learn a little bit more about their various different use cases, they’re available on the website. But just want to thank you both again for the insightful conversation, and thanks to our listeners for joining this episode again today. Until next time, this has been the IoT Chat.

The preceding transcript is provided to ensure accessibility and is intended to accurately capture an informal conversation. The transcript may contain improper uses of trademarked terms and as such should not be used for any other purposes. For more information, please see the Intel® trademark information.

This transcript was edited by Erin Noble, copy editor.

About the Author

Christina Cardoza is an Editorial Director for insight.tech. Previously, she was the News Editor of the software development magazine SD Times and IT operations online publication ITOps Times. She received her bachelor’s degree in journalism from Stony Brook University, and has been writing about software development and technology throughout her entire career.

Profile Photo of Christina Cardoza