Product Defect Detection You Can Count On: With Mariner
The time is now for manufacturers to start moving toward a smart factory before being left behind by their competitors. But where do they begin? The challenge is that there’s no right answer. To come up with workable solutions, manufacturers must clearly understand the operational problems they face.
For example, many manufacturers still conduct manual defect detection on the production line. This is not only expensive and time consuming, but often leads to inaccurate results. One way to address this issue is to apply machine vision and deep-learning models to smart cameras and automate the quality inspection process.
In this podcast, we will explore changes happening on the factory floor, what makes a smart factory run, and how machine vision and AI improve the product inspection process.
Listen Here
Our Guest: Mariner
Our guest this episode is David Dewhirst, Vice President of Marketing at Mariner, a provider of technology solutions that leverage IoT, AI, and deep learning. Prior to joining Mariner in January 2021, David cofounded marketing agency ThreeTwelve in 2011, where he worked for almost 11 years. At Mariner he leads strategic planning, execution, and oversight of all marketing initiatives.
Podcast Topics
David answers our questions about:
- (2:15) What is a smart factory?
- (3:56) How manufacturers have adapted to digital transformation
- (7:43) Getting started on the smart-factory journey
- (9:32) Computer vision vs. machine vision
- (13:47) Importance of AI for product defect detection
- (17:15) What to do when there is lack of IT support
- (19:20) Data processing in the cloud and at the edge
- (22.41) Working in a partner ecosystem
Related Content
To learn more about defect detection, read Getting the Smart Factory to 20/20 Machine Vision and A Guaranteed Model for Machine Learning. For the latest innovations from Mariner, follow it on Twitter at @MarinerLLC and LinkedIn at Mariner.
This podcast was edited by Georganne Benesch, Associate Editorial Director for insight.tech.
Transcript
Christina Cardoza: Hello, and welcome to the IoT Chat, where we explore the latest developments in the Internet of Things. I’m your host, Christina Cardoza, Associate Editorial Director of insight.tech. And today we’re talking about machine vision for quality assurance with David Dewhirst from Mariner. But before we jump into a conversation, let’s get to know David. David, welcome to the podcast. Thanks for joining us.
David Dewhirst: Thank you. I’m happy to be here.
Christina Cardoza: What can you tell us about Mariner and your role there?
David Dewhirst: Sure. I am Vice President of Marketing for Mariner. I’ve been in that role full time for about a year now. As a consultant, a little bit longer. And my background is in IoT for marketing. So I have moved from that, generally to Mariner, particularly over the past year, about a year ago, I would say. Mariner, we like to call a 20-year-old startup. So our roots are in writing software for data-warehousing applications. We’ve been doing that for decades, really. About three years ago, we made a big pivot. We saw the opportunity in AI, and in particular how does AI connect to camera systems or machine vision systems on the factory floor, and how can we use that to help manufacture? So about three years ago, we made that big pivot away from being a data-warehouse software provider to really leveraging our expertise in data science and AI to provide these applications for manufacturers that are aimed at reducing their cost of quality.
Christina Cardoza: I love that pivot—looking at manufacturers and how they can use their camera systems to digitally transform, because at insight.tech that’s been a big topic—is, going down this digital transformation, the manufacturing industry has been pressured to change, and one of the things they talk about is becoming a smart factory or becoming smart manufacturers. So I’d love to kick the conversation off there and talk about, what do we mean when we say “smart factory”? What are they trying to achieve, and what goes into that?
David Dewhirst: Well, the way I look at smart factories, and if you ask 100 practitioners, you will probably get 100 different answers, which makes for great podcast, right? So when I talk about a smart factory, the way I think of a smart factory or a connected factory—they’re analogous to me, or Industry 4.0, you’ll hear that—which is really how we take all of the information that’s inherent on the factory floor. How do we connect everything together? And how do we get useful results out of that data? In other words, I like to draw a distinction between data. Data is all around us. It’s everywhere. How do we draw a distinction between data and information? Because there is a distinction there. Data is just like the inputs, right? They’re everywhere. We need to somehow transform that into information so that we can do useful things with that information. So that’s how I look at a smart factory: gathering that data, doing something that processes that data that you’re collecting, and then doing something useful with the results of that data that you’ve processed into information. So that’s how I think of a smart factory or a connected factory, is just availing ourselves of sensors and technology that we haven’t had before to really advance kind of the state of the art in manufacturing.
Christina Cardoza: Yeah, I love that point. Sort of, we know where we want to go, but how to get there and how to connect all these pieces together isn’t as clear as the end goal. So over the last couple of years, manufacturers have obviously been under a lot of pressure to move faster and streamline their operations. How do you think they’ve handled these changes? How well are they doing on this journey to becoming that smart factory?
David Dewhirst: It can be hard. And there is a high project-failure rate in this space. And from my observations, and I believe I’ve seen data that backs this up as well, but from my own observations what happens many times is everybody kind of knows—so, becoming a smart factory or connected factory, these Industry 4.0 initiatives, people—pretty soon these are going to be table stakes. You’re going to have to do this because all of your competitors are doing it. And if all of your competitors are doing it and you’re not, you’re going to be left behind. So pretty soon, smart factory initiative, these kinds of things or projects, are really going to become table stakes, and people sense that. People know that this is just the arc of technology in manufacturing history. We’re always progressing through different technological innovations. This is really just another evolution of that. This is Evolution 4.0, if you want to call it, Industry 4.0.
So people know that and they want to get on board with it, but they don’t quite know what to do a lot of times. So when you see the projects fail, in my observation, it’s because they haven’t thought through actually what they’re trying to do. They know they should do this cool thing, and so they just go out and they find something to do that may be cool, but it may not necessarily be solving a problem.
So our solution is very pointedly aimed at improving defect detection in factory. So that’s one kind of a use case that you can find. Like, “Oh, we’re having quality cost problems.” So then you go find an appropriate solution to ameliorate that problem. That’s how I think any smart factory initiative should proceed. If you’re charged with digital transformation in your factory, find the use case that may not be the coolest thing that you can do, but solves the biggest problem.
Because one of the things you’re going to have to do as a digital transformation guy is to get people below you on board—the engineers who are impacted by it. You’re also going to need to get the decision makers who cut the checks on board with it. If you’re just stuck in the middle with no clear use case that you can say, “Folks down here, this is what we’re doing. It’s going to make your life better. Folks up here, who write the checks, this is going to make your life better because we’re going to save you a lot of money.” And ultimately, the guy at the top is concerned with shareholders, stakeholders. It all comes down to money—somebody, somewhere at the top.
So find those use cases where you can sell your project above and below. And then you’ll be on a clear middle path towards smart factory. And from there, maybe you’ll identify other use cases that you can tackle, but start with the big, hairy problem and see if you can figure it out. And that has not only the benefit of helping you sell, but it also should help you find the expert to solve that particular problem. So if it’s a defect-detection problem, you can go looking for companies, like Mariner, who do that. Or if it’s some other kind of problem—this machine X keeps breaking down, we don’t know why, but we think we can generate analytics—then you go find the expert who does that kind of thing. So clearly identifying your use case will help you sell it, and will also help you solve it.
Christina Cardoza: Now, clearly, every factory floor is different from one another, and every manufacturer is building different things, doing things differently. But are there any best places to get started or to identify those use cases? Any areas within the manufacturing floor that you think would provide the biggest benefit to some of these changes?
David Dewhirst: Talk to the people who have the problems. I mean, if you want to identify the problem, talk to the people who are likely to have those problems. So talk to the people above you, say, “What is costing us money?” Figure out what that is, and then figure out what kind of a solution. If they’re telling you their pain point ahead of time, that lets you go find some kind of a solution for that pain point. You can also talk to the people on the factory floor, like the engineers, boots on the ground, will often be aware of day-to-day problems. And they are just fixing those things or they’re taking care of them, they’re ameliorating somehow, but maybe not even fixing them, and it’s just part of their job, it’s what they do, they show up every day, they’re very good at it, and they make stuff work. They may be suppressing problems that they would love to have a solution for if you just asked them. So a fine first step, if you’re the digital transformation person, is to ask the people above and below you if there is a particular problem that they know about.
Christina Cardoza: Now, I want to dig a little deeper into this product-detection, defect-detection use case that you’ve mentioned a couple of times. I know this is big in manufacturing. They sort of want to find the problem before it happens or before it becomes a bigger problem, and one of the ways that manufacturers have been doing this along the digital transformation is applying advanced technologies like artificial intelligence and computer vision to it. But at the same time, we also talk about machine vision being a big part of this area. So can you talk a little bit about the distinction between computer vision and machine vision? What we are talking about in regards to product-defect detection?
David Dewhirst: Sure. So when I’m talking about it, or when Mariner generally is talking about it, because we have a very specific use case for defect detection in products on the line in your factory, I tend to use them interchangeably: machine vision system, camera system. But when I’m using it in those ways, I’m not talking about machine vision in autonomous self-driving cars, for example. It’s very different use case. When we’re talking about machine vision or camera systems or computer vision in the factory setting, those are typically fixed cameras in a fixed position with a fixed type. They are selected typically by a vision integrator or maybe your engineers, depending if you’re rolling your own, but they’re very bespoke to the production line. So they will be fixed, they will be designed in their placement, their lighting, set up, all the other stuff, it’s very targeted at the specific product on that production line. And that’s what I’m talking about when I talk about machine vision or camera systems in the factory setting—are those very focused, very engineered camera solutions that look at products on the line for defect detection.
Christina Cardoza: From your perspective, what is the importance of machine vision, as manufacturers head towards the smart factory and digitally transform?
David Dewhirst: The importance is in the ability to improve the quality-control process. So, there is the concept of total cost of quality, right? You are either going to spend money on your factory floor to have good quality that goes out the door, or, if you don’t do that, you are going to have a lot of returns, you’re going to have warranty claims. If you are, for instance, a tier-one supplier to an OEM auto manufacturer, your contract is in danger. They do not tolerate defects in things that you’re shipping to them. Not spending money on the quality costs on the factory floor means you’re still going to spend money on quality costs. It’s just going to be in canceled contracts and bad brand association. You’re going to get a reputation for bad quality, and that’s costly. You’re going to have returns and warranty claims. All those things are costly. So either way you look at it, you’re going to have to control costs.
So the better way to do that, rather than paying the cost of your damaged brand or returns or other things like that, the cheapest, highest ROI way to do that is to do your quality work on the factory floor. And this isn’t a new concept; we’ve had in quality inspectors forever, ever since the first assembly line in Dearborn, Michigan, you’ve had guys at the end of the line looking at stuff for quality controls. It’s not new. Machine vision systems or camera systems to help do that have also been around for decades, literally decades. They are useful because they can present a very consistent look, piece to piece, part to part to part to part. It always looks the same because the camera, as I was explaining, is very fixed and situated. So the quality guys can just look at an image now, instead of standing there turning a thing over. Although in a lot of factory settings they’re still doing that. They will pick up a piece, and they will turn it all different ways and hold it up to the light and do a bunch of other stuff. That does still go on. But machine vision has made a lot of strides over the past few decades to eliminate that.
What they haven’t done is use AI. Because AI has just in the past handful of years become an actual useful thing that you can apply to real-world situations, like machine vision problems. So that’s kind of where using machine vision for quality comes from. It comes from the need for it. Like most things in manufacturing, the need was found, the need was addressed, but there are situations where it just doesn’t work quite right, which is where the AI part comes in.
Christina Cardoza: Great. Yeah, that was going to be my next question. So you mentioned we’ve had machine, we’ve had these cameras that have been able to support the quality assurance, quality inspection in manufacturing for years now. But how does AI take it to the next level, elevate it, and help streamline operations, or like you were mentioning, reduce some of that manual work?
David Dewhirst: So it may be useful to back up a little bit. So for the past several decades, machine vision systems have used traditional problems, and they’re very good at solving binary problems. Let’s say, so is there a hole or is there not a hole in this piece? That’s a binary thing. Yes or no. Did we put the sticker on the engine block? That’s a binary thing. We did or we didn’t. Very easy using traditional programming, which relies again on those true/false things to come up with a true/false answer.
But what happens when your problem isn’t binary? What happens when instead of just saying a hole or not a hole, what happens when looking at, for example, is this an oil stain on fabric or is it a piece of lint? They’re both kind of fuzzy, right? So you can’t just say, is there a blotch or not? Because a piece of lint will present as a blotch in that machine vision system, and you can’t just draw a line. Maybe the stain is a little bit fuzzier, the lint is less fuzzier, so let’s draw a line, but that’s still arbitrary, right? You’re drawing some arbitrary line between the fuzziness levels. What happens if the lint is a little bit fuzzier than where you drew the line? That gets called defect. What happens if the stain is a little less fuzzy than you thought? That will escape because you might think that it’s lint. Very hard to tackle those fuzzy problems using traditional programming languages, and that’s where AI comes in now.
With machine learning, deep learning techniques, you don’t need to draw an arbitrary line for a true/false answer. You can train the AI just with enough samples of stains and lint. The AI will learn on its own what the difference is. So you don’t need to apply traditional programming techniques to these fuzzy problems. So the AI can come in and solve those kinds of challenges that weren’t really solvable before using traditional programming. So that’s the value of AI, and in particular AI to the machine vision problems. They just haven’t worked—they’ve never worked in these kinds of fuzzy situations. So with AI, you can oftentimes get your machine vision system, your camera system, to do what you hired it to do and it just has never done a good job at it because it hasn’t had the capability to do it. So that’s kind of the value add of AI, just to get your machine vision to do what you thought.
Christina Cardoza: That’s a great point. And I think the benefits of AI have been proven. So it’s no longer a hurdle, like you mentioned at the beginning, trying to get the top leadership to go on board on applying or implementing AI to these machine vision cameras. I think the next hurdle has been, do we have the in-house expertise and support to be able to do this? So manufacturing floors typically don’t have data scientists on hand or advanced AI developers, so it can be intimidating trying to apply this to machine vision cameras or on the factory floor. How can manufacturers do this without sort of the IT support available?
David Dewhirst: In our particular case, with Mariner, the only thing that we ask for is that you—obviously you’re going to need the image-formation system with pictures of your products. The only thing that we then ask is—we use a tool—we just ask that your quality guys take all the images that you have of your product that show defects, upload it to the tool, and they draw a little box around it. So that lets your quality guys do what they’re good at. They know how to do it. They’ve been doing it for decades, as we’ve talked about, looking at these images and pointing at defects. This is something they’re trained at, something they’re skilled at.
We can take advantage of that, and let us do the part that we’re good at, which is the data science. So the quality engineers will point out defects or not, and then our data scientists will build that AI model. So for our particular customers you don’t need data science guys on the factory floor. We’ve got you on that. And what will be delivered to you is the solution with the AI model built by our data science guys. So you don’t need to have data science at all. You have to have the quality guys who can point out the difference between defect and not defect. That’s our particular answer to that.
Other companies with other solutions and other spaces, sometimes they will ship prebuilt models. Those may or may not work, depending on how closely the preshipped models match what your particular situation is on the factory floor. If those align—what is shipped with the camera aligns with what is seen on your factory floor, you might not need us at all. You’ll be golden, and you still might not need to be a data scientist. Where the prebuilt models don’t align very closely with what your actual product is, that’s going to be a good use case for us to actually just provide you the data science as part of our solution.
Christina Cardoza: So where is data collection and all the data processing happening on the factory floor? And with a solution like Mariner, is this being processed to the cloud? Or, I know a lot of manufacturers have started looking at the edge.
David Dewhirst: It depends. So just in general terms, a lot of things happen in the cloud, and that’s because big data is a hot topic. And if you have 10,000 sensors all over your factory and you’re generating terabytes of information, you are probably not going to have a data farm on your factory floor, right? You’re going to have to do that in the cloud. So you have all those sensors collecting all that information, and it gets pushed up to the cloud.
In machine vision there’s a little bit less reliance on the cloud. We talk about Mariner and our Spyglass Visual Inspection solution just as being a useful talking point. Perhaps SVI, Spyglass Visual Inspection, SVI, you’ll hear me refer to it, is actually a hybrid solution. And that’s because, for the real-time defect-detection work, we don’t have time to make a round trip to the cloud to do that. So to keep up with the production line, that’s all happening in a server in the factory. We also, in those particular mission-critical applications, think carefully about doing everything in the cloud because it does happen—and I have literally seen it happen with my own eyes—a backhoe cuts the T1 cable outside the factory, you’re not on the cloud anymore. You’re just sitting on a piece of dirt. So think carefully about doing mission-critical stuff completely in the cloud. And that’s why we’re doing our actual defect detection and the AI-inference work on the factory floor. Even if you lose internet connection your production isn’t shut down. Your factory isn’t shut down. We can continue to do that inference.
Now we do also make use of the cloud. So SVI is designed to run headless without anybody standing around, but engineers can go back and review the decisions that the AI has made. If the AI got something wrong, the engineers can correct it. That will go up to the cloud. So we are accumulating some stuff in the cloud. And eventually, if the AI model needs to be retrained, we can do that in the cloud because it doesn’t require real-time connectivity to do that. And then it can get pushed back down to the edge. So SVI in particular is a hybrid, edge/cloud solution.
And I think that’s a good way to go for any application that you have that has a component that needs to be done real time and quickly. The edge is really where you’re going to want to do stuff. The cloud becomes a great place to do, well, data storage obviously, but all of the more computationally expensive things that take a while to run—that can all be done in the cloud, and that’s fine. So yeah, I would encourage anyone just to look at a hybrid scheme instead of entirely cloud—or entirely edge of course.
Christina Cardoza: So, cloud/edge computing, AI—these are all huge fields with lots of players in this ecosystem. So I’m wondering, with Mariner’s SVI solution how you guys work with other partners in this ecosystem to make it all come together.
David Dewhirst: Sure. In terms of our particular solution, kind of start to finish, we have a partner ecosystem that we work with, let’s call it. Number one, we don’t sell cameras. So we are AI software as a service solution. If you need cameras, we work with a vision integrator who will get you the right camera. But, by and large, we don’t care what the camera is. We can make use of any camera that you already have, or work with you. So, upfront, we work with the vision integrator if we have to, that’s partner number one.
Partner number two, we work very closely with Intel® and Nvidia, both on the factory floor. And that’s because we’re doing this real-time inference work with AI on the floor—we need some powerful processing capabilities. So our box that we ship, so it’s AI software as a service that ironically will arrive to you on a server box. So that is included as part of the solution. It’s the server box. We do that because we can build those server boxes to do what we want. So we have Intel® Xeon® chips in there for, like, really muscular, beefy processing. We have Nvidia cards in there for extra GPU power. So we partner in that sense with those hardware people—what’s the best solution, like, what’s the best GPUs we can get, what’s the best CPUs we can get to handle this workload? Those are partners on the factory floor.
We also partner on the cloud with Microsoft. So all the cloud stuff that we’re doing is typically in Azure. And that gives us—there’s a lot of prebuilt services and a lot of other capabilities in Azure that we can make use of and be certain about security and speed and all the other stuff that you might worry about in the cloud. So from front to back or top to bottom, however you’re going to architect that in your mind, we do have a complete array of partners that we are going to market with, for sure.
Christina Cardoza: Great. And I should mention the IoT Chat and the insight.tech program as a whole are owned by Intel. So it’s great to see how technology is being used with companies like Mariner and on the factory floor. You mentioned GPUs. I know with Intel’s latest release of OpenVINO™, their AI toolkit, they’ve sort of made it easier to work with GPUs or to not work with GPUs. Are you guys leveraging OpenVINO in this solution at all as well?
David Dewhirst: That’s a question for the folks down in Prod/Dev. And let me say, I don’t know, but I would be surprised if they weren’t, because they’re always looking at what is the next best thing that will let us scale up capabilities for customers. And I know the OpenVINO and stuff like that will help that. So I’d be surprised if they weren’t, let me just put it that way.
Christina Cardoza: Absolutely. So unfortunately, we’re nearing the end of our time, and I know this is just such a big topic, we could probably go on for another 30 minutes to an hour, but before we go, David, I want to make sure we got to cover at least everything as it relates to this conversation. So is there anything else you wanted to add, or anything you feel like we didn’t get a chance to go over yet?
David Dewhirst: No. I just—I will encourage people—again, you may not need Mariner’s solution, but you are going to need to move forward with industrial IoT and AI—probably you may or may not need AI given your use case—but you are going to need to have industrial IoT of some kind. It’s just too big and too out there, and your competitors are doing it. So I just encourage people to think about the use cases and the situations that are right for them. Find that hook, and get in, and don’t be the last guy. Find the area that will give you value, and move forward with that, because you’ll be ultimately happy that you did it on a lot of fronts, for sure.
Christina Cardoza: Great, great final takeaway there for our listeners. With that, I just want to thank you again for joining the podcast today.
David Dewhirst: Yeah, my pleasure. Thank you for having me.
Christina Cardoza: And thanks to our listeners for tuning in. If you like this episode, please like, subscribe, rate, review, all of the above on your favorite streaming platform. Until next time, this has been the IoT Chat.
The preceding transcript is provided to ensure accessibility and is intended to accurately capture an informal conversation. The transcript may contain improper uses of trademarked terms and as such should not be used for any other purposes. For more information, please see the Intel® trademark information.
This transcript was edited by Erin Noble, copy editor.