Skip to main content

INDUSTRIAL

Democratizing AI for All with Plainsight and Intel®

Elizabeth Spears & Bridget Martin

Elizabeth Spears & Bridget Martin

When you think about AI, you don’t typically think about agriculture. But imagine how much easier farmers’ lives would be if they could use computer vision to track livestock or detect pests in their fields.

Just one problem: How can an enterprise leverage AI if they don’t already have a team of data scientists? This is a pressing question not only in agriculture but also in a wide range of industrial businesses, such as manufacturing and logistics. After all, data scientists are in short supply!

In this podcast, we explore how companies can deploy computer vision with their existing staff—no expensive hiring or extensive training required. We explain how to democratize AI so non-experts can use it, the possibilities that come from making AI more accessible, and unexpected ways AI transforms a range of industries.

Our Guests: Plainsight and Intel®

Our guests this episode are Elizabeth Spears, Co-Founder and Chief Product Officer for Plainsight, a machine learning lifecycle management provider for AIoT platforms, and Bridget Martin, Director of Industrial AI & Analytics of the Internet of Things Group at Intel®.

In her current role, Elizabeth works on innovating Plainsight’s end-to-end, no-code computer vision platform. She spends most of her time focusing on products offered by Plainsight, particularly thinking of what new products to build, what order to build them in, and why they are needed.

Bridget focuses on building up the knowledge and understanding that occur during the process of adopting AI, especially in an industrial space. Whether it is manufacturing or critical infrastructure, Bridget and her team at Intel® spend their time working to develop solutions that address the challenges of incorporating AI into an industrial ecosystem.

Podcast Topics

Elizabeth and Bridget answer our questions about:

  • (2:19) Plainsight’s rebranding and evolution from Sixgill
  • (7:32 ) The rapid evolution of AI and computer vision
  • (10:08) The unexpected use cases coming from advancements of AI
  • (13:33) How companies can help make AI more accessible
  • (16:07) The biggest challenges industries face when adopting AI
  • (18:31) How to get organizations to start thinking differently about AI
  • (21:30) The benefits of democratizing AI and computer vision
  • (23:50) How organizations can best get started with AI

Related Content

To learn more about the future of democratizing AI, read Democratizing AI: It’s Not Just for Big Tech Anymore and Build ML Models with a No-Code Platform. For the latest innovations from Plainsight, follow them on Twitter at @PlainsightAI and on LinkedIn at Plainsight.

 

Transcript edited by Christina Cardoza, Senior Editor for insight.tech.

 

Apple Podcasts  Spotify  Google Podcasts  

Transcript

Kenton Williston: Welcome to the IoT Chat, where we explore the trends that matter for consultants, systems integrators, and enterprises. I’m Kenton Williston, the Editor-in-Chief of insight.tech. Every episode we talk to leading experts about the latest developments in the Internet of Things. Today I’m discussing the democratization of AI with Elizabeth Spears, Co-Founder and Chief Product Officer at Plainsight, and Bridget Martin, Director of Industrial AI and Analytics of the Internet of Things Group at Intel®.

AI already has a solid track record in manufacturing. But, as the technology constantly advances, it’s turning up in all kinds of rough-and-ready use cases. For example, AI is now being used to count cows! But AI is useless if no one understands how to use it, right? And it’s not very often you find data scientists on a farm.

So, in this podcast I want to explore the possibilities for AI in all kinds of rugged use cases—not just in agriculture, but across the industrial sector. We’ll discuss the importance of making AI more accessible, and the new and exciting use cases that come from its democratization. But before we get started, let me introduce our guests. Elizabeth, I’ll start with you. Welcome to the show.

Elizabeth Spears: Hi, thank you for having me. I’m excited to chat with you today.

Kenton Williston: Likewise. And can you tell me about Plainsight, and your role there?

Elizabeth Spears: Sure. So, here at Plainsight we have an end-to-end, no-code computer vision platform. So, it allows both large and small organizations to go from data organization, to data annotation, to training a machine learning model or a computer vision model, and deploying. So, deploying it on-prem, on the edge, or almost anywhere in between, and then being able to monitor all of your computer vision deployment in a single pane of glass. My role is the Co-Founder and Chief Product Officer. So, basically everything around what we build, in what order, and why, is really where I spend most of my time—along with my amazing team.

Kenton Williston: I am really looking forward to hearing about all the details there. That sounds very, very interesting. And one thing I’m curious about upfront, though, is I had known your company as Sixgill, and I’m wondering why it’s been rebranded to Plainsight, and what that has to do with the company’s evolution.

Elizabeth Spears: Yeah, good question. So, like a true product-focused company, we listened to what our customers wanted and needed. And we basically took a transformational turn from an IoT platform to a computer vision platform. So, what we kept hearing from our customers was that they wanted more and more AI, and then, specifically, more computer vision. So we took this foundation that we had of a platform—an IoT platform that was used for high-throughput enterprise situations—and we made it specialized for both large and small companies to be able to build and manage their computer vision solutions, really 10x faster than most of the other available solutions out there. So, we’re talking about kind of implementing the same use case with even higher accuracy in sort of hours instead of months. And that’s really been our focus.

So, the name—the rebrand for the name Plainsight—really came from this “aha” moment that we have with our customers, where they often have thousands of hours of video or image data that’s really this untapped resource in the enterprise. And when we start talking to them about how the platform works, and all the big and small ways that data can provide value to them, they all of a sudden kind of get it. It’s almost like everything that I can see—if I sat there and watched it without blinking—all of that could actually just be identified and analyzed automatically. So they have this “aha” moment that we talk about as sort of the elephant in the room, which is—the elephant is our icon—where you start to understand how computer vision works, and you just can’t unsee all the places that it can be applied. So we’re bringing all of that value into Plainsight for our customers, and that’s where the name came from. Our icon, like I said, is that elephant that we’ve all really bonded to named Seymour, and he’s named that because he can “see more.” He can help see more in all that visual data.

Kenton Williston: Oh boy. So, I have to say, I have a well-earned reputation for being the dad-jokes guy, and I think I would fit right in.

Elizabeth Spears: Yeah. We were very pleased with that one internally.

Kenton Williston: Yeah. So, the evolution—that’s a really great story, and I think is reflective of where so much technology is going right now, and how central AI and computer vision in particular have become just everywhere. And I’m really excited to hear more from your perspective, as well as from Bridget’s perspective. So, Bridget, I’d like to say welcome to you as well.

Bridget Martin: Yeah. Thank you for having me. Super excited to be here.

Kenton Williston: So, tell me a little bit more about your role at Intel.

Bridget Martin: Well, so at Intel, obviously everybody really knows us for manufacturing chips, right? That is absolutely Intel’s bread and butter, but what I loved hearing Elizabeth talk about just now is the real need to be connected to and understand the ultimate consumers of these solutions, and ultimately of this technology. And so the main function of my team is really to have and build up that knowledge and understanding of the pain points that are occurring in the process of adopting AI technology in the industrial space—whether it’s manufacturing or critical infrastructure—and really working with the ecosystem to develop solutions that help address those pain points. Ultimately, in top of mind for me is really being around the complexity that it is to deploy these AI solutions. LikeElizabeth was saying, there’s such great opportunity for capabilities like computer vision in these spaces, but it’s still a really complex technology. And so, again, partnering with the ecosystem—whether it’s systems integrators or software vendors—to help deploy into the end-manufacturer space, so that they can ultimately take advantage of this exciting technology.

Kenton Williston: Yeah. And I want to come back to some of those pain points, because I think they’re really important. I think what both your organizations are doing is really valuable to solving those challenges. And I should also mention, before we get further into the conversation, that the insight.tech program as a whole and this IoT chat podcast are Intel publications. So that’s why we’ve gathered you here today. But, in any case, while those challenges are very much something I want to talk to you about, I think it’s worth doing some framing of where this is all going by talking about what’s happening with the applications. Because, like Elizabeth was just saying, we’re at a point where, if you can see it—just about anything you can see, especially in an industrial context—there’s something you can do with that data from an AI–computer vision point of view. And, Bridget, I’m interested in hearing what you are seeing in terms of new applications that you couldn’t do five years ago, a year ago, six months ago. Everything’s moving so fast. Where do things stand right now?

Bridget Martin: Yeah. Well, let’s kind of baseline in where we’re ultimately trying to go, right? Which is the concept of Industry 4.0, which is essentially this idea around being able to have flexible and autonomous manufacturing capabilities. And so, if we rewind five, ten years ago, you have some manufacturers that are what we would consider more mature manufacturing applications. And so those are scenarios where you already see some automated compute machines existing on the factory floor—which are going to be, again, automating processes but also, most critically when we’re talking about AI, outputting data—whether it’s the metadata of the sensors, or the processes that that automated tool is performing. But then you also have a significant portion of the world that is still doing a lot of manual manufacturing applications.

And so we really have to look at it from these two different perspectives. Where the more mature manufacturing applications that have some automation in pockets, or in individual processes within the manufacturing floor space—they’re really looking to take advantage of that data that’s already being generated. And this is where we’re seeing an increase in predictive maintenance-type applications and usages—where they’re wanting to be able to access that data and predict and avoid unplanned downtime for those automated tools. But then when we’re looking at those less mature markets, they’re wanting to skip some of these automation phases—going from an Industry 2.0 level and skipping right into Industry 3.0 and 4.0 through the use of leveraging computer vision, and enabling now their factory to start to have some of the same capabilities that we humans do, and where they’re, again, deploying these cameras to identify opportunities to improve their overall factory production and the workflow of the widgets going through the supply chain within their factory.

Kenton Williston: Yeah. I think that’s very, very true, everything you’ve said. And I think one of the things that’s been interesting to me is just seeing that it’s not just the proliferation of this technology, but it’s going into completely new applications. The use cases are just so much more varied now, right? It’s not just inspecting parts for defects, but, like Elizabeth was saying, basically anything that you could point a camera at, there’s something you can do with that data now. And so, Elizabeth, I’d love to hear some more examples of what you were seeing there. And is it really just the manufacturing space? Or is it a wider sphere of applications in the rugged industrial space where you’re seeing all kinds of new things crop up?

Elizabeth Spears: It’s really horizontal across industries. We see a lot of cases in a lot of different verticals, so I’ll go through some of the fun examples and then some of my favorites. So, one of the ones that is really cool, that’s sort of just possible, is super resolution—a method called super resolution. And one of the places it’s being used, or they’re researching using it, is for less radiation in CT scans. So, basically what this method does is, if you think of all of those FBI investigation movies, where they’re looking for a suspect and there’s some grainy image of a license plate or a person’s face, and the investigator says, “Enhance that image.” Right? And so then all of a sudden it’s made into this sharp image and they know who did the crime, or whatever it is. That technology absolutely did not exist most of the time that those types of things were being shown. And so now it really does. So that’s one cool one.

Another one is simulated environments for training. So, there’s cases where the data itself is hard to get, right? So, things like rare events, like car crashes. Or if you think about gun detection, you want your models around these things to be really accurate, but it’s hard to get data to train your models with. So just like in a video game, where you have a simulated environment, you can do the same thing to create data. And people like Tesla are using this for crash detection, like I mentioned, and we’re using it as well for projects internally. My favorite cases are just the really practical cases that give an organization quick wins around computer vision, and they can be small cases that provide really high value. So, one that we’ve worked on is just counting cattle accurately, and that represents tens of millions of dollars in savings for a company that we’re working with. And then there’s more in agriculture—where you can monitor pests. And so you can see if you have a pest situation in your fields and what you can do about it. Or even looking at bruising in your fruit—things like that. So, it’s really across industries, and there’s so much, well, low-hanging fruit, as we were talking about agriculture, where you can really build on quick wins in an organization.

Kenton Williston: It’s just all over the place, right? Anything that you can think of that might fall into that category of an industrial, rugged kind of application, there’s all kinds of interesting new use cases cropping up. And one of the things that I think is really noteworthy here is a lot of these emerging applications, like in the agricultural sector, are in places where you don’t traditionally think of there being organizations with data science teams or anything like that. Now, I will say a little aside here, that sometimes people think of a farming as being low tech, but really it’s not. People have been using a lot of technology in a lot of ways for a long time, but nonetheless, this is still an industry that’s not typically thought of as being a super high-tech industry, and certainly not one where you would expect to find data scientists. Which leads me to the question of how can organizations like this, first of all, realize that they have use cases for computer vision? And, second of all, actually do something to take advantage of those opportunities. So, Elizabeth, I’ll toss that over to you first.

Elizabeth Spears: Yeah. So this is kind of why we built the platform the way we did. First, hiring machine learning and data science talent is really difficult right now. And then, even if you do have those big teams, building out an end-to-end platform to be able to build these models, train them, monitor them, deploy them, and keep them up to date, and kind of the continuous training that many of these models require to stay accurate—it requires a lot of different types of engineers, right? You need the site-reliability guys. You need the big data guys. You need a big team there. So it’s a huge undertaking if you don’t have a tool for it. So that’s why we built this platform end-to-end, so that it would make it more accessible and simpler for organizations to just be able to adopt it. And, like I was saying, I feel like often we talk about AI as: the organization has to go through a huge AI transformation, and it has to be this gigantic investment, and time, and money. But what we find is that when you can implement solutions in weeks, you get these quick wins, and then that is really what starts to build value.

Kenton Williston: Yeah, that’s really interesting. And I think the general trend here is toward making the awareness of what computer vision can do for an organization so much more widespread, and getting people thinking about things differently. And then I think where a lot of folks are running into trouble is that, “Okay, we’ve got an idea. How do we actually do something with that?” And I think tools like Plainsight are a critical, critical part of that. But I know Intel’s also doing a lot of work to democratize AI. And, Bridget, I’d love to hear from your point of view what some of the biggest challenges are, and what Intel’s doing to address those challenges and make these capabilities more broadly available.

Bridget Martin: Yeah. I mean, like I was saying toward the beginning, complexity is absolutely the biggest barrier to adoption when we’re talking about AI in any sort of industrial application and scenario. And a lot of that is to some of the points that yourself and Elizabeth were making around the fact that data scientists are few and far between. They’re extremely expensive in most cases. And in order to really unleash the power of this technology, this concept of democratizing it and enabling those farmers themselves to be able to create these AI-training pipelines and models, and do that workflow that Elizabeth was describing as far as deploying them and retraining and keeping them up to date—that’s going to be the ultimate holy grail, I think, for this technology, and really puts it in that position where we’re going to start seeing some significant, world-changing capabilities here.

And so of course that’s, again, top of mind for me as we’re trying to enable this concept of Industry 4.0. And so Intel is doing a multitude of things in this space. Whether it’s through our efforts like Edge Insights for industrial, where we’re trying to help stitch together this end-to-end pipeline and really give that blueprint to the ecosystem of how they can create these solutions. Or it’s even down to configuration-deployment tools, where we’re trying to aid systems integrators on how they can more easily install a camera, determine what resolution that needs to be on, help fine-tune the lighting conditions—because these are all factors that greatly impact the training pipeline and the models that ultimately get produced. And so being able to enable deployment into those unique scenarios and lowering the complexity that it takes to deploy them—that’s ultimately what we’re trying to achieve.

Kenton Williston: Yeah, absolutely. One thing that strikes me here is that there is a bit of a shift in mindset that I think is required, right? So, what I’m thinking about here is that I think in large part—because of the complexity that has traditionally been associated with AI and computer vision, and when organizations are thinking about what they can do with their data—I think oftentimes there’s kind of a top-down, “let’s look for some big thing that we can attack, because this is going to require a lot of effort and a lot of investment for us to do anything with this technology.” And I think there are certainly going to be cases where that approach makes sense. But I think there are a lot of other cases, like we’ve been talking about, and you’ve got all these very niched, specialized scenarios, where really the way that makes sense to do it is to just solve these small, low-hanging fruit problems one at a time, and build up toward more of an organization-wide adoption of computer vision. So, Elizabeth, I’d like to hear how you’re approaching that with your customers—what kind of story, how you’re bringing them this kind of “aha” moment, and what gets them to think a little bit differently about how they can deploy this technology.

Elizabeth Spears: Yeah. And I want to take a second just to really agree with Bridget there on how challenging and interesting some of the on-the-ground, real-world things that come up with these deployments are, right? So, it’s like putting up those cameras and the lighting, like Bridget was saying, but then things come up—like all of a sudden there’s snow, and no one trained for snow. Or there’s flies, or kind of all of these things that will come up in the real world. So, anyway, that was just an aside of what makes these deployments fun and keeps you on your toes. It’s really about expanding AI through accessibility, for us. AI isn’t for the top five largest companies in the world, right? We want to make it accessible not just through simplified tools, but also simplified best practices, right? So, when you can bake some of those best practices into the platform itself, companies and different departments within companies have a lot more confidence using the technology. So, like you’re saying, we do a lot of education in our conversations, and we talk to a lot of different departments. So we’re not just talking to data scientists. We like to really dig into what our customers need, and then be able to talk through how the technology can be applied.

Kenton Williston: To me, a lot of what I’m hearing here is you’ve actually got a very different set of tools today, and it requires a different way of thinking about your operations. Because you’ve got all these new tools and because they’re available to such a wider array of use, there are a lot of different ways you can go after the business challenges that you’ve got. And, Bridget, this brings me to a question—kind of a big-picture question: what do you see as the benefits of democratizing AI and computer vision in this way, and making these sorts of capabilities available to folks who are expert in the areas of work, but not necessarily experts in machine learning and computer vision and all the rest?

Bridget Martin: Oh my goodness, it’s going to be huge. When we’re talking about what I would call a subject-matter expert, and really putting these tools in their hands to get us out of this cycle where it used to have to be, again—taking that quality-inspection use case—something that we can all kind of baseline on: you have a factory operator who would typically be sitting there manually inspecting each of the parts going through. And when you’re in the process of automating that type of scenario, that factory operator needs to be in constant communication with the data scientist who is developing the model so that that data scientist can ensure that the data that they’re using to train their model is labeled correctly. So now think if you’re able to take out multiple steps in that process, and you’re able to enable that factory operator or that subject-matter expert with the ability to label that data themselves—the ability to create a training pipeline themselves. These all sound like crazy ideas—enabling non–data scientists to have that function—but that’s exactly the kind of tooling that we need in order to actually properly democratize AI.

And we’re going to start to see use cases that myself or Elizabeth or the plethora of data scientists that are out there have never thought about before. Because when you start to put these tools in the hands of people and they start to think of new creative ways to apply those tools to build new things—this is what I was talking about earlier—this is when we’re really going to see a significant increase, and really an explosion of AI technologies, and the power that we’re going to be able to see from it.

Kenton Williston: Yeah. I agree. And it’s really exciting even just to see how far things have come. Like I said, you don’t have to go back very far—six months, a year—and things are really, really different already. I can barely even picture where things might go next. Just, everything is happening so fast, and it’s very, very exciting. But this does lead me to, I think, a big question. Which is, well, where do organizations get started, right? This is so fast moving that it can seem, I’m sure, overwhelming to a lot of organizations to even know where to begin their journey. So, Elizabeth, where do you recommend the company start?

Elizabeth Spears: Yeah. So, I mean, there are so many great resources out there on the internet now, and courses, and a lot of companies doing webinars and things like that. Here at Plainsight we have a whole learning section on our website, that has an events page. And so we do a lot of intro-to-computer-vision-type events, and it’s both for beginners, but also we have events for experts, so they can see how to use the platform and how they can speed up their process and have more reliable deployments. We really like being partners with our customers, right? So we research what they’re working on. We find other products that might apply as well. And we like kind of going hand in hand and really taking them from idea, all the way to a solution that’s production ready and really works for their organization.

Kenton Williston: That makes a lot of sense. And I know, Bridget, that was a lot of what you were talking about in terms of how Intel is working through its ecosystem. Sounds like there’s a lot of work you’re doing to enable your partners and, I imagine, even some of your end users and customers. Can you tell me a little bit more about the way that that looks in practice?

Bridget Martin: Yeah, absolutely. So, one of my favorite ways of approaching this sounds very similar to Elizabeth really partnering with that end customer—understanding what they’re ultimately trying to achieve, and then working your way backward through that. So, this is where we pull in our ecosystem partners to help fill those individual gaps between where the company is today and where they’re wanting to go. And this is one of the great things about AI—is what I like to call a bolt-on workload—where you’re not having to take down your entire manufacturing process in order to start dabbling or playing with AI. And it’s starting to discover the potential benefit that it can have for your company and your ultimate operations. It’s relatively uninvasive to deploy a camera and some lighting and point it at a tool or a process—versus having to bring down an entire tool and replace it with a brand new, very large piece of equipment. And so that really is going to be one of the best ways to get started. And we of course have all kinds of ecosystem partners and players that we can recommend to those end customers, who really specialize in the different areas that they’re either wanting to get to or that they’re experiencing some pain points in.

Kenton Williston: So you’re raising a number of really interesting points here. One is, I love this idea of the additive workload, and very much agree with that, right? I think that’s one of the things that makes this whole field of AI—but particularly computer vision—so incredibly powerful. And the other thing that I think is really interesting about all of this is because there are so many point use cases where you can easily add value by just inserting a camera and some lighting somewhere into whatever process you’re doing, I think it makes this a sort of uniquely easy opportunity to do sort of proofs of concept—demonstrate the value, even on a fairly limited use case, and then scale up. But this does lead me to a question about that scaling, right? While it’s great to solve a bunch of little point use cases, at some point you’re going to want to tie things together, level things up. And so I’d be interested in hearing, Elizabeth, how Plainsight views this scaling problem. And I’m also going to be interested in hearing about how Intel technology impacts the scalability of these solutions.

Elizabeth Spears: We’re looking at scale from the start, because, really, the customers that we started with have big use cases with a lot of data. And then the other way that you can look at scale is spreading it through the organization. And I think that really comes back to educating more people in the organization that they can really do this, right? Especially in things like agriculture—someone who’s in charge of a specific field or site or something like that may or may not know all the places that they can use computer vision. And so what we’ve done a lot of is we’ll talk to specific departments within a company. And then they say, “Oh, I have a colleague in this other department that has another problem. Would it work for that?” And then it kind of spreads that way, and we can talk through how those things work. So I think there’s a lot of education in getting this to scale for organizations.

Kenton Williston: And how is Intel technology, and your relationship with Intel more broadly, helping you bring all these solutions to all these different applications?

Elizabeth Spears: They’re really amazing with their partners, and bringing their partners together to give enterprises really great solutions. And not only with their hardware—but definitely their hardware is one of the places that we utilize them, because we’re just a software solution, right? And so we really need those partners to be able to provide the rest of the full package, to be able to get a customer to their complete solution.

Kenton Williston: Makes sense. We’re getting close to the end of our time together, so I want to spend a little bit of time here just kind of looking forward and thinking about where things are going to go from here. Bridget, where do you see some of the most exciting opportunities emerging for computer vision?

Bridget Martin: Elizabeth was just touching on this at the end, and when we’re talking about this concept of scalability, it’s not just scaling to different use cases, but we also need to be enabling the ability to scale to different hardware. There’s no realistic scenario where there is just one type of compute device in a particular scenario. It’s always going to be heterogeneous. And so this concept—and one of the big initiatives that Intel is driving around oneAPI and “Write once. Deploy anywhere”—I think is going to be extremely influential and help really transform the different industries that are going to be leveraging AI. But then, also, I think what’s really exciting coming down the line is this move, again, more toward democratization of AI, and enabling that subject-matter expert with either low-code or no-code tooling—really enabling people who don’t necessarily have a PhD or specialized education in AI or machine learning to still take advantage of that technology.

Kenton Williston: Yeah, absolutely. So, Elizabeth, what kind of last thoughts would you like to leave with our audience about the present and future of machine vision, and how they should be thinking about it differently?

Elizabeth Spears: I think I’m going to agree with Bridget here, and then add a little bit. I think it’s really about getting accessible tools into the hands of subject-matter experts and the end users, making it really simple to implement solutions quickly, and then being able to expand on that. And so, again, I think it’s less about really big AI transformations, and more about identifying all of these smaller use cases or building blocks that you can start doing really quickly, and over time make a really big difference in a business.

Kenton Williston: Fabulous. Well, I look forward very much to seeing how this all evolves. And with that, I just want to say, thank you, Elizabeth, for joining us today.

Elizabeth Spears: Yeah. Thank you so much for having me.

Kenton Williston: And Bridget, you as well. Really appreciate your time.

Bridget Martin: Of course. Pleasure to be here.

Kenton Williston: And thanks to our listeners for joining us. To keep up with the latest from Plainsight, follow them on Twitter at @PlainsightAI, and on LinkedIn at Plainsight.

If you enjoyed listening, please support us by subscribing and rating us on your favorite podcast app. This has been the IoT Chat. We’ll be back next time with more ideas from industry leaders at the forefront of IoT design.

The preceding transcript is provided to ensure accessibility and is intended to accurately capture an informal conversation. The transcript may contain improper uses of trademarked terms and as such should not be used for any other purposes. For more information, please see the Intel® trademark information.

About the Author

Kenton Williston is an Editorial Consultant to insight.tech and previously served as the Editor-in-Chief of the publication as well as the editor of its predecessor publication, the Embedded Innovator magazine. Kenton received his B.S. in Electrical Engineering in 2000 and has been writing about embedded computing and IoT ever since.

Profile Photo of Kenton Williston