Skip to main content

SAFETY AND SECURITY

The Future of Video Analytics Is Virtualized

video analytics

Cameras are all over modern life—and we’re not just talking selfies. They’re ubiquitous in security, manufacturing, traffic regulation. And with lots of cameras come lots of data: data to be collected, data to be accessed, data to be analyzed as it becomes more and more important for businesses to be able to extract its value. But after it’s been collected and before it’s been analyzed, it has to be stored. And that, it turns out, can be a real problem.

Virtualization might just be the solution to that problem, and systems integrators are rapidly figuring that out, says Darren Giacomini, Director of Business Development at BCD, a video storage solution provider. Here he discusses this transition to virtualization, and the benefits it can bring to both businesses and the general public—from a foil to cybercrime to an empty parking space (Video 1).

Video 1. BCD's Darren Giacomini discusses why businesses should start making the switch to virtualization. (Source: insight.tech)

What are some trends in the way we benefit from video camera systems?

Video camera systems and analytics are becoming incredibly powerful. Cameras can pull a lot of analytical data and metadata in to be analyzed, and there are a lot of applications we’re seeing for cameras specifically in IoT devices that reside at the edge of networks. People are building in analytics to do things like count objects in manufacturing, but you’re also starting to use cameras to set up IoT devices at the edge in smart cities that can, for example, look at parking spaces and parking lots and determine what’s available. You can now open your smartphone and check for a spot there, instead of endlessly driving around looking for a place to park.

What challenges do businesses face with their camera systems?

You’re seeing a trend where people are starting to expand their retention periods—how long you have to keep the video at full-frame rate. In some correctional facilities they want to keep it for two years for its evidentiary value.

When you start talking about holding high-quality video for that time frame, you’re talking about an enormous amount of storage—petabytes and petabytes of storage. When you look at smart cities that may have thousands of cameras throughout their environs, storing all the data from all those cameras all the time can become not only incredibly expensive but difficult to maintain properly.

So a lot of what you’re seeing in the movement to 5G networking and IoT devices at the edge is about people trying to push the decision-making out to the edge in order to determine what’s important for video and what’s not. A little-known fact is that, in most cases, maybe only 5% to 10% of the video that’s recorded is ever used. The rest of it is just being stored.

For instance, you can run a search over a two-year period of data that says: I want all white trucks that went this direction, during this time frame, on any day. And you can pull that video back and see it, based on an analytical analysis. But the idea of doing that at the edge for 5G is that, if you can determine what’s important and what’s not important, then you don’t have to store everything else. I think analytics is going to play a huge, huge part in trying to scale back the massive amount of data and resources that we’re currently seeing.

“It’s really about utilizing your resources more efficiently. Also, when you talk about #virtualization you’re talking about the ability to create recovery points and snapshots.” – Darren Giacomini, @BCDVideo via @insightdottech

And I think the whole approach is going to change. Today the ideas is: Keep everything for two years. And over the years we’ve seen that people have changed the rules a little bit. Everything has to be kept at 30 or 60 frames per second for maybe six months. And then it drops down to 15 frames. But what we can’t do is drop it below the threshold needed for evidentiary value in municipal courts, so you can actually identify what you’re looking at.

Why is data storage such a problem for businesses in the video space?

Standard production servers more often than not cater to the IT environment. In an IT world you have a lot of data that’s stored in a repository or data center, and the data’s going outward to a few selected individuals who request it at a given time.

But with physical security, for example, you have hundreds and thousands of cameras and IoT devices simultaneously bringing data inward. And what we do at BCD is specialize in redesigning and re-implementing those particular devices to make sure that they’re optimized for that particular type of application.

What’s the role of virtualization in easing some of this storage congestion?

It’s the utilization of resources. In a typical physical security environment, you’re going to have cameras that reside at the edge, or IoT devices or sensors that are bringing data back. You’re going to have a network infrastructure—whether it’s wireless or a hardwired network infrastructure—that’s going to bring it all back to a centralized or decentralized point of recording where it’s stored. In some cases you may also have second-tier storage behind that.

And then you have people who are actually watching the cameras, and who need to see a particular event. You’ve got a car accident; you need to be able to pull up the appropriate video in real time and actually see what’s going on. That requires taking the data and either bringing it directly from the camera or redirecting it through the server out to the workstation. All of that utilizes resources.

But the most important segment in there is where the video is stored. Servers have finite resources: You have CPU, you have memory, you have network resources. When you’re doing a bare-metal server approach and you’re not virtualizing, you may be leaving unutilized 40% or 45% of the CPU cycles in cores that are allocated to that server. And it has nothing to do with the server’s capability in itself; it may have to do with the fact that you’re running on a Windows 2019 server, or whatever, and you can only load one instance of that software application.

So virtualization allows you to add in an abstraction layer—like VMware, ESXI, Hyper-V, or Nutanix—to the bare-metal server as an archiver. You virtualize maybe the directory servers or the access control into a flat-file structure that can be stored at a common share point. Then you have the ability to create more than one instance of the application on that machine. So instead of running just one instance of Windows 2019 on a server, maybe you run five, and you divide the resources up. Then you can take that CPU in memory that wouldn’t traditionally be utilized and get more production out of what you bought.

Naturally you’d think—for a company like BCD, where we sell high-performance servers—that that would be something we wouldn’t want to happen, but it’s going to happen regardless. Virtualization has been in the IT field for a very long time; there’s nothing you can do about it. You have to embrace the fact that people are wanting to do more with less.

How can businesses move to virtualization successfully?

It’s very similar to what’s already happened with the shift from analog to digital or IP. I think there are going to be a lot of the same growing pains, where people who have a particular skill set that they’re used to today are going to have to modify their approach. But if you don’t modify your approach, where you’re really going to feel it is in your pocketbook.

Because the fact of the matter is, if you’re quoting eight servers and I can do the same thing with three, I’m going to outbid you—even if we’re close, or even if I’m a little bit more. When I go back to the people who are making the decisions on the financial side and tell them, “Take a look at the total cost of ownership of running eight servers versus three. This is going to be more efficient; this is going to take up less real estate in your data center; this is going to take less power to keep it cool.” All these things come into play.

And I think there’s going to be a little bit of a struggle, but I don’t think it’s going to be as big as with analog to digital or IP. Everybody’s used to working with Windows now; everybody’s used to working with servers. It’s just that next step of learning that, instead of plugging into the front of a server, you’re going to get into a webpage that’s going to represent that server, and that’s the virtual machine that you’re going to work off of. It will be an educational experience, and people are probably going to have to send their employees out for some training to come up to speed on it.

Do you have any use cases you can share?

There’s a city on the East Coast where the entire municipality is running off of our Revolv platform, a hybrid hyperconverged approach based on virtualization that provides high availability. It literally cut down on getting people up at 2:00 or 3:00 in the morning to deal with some server that had lost power supply, because it’s going to do an automated recovery-feature set for you. And we gave them a much, much smaller footprint—4 or 5 servers to run everything in a virtualized environment versus 20 or 25.

What role does Intel technology play in making that happen?

One of the things that sets BCD up as a differentiator is that we don’t just build servers; we really do holistic, end-to-end solution sets. And that also plays a part in this virtualization that we’re all headed towards. That’s where the partnership with Intel comes in. Intel has been really good about providing resources, like these 100 gig NIC cards—which are not cheap—and other resources, that we need to do analytics and things of that nature to help push the envelope.

Beyond the physical aspects, where do you see virtualization going?

It’s really about utilizing your resources more efficiently. Also, when you talk about virtualization you’re talking about the ability to create recovery points and snapshots. We partner with another company called Tiger Technology that is, in my opinion, absolutely outstanding at looking at next-generation hybrid-cloud storage. That means that you’re going to have an on-prem presence with the ability to get hooks into the NTFS or the actual file structure inside of Windows and make it an extension of the platform. So at any given time you can take backups, or you can take multiple instances of backups, and push those out to the cloud for recovery. You really can’t do that type of thing in a bare-metal environment.

If you can take a snapshot and you can create a repository of snapshots, when something goes wrong then. . . Because what is your disaster-recovery plan? What is your business-continuity plan if you get hit? And the fact of the matter is that everybody is going to get hit. I don’t care how careful you are, there are zero-day instances out there that are going to hit you at some point.

Mostly what you’re seeing today is ransomware. So when you’re in a virtualized environment and taking regular snapshots you can actually say, “This is an acceptable loss. Roll the snapshot back one week. Let’s take the one-week loss rather than paying.”

Is there anything else about this topic you think we should know?

It just really comes down to the fact that if you’re an integrator out there and you deal in the physical security market you can ill afford to ignore the fact that virtualization is coming. I predict that in the next three to five years the large majority of the market is going to hit mainstream virtualization. And you either need to get on board with that, or you’re going to find yourself in a situation where it’s going to be very, very difficult to be competitive in the market.

Related Content

To learn more about virtualization within the video analytics space, listen to The Impact of Virtualization on Video Analytics: With BCD.For the latest innovations from BCD, follow them on Twitter at @BCDvideo and on LinkedIn.
 

This article was edited by Christina Cardoza, Editorial Director for insight.tech.

About the Author

Erin has been an editor in San Francisco since the days when there was a print publishing industry there—and still has the red pencils to prove it. As an editor, technical writer, copy editor, and proofreader she has a sharp eye for consistency, an aptitude for clear and precise English, and proficiency with Chicago style. She specializes in converting technical or industry-heavy language to intelligible prose for the layperson or new reader, with as light or heavy a touch as the circumstances require. Experience with consumer-tech communications, business and banking publications, advertising, and corporate journalism.

Profile Photo of Erin Noble