Edge Video Analysis (EVA) Pulls the Smarts from the Cloud

There are reasons why it’s better to perform your calculations at the Edge rather than in the Cloud. EVA makes that happen

Artificial intelligence, aka AI, is all around us. It’s used to analyze images and videos, and from that analysis, it can detect and identify objects and people, and derive actionable information. While the benefits of this technology are many, and we’re just scratching the surface of what can be accomplished, it’s a fairly complex technology that requires a tremendous amount of computational resources.

<br />Artificial intelligence (AI) is here. The more AI we can perform at the Edge, the better (and faster) analytics we can expect. (image source: builtin.com)

Artificial intelligence (AI) is here. The more AI we can perform at the Edge, the better (and faster) analytics we can expect. (image source: builtin.com)For this reason, most the computations associated with AI were handled in the Cloud, where the compute-heavy resources were stored. However, Cloud computing comes with some baggage, not the least of which is the latency associated with having to move data from the Edge to the Cloud. Such delays would be deemed as unacceptable for a large percentage of mission-critical and business-critical applications.

A second “down side” to Cloud computing is that the medium used to transmit the information can get very expensive very quickly. Video data gets large very quickly, meaning that the cost to transmit that data grows accordingly.

Finally, there’s the security issue. While the Cloud itself may be secure, it’s hard to ensure that every node along the way is also locked down.

EVA Improves Visibility, Safety

Thanks to the latest advances in computing technology and AI algorithms, it’s now possible to perform edge video analysis (EVA), a technology that performs video analysis in real time at the Edge, where the data originates.

This is possible because many AI algorithms, like those involving matrix operations, benefit from parallel processing and today’s extremely powerful microprocessor units (MPUs) can be dramatically boosted when combined with graphics processing units (GPUs). The latest GPUs are designed with thousands of small processors, each with its own local memory. In the EVA application, the GPUs execute the video analysis AI algorithms in a massively parallel fashion.

Don’t be surprised if EVA use becomes relatively ubiquitous in the near future. Here are three real-world examples that demonstrate the capabilities of EVA, all using ADLINK edge computing platforms. First is a system that’s operated in conjunction with an offshore drilling rig, which obviously operates in extremely hostile environments. They must endure shocks, vibration, noisy power supplies, wide-ranging temperature swings, high humidity, and saltwater. In addition, Internet connections can be quite unreliable.

<br />The safety and security of an off-shore oil rig can be greatly improved thanks to ADLINK-powered EVA. (source: imeche.org)

The safety and security of an off-shore oil rig can be greatly improved thanks to ADLINK-powered EVA. (source: imeche.org)In this example, high-resolution cameras augmented with ADLINK edge computing systems can monitor the main drill assembly. The system can observe the speed and positioning of the clamps as they attach to the drilling apparatus, and immediately warn the operators if something is not right. A second benefit, because it can recognize humans, is that it can issue a warning should a person venture into a place he or she doesn’t belong.

Next is the example containing high-speed trains. While offering such benefits as reduced congestion and higher productivity, the trains can be dangerous simply because they are traveling at such high speeds. There’s less time for an operator to respond to an obstruction on the track, such as animals or people, or deformations in the tracks.

High-resolution cameras boosted by ADLINK edge computing systems can detect a problem on the track that could be a kilometer away. Bear in mind that systems housed on trains must be able to withstand shocks, vibration, and noisy power supplies pre regulatory requirements.

The next example is the modern airport. It’s no secret that passenger traffic continues to rise. In fact, the world’s busiest airports see a plane taking off or landing every 35 to 45 seconds and about 200,000 to 300,000 passengers every day. With so many people, planes, other forms of transport, etc., moving around, the potential for problems is high.

<br />Many aspects of airport use can be enhanced using ADLINK-powered EVA. That includes any moving transport, such as planes and trams. (source: airport-technology.com)

Many aspects of airport use can be enhanced using ADLINK-powered EVA. That includes any moving transport, such as planes and trams. (source: airport-technology.com)One airport in Asia comes up with a solution, thanks to ADLINK’s edge computing system. There, the edge computing system constantly monitors the runways, taxiways, and terminals to detect and identify potential problems. The live video feeds from the ten cameras mounted around the top of the control tower. Each camera has a 4K resolution, and the images are stitched together in real time to provide a seamless 360-degree panoramic view. The EVA’s artificial intelligence system observes the movements of taxiing planes, people, and other activities. If something is not as it should be, an alarm is signaled to an operator.

In addition, the edge computing system is connected directly to the scheduling system, so it knows which planes have been instructed to land, take off, and taxi, and where they should be at any given time. If a plane is not doing what it is supposed to, the alarm sounds. While not enabled today, it’s possible for the edge computing system to take over control of the situation and issue instructions to the humans operating the machines.

ADLINK Makes EVA a Reality

ADLINK Technology designs and manufactures a wide range of products for dge computing. The company’s edge computing solutions include GPU-accelerated board-, system-, and server-level products, enabling system architects to construct and optimize system architecture for individual EVA applications.

As we’ve seen, edge computing systems often have to be deployed in harsh environments, and they also have to exhibit the upmost reliability. ADLINK’s edge computing solutions are up to the task as their ruggedized units are fully certified for such target environments.

Most engineers are familiar with NVIDIA’s off-the-shelf GPU cards with their integrated cooling fans. Unfortunately, while powerful, these cards aren’t always suitable for EVA applications, partly because the off-the-shelf cards typically have a shorter commercial lifespan (1.5 years). Then add in that that a system’s cooling fans offer potential points of failure. If the fans stop working, the system stops working; and, if the system stops working, everything stops working.

For the edge computing GPU subsystems that we discussed here, and others like them, ADLINK’s engineering team has taken NVIDIA’s GPUs and designed them onto MXM modules, a compact form factor smaller than traditional graphics cards. But MXM GPU modules don’t sacrifice performance. In fact, these MXM GPU modules provide equivalent processing capability while consuming less power (and producing less heat). And they have a longer commercial lifespan than conventional graphics subsystems.

Moving forward, it’s clear that systems employing edge video analytics will be deployed in various diverse locations, performing a cross-set of applications, all designed to make our lives easier and safer. And rest assured that ADLINK will remain at the forefront of this burgeoning technology.

Author: Zane Tsai
Author: Zane Tsai

Director of Platform Product Center, Embedded Platforms & Modules, ADLINK Technology