How AI is reshaping the edge computing landscape

How much computing power is needed at the edge? How much memory and storage are enough for AI at the edge? Minimum requirements are growing as AI opens the door to innovative applications that need more and faster processing, storage, and memory. How can today’s memory and storage technologies meet the stringent requirements of these challenging new edge applications?


What do we mean by “the edge”?

Edge includes any distributed application where specific processing occurs away from the server, even if the data is eventually sent to a data center. The big idea is to avoid sending all the data over the internet for processing on a server and instead allow data to be processed closer to where it’s collected, avoiding latency issues with long data roundtrips, and enabling near real-time response on site.
The edge is roughly divided according to the distance from the server to the endpoint. The so-called near edge can include applications close to the data center, perhaps even within the same building. The far edge takes the other extreme in applications such as autonomous vehicles. The overlapping feature is that the edge system processes data that would have traditionally been sent to a data center. This has practical applications in many industries.

 


Data latency and bandwidth at the industrial edge

In industrial applications, edge computers are typically designed to take inputs from sensors or other devices and act on the inputs accordingly. For example, preventative maintenance takes acoustic, vibration, temperature, or pressure sensor readings and analyzes them to identify anomalies that indicate slight faults in machines. Machines can be taken offline immediately or when needed to enable maintenance to occur ahead of catastrophic failure. Reaction times must be quick, but data quantity is low. However, AI is putting a strain on these edge systems.


The impact of AI on edge processing loads

AI places a different kind of load on computer systems. AI workloads require faster processors, more memory, and powerful GPUs. AOI, for example, has seen widespread adoption for PCB inspection, using video input from high-speed cameras to identify missing components and quality defects. In fact, similar visual inspection technology is seeing adoption in industries as diverse as agriculture, where it can be used to identify defects and discoloration in produce.
Performing complex algorithms on video inputs requires the parallel processing capabilities of power-hungry GPU cards, more memory for efficient and accurate AI inference, and more storage space for additional data. But don’t these already exist in data centers?


Bringing a sliver of data center power to the edge

Essentially, to tackle AI tasks at the edge, we’re bridging the gap between the edge and the data center. Servers tucked away in temperature-controlled data centers have terabytes of memory and vast amounts of storage on hand to handle specific high-capacity loads and keep systems working fast. But when it comes to inference happening far from the data center, it’s a different story. Edge computers don’t enjoy such idyllic settings and must be built to withstand harsh environments. The edge needs hardware that strives for maximum performance while accounting for less-than-ideal conditions.


Hardware for the edge

Adding AI at the industrial edge requires hardware suited to the task. An industrial computer that can handle extreme temperatures, vibrations, and space constraints is a must. In particular, three things are needed for vision systems, the most prolific AI application to date, memory to support efficient AI inference, storage for the incoming data, and PoE to support the addition of cameras.
Getting more memory in a smaller space can be accomplished with the latest DDR5. It provides more memory capacity at the edge with higher speeds, with twice the speed and four times the capacity of DDR4 with the same footprint, it makes more efficient use of available space and resources.
Extending capacity is needed for edge applications, as the data must go to the server or stay at the edge for some time, so SSDs are needed for interim storage. The shift from SATA to NVMe has opened the doors to greater speeds and performance and the NVMe PCIe G4X4 SSD, available soon, is the latest SSD in Cervoz’s pipeline, providing the industrial performance for these applications.
Vision systems need cameras. PoE+ is the simplest and most efficient way to add high-speed cameras to the system, providing both power and data transmission through a single cable. Cervoz’s PoE Ethernet Modular PCIe Expansion Card adds this functionality through a small add-on for power.


Get a head start for AI at the edge

For enterprises looking to get an edge, the combination of industrial computers plus industrial-strength memory and storage provide the reliability to withstand harsh edge environments and the power needed to enable next-generation AI technologies at the network edge.

About Cervoz
Headquartered in Taiwan, Cervoz Technology supplies embedded components for the industrial PC market. The company has nearly two decades of experience designing and developing high-performing memory and storage solutions for industrial applications.

 

Comments (0)

This post does not have any comments. Be the first to leave a comment below.


Post A Comment

You must be logged in before you can post a comment. Login now.

Featured Product

Stäubli robotic tool changer solutions

Stäubli robotic tool changer solutions

Stäubli is a pioneer in the construction and development of robotic tool changing systems. Our customers benefit from our many years of expertise in all industry sectors, as well as our modular product concept, which offers three efficient solution paths: MPS COMPLETE offers preconfigured robotic tool changers for immediate use. MPS MODULAR allows the user to determine the configuration, while MPS CUSTOMIZED allows the construction of special, application-specific systems.