Edge or cloud? The five factors that determine where to put workloads

Should you send your data to computers or bring your computing assets to the data?

This is a major question in IoT. A few years ago you might have said “everything goes to the cloud,” but sheer size and scope often makes a smart edge more inevitable. IDC estimates that 40% of IoT data will be captured, processed and stored pretty much where it was born. While Gartner estimates the amount of data outside the cloud or enterprise data centres will grow from 10% today to 55% by 2022.

So how do you figure out what goes where?

Who needs it?

IoT will generate asinine quantities of data across all industries. Manufacturers and utilities already track millions of data streams and generate terabytes a day. Machine data can come at blazingly fast speeds, with vibration systems churning out over 100,000 signals a second, delivered in a crazy number of formats.

Machines, however, aren’t good conversationalists. They often just provide status reports on temperature, pressure, speed, pH, etc. It’s like watching an EKG machine; companies want the data, and in many cases need to keep it by law, but only a few need to see the whole portfolio.

The best bet: look at the use case scenario first. Chances are, every workload will require both cloud and edge technologies, but the size of the edge might be larger than anticipated. 

How urgently do they need it?

We’ve all become accustomed to the Netflix wheel that tells you your movie is only 17% loaded. But imagine if your lights were stuck at 17% brightness when you came home. Utilities, manufacturers and other industrial companies operate in real-time – any amount of network latency can constitute an urgent problem.

Peak Reliability, for instance, manages the western U.S. grid. It serves 80 million people spread over 1.8 million square miles. It also has to monitor over 440,000 live data streams. During the great eclipse it was getting updates every ten seconds

Rule of thumb: if interruptions can’t be shrugged off, stay on the edge or a self-contained network.

Is anyone’s life on the line?

When IT managers think about security, they think firewalls and viruses. Engineers on factory floors and other “OT” employees—who will be some of the biggest consumers and users of IoT — think about security as fires, explosions and razor wire. The risk of a communications disruption on an offshore drilling rig, for example, far outweighs the cost benefits of putting all of the necessary computing assets on the platform itself. Take a risk-reward assessment.

What are the costs?

So if the data isn’t urgent, won’t impact safety, and more than a local group of engineers will need it, do you send it to the cloud? Depends on the cost. Too many companies have responded to cloud like a teenager in 2003 given their first smart phone. Everything seems okay, until the bill comes.

In the physical world, no one sends shipments from L.A. to San Francisco via New York, unless there is a good reason to go through New York. Distance means money. Sending data to the cloud that could just as effectively be stored or analyzed on the edge is the digital equivalent. Getting the right balance of edge and cloud is the key to managing the overall TCO.

How complex is the problem?

This is the most important and challenging factor. Are you examining a few data streams to solve an immediate problem such as optimizing a conveyor belt, or comparing thousands of lines across multiple facilities? Are you looking at a patient’s vital signs to determine a course of treatment, or studying millions of protein folds to develop a new drug?

Companies often use the cloud to crack a problem, and then repeat it locally at the edge. Projects resulting in millions in savings aren’t being produced by a magical algorithm in the cloud – instead, people look at a few data streams and figure it out on their own.

Another way to think about it: the cloud is R and the edge is the D in R&D.