When to Move to the Edge: A Simple Equation to Help Decide


Stewart McGrath

Stewart McGrath is the CEO and co-founder of Section, an edge compute platform that helps engineers deliver better web applications. With extensive experience leading companies in the technology space, Stewart’s passion for building teams focused on bringing technologies to market drove him to co-found and lead Section. As applications continue to evolve and end-user demands for performance, security and functionality increase, he envisions a world where developers are unencumbered by infrastructure and a better internet is powered by the edge.

Using edge compute over cloud compute means improved performance, reduced data backhaul, earlier threat detection and mitigation, and other benefits such as improved reliability and scalability. It is undeniable that running an application closer to the end user can provide an improved digital experience. In short, all things being equal, it’s better to run applications at the edge instead of in a centralized cloud or data center environment.

Given that fact, why, up until now, have we embraced the centralized cloud compute paradigm rather than a distributed hosting platform?

The answer is in that “all things being equal” statement. Historically, deploying applications at the edge adds technical and operational complexity — not to mention unfamiliar tools and requirements — that can, for many, outweigh its other benefits. These challenges of running distributed versus centralized applications have been with us for some time. We have seen pushes toward grid compute paradigms and, of course, the completely decentralized blockchain application approach. Each of which has represented fundamental shifts away from our current application security, architecture, and dev and ops lifecycles.

However, modern edge compute paradigms are a much more natural, and thus more mainstream, extension from the current operatives of a centralized cloud hosting model.  

Is this enough to achieve the “all things equal” parity? How do you judge? It turns out we can demonstrate the natural extension a distributed deployment brings to the benefits of cloud by considering cloud vs. edge in equation format.

Setting the Equation

First, let’s restate that:

            The Benefits of Distributed Edge Hosting > The Benefits of Centralized Cloud Hosting

What is missing from this statement is the accessibility of these benefits for either hosting paradigm. So, for purposes of creating an equation, we need to determine how to express an ability to access or enjoy the benefits of edge vs. cloud hosting.

The key is to recognize that the edge side of the compute equation is not dissimilar to the cloud portion, with one significant differentiator: distribution and how we handle that distribution.     

Common Variables

Given this, what is it that we need to solve for in order to calculate the benefits of edge computing? What are the variables?

A number of relevant variables are common between both edge and cloud and are dictated by the DevOps lifecycle. How does an organization develop, deploy and operate an application workload in either a distributed edge or centralized cloud environment?

Variables Common to Cloud and Edge Include

A = Ability to Architect the Hosted Application/s

To what extent can an application architect build the desired application structure for their application requirements. Can they leverage containerized workloads, VMs, and/or serverless functions? Can they add security, performance or other augmenting services from an open marketplace or are they constrained to have access only to the provider platform’s proprietary services? Do they need to rewrite their application to meet the constraints of the target hosting platform? Does the application architect have access to solutions on the target hosting platform to solve for all parts of the application including the core app logic, persistent data stores, load balancing, delivery, and acceleration capabilities, caching, etc.?

D = Developer Experience Quality and Simplicity

To what extent can an application developer continue to work within their existing application development and deployment lifecycle in order to deploy their application changes to the production environment?

O = Ability to Operate the Hosted Application/s

When it comes to Day 2 operations, can the operations teams access all the elements they need easily to monitor, debug and optimize the application running on the hosting platform?

S = Security Plane Quality and Simplicity

Is there a high-quality and consistent security plane available to protect the application’s needs? Does it consider network and application layer aspects? Is it simple to implement and manage and is it available consistently across the entire hosting platform footprint?

C = Sensitivity to Cost

To what extent is the team sensitive to increases or reductions in application hosting cost? If the cost was significantly greater to obtain a very small increase in application performance, security, etc., would they accept that cost? Or are they very sensitive to any increase in service cost regardless of potential benefits?  

For the cloud portion of the compute equation, this is all we need. Thus, while the benefits of edge as compared with cloud are undeniable, up until now the centralized cloud equation has been significantly simpler to solve. Without the complexity of distribution, the ability to enjoy cloud compute can be written thus:

(A + D + O + S)

——————–    =  Ability to Enjoy the Benefits of Cloud Hosting (X)

           C     

Edge Variables

While there are shared considerations for cloud and edge, there are also variables that are unique to the distributed edge compute paradigm, such as:

L = Number of Locations

To how many discreet locations will the application be deployed at any one time?

P = Number of Providers

How many providers of underlying compute are involved in delivering the Number of Locations (L) to which the application is being deployed.

E = Ability to Orchestrate Distributed Compute and Networking

To what extent can the edge locations and demand for the application at those locations be cost-effectively orchestrated? Is the application always running in all locations at all times or is the edge management such that the application only needs to run when and where user demand requires in order to deliver desired performance outcomes? 

This creates a more complex edge portion of the equation that can be written as follows:

(A + D + O + S)

———————         =  Ability to Enjoy the Benefits of Edge Hosting (Y)

 C x (log (LxP)/E)

Note: We would suggest 10 as a reasonable base for the logarithm as you could consider that it is approximately equivalently difficult to upgrade your hosting solution from one location to 10 locations as it is to upgrade from 10 to 100 locations or 100 to 1000 locations, etc. However, arguments could be made for alternative mathematical constructs for this part of the equation.

The Complete Equation

Now that we have both sides of the equation, we can start on the math. While the Benefits of Edge are greater than the Benefits of Cloud, unless the Ability to Enjoy the Benefits of Edge Hosting (X) is at least equal to the Ability to Enjoy the Benefits of Cloud Hosting (Y), developers, architects and operations teams will still not be inclined to move their applications to edge. It would simply be too hard versus the potential benefit.

If the above equations are true, then finding the comparison or equality point between X and Y becomes a simplification process. Our equation looks like this: 

(A + D + O + S)         (A + D + O + S)

——————–    =     ——————–  

           C                     C x (log (LxP)/E)

This equation quickly makes two things readily apparent. The first is that if A, D, O, S and C are in fact common variables, then to the extent that these elements are equivalent between edge and cloud, they cease to be a consideration in solving the equation.

                                 1

          1   =     ——————–  

                        (log (LxP)/E)

That is to say, if the ability to architect the applications, the developer experience, the operational simplicity and the security plane are consistent in quality across platforms, while the overall cost sensitivity is the same, then these variables cancel out.

This then leaves us to solve the second part of the equation:

     

          (log (LxP)/E)   =    1

And

          log (LxP)   =    E   ;or

          LxP = bE  

          (where b is the base (2, 10, e, …)

What does this part of the equation tell us? Overall, the organizational impact of edge compute compared to centralized cloud is a function of:

  1. (LxP) – Edge Distribution 
    1. The number of locations to which the app is being deployed and operated on
    2. The diversity of underlying suppliers of the edge locations and infrastructure

Versus

  1. E – The ability to cost-effectively manage the application across the distributed edge infrastructure.

Solving the Edge Equation

Distilling this down provides two insights in considering distributed edge vs. centralized cloud hosting. 

The first is that a centralized cloud is a default choice simply due to familiarity and inertia. To the extent that an edge solution can replicate and deliver the same overall DevOps experience, there remains little if any reason to elect a centralized cloud deployment. 

The second is that the more distributed and diverse an edge hosting environment is, while still being cost-effective, the greater its advantage over a centralized paradigm.

Organizations would thus be wise to consider their deployment options and weigh advantages and disadvantages carefully, rather than simply letting inertia carry the day. 

Feature image via Pixabay.

When to Move to the Edge: A Simple Equation to Help Decide

Leave a Reply