Understanding how and where edge computing fits into an organization’s overall IT road map goes a long way toward formulating a business case for it within your organization.

Andrew Froehlich, President & Lead Network Architect, West Gate Networks

March 29, 2021

5 Min Read
Image: BillionPhotos.com - stock.adobe.com

It wasn’t that long ago when edge computing was considered a futuristic, forward-looking concept that was interesting to contemplate -- yet offered no real-world benefit to the enterprise. This is no longer the case. As end users access networked services from anywhere and real-time applications are cropping up almost daily, increased flexibility is needed at the edge of the network to ensure the highest levels of security and performance.

IT leaders must learn to visualize the true benefits of edge computing from a long-term, strategic perspective. Understanding how and where edge computing fits into an organization’s overall IT road map goes a long way toward formulating a business case for it within your organization. This is especially true as more mobile users and new applications drive the need for greater agility. Here’s what you need to know to start building your business case for the edge.

Edge computing is a necessary architectural shift in the enterprise

Like traditional cloud architectures before it, edge computing has become an ambiguous term that means different things to different people. As a result, moving to a more dynamic architectural edge model can be greeted with indecisive support. It’s often the consensus that major architectural network changes are considered high-risk propositions both from an IT architecture and business investment perspective. However, it doesn’t take a great deal of research to see that valid use-cases for edge computing currently exist -- and more are being created each day.

Today’s IT world is now being defined by real-time communications, data collection and AI-backed analytics for many different business functions. These highly desired services require a network architecture that delivers more flexible levels of performance to an increasingly distributed world. Edge computing bridges the gap sitting between on-premises and cloud-delivered applications. Deploying apps and services at one or more metro edge locations offers the low-latency network benefits found in on-premises deployments with the managed infrastructure benefits offered by public cloud service providers. For those already operating within this style of hybrid infrastructure, adding a metro edge deployment option simply makes sense for applications that require low-latency network services in a managed-services model.

Looking a bit further down the road, there will soon be a time when a single application is deployed at all three locations (private data center, public cloud and metro edge) and the network will intelligently route users to the service location that makes the most sense from a performance, security and cost perspective. This architectural concept is being referred to as the “edgeless enterprise” and is poised to transform application and service delivery models of the future. Of course, a key component to an edgeless enterprise will be an IT department’s ability to serve apps from multiple service edge locations.

Network technology vendors are currently developing ways that this can be accomplished. For example, AWS and Microsoft are already offering paths to bring their cloud offerings closer to customers through their AWS Outposts and Azure Edge Zones services. Additionally, major telecom carriers like AT&T and Verizon are beginning to launch metro edge solutions in select US cities. Finally, 5G network technology innovators like Celona are seeking to streamline how users and data are intelligently routed to various edge service delivery locations by bringing SLA-backed 5G microslicing technologies to the corporate LAN and WAN.  This promises to provide a unified edge framework that is dynamic by definition. So, as you can see, the momentum is clearly moving in the direction where enterprises can shift away from a two-pronged hybrid architecture to one that’s far more flexible at the corporate edge.

Determine which pain points can be solved today

To build a solid business case, it’s important to detail how edge computing solves currently existing pain points. While architectural conversations regarding the future of edge computing and application delivery are useful, there must be an immediate business need behind investing in the edge today as opposed to next month or next year.

Despite various vendor claims, the ultimate purpose of edge computing is to bring compute, storage, and network services closer to endpoints and end users to improve overall application performance. Based on this knowledge, IT architects must identify and document instances where edge computing can address existing network performance problems. While existing pain points don’t necessarily have to be the primary reason for your business case, it does help to solidify why budget dollars should be allocated sooner rather than later.

Calculating the value of edge computing

No technology business case is complete without showing the value of the investment. Just consider the recent IT spend that could have been avoided if the new technology were already in place. When it comes to the need for more flexible and higher performing networks using edge computing, we need to only look as far as the recent COVID-19 pandemic that forced large numbers of employees to work from home. As the shift to remote work began, IT architects had to scramble to figure out how to deliver the required application performance to a large, distributed workforce. In some cases, major network modifications and upgrades were required to bring the necessary levels of remote application performance back into balance. Combine this with the fact that new applications are requiring increased network performance demands and one can begin to see the value of investing in a flexible edge architecture versus siting on the sidelines.

An architecture built on flexibility and scalability

Ultimately, architecting a more agile and dynamic edge brings with it massive economies of scale from a compute, operational and flexible deployment perspective. This is due to the fact that network computing resources become inherently dynamic and software-defined. When these network functions are physically deployed closer to applications and users, performance efficiencies can be achieved far more easily.

When other aspects of enterprise IT have evolved over the years to embrace flexibility and scalability, the network has largely remained unchanged. While understandable from a risk perspective, edge computing is poised to be an intrinsic architectural shift that has the potential to revolutionize how corporate networks operate for the foreseeable future, saving companies both time and money. A business case that successfully reflects these notions is one that IT leaders must strive to achieve.

Related Content:

10 Trends Accelerating Edge Computing

The Inevitable Rise of Intelligence in the Edge Ecosystem

Entering a New Chapter for Tackling IoT and ‘The Edge’

 

About the Author(s)

Andrew Froehlich

President & Lead Network Architect, West Gate Networks

Andrew has well over a decade of enterprise networking under his belt through his consulting practice, which specializes in enterprise network architectures and datacenter build-outs and prior experience at organizations such as State Farm Insurance, United Airlines and the University of Chicago Medical Center. Having lived and worked in South East Asia for nearly three years, Andrew possesses a unique international business and technology perspective. When he's not consulting, Andrew enjoys writing technical blogs and is the author of two Cisco certification study guides published by Sybex.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights