The growth of edge computing and IoT will require rearchitecting IT infrastructures. Here are some options to consider before you get started.

Mary E. Shacklett, President of Transworld Data

March 15, 2021

7 Min Read
Image: RA2studio - stock.adobe.com

Edge computing is so named because it is literally located at the edges of enterprises -- in the areas where people work -- and away from the central IT data center.

Organizations implement edge computing primarily when they adopt IoT technologies. IoT devices produce information from moving trucks, from machines on assembly lines, from drones in the field, or from telecommunications towers that are many miles away.

It doesn't make sense to collect data directly from thousands of distributed IoT devices, and then transmit all of this data over bandwidth-stretched, highly expensive communication channels in real time to a central data center. It also doesn't make sense to just start deploying IoT without an architectural plan for how you’re going to administer your data, applications, and security.

What are the options for defining and deploying an IT architecture for IoT? Here are three areas to consider before you get started:

1. Cloud

By using a cloud service as a centralizing agent, enterprises can route their IoT data to the cloud, process it, and then export analytic results. In this sense, the cloud serves a centralized function because it gathers incoming edge IoT data at a single point. The cloud does not replace the central corporate data center though and can be used as an additional centralizing agent.

Once in the cloud, IoT processing works like this: Raw data is sent to the cloud from different edge locations in the company; the data is processed in the cloud; and the output of IoT analytics is then sent from the cloud back to the company's users.

In a setup such as this, IT must do the following:  

  • Security and governance rules must be defined in the cloud for your IoT data.

  • If there is IoT data that you want to eliminate (e.g., jitter, irrelevant data, or other noise), you must define what you want to exclude.

  • If data needs to be transformed so it can work with data from other systems, these transformation rules must be defined in the cloud.

  • Any other required cloud configurations must also be performed by IT staff.

The goal is to synchronize IoT data and processing business rules in the cloud with the rules that your own data center uses. This will force IT to replicate some of the data administration in the cloud that it does in the data center -- but the advantage is that you’re offloading processing to the cloud and also limiting longer haul communications bandwidth costs to your primary data center.

2. Zero-trust networks

A zero-trust network grants security access and clearance to specific users for access to specific types of IoT data and applications. If IT uses zero-trust networks throughout the enterprise, it gains visibility into any new IT assets that might be added (or subtracted) at the edge, along with who is accessing which IoT data, when and where.

Zero-trust uses internal networks to carry out centralized IT policies. Zero-trust networks also enable IT to exercise centralized control over IoT communications and assets, wherever assets might be.

In a zero-trust network environment with distributed IoT processing, a manufacturing unit could have a separate server that processes production data in real time and outputs information to supervisors about how a production line is functioning. A warehouse function could have a localized sever for tracking and checking inventory in and out. Both examples illustrate distributed processing at the edge that is away from the central data center. Periodically, data from these distributed IoT platforms could be shipped to the central data center for processing and compilation with data from central systems.

In a setup such as this, IT must do the following: 

  • Security standards and governance must be uniformly applied to data at all points.

  • IoT data processing business rules must be defined.

  • If IoT data is to be merged with other types of data from other systems, data mappings and transformations must be defined.

From a bandwidth standpoint, a majority of the communications from IoT application areas like warehousing and manufacturing will occur over standard TCP/IP cable, so the burden on Internet-based communications (and costs) is significantly less.

3. Micro data centers

Surveying, construction, scientific, oil and gas, and mining companies have all recognized that an important part of their edge computing is conducted in the field. This “field” is frequently in remote, difficult to access locations where IoT works on unmanned crafts such as drones. The drones perform reconnaissance over large tracts of land/sea and collect data on topographic features, as well as on company assets and activities in the field. The data is then forwarded for processing and the derivation of analytic insights.

Due to the  constraints of sending large troves of unsifted data across the internet, the decision in most of these cases has been for the drone to collect the data itself on solid state drives, and then for those drives to be offloaded onto servers in field offices where the data is processed and stored. At the site of these “micro data centers” in the field, data is cleaned, organized, and trimmed down so only the data that is relevant to the mission is collected.

There is still a need for a central data repository, located in the central data center, to gain access to this data -- so the enterprise plans to ship the data to the central data center when data shipment rates over the internet are lowest and when line traffic is lightest.

The use of micro data centers dates to the early days of distributed computing, when different departments in the company used servers to process their own data. At regular intervals, this data was collected and sent over in batch to a mainframe in the central data center. Using micro centers in the field, and then shipping bundled data, is the latest iteration of the technique.

What IT must do: 

  • Employees at field offices are the end users and stewards of this data. This means IT must train these users in the techniques and standards of enforcing physical and logical security, and data safekeeping.

  • All drones and in-field IoT devices should be routinely inspected and maintained; it is advisable for IT to participate in this process.

  • IT product standards should be set for the field IoT that end user departments could potentially budget for and acquire.

  • IT should inspect and install all IoT security settings to ensure that they meet enterprise standards before the IoT is deployed.

  • Failover procedures should be written into the corporate disaster recovery plan for IoT and micro data centers that are deployed in the field.

  • Field-based micro data center standards and design should be defined.

  • Security locator and lockdown procedures should be defined for any IoT device (e.g., a drone) that is lost on a mission.

Bringing it all together

The growth of edge computing and IoT will require a rearchitecting of IT infrastructure. This rearchitecting must address not only data, but security, processing, failover, and compliance. In the most complicated of these architectures, an enterprise could conceivably have a central data center, a number of micro data centers deployed in the field, zero-trust networks that run within the walls of the enterprise, and a complement of cloud-based analytics computing services that offload some of the IoT processing from the central data center. To accommodate these different implementations of IoT, an IT architecture for IoT is needed that can span all points, while still enforcing the same levels of security and governance that enterprise stakeholders expect. This isn’t an easy task, but IT already knows the different technologies, deployments, guidances, etc., to make it happen. Now it’s a matter of getting the job done.

Related content:

10 Trends Accelerating Edge Computing

The Inevitable Rise of Intelligence in the Edge Ecosystem

Deloitte on Cloud, the Edge, and Enterprise Expectations

Exploring Edge Computing as a Complement to the Cloud

 

About the Author(s)

Mary E. Shacklett

President of Transworld Data

Mary E. Shacklett is an internationally recognized technology commentator and President of Transworld Data, a marketing and technology services firm. Prior to founding her own company, she was Vice President of Product Research and Software Development for Summit Information Systems, a computer software company; and Vice President of Strategic Planning and Technology at FSI International, a multinational manufacturer in the semiconductor industry.

Mary has business experience in Europe, Japan, and the Pacific Rim. She has a BS degree from the University of Wisconsin and an MA from the University of Southern California, where she taught for several years. She is listed in Who's Who Worldwide and in Who's Who in the Computer Industry.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights