Across industries, big data has joined traditional, structured data as a mission-critical element. Here’s some advice for CIOs and big data leaders on how to get started.

Mary E. Shacklett, President of Transworld Data

November 17, 2020

5 Min Read
Image: areebarbar - stock.adobe.com

Data centers and disaster recovery are still structured around traditional transaction systems, but more companies are beginning to use big data in mission-critical ways. It's time for IT to define DR and “mission-critical” status for big data. What are the critical ingredients that support both, and what should companies be doing now?

First, let’s look at some examples of how big data became mission-critical data.

At the Hospital for Sick Children in Toronto, Project Artemis focuses on a real-time data collection and analysis system that monitors babies’ heart rates and issues real-time alerts when a heartbeat indicates the possibility of neonatal sepsis, which can result in death. The alerts are sent to nurses so they can immediately intervene. The technique reduces the likelihood of infant deaths.

In manufacturing, real-time IoT data reported from production lines immediately informs manufacturers when an assembly line or a piece of equipment is in danger of failing. The alerts trigger maintenance so that production lines stay up and continue to run 24/7. The average cost of a single downtime incident in manufacturing is $17,000, and a single manufacturer can experience as many as 800 downtime incidents per year. From cost and competitive standpoints, big data IoT reporting in manufacturing is mission critical.

Across industries, whether it is healthcare, financial services, agriculture, logistics or life sciences, big data has joined traditional structured data as a mission-critical element.

Strategies for mission-critical big data

Since organizations are only now beginning to classify big data applications as mission critical, most are in early stages of developing IT strategies supporting the mission criticality of big data.

As companies work through these issues, here are some major questions that IT strategies around mission-critical big data should address:

1. What are the major mission-critical big data apps in the business?

Do you depend on streamed IoT to inform management about the environmental safety and locations of sensitive cargo you are transporting? Or do you use big data for in-field utility and equipment inspections conducted by drones?

If the company is depending on these functions to replace daily, operational tasks that were formerly performed manually, or if revenue, cost and/or safety factors are impacted, the big data applications should be identified as mission critical and presented as such. Upper management and the board of directors need to also understand the significance of maintaining the operations of these systems.

Mission-critical systems must stay operational no matter what, and there must be system backups, IT staff, and management/board support to sustain them.

2. Are there backups for data and operations?

In some big data processing systems like Hadoop, there is built-in processing and data failover, but in other cases it might be necessary to establish big data backup methods in the data center or in the cloud.

For instance, if the logistics tracking system you depend on for your truck fleet suddenly fails, what happens? Are you using a vendor transportation management system in which the vendor provides failover? Or do you have backup data and processing capability on premises or in the cloud that can take over?

What about your personnel? If a key contributor on your big data staff is unavailable, do you have a backup option for that person?

These are questions that have long been answered for traditional transaction systems, and that now must be answered for big data.

3. How robust is your security?

Big data systems that rely on IoT sensors, devices and appliances that are used far from headquarters or the data center are often overseen by end users who are not as cognizant of security best practices as IT teams.

There is always the potential for bad actors to compromise big data systems in the same way that they attempt to breach standard systems of record. Additionally, out-of-the box IoT appliances may come with vendor security presets that are too loose to meet your security requirements.

A security evaluation should be made for any big data system that you classify as mission critical. Often, an initial review by an outside audit firm can help you identify potential security holes.

4. Can you trust your data and your algorithms?

Many of us remember the Google Flu Trends failure to identify the peak of the flu season in 2013 by 140%.

Google is hardly alone in big data “misses.” Many companies inaccurately design their data models and the algorithms that operate on big data. There are also cases in which data comes into systems without being adequately “cleaned” (i.e., screened for accuracy and relevancy, and normalized so it can be aggregated with data from other systems).

Because big data uses iterative data models that can change from day to day, it’s also important to assure that the latest versions of data, data models and algorithms are in production and available to users.

To support these functions in mission-critical big data systems, IT should develop policies and procedures for tracking data, data model and algorithm versions, and for assuring that the latest versions of all are in place.

5. Are big data systems in your disaster recovery and failover plan?

Many organizations have yet to incorporate mission-critical big data systems into their formal disaster recovery and business continuation plans. They should if these systems are being used to run and manage critical operations in the company.

This is a good time for CIOs and big data leaders to review DR plans and fill in sections that may be missing for big data systems that are listed as being mission critical. It’s equally important to communicate the additions of these systems to management and the board --and to arrange for periodic DR and failover testing.

 

Follow up with these data strategies articles:

Top 10 Data and Analytics Trends for 2021

Ready or Not, Big Data is Bringing Big Changes

Is Augmented Analytics Making the Difference It Advertises?

 

About the Author(s)

Mary E. Shacklett

President of Transworld Data

Mary E. Shacklett is an internationally recognized technology commentator and President of Transworld Data, a marketing and technology services firm. Prior to founding her own company, she was Vice President of Product Research and Software Development for Summit Information Systems, a computer software company; and Vice President of Strategic Planning and Technology at FSI International, a multinational manufacturer in the semiconductor industry.

Mary has business experience in Europe, Japan, and the Pacific Rim. She has a BS degree from the University of Wisconsin and an MA from the University of Southern California, where she taught for several years. She is listed in Who's Who Worldwide and in Who's Who in the Computer Industry.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights