Feds Face 'Big Data' Storage Challenge

Federal data centers are filling up with data as terabytes accumulate into petabytes. Agencies must adapt their storage

Michael Biddick, CEO, Fusion PPT

May 24, 2012

4 Min Read

InformationWeek Government  Digital Supplement - June 2012

InformationWeek Government Digital Supplement - June 2012


InformationWeek Green

InformationWeek Green

Download the entire June 2012 InformationWeek Government supplement, distributed in an all-digital format as part of our Green Initiative
(Registration required.)


Big Storage

Big Storage

At $78.9 billion, the federal government's proposed IT budget for fiscal year 2013 is 0.7% less than the current budget. That's four years in a row that the federal IT budget has been flat, but there's been no letup in the growth of data or the need to store it. During this same period, the feds' data storage requirements have been growing 30% to 40% a year, gobbling up scarce IT funding.

We hear a lot about the imperative to reduce costs and make the federal government more efficient--data center consolidation and "cloud first" are two prominent examples--but less has been said about a key technology: storage.

But the move to cloud computing and the growing importance of big data and information sharing are creating a sense of urgency around data storage. Networked storage systems such as network-attached storage and storage area networks are part of the answer, but they're not enough.

Most agencies have adopted a tiered storage architecture that involves different technology components for different functions and data sensitivity. This architecture includes various physical media (NAS, SAN, and others), as well as policies and services that govern functions within the storage environment.

Some data typically requires real-time or near-real-time access, which can be accommodated by the architecture. With NAS, data files are stored and accessed using standard file systems and protocols, including Network File System for Linux and Unix clients and Common Internet File System for Windows clients. Agencies may choose to spend more on SANs, where data files are stored on highly available devices. Magnetic hard disk drives that store data are typically architected in a way to include highly available RAID. Solid-state disk drives, which are extremely costly, are reserved for data that requires very fast access. For budget-strapped agencies, solid state is largely out of reach.

For slower data access, at significantly lower costs, magnetic tape drives in robotic libraries can handle archived data or data that's protected as part of a backup strategy. They're also used to archive files no longer needed in operational environments but that must be retained in accordance with the federal electronic records retention policy.

Your archiving requirements should govern the movement of data files from fast-access storage devices to slow-access devices. This will help reduce the growth of the faster, but more expensive, storage systems. As needed, data files moved to a tape archive can always be copied back to fast-access data storage using a data recovery service.

A backup service copies data to an indexed storage location on slow-access devices such as magnetic tape drives. These copies are stored for data protection purposes. If the files are needed because of accidental deletion or corruption, they can be recovered.

For agencies that need continuous data protection, a replication service copies all changes to data and files from one storage subsystem to a secondary subsystem. This service can provide a zero recovery point objective to ensure that agencies never miss a beat. A recovery service copies files from the archive or the backup storage location to the operational system with fast-access storage. Recovery also enables data logged during replication to be retrieved from the logs and copied to the operational system.

All of these services require storage media and data governance policies. With the booming growth of data across government, anything less than a well-conceived plan will drain an agency's IT budget.

To read the rest of the article,
Download the June 2012 InformationWeek Government supplement

Research: Federal Government Cybersecurity Survey The New Threat Landscape
Our report on federal government cybersecurity is free with registration.

This includes 26 pages of analysis and 15 charts. What you'll find: Rankings of most-serious cyberthreats and top initiatives Federal IT pros' assessments of their agenices preparedness Key areas of investment to bolster defenses Get This And All Our Reports

Read more about:

20122012

About the Author(s)

Michael Biddick

CEO, Fusion PPT

As CEO of Fusion PPT, Michael Biddick is responsible for overall quality and innovation. Over the past 15 years, Michael has worked with hundreds of government and international commercial organizations, leveraging his unique blend of deep technology experience coupled with business and information management acumen to help clients reduce costs, increase transparency and speed efficient decision making while maintaining quality. Prior to joining Fusion PPT, Michael spent 10 years with a boutique-consulting firm and Booz Allen Hamilton, developing enterprise management solutions. He previously served on the academic staff of the University of Wisconsin Law School as the Director of Information Technology. Michael earned a Master's of Science from Johns Hopkins University and a dual Bachelor's degree in Political Science and History from the University of Wisconsin-Madison. Michael is also a contributing editor at InformationWeek Magazine and Network Computing Magazine and has published over 50 recent articles on Cloud Computing, Federal CIO Strategy, PMOs and Application Performance Optimization. He holds multiple vendor technical certifications and is a certified ITIL v3 Expert.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights