Big Data Fuels Rise of Containerized Data Centers

Big data growth forces more companies to adopt containerized data centers and other solutions to meet storage demands.

Kevin Fogarty, Technology Writer

September 28, 2012

4 Min Read

Except for balloons and perhaps some reputations, few things expand exponentially when they're filled with something insubstantial. Data centers appear to be another exception.

Corporate data centers have been "growing exponentially for years," according to Data Center Journal, to keep up with the demands of virtualization, cloud computing, and the growth of e-business. This year data center growth took another big leap, driven largely by the need to store ever-more-vast amounts of digital data.

In fact, growth has been so fast that half of data center managers polled for a June Data Center Journal report said their companies were suffering moderate to severe "data center sprawl"--a term describing unregulated growth that forces data centers to expand capacity by adding new servers or storage units piece by piece. (The preferred method, of course, is to build a new data center that's capable of handling greater capacity demands to avoid overloading the electrical or cooling systems of the current center.)

[ For more on how big data is squeezing IT budgets, see Big Data Squeezes Legacy IT Spending: IDC. ]

Most data center growth is driven by cloud computing, virtualization, mobile computing, and other trends within corporate IT--trends that may cost-justify their own existence but that still add greater demands on the capacity of data centers.

Underlying all this growth within the data center is, of course, one common element: data. Virtual servers, software-as-a-service, cloud computing projects, even mobile computing drives more data into the data center in the form of stored virtual-machine images, replicated copies of cloud-based data, and copies of documents and applications for use by mobile workers using smartphones.

The drive toward big data analytics and collection of data that was either ignored or discarded in past years has driven data growth rates even higher--as much as 25 percent higher, according to a quarter of those surveyed.

More than half of senior-level IT executives (54 percent) predicted that within the next two years, they'd have to expand corporate network bandwidth by 50% or more to keep up with the growing demands of big data, according to a survey of 1,549 senior IT managers in the U.S. and Europe conducted by network infrastructure provider Emulex. These managers also told Emulex they'd have to expand storage capacity by at least 50% during the same period.

Demand is growing so quickly that data center construction projects are getting backed up or delayed at unprecedented rates, according to Data Center Journal. As a result, record numbers of large companies are now opting for "containerized data centers," prefabricated structures containing a pre-selected mix of data center storage and computing equipment and power and cooling hookups to keep them running.

The first containerized data centers were introduced in 2005 and were unpopular for several years, according to IMS Research. That has changed in the past two years--almost exactly the time big data became a clear trend within IT.

In fact, sales of containerized data centers nearly doubled in 2012 and are expected to increase by 40% in 2012, according to a survey of data center managers released this week by IMS Research.

Containerized data center design is a cost-efficient, controllable way to expand data center capacity. A disadvantage, however, is that it offers generic product selection, limited to those designed to be "generator-ready," according to Jun Yang and Patrick Kenny, senior consultants at Infrastructure Factor Consulting, Inc.

IT managers responding to Data Center Journal's survey said they were near the breaking point in both data center capacity and their ability to expand it.

They're far from alone. The Ricoh Document Governance Index 2012--an annual study of the factors affecting the storage and management of paper or electronic documents in the enterprise—anticipates increasing demand for big data storage and compute power to run complex analytics.

Out of 1,000 C-level executives Ricoh interviewed in Europe, 91% cited good data and the ability to get clear answers from it as the single biggest factor preventing them from running their businesses more efficiently and profitably. During the past three years, big data has changed the priorities of the majority of European businesses, which are now focusing on taking and managing business risks over cutting costs, increasing efficiencies, or decreasing their environmental impact. (In 2009, cutting costs was the top goal of 67% of European business managers, according to the study, compared to 43% today.)

Even small IT shops can now afford thin provisioning, performance acceleration, replication, and other features to boost utilization and improve disaster recovery. Also in the new, all-digital Store More special issue of InformationWeek SMB: Don't be fooled by the Oracle's recent Xsigo buy. (Free registration required.)

Read more about:

2012

About the Author(s)

Kevin Fogarty

Technology Writer

Kevin Fogarty is a freelance writer covering networking, security, virtualization, cloud computing, big data and IT innovation. His byline has appeared in The New York Times, The Boston Globe, CNN.com, CIO, Computerworld, Network World and other leading IT publications.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights