Big Data Drives Big IT Spending

IT spending will hit $34 billion by 2013 as companies upgrade and adapt existing infrastructures to meet the demands of big data, Gartner research predicts.

Kevin Fogarty, Technology Writer

October 18, 2012

4 Min Read

Big data is a big deal. Marketers and corporate strategists hope it can provide insights on customers, while IT managers struggle with how to manage all that data within the parameters of their budget and staff.

What IT professionals will do, according to a Gartner study published today, is spend nearly half of all IT resources during the next few years in an effort to adapt large, complex IT infrastructures to the demands of big data projects.

The result is a misleadingly small market for big data projects, which will account for about $4.3 billion worth of corporate IT spending worldwide during 2012, according to the report, titled Big Data Drives Rapid Changes in Infrastructure.

[ For more on how machine-to-machine devices will work alongside big data analytics, read Where M2M And Big Data Are Headed. ]

Direct spending is only a fraction of the total, however.

Few companies plan to add big data capability by ripping out and replacing existing products, Gartner found. Instead, most companies will add a few new products while beefing up their storage, databases, servers, and other IT resources to handle the rigors of huge databases--databases that are updated constantly and that include information so complex that most data specialists have traditionally viewed it as impossible to parse or analyze effectively.

The result: a chain reaction of upgrades and adaptations that will account for a total of $28 billion in IT spending worldwide during 2012, and $34 billion in 2013, Gartner's report predicts.

Much IT spending now is focused on gathering and analyzing business transactions, data from server logs, and email among employees and with business partners, according to a study from IBM and the University of Oxford, which was also published today. (Download a PDF of the 2012 Analytics Study here.) That spending focuses on the need to integrate and analyze both unstructured textual data and machine-to-machine data, key challenges of early big data projects.

Still to come for most corporations is the greater challenge of adapting data from external sources that include overwhelming volumes of audio and video data as well as unstructured text. According to IBM's study, 43 percent of data from external sources comes from social networks, while audio makes up 38 percent, and photo or video comprises 43 percent.

So far, however, only 28 percent of global organizations have pilot or production-quality big data projects underway. Meanwhile, 47 percent are in the planning stages, and 24 percent have no big-data projects underway, according to IBM/Oxford's survey of 1,144 business and IT executives in 95 countries.

What do you want to know? How can you find out?

New insights on customer behavior is the Holy Grail of big data, but few organizations have data management and business intelligence systems that can scale high enough or change quickly enough to handle the demands of big data, according to IBM/Oxford's report.

Getting existing infrastructures up to speed with new definitions of what data is supposed to be will drive not only huge spending increases--a total of $232 billion by 2016--but also a new way of thinking about data and analytics, according to Mark Beyer, a VP of research at Gartner and lead author of the report.

IT spending driven by the demands of big data will continue through 2018, when expectations about the size, composition, and potential of corporate data rise to the point that "big data" becomes simply "data"--probably around the year 2020, according to Beyer.

The idea of big data was new enough in 2011 that it created an entirely separate set of reasons to spend money on data management products. But it didn't take long for most IT managers to realize big data required more space, more power, and more flexibility in analytics, storage, and data management--not a completely new set of capabilities, according to Beyer.

The primary goal of big data is to gain insight from data that had been previously inaccessible. Increases in computing power, storage capacity, and data mining capabilities now make it possible to analyze information about customers culled from petabytes of social network chatter, web-server logs, and other peripheral data sources.

"Big data requirements will gradually evolve from differentiation to 'table stakes' in information management practices and technology," Beyer wrote. "By 2020, big data features and functionality will be non-differentiating and routinely expected from traditional enterprise vendors and part of their product offerings."

In-memory analytics offers subsecond response times and hundreds of thousands of transactions per second. Now falling costs put it in reach of more enterprises. Also in the Analytics Speed Demon special issue of InformationWeek: Louisiana State University hopes to align business and IT more closely through a master's program focused on analytics. (Free registration required.)

Read more about:

20122012

About the Author(s)

Kevin Fogarty

Technology Writer

Kevin Fogarty is a freelance writer covering networking, security, virtualization, cloud computing, big data and IT innovation. His byline has appeared in The New York Times, The Boston Globe, CNN.com, CIO, Computerworld, Network World and other leading IT publications.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights