Big Data's Holy Grail: Better, Faster Decisions
Smart companies are cashing into early opportunities to improve decision-making -- increasing the speed at which diverse information sets can be analyzed to derive critical insights.
We've all been hearing a lot about the promise of big data during the past two years. Now many Fortune 1000 companies are asking, "Where's the beef?" The organizations that are succeeding are using big data as a fast route to better decisions.
In late 2012, NewVantage Partners conducted a big data survey of c-level business and technology executives from 50 Fortune 1000 companies. The survey concentrates on financial services firms with a history of heavy data and analytics usage. Participants include Bank of America, JP Morgan, Wells Fargo Bank, American Express, Fidelity Investments, as well as non-financial firms such as General Electric, and government agencies such as the Department of Defense, among others.
One of the primary findings of the survey is that large organizations are looking to use advanced analytics and big data to make "better, fact-based decisions." In fact, this was the most-cited potential benefit of big data initiatives, chosen by 22% of executives surveyed.
As we met with executives to share the survey results and explore their responses, we wanted to understand precisely how big data could most directly improve their ability to make better decisions.
In a meeting with The Department of Defense, for example, we learned about a program called "Data to Decisions." The program came about because the intelligence analyst community was being overwhelmed by data from diverse sources -- sensor data, social chatter, facts and figures. The challenge for the defense community was to be able to organize, analyze and gain insight from all this data in a timely fashion. Their focus was on getting to answers and critical insights more quickly.
[ Want more on getting fast answers from vast data sets? Read Big Data: 6 Ways To Find What Matters. ]
A leading national credit card firm faced a similar challenge. How could it accelerate the speed at which it could create new marketing offers and campaigns? Their goal was to increase the cycle time with which it could analyze new information, test new offers and repeat the process again and again until it arrived at an optimized match of offers to customer needs.
In both of these examples, the critical need was to increase the speed of capturing, organizing and analyzing vast amounts of diverse information. Each organization thought that big data could offer a potential solution.
The Big Data Breakthrough
The breakthrough that big data technology platforms provide is the ability to defer the process of data preparation, which typically takes 80% of the time required for data analysis in a conventional data warehousing environment. This no-data-prep shortcut is made possible by the "load and go" ability to gather all available data and then rapidly detect patterns and correlations. The requirements of data standardization, normalization and transformation -- what we call the "data engineering" -- do not disappear, but they can be deferred until after the critical data elements and patterns are detected.
Rather than developing a hypothesis in advance, as would be required in a traditional data warehousing environment, big data technologies enable organizations to load and analyze the data first, understand where it leads and then clean it up to make it suitable for ongoing production work. We call this process the ability to accelerate time-to-answer (TTA), with TTA being the business metric that organizations use to measure how quickly they can get to better, fact-based decision-making.
More than any other big data use case or business need, this ability to accelerate TTA is the single greatest value that firms can derive from big data. Fortune 1000 companies and organizations such as the Department of Defense are using big data platforms to answer questions in seconds rather than days, and in days rather than months.
Accelerating TTA can enable organizations to answer questions that have stubbornly resisted analysis. For example, credit card issuers are using big data approaches to develop new test-and-learn processes that they quickly adapt to the market based on the ability to iterate through results faster than they could on conventional platforms.
One executive described the approach as "failing faster." The point is that these firms can analyze larger, more diverse volumes of information than previously possible. What's more, they can optimize and automate their complex workflows based on what they learn through each iteration of analysis. In many instances, these firms can conduct 50 or more analyses in the time that it previously took to perform a single analysis.
Deriving Business Value
One of the nation's largest issuers of consumer and business credit cards has reengineered its conventional data and analytics environment using Hadoop and R-based analytics to establish a new test-and-learn environment that supports faster TTA. Hadoop gives the firm's data scientists rapid access to fresh data without the data-preparation delays required in traditional data-warehousing environments.
Big data platforms, like those used in the examples above, allow data scientists to focus on just the data they need while eliminating the engineering effort around data that does not deliver value. The credit card issuer has been able to dramatically reduce the time it takes to ask and answer critical business questions, such as which customers are most likely to respond to which offers.
Why is faster TTA possible now? In addition to reengineering the underlying data-management platform -- employing Hadoop and massively parallel processing capabilities such as MapReduce -- comparatively low-cost, high-capacity big data platforms makes it cost-effective for organizations to load all data available for analysis.
In recent years, firms have also been able to accelerate query processing and reduce costs using database appliances that are optimized for analytical queries, like Netezza or Greenplum. However, the costs shrink even more dramatically using a big data platform. Whereas a traditional relational database platform costs $37,000 per terabyte, costs shrink to $5,000 per terabyte when using a database appliance and they shrink further, to $2,000 per terabyte, when using a big data platform such as Hadoop. These cost savings support an order-of-magnitude greater processing capacity that can be applied to data analysis and discovery.
Organizations are just beginning to derive business value from big data initiatives, but there's an immediate opportunity to improve decision-making by increasing the speed at which diverse information sets can be analyzed to derive critical insights. Accelerated TTA is made possible by big data platforms like Hadoop that let you defer the time-consuming data engineering until the critical questions have been refined. This approach lets organizations be fast and nimble while affordably analyzing more data by taking advantage of raw processing power. We're in the early stages, but big data is delivering concrete business value to organizations that are putting their toes in the water.
Randy Bean is managing partner at NewVantage Partners, a company he co-founded in 2001. He has decades of experience as a line-of-business executive and general manager, a strategic planning and IT professional, and leader of marketing and business development functions.
The big data market is not just about technologies and platforms -- it's about creating new opportunities and solving problems. The Big Data Conference provides three days of comprehensive content for business and technology professionals seeking to capitalize on the boom in data volume, variety and velocity. The Big Data Conference happens in Chicago, Oct. 21-23.
About the Author
You May Also Like