BI Vendor Selection: Smarter the Second Time - InformationWeek

InformationWeek is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

IoT
IoT
Software // Information Management

BI Vendor Selection: Smarter the Second Time

With 2005 in their sights, organizations are planning to upgrade their business intelligence (BI) capabilities. Find out how you can overcome obstacles to achieving "second-generation" success -- including power politics played by entrenched vendors, business users, and IT.

IT managers are frequently reluctant to abandon existing expertise. To switch from a current toolset, their organization will have to acquire new skills, rewrite otherwise usable code, and retrain users and help-desk personnel. In some cases, the introduction of new tools requires changes to the underlying DBMS and/or OS platforms, which can compound the overall disruption and increase the risk.

There are other situations, however, where organizations favor change for its own sake. This powerful bias occurs, for example, when there's a change in IT management. As new buyers, they bring their own product preferences with them. They immediately look for an opportunity to exert influence. Coming into a situation where there's already dissatisfaction with an application, some new IT managers may feel a need to make a change solely to appease a disgruntled user base.

A third source of IT bias is the tendency of some technical specialists to do some "resume building" by gaining experience with a currently popular product. This factor creates a strong incentive to favor change.

Bias: The User View

Even when strong impetus exists to replace a BI solution, users may show significant resistance to changing vendors. Opinion leaders in the user community are often the power users who've developed expertise with current tools, along with the status and job security that comes with that expertise. Going to a new toolset endangers their positions. In view of this situation, user management will consider carefully the cost and disruption of a major retraining effort.

Of course, politics often play a key role in the decision process. Given the opportunity, users who didn't play a major role in the development of the original BI application will try to assume more control over its replacement. They'll do this either by attempting to take ownership of the technology selection process or by choosing a product that they feel they can implement with a minimum of IT support. Such decisions are sometimes made regardless of functional or technical fit.

As is the case with other technologies, BI users are also influenced by their peers in other organizations and will have a tendency to follow their lead with regard to vendor preferences. At times, a single highly visible feature or feature set will become a singular focus in a way that's completely inconsistent with its true importance. A common example is the manner in which BI tools work with Microsoft Office. Smooth Excel integration can provide a basis for compelling vendor demonstrations, but isn't, in and of itself, a preemptive reason to buy one product over another, especially if the application is to be deployed primarily over the Web.

To have success with a second-generation selection process, a fresh look is essential. BI vendor selection methodologies are generally aimed at "clean slate" scenarios, where buyers have little or no experience with the technology. Clearly, the presence of incumbent technology injects new factors into the process. The following sections discuss ways that we've learned to adapt vendor selection best practices to second-generation BI initiatives.

Technical Requirements

No matter how thoroughly your organization analyzed data volumes, response time, and other technical requirements when the original tools were selected, it's essential to completely reassess those requirements to establish a more current baseline. In the world of data warehousing, some reasons are obvious: You might have new data sources and an expanding user community. More subtle reasons could include the following:

  • Demand for deeper drill-down analysis
  • The need for longer duration trend analysis, which requires additional historical information
  • The organization wants to expand BI beyond after-the-fact reporting to include predictive analysis or operational, near real-time mission-critical functionality.

Security needs are another critical factor. Very often, first-generation BI environments were built and deployed with minimally acceptable security standards set by what any mainstream toolset could provide. After all, part of the purpose was to make data more accessible. Now, with organizations typically much smarter and more demanding about security, second-generation BI systems must be better. BI products vary considerably with respect to security models. Choose your new toolset based on today's and tomorrow's security needs, not yesterday's.

Additionally, since you deployed your current BI system, the core technologies that comprise commercial software products — BI or otherwise — have almost certainly evolved. Very likely, your organization has some sort of "architectural blueprint" that governs permissible standards and platform technologies for all new applications; the blueprint offers a "playbook" of allowable interfaces among systems and components. If, for example, Web services, a common portal, and directory services integration are now required capabilities for all new deployments — whether transactional or analytic — then the technical compatibility requirements factored into your BI tool evaluation need to reflect this changing landscape.

Since BI is an end-to-end proposition, it's likely that both the ETL and user-facing tools (reporting, OLAP, and so forth) must conform as a group to a set of forward-looking technical standards. The key point here is that you proceed at great peril if you undertake second-generation tool evaluation and selection based on an obsolete set of technical requirements.

New Functional Requirements

More often than not, first-generation BI applications do little beyond producing a mixture of standard and parameter-driven, after-the-fact reports and analyses that can't be altered without IT involvement. This lack of flexibility won't satisfy today's decision makers who value self-service and want the ability to look ahead. Most likely, your organization now employs several packaged applications. The desired data must pass through a new generation of ETL tools and reach users via the new BI tools that you're evaluating. While it's clear that you'll need to factor in issues related to these new classes of data, don't neglect the metadata issues associated with the new data sources in your evaluation.

A critical shortcoming of most first-generation BI environments is inadequate metadata management, due in large part to the product-centric nature of most early tools. Develop a broader and bolder vision of the role metadata management will play in your future BI environment; thoroughly evaluate the candidate tools to see if they can implement that vision.

Many organizations now have functional requirements that call for real-time data flows from source systems into the BI environment. This becomes more than just a technical issue; the business success of the BI environment may depend on recognizing that, for example, the widely used "Daily Sales Activity Report" currently produced overnight must now be available, on demand, several times a day. Therefore, real-time data must be included for users across the enterprise. This sort of data demand has far-reaching implications not only for extract, transform, and load (tools must support multiple interface protocols between source systems and the data warehouse), but also for reporting and analysis tools. Consider how the system will alert users as to the exact "age" or latency of the information they're analyzing. You'll also need to think about the BI system will update information in OLAP cubes and caches.

We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.
Previous
2 of 3
Next
Comment  | 
Print  | 
More Insights
News
How COVID is Changing Technology Futures
Jessica Davis, Senior Editor, Enterprise Apps,  7/23/2020
Slideshows
10 Ways AI Is Transforming Enterprise Software
Cynthia Harvey, Freelance Journalist, InformationWeek,  7/13/2020
Commentary
IT Career Paths You May Not Have Considered
Lisa Morgan, Freelance Writer,  6/30/2020
White Papers
Register for InformationWeek Newsletters
Video
Current Issue
Special Report: Why Performance Testing is Crucial Today
This special report will help enterprises determine what they should expect from performance testing solutions and how to put them to work most efficiently. Get it today!
Slideshows
Flash Poll