News & Analysis - InformationWeek

InformationWeek is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

IoT
IoT
Software // Information Management

News & Analysis

Compliance and interoperability demands spark interest in metadata repositories... Database vendors incorporate BI to satisfy current customers - and move up the food chain to attract more clients.

In this Issue:

Metadata Renaissance

Compliance and interoperability spark new interest in metadata repositories
The push to make metadata more palatable for business users is getting stronger, as C-level executives struggle to comply with Sarbanes-Oxley Act mandates and to understand customers' behavior patterns.

"Our customers want to verify figures and calculations around sales and revenues," notes Donna Burbank, product manager for Computer Associates' data repository suite.

That means business analysts need to see how data is transformed through IT systems, and IT needs to see the underlying components of application development. "That process can no longer happen with overnight batch scans; customers demand immediate access to changes in operational systems so their engineers can react quickly when transformations go out of sync," explains Scott McCurdy, metadata management product manager with Allen Systems Group.

Demand for real-time access to "cuts" of data is growing in the struggle to understand how operational, software development, business process, and workflow areas affect one another. Such information is important to impact analysis.

According to Gartner's Michael Blechar, who recently wrote a "Magic Quadrant" report for metadata repositories, the cuts include service-oriented development of applications, which looks into applications and components; application architecture, which looks across multiple applications to understand how they work together; and "global cut."

"[Global cut] unveils the interrelationships among business, finance, product distribution, sales, and order entry, which can translate into double-digit returns through improved productivity and better impact analysis," maintains Lou Agosta, Forrester Research's lead industry analyst for metadata and data warehousing.

To realize such returns, organizations have to fight through pain points around integrating data off of ERP, legacy, and electronic-commerce systems — as well as the service-oriented architectures that are layered on top of those systems. Usually, problems around stovepipe processes are revealed: "Once you see the domino effect of how business requirements evolve from design to coding to end users, you get a picture of the life cycle so you can reuse artifacts without disrupting processes," explains Greg Coticchia, CEO of LogicLibrary.

Reuse without disruption can lead to significant results: "We realized a 66% reduction in labor around upgrades in the back office," says Craig Drinkhall, senior vice president of product development and engineering at TelCove, a Pittsburgh-based local-exchange carrier. "Integrating specs for upgrades was taking up to two months — too long when you have to get services to new markets." To reuse code rather than continuously rewrite and tweak code, the organization determined it had to make sense of the specs for modules around input/output, service acceptance, quality assurance, and system integration testing, as well as documentation of what was already coded. With metadata tools from LogicLibrary, the process flow among customer care, provisioning, and custom applications became apparent. "It now takes us two days to create specs for upgrades, as we now see the interrelationships among our many modules," says Drinkhall.

The level of sophistication needed varies. In is report, Gartner's Blechar acknowledges that lesser solutions may be better suited to some. "A company looking to document legacy applications down to the line of code will require sophisticated tools that have code and data scanners, parsers and bridges, whereas matching databases to programs requires simpler tools," he says. The price is commensurate with sophistication, ranging from $150,000 to more than $1 million.

The companies cited as overall leaders in the Gartner report were Allen Systems Group, Computer Associates, Logic Library, and Flashline. Blechar designated as niche players Fujitsu, Data Advantage Group, ComponentSource, Select, Unicorn, and Adaptive. Among those he named "visionaries" were MetaMatrix, Troux, and Informatica. — Susana Schwartz

SUSANA SCHWARTZ is a New York-based freelance writer specializing in emerging technologies and their impact on IT infrastructure.

We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.
Previous
1 of 3
Next
Comment  | 
Print  | 
More Insights
InformationWeek Is Getting an Upgrade!

Find out more about our plans to improve the look, functionality, and performance of the InformationWeek site in the coming months.

News
Becoming a Self-Taught Cybersecurity Pro
Jessica Davis, Senior Editor, Enterprise Apps,  6/9/2021
News
Ancestry's DevOps Strategy to Control Its CI/CD Pipeline
Joao-Pierre S. Ruth, Senior Writer,  6/4/2021
Slideshows
IT Leadership: 10 Ways to Unleash Enterprise Innovation
Lisa Morgan, Freelance Writer,  6/8/2021
White Papers
Register for InformationWeek Newsletters
Video
Current Issue
Planning Your Digital Transformation Roadmap
Download this report to learn about the latest technologies and best practices or ensuring a successful transition from outdated business transformation tactics.
Slideshows
Flash Poll