Maybe you don't need to close down 40% of your facilities, like the federal government. But you can learn from Uncle Sam's missteps.

Michael Biddick, CEO, Fusion PPT

June 3, 2013

6 Min Read

InformationWeek Green -  June 10, 2013

InformationWeek Green - June 10, 2013

InformationWeek Green

InformationWeek Green


Download the entire June 10, 2013, issue of InformationWeek, distributed in an all-digital format (registration required).


As the largest IT spender globally, the U.S. government has amassed more than 3,100 data centers and, as of January, about $9.1 billion worth of applications -- with almost no sharing of resources across or even within agencies. Besides the cost to support and maintain these data centers, they consume an eye-popping amount of energy, initially estimated at 100 billion kilowatt-hours, or 2.5% of total U.S. electricity consumption.

The good news is that the government is aware of the problem and has been working on it for years, via the Federal Data Center Consolidation Initiative, a plan launched in 2010 (and recently panned by the Government Accountability Office) to chip away at data center energy consumption and the cost of operations. The current plan is to close 1,253 facilities by the end of 2015, but this goal is far from reality -- unfortunately, a lot more time is still being spent on inventories and plans than on actually making data centers more efficient. That's not to say there's no progress; the government expects to save $3 billion within the next few years. But every agency is off to a slow start. In fact, after years of planning, only the Department of Commerce has been deemed to have a complete plan to tackle the problem.

Obviously, data center consolidation is more difficult than anyone expected it to be, a reality I've seen firsthand while working on the effort. The problems aren't unique to the public sector. A lack of record keeping and plain old resistance to change are universal human traits.

I've learned a few lessons that may help make enterprise data center consolidation efforts more efficient than the feds have thus far been.

1. Allocate time and resources to an inventory, but don't stop the presses. Not surprisingly, one of the biggest technical challenges for the feds is figuring out what's inside those 3,000 or so data centers -- one agency had more than 8,000 applications. Decades of record keeping neglect have resulted in massive data-collection exercises that involve a combination of paper surveys (yes, really) and automated tools. The lesson is to put a detailed inventory process in place from the get-go to stop unmanaged growth before it eats your budget. But if it's too late for that, the next best thing is to start chipping away. Tempting as it is, don't wait for a perfect holistic picture that may never come into focus. You don't need a complete inventory before beginning a consolidation project or migrating some apps to the cloud. Early, steady progress helps justify the cost of the effort.

Once you have an application mapped, decide immediately what to do with it. Can it be decommissioned and the function eliminated? If not, can it be purchased in a software-as-a-service model? Can it be run on a virtual machine in-house or in the public cloud?

The government has developed an approach based on the Federal Enterprise Architecture that involves collecting inventory using a Software Asset Template (Word document download) to capture critical technical characteristics for major systems, from servers, operating systems and platforms to software engineering and databases/storage to delivery servers.

chart: Data center trends

chart: Data center trends

Paper-based data-collection exercises can be effective in small projects, but they don't scale. More progressive agencies are using application-discovery or dependency-mapping tools, such as BMC's Atrium Discovery and Dependency Mapping and Riverbed's AppMapper Xpert, that connect at the network layer, do packet inspections for a few weeks and produce automated application inventory reports. This can cut months off the discovery process.

Government agencies also tend to lack monitoring tools to track data center performance and application and energy usage. Don't make this mistake. Visibility into metrics is critical to relentlessly improving performance.

2. Aim for meaningful consolidation. Shuffling gear from one facility to another might save space, but there's a big difference between simple physical consolidation and truly making a data center more efficient. Most government facilities have way too many physical servers and very low utilization because they're packed with legacy applications and databases that don't support virtualization and can't be run in the cloud. Sound familiar? While targeting those applications for elimination makes sense, it's a wretched process. Never underestimate end users' desire to hang on to a legacy Cobol application.

To achieve real efficiency, someone has to make the hard decision to retire applications that, while helpful, aren't critical. And that brings us to the most challenging angle.

3. Get personal. People hate change, especially when it puts them out of a job. In the past five years, I've seen both passive and active resistance to data center consolidation. It's a bigger obstacle than technology.

Passive resistance might mean an admin is slow to respond to requests or raises objections, such as iffy security concerns, that limit the ability to capture inventory, migrate applications or execute plans. Worse, agency divisions vie to control the new consolidated data center and maintain control over their servers and apps. These turf battles are going to escalate as use of software-defined networking and private clouds increases. The hard reality is that when data centers shut down, part of the cost savings comes from reducing head count.

CIOs must use budgets as a strategic tool. Avoid recapitalizing equipment so that when hardware dies, it's not replaced. Along the way, reduce the workforce voluntarily. Yes, you may lose top-notch people, who can easily find new jobs. Use short-term contracts to fill gaps or provide surge resources. This approach will also bring in some fresh perspectives.

Eliminating a data center is a massive and traumatic undertaking. Instead of consolidation for its own sake, focus on making IT delivery as efficient as possible.

About the Author(s)

Michael Biddick

CEO, Fusion PPT

As CEO of Fusion PPT, Michael Biddick is responsible for overall quality and innovation. Over the past 15 years, Michael has worked with hundreds of government and international commercial organizations, leveraging his unique blend of deep technology experience coupled with business and information management acumen to help clients reduce costs, increase transparency and speed efficient decision making while maintaining quality. Prior to joining Fusion PPT, Michael spent 10 years with a boutique-consulting firm and Booz Allen Hamilton, developing enterprise management solutions. He previously served on the academic staff of the University of Wisconsin Law School as the Director of Information Technology. Michael earned a Master's of Science from Johns Hopkins University and a dual Bachelor's degree in Political Science and History from the University of Wisconsin-Madison. Michael is also a contributing editor at InformationWeek Magazine and Network Computing Magazine and has published over 50 recent articles on Cloud Computing, Federal CIO Strategy, PMOs and Application Performance Optimization. He holds multiple vendor technical certifications and is a certified ITIL v3 Expert.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights