Practical Analysis: What Cloud Computing Really Means

If cloud adherents are using the technology as a means to drive the hard work of process reengineering, then bravo. But I don't give them so much credit.

Art Wittmann, Art Wittmann is a freelance journalist

April 15, 2010

4 Min Read

InformationWeek Green - Apr. 19, 2010 InformationWeek Green Download the entire Apr. 19, 2010 issue of InformationWeek, distributed in an all-digital format as part of our Green Initiative
(Registration required.)
We will plant a tree
for each of the first 5,000 downloads.


Art Wittmann

Art Wittmann

I can't tell whether all the talk about cloud computing, and the benefits attached to it, comes from talkers who are fox-like clever or just woefully misguided. Hype is hype, and there will always be those who make excessive claims about whatever the new hot thing is. Cloud computing won't cure the common cold, even if someone somewhere claims that it will.

The first excessive claim was brought to my attention by colleague John Foley, who puzzled over an analyst note pointing out that federal data center useful storage rates are a pathetically low 12% and, you guessed it, claimed that cloud computing is the fix.

At the same time, we just got the results of our survey of federal government workers and contractors on their plans for cloud computing. There's a lot of interest, planning, and, for lack of a better word, hope for cloud computing. The feds do like to plan--but not every plan becomes reality and not every envisioned benefit pays off. Here, the comments from our survey respondents are instructive. One contractor bemoaned the efforts of federal overseers to quash any effort he made to improve efficiency or save money. He wanted to spin up a new service for FEMA in Amazon's Elastic Compute Cloud, but was told he needed to do it in a Department of Homeland Security data center, where, he noted, the lead time to allocate a new server was one year.

While it isn't hard to imagine why EC2 might not be an appropriate place for a FEMA application, that 12-month provisioning time is enough to stop you in your tracks. I can imagine a backlog of requests, and I can imagine some lag time introduced by the need to coordinate server, storage, and networking considerations, but I can't imagine those things adding up to a year's delay. Knowing that it's DHS, I can imagine that applications have to be profiled and assessed for security and privacy before any resources are allocated. My point is that the actual provisioning might take days or weeks, but there's another 40 to 45 weeks of stuff going on here that probably has nothing to do with provisioning. So if the feds were 100% cloud tomorrow, how much would this provisioning problem actually change? Saving 10 to 12 weeks is nothing to sneeze at, but there's other systemic bureaucracy that's a far bigger deal.

The storage issue packs the same problem. It's not technology that causes a 12% utilization rate; it's organizational policies, and procedures. Sure, technologies like storage virtualization, thin provisioning, and data deduplication could help here (note that cloud computing isn't on my list), but the real issue is that storage needs to be managed differently.

While the federal government can be a poster child for waste and inefficiency, it exhibits just a more extreme set of maladies common in non-governmental organizations as well. For server provisioning, the majority of the delay from request to delivery won't be fixed by cloud computing per se; it'll be fixed by changing the way IT looks at the job of provisioning services. The problem is that most organizations have no stomach for reengineering their procedures to wring out the waste and delay, but they can get excited about a new technology like cloud computing.

So if the fox-like clever adherents of cloud computing are using the new technology as a means to drive the really hard work of process reengineering, then bravo. The thing to remember, especially for large organizations, is that the process mapping and engineering need to come first. If you implement technology without knowing your process, you can bet you'll be reworking that technology almost immediately.

Art Wittmann is director of InformationWeek Analytics, a portfolio of decision-support tools and analyst reports. Write to him at [email protected].

To find out more about Art Wittmann, please visit his page.

More than 100 major reports will be released this year. Sign up or upgrade your membership.

Read more about:

20102010

About the Author(s)

Art Wittmann

Art Wittmann is a freelance journalist

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights