Virtualization: How Much Is Enough?

Most organizations have virtualized one-half to two-thirds of their systems. For many, that's about right.

Art Wittmann, Art Wittmann is a freelance journalist

August 2, 2011

4 Min Read

Anyone who has been around the IT block a few times knows that even as new technologies come to dominate, old technologies never die and some won't even fade away. Skeptics, consider this: In its most recent quarterly statement, IBM said that System z mainframe revenues increased 61% over the same period last year. That's right--mainframe sales are on fire. It's the latest data point illustrating the enterprise struggle to find the right balance between workload agnostic systems (virtualized generic servers, a.k.a. private cloud) and workload specific systems--like mainframes, database machines, and all those appliances we've been snapping up for years.

Even for what we think of as appliance applications, the decision hasn't always been clear cut. Back in the 90's, if you wanted the gold standard for routers, you shelled out $30k for a Cisco AGS+, but if you needed a fast router with basic functionality, a Netware 3.1 server did a pretty good job. You paid Novell about $2000 and the hardware ran around $5000, quite a deal for cash-conscious organizations. Likewise, you had the option in many cases to purchase firewall software and run it on your own hardware, or to buy an appliance. You could roll your own for about $10,000 or spend $25,000 on the appliance version.

This distinction was sometimes lost on startup companies, but never on their funders. You want to make software and sell it for $5000 a copy, no VC will touch your company. Want to put that same software on $5k white box PC running a stripped down version of Linux and sell it for $50k? The VCs will be happy to buy you lunch and talk about that idea.

The result is that we often run appliances in places where there's no performance related reason to do so, though you can make the argument that because they run on a stripped down OS, they're easier to manage and potentially more secure. I say potentially, because if there is a security vulnerability, it's less likely that you can do anything about it and in many cases, less likely that the vendor will know about it.

Even at the lowest levels, we can debate where and how to run a service. DNS, DHCP, LDAP could all run together on a virtualized server or they could run in an appliance. File and print services can run on a big fancy NetApp-like box, or they could run on a virtualized server hooked up to local storage or to SAN.

Of course, it's still pretty reasonable to run some services un-virtualized. If you have a database that runs well and uses all of the resources of a memory loaded server with 32 cores and some fast networking connections, you might think twice before sliding virtualization into that mix. It's not like you're going to start running more VMs on that system. But there's also the clear advantage of being able to run that database on any 32 core box, not just that particular 32 core box where it runs now.

For larger, complex, mission-critical systems, the tradeoff in going virtual is less about performance and more about the testing and validation it takes to make that virtualized transition. It's not trivial, it takes time, and for just about 100% of IT organizations to whom I've talked, it takes resources that they really just don't have. That leaves most organizations with one-half to two-thirds of their systems virtualized. They've done the easy stuff, the rest is much harder.

So, for that other one-third to one-half of systems and workloads, there's a completely valid argument to be made for choosing workload-specific hardware. That could be an argument for a database toaster like Oracle's Exadata, or it could be a good mainframe like your Pa used to run. These devices are secure, they're very good at what they do, and they can be run by surprisingly few staff people--if you can find qualified staff people (staffing being probably the biggest deterrent to System z adoption). Should 100% virtualization be your goal? For a lot of organizations, the answer is probably not.

Art Wittmann is director of InformationWeek Analytics, a portfolio of decision-support tools and analyst reports. You can write to him at [email protected].

To find out more about Art Wittmann, please visit his page.

More than 100 major reports will be released this year. Sign up or upgrade your InformationWeek Analytics membership.

At the 2011 InformationWeek 500 Conference, C-level executives from leading global companies will gather to discuss how their organizations are turbo-charging business execution and growth--how their accelerated enterprises manage cash more effectively, invest more wisely, delight customers more consistently, manage risk more profitably. The conference will feature a range of keynote, panel, and workshop sessions. St. Regis Monarch Beach, Calif., Sept. 11-13. Find out more and register.

Read more about:

20112011

About the Author(s)

Art Wittmann

Art Wittmann is a freelance journalist

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights