The High Cost Of Cool

New bells and whistles on your favorite technology are exciting, but sometimes they're just noise.

Craig Mathias, Contributor

March 14, 2012

4 Min Read

The pace of evolution in modern computing, which can trace its roots back to the 1940s, has slowed in most dimensions in recent years. Sure, we've got solid-state drives, multi-core processors, DDR3 DRAM, gigabit Ethernet, and more, but the fundamental nature of the computer hasn't changed much at all in decades--and that's a good thing.

In the 1970s we debated all manner of computer, network, and multiprocessor architectures; everything was possible then. But we've now converged to a core set of IT intrinsics, and the computer itself is no longer the hallmark of innovation it once was. And that's good because IT buying is now mostly risk-free.

Except in one key dimension: user interface. We've gone from punch cards to teletypes to smart terminals to WIMP (windows/icons/mouse/pull-down menus) to--with tablets, handsets, and some notebooks--gestures and voice. In some cases the transition from one UI model to the next has been easy, but rapid evolution, especially in the last few years, is becoming increasingly difficult for many to swallow--and, quite frankly, most of this change is totally unnecessary.

Why? Because much (if not most) of the training and support costs that IT departments must bear on a continuing basis are with the goal of simply helping the user to be effective and productive with a given device. Making routine tasks easy and sure. Enabling secure, transparent, and accurate data management.

[ Is Microsoft's upcoming operating system refresh more cool than practical? See 8 More Ways Windows 8 Could Be Great. ]

All too often, however, these important goals are secondary to those of a cool new user interface. And the goal of that cool new interface, then? Product differentiation--providing an incentive to buy a cool new product, and, all too often, cool for cool's sake. Think iPad and you'll see what I mean. Sure, the iPad, as we saw with Apple's recent announcement of a product so cool it needs only one name (more confusion afoot there, I think), improves in the hardware domain with each new edition, but it's still a big iPod Touch and often inconvenient for even simple enterprise data-manipulation activities. I never thought iTunes would be a business necessity, and I still don't think it should be.

I can't tell you how many IT shops I've visited that are still using that clunky, slow, and soon-to-be-unsupported Windows XP. Why? Well, apart from all that clunkiness and slowness, it works. But most importantly, XP retains market share because of latent pushback from the Windows Vista fiasco and fundamental user familiarity--and thus productivity. Why change the user interface of a given operating system or device just to do the same tasks as before, only differently?

The only real benefit here accrues to the suppliers of that coolness, who are, after all, in it for the money and need to continue to sell new stuff, needed or not, to keep the cash tumbling in. IT organizations get stuck with new training and support costs that they really can't afford, and overall productivity is impacted as users learn new ways to either do what they did before, or screw up in the process. It's bad enough that underlying implementation details and features of new operating systems change, forcing IT to re-evaluate, with each new release, such subtleties as reliability, integrity, application compatibility, and security, but forcing users to change for change's sake is simply going too far.

Please note that I'm not arguing against progress. I know there are still users of WordStar on CP/M out there, and that's by no means what I'm advocating. If there are real, demonstrable benefits to new user interfaces, and the cost of these can be successful amortized, then, well, great--let's have them. But my personal appeal is for a little less "progress" here.

I switched to a Mac around the time of the aforementioned Vista fiasco, and, while Apple has issued numerous updates to OS X over the five or so intervening years, the essential integrity of the user experience remains intact. Do I like the Mac UI? No, not particularly--but it gets the job done, and I'm satisfied that security, integrity, and other requirements are being properly addressed in our IT operations. And that, and not coolness, must be the bottom line for any enterprise.

Craig Mathias is a Principal with Farpoint Group, a wireless and mobile advisory firm based in Ashland, MA. Craig is an internationally recognized expert on wireless communications and mobile computing technologies. He is a well-known industry analyst and frequent speaker at industry conferences and trade shows.

Predictive IT analytics can provide invaluable insight--vital if a private cloud is in your future. Find out how in the new, all-digital Predictive IT Analytics issue of InformationWeek. Also in this issue: Randy Mott named CIO of General Motors, how Dell is pushing into the enterprise data center, and eight key features in Windows 8. (Free registration required.)

Read more about:

20122012

About the Author(s)

Craig Mathias

Contributor

Craig Mathias is a Principal with Farpoint Group, a wireless and mobile advisory firm based in Ashland, MA. Craig is an internationally recognized expert on wireless communications and mobile computing technologies. He is a well-known industry analyst and frequent speaker at industry conferences and trade shows.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights