Essentially, all models are wrong, but some are useful. --George E. P. Box

Imre Kabai, Contributor

February 27, 2013

5 Min Read

One of my most embarrassing memories from college:

My lab partner and I went to a nuclear research reactor to expose samples to thermal neutrons and measure the decay spectrum. The reactor, one of two such facilities in the country, had to be started up. Technicians in white lab coats were busy turning knobs, and a large vacuum tube display indicated the increasing power output. Around the reactor core the bluish glow of the Cherenkov radiation outlined the deep underwater structure. The machine was amazing.

Then the chief scientist looked at our lab results from the previous week's experiment and noticed that we skipped the error calculations. He promptly gave us Fs and sent us home. The reactor had to be powered down. We wasted the time of many people.

How many times do we prioritize our IT projects and services based on "objective" scores while all the numbers are well within one standard deviation of one another? How many IT decisions should get an F because, behind the numbers, there's an unknown degree of uncertainty?

[ Change your life or career or both with better decisions. Read 7 Ways To Improve Your Decision Making. ]

All IT decisions should be based on forecasts. We try to predict the immediate and future impact of our decisions, such as picking a product, choosing a vendor, funding a project or launching a new IT service. Forecasting is nothing but the collection of information and the improvement of the signal/noise ratio until we find the option with the best probable outcome.

Here is where it gets complicated. We have to evaluate tangible and intangible criteria such as risk, capex, opex, TCO, ROI, strategic value, business outcome, competitive advantage, fit in the current environment, institutional memory, customer perception, future IT trends and sustainability.

Since testing the different options -- implementing them and waiting a few years to see the results – isn't feasible, we use models instead. These models can get rather sophisticated. We can bring in the vendors for short intro projects or install proof-of-concept pilot systems.

Global CIOGlobal CIOs: A Site Just For YouVisit InformationWeek's Global CIO -- our online community and information resource for CIOs operating in the global economy.

We usually make decisions within a group of business and IT experts who come from various areas and share little common ground or knowledge. And when it comes to new information technologies, often the only deep expert in a room is a biased vendor.

So how can we improve the IT decision-making process? Here's a small collection of ideas:

Human Factors

-- Express technology in terms of business outcome and value.

-- Assemble a well-balanced team of business and IT experts.

-- Deploy devil's advocates to shake the group out of groupthink.

-- Don't give short shrift to culture and institutional memory.

-- Be aware of biases such as anchoring, availability heuristics and loss aversion.

-- Use collective intelligence (Delphi Method).

-- Spend time to get on the same prior beliefs (Aumann's Agreement Theorem).

-- Don't decouple authority from accountability. Bad decisions must have consequences.

-- Incorporate sound governance. Optimize to the right level between local (group) and global (enterprise).

-- Understand the long-term value and cost.

-- Don't throw good money after bad. A past bad decision isn't a good reason for the next one.

-- Don't conduct a "supporting analysis" when a decision has already been made.

-- Spend time uncovering the hidden costs and risks (ambiguity effect).

-- Understand that you often don't know what you don't know when it comes to new technologies.

-- Understand that learning is part of the process of making a good decision.

Techniques

-- Use models, pilots and proof of concepts to improve the signal/noise ratio.

-- Remember the quote from Mr. Box: Models are just models. Although they can be useful, they're not equal to reality.

-- When it comes to models, apply Occam's razor: Go for the simplest one.

-- Avoid analysis paralysis. Cap the time spent on the decision.

-- Use decision matrices and sensitivity analysis when appropriate.

-- Remember that Magic Quadrant winners aren't always the best technology choices for your particular situation.

-- While delaying the decision is an option, understand the associated costs and risks.

-- Find out vendor weaknesses from competitors; they have in-depth knowledge.

-- Find your own references. Every vendor has a few great success stories.

-- Use sound requirements-gathering practices

And when all else fails, dial 1-800 psychic hotline.

Great reads on human factors are Daniel Kahneman's Thinking, Fast and Slow and David McRaney's You Are Not So Smart.

For techniques, I loved Nate Silver's The Signal And The Noise.

Other IT worst practices and "core incompetencies" are discussed on the AntipatternZOO.

Attend Interop Las Vegas, May 6-10, and attend the most thorough training on Apple Deployment at the NEW Mac & iOS IT Conference. Use Priority Code DIPR02 by March 2 to save up to $500 off the price of Conference Passes. Join us in Las Vegas for access to 125+ workshops and conference classes, 350+ exhibiting companies, and the latest technology. Register for Interop today!

About the Author(s)

Imre Kabai

Contributor

Imre Kabai is director and chief architect at Granite, a $2.5B heavy construction company. Previously he worked as the enterprise architect of Stanford Healthcare, and chief architect of the SLAC National Accelerator Laboratory. His interests include enterprise architecture, systems engineering, emerging technologies, cyber security, and data science. Imre enjoys paddling and practicing aerobatics in his vintage airplane.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights