Isn't there a danger that managers just go out and find the evidence that supports what they want to do anyway?
There's the full-blown "what's the weight of the evidence" approach. If you're going to do something really important, like a multibillion-dollar merger, then you might want to look at the best evidence about the conditions under which mergers are most and least likely to succeed. There's quite excellent evidence about that.
But that's the extreme case. If you just have a hunch that something is going to work, then maybe the best thing to do is to try a small-scale experiment to see if it will workwhich is one of those things that organizations very often will not do, even though it's actually pretty cheap.
We use the example of 7-Eleven in the book. Very early in my career, 7-Eleven spent millions and millions of dollars trying to improve the quality of service in its stores. Yet they didn't collect any evidence about how good or bad the service actually was. The then-CEO, Jodie Thompson, had experienced a rude clerk, and it set off a chain of events which led to the spending of millions of dollars. The highlight was a "Thanks a million" contest, in which the manager of a store in Plano, Texas, in a drawing conducted by Monty Hall, won $1 million for the quality of service in her store. These things were costing serious amounts of money.
But they only started looking at actual evidence, including some stuff I was involved in, after they spent all that money. The guy who was head of field research, Larry Ford, kept saying to them, "Well, why don't you just do a couple of little experiments?" and "Why don't we run the numbers to see if this is actually worthwhile?" And they said, "No, we're going to go ahead and do it because Jodie wants it."
What 7-Eleven was missing was an intermediate approach that might have looked at things that might work under certain conditions and not othersperhaps a pilot study, for instance. You need to treat your organization as an unfinished prototype that you can update to accommodate the vast information you have out there. And if you can treat such projects as temporary, to the extent possible, that reduces the risk as well. But it's amazing how few organizations actually use that approach.
Why is it so hard to get people to test something out before they leap?
There's a whole term for it, called "escalating commitment to a failing course of action." If you look at the conditions under which it happens, a lot of times managers make a public commitment to a course of action and spend a lot of resources. They mobilize a whole base of support around it, and their medium-term financial well-being is dependent on it.
At that point it's very hard to pull the plug and to convince others they should do so. The thing to do is to build in organizational checks and balances so you're allowed to question things and allowed to fight such projects.
I don't believe that management can be turned into a science. In fact, I don't think that if you educated managers to find and analyze data, and judge their best ideas, they would become any better managers. The first and foremost thing about management is that, like medicine, it's a craft, and there's no way that I can figure out how people can learn to practice it, other than by actually doing it.
Thus the analogy to medicine. If you look at the literature on medicine, the primary predictor of how effective your doctor or your hospital will be on any given procedure is how many times they've done it before. So this notion that it's a craft applies in both cases, but evidence-based management, like evidence-based medicine, provides an attitude and a body of knowledge that enables people to practice their craft better.
Is there any way to use information technology that would make companies better at evidence-based management?
There are a number of examplesYahoo!, DaVita [kidney dialysis services], Harrah's Entertainmentof organizations that rely heavily on their IT organizations to produce the best information so they can make decisions. In most modern organizations, IT plays a key role in getting good information back to users, and getting it back to them in a form they can actually use. I don't know how you could do this in most organizations without IT.
At the same time, one of the things I'm always amazed by in the IT world is the amount of time large software vendors spend talking to people in IT who buy their systems, but no time talking to users. The problem in IT is that no one seems clear on who's the customer.
What is the role of instinct and gut versus evidence? How do you balance the two?
It's interesting because it gets back to the current bestseller, Malcolm Gladwell's Blink, which claims that snap judgments are often better than deep analysis. It's based on a quite sound body of evidence.
But one of the points the book makes clear is that people who are especially good at making snap judgments are people who have a huge amount of clinical and especially evidence-based experience. Doctors are decision-makers, for instance, and they can make instant decisions because they have both clinical experience and the right evidence.
So there is an argument that if you do too much analysis, what you end up doing is wringing your hands and thinking about what to do under this condition, what to do under that condition. Again, you're better off if you can set up a situation where small failures and small successes can be learned from, as opposed to doing analysis after analysis after analysis and you still can't quite figure out what to do.
A.G. Lafley, chairman and CEO at Procter & Gamble, is a good example. The company has had a culture that has suffered from too much analysis and too much market research. But Lafley has pushed them to put products onto the market earlier just to see what happens, because it's hard to know otherwise.
So ultimately you need the right balance between being able to get the evidence, and not getting caught up in continually getting more and more. You have to be able to act, as well.
That's why the core idea of the book is this notion of wisdomacting with knowledge while doubting what you know. You can't be so arrogant that you can't learn, nor so insecure that you can't act.