Are Mergers a Good

By Ellen Pearlman  |  Posted 02-06-2006

Robert I. Sutton: Making a Case for Evidence-Based Management

Robert I. Sutton, professor of management science and engineering at the Stanford Engineering School, has tried several times to complete a book about wisdom and management. After two stalled attempts to write a book with the prospective title The Attitude of Wisdom, he and Jeffrey Pfeffer, coauthor with Sutton of The Knowing-Doing Gap: How Smart Companies Turn Knowledge into Action (Harvard Business School Press, 2000), set out to write the forthcoming Hard Facts, Dangerous Half-Truths, and Total Nonsense: Profiting from Evidence-Based Management (HBS Press, March 2006). When they finished it, says Sutton, "Jeff turned to me and said, 'We finally wrote the book on wisdom when we wrote this book.' "

Sutton, who is also codirector of the Center for Work, Technology, and Organization at Stanford, notes that, like the vast majority of ideas, the notion of evidence-based management is not new.

"You know, in 1911, Frederick Winslow Taylor had a bestseller called Scientific Management that examined the same ideas. Still, I think we have a somewhat different point of view on the topic." For Sutton, evidence-based management isn't just about sifting through mountains of management research before making decisions. It's about testing ideas, and learning from your mistakes. It's really an attitude—in fact, you could call it the Attitude of Wisdom.

Sutton, a member of CIO Insight's advisory board, spoke at length with Editor-in-Chief Ellen Pearlman about what makes a manager wise, and about how to further that attitude in your own organization. An edited version of their conversation follows.

CIO INSIGHT: How did you and Jeffrey Pfeffer come to write this book?

Sutton: The basic idea of our last book, The Knowing-Doing Gap, was that there are many situations where people in organizations know what the best practices are, or have a very strong belief about what it'll take to make their organization as effective as possible. But there's something about the organization that makes it difficult, or impossible, to turn that knowledge into action.

As we went around and worked with companies on the issue, we kept running into managers who would tell us about these great practices they were implementing. Yet we knew that, based on the best of our knowledge, they were at least half wrong. We started looking at the notion that there's a lot of bad advice in the business-knowledge market that makes it harder for people to tell what's good and what's bad. So we felt we had to write the opposite book. I guess the sound bite is that the last book was on the knowing-doing gap, and this one is on the doing-knowing gap.

Can you define evidence-based management?

To me it's a simple willingness to find the best evidence you can, and then act on it. That may sound incredibly simplistic, yet the market for business knowledge works in such a way that it is incredibly hard to do that. Here's a political analogy: The Economist magazine, which is not exactly a bastion of liberal thinking, recently noted that during the run-up to the war in Iraq, if somebody had evidence for weapons of mass destruction, they had to jump over a matchbox, but if they had evidence against WMDs, they had to jump over a mountain.

Also see: an interview with Bob Sutton on The Learning-Doing Gap at LiNEZine by Beth Garlington Scofield

All of us as human beings have trouble getting past our biases and ideologies. Moreover, it's one thing to seek out and find good evidence, but you must also face it and act on it. It's not an amazingly complicated idea, but it's amazing how rarely it's done and how hard it is to actually do it in everyday life.

Why is it so hard to find good evidence?

It's hard to tell what's right and what's wrong, and anybody can bea management expert. It's a signal-to-noise ratio kind of problem: There's just too much stuff out there. And what sells best is by no means the best way to actually practice management.

People are attracted to brand-new ideas and things. The way most knowledge is developed is that people build on one idea, or on nothing at all. A consequence is that the same new things get discovered every six or seven years, and just relabeled. Think of business process reengineering, which is built on a whole lot of earlier ideas.

One of my colleagues in the School of Education here describes our country as the United States of Amnesia. We keep pretending the same old ideas are brand-new, and we never learn from what's already happened. Consider the idea of incentive-based pay for teachers. Once again, it's the rage throughout the U.S. Yet this idea actually gets debunked about every eight years, going back almost a hundred years.

Yet there is good evidence for business decisions, right?

Yes, but it's just amazing to me how many companies refuse to accept even the strongest evidence. The two strongest bits of evidence that I can think of in business that are routinely ignored are the evidence of the dangers of ERP implementations and the evidence on the failure of most mergers. Everybody who considers a merger either pretends they don't believe the evidence, or they say it can't happen to them.

Next page: Are Mergers a Good Idea?

Are Mergers a Good

Idea?">

The evidence on mergers is not good, but you're not saying don't do them. Are you just saying learn from where they can go wrong?

There are two parts to the issue. First, what are the conditions under which mergers succeed or don't succeed? The worst condition is when you have two companies of about equal size, and they're far away from each other. That's just about as bad as it gets—when they start talking about mergers of equals.

The other part has to do with the cultural integration of the two companies. Look at Cisco Systems. It's a company that actually does a pretty good job of acquiring companies. There are a number of reasons, but one thing is that they have an incredibly detailed merger integration process. And whenever they do a merger, they conduct a postmortem, and try to figure out how they should change their merger integration process as a result. That's a very different process from what you usually see. Cisco looks at the evidence about why mergers tend to succeed or fail, and then constantly updates its own merger process.

How should corporate managers—who have all kinds of pressures on them, and no time—go about finding real evidence?

That's a great question. The snide answer is that it's amazing to me how many managers don't have time to look for the evidence but do have time to make the same mistake over and over and over again.

First of all, look at the second table in Chapter 2 of the book (see table, page 64). You could use that as a guide to what to ignore. It's an old theme of mine: One of the most important things successful managers do is not to figure out what they should pay attention to, but what they should ignore.

Second, every manager needs to take the time to examine their management and business assumptions. Go out and cut through the layers and take a look at what's going on. Executives at General Motors—not just the top 20 people at the company, but further down in the organization than you might think—are provided with a car that's gassed up, washed and kept repaired. They don't have to deal with the horrible experience of buying a car from a dealer. Rather, they're isolated from the experience of the average American consumer buying a General Motors car. Not exactly a recipe for self-examination. Yet this sort of behavior is standard at lots of companies.

There are times when it's reasonable to figure out what the best evidence is when you're making certain decisions, especially when they're expensive. But what we're really trying to say is that you should act on what you know now, and when you face problems and setbacks, you should take the time to update and learn from what has happened, both good and bad. One of my former students, Andy Hargadon, says that some companies have 50 years of experience, but it's the same year 50 times over and over again. Other companies have 50 consecutive years of experience. To me that's the important distinction.

So it's just common sense that you shouldn't act unless you have done some homework. Yet so many people go by what they believe or what their company believes, and don't look for these truths. Why is that?

My perspective on business is that people in the most effective organizations usually have common sense and act on it. In fact, I believe that many of the most effective organizations are masters of the obvious. Dell, Toyota—it's not exactly like they have a business model their competitors don't understand. They just do it right. But common sense is uncommon.

There are certain things that organizations that are better at evidence-based management do, and one of them is create an environment where it's safe to speak up. It's very difficult to do. One thing I like about [Intel's] Andy Grove—and he's been consistent about this throughout his career—is that he wants to hear the bad news as well as the good, and he wants to have people argue with him.

Next page: Separating Evidence From Wishful Thinking

Separating Evidence From Wishful

Thinking">

Isn't there a danger that managers just go out and find the evidence that supports what they want to do anyway?

There's the full-blown "what's the weight of the evidence" approach. If you're going to do something really important, like a multibillion-dollar merger, then you might want to look at the best evidence about the conditions under which mergers are most and least likely to succeed. There's quite excellent evidence about that.

But that's the extreme case. If you just have a hunch that something is going to work, then maybe the best thing to do is to try a small-scale experiment to see if it will work—which is one of those things that organizations very often will not do, even though it's actually pretty cheap.

We use the example of 7-Eleven in the book. Very early in my career, 7-Eleven spent millions and millions of dollars trying to improve the quality of service in its stores. Yet they didn't collect any evidence about how good or bad the service actually was. The then-CEO, Jodie Thompson, had experienced a rude clerk, and it set off a chain of events which led to the spending of millions of dollars. The highlight was a "Thanks a million" contest, in which the manager of a store in Plano, Texas, in a drawing conducted by Monty Hall, won $1 million for the quality of service in her store. These things were costing serious amounts of money.

But they only started looking at actual evidence, including some stuff I was involved in, after they spent all that money. The guy who was head of field research, Larry Ford, kept saying to them, "Well, why don't you just do a couple of little experiments?" and "Why don't we run the numbers to see if this is actually worthwhile?" And they said, "No, we're going to go ahead and do it because Jodie wants it."

What 7-Eleven was missing was an intermediate approach that might have looked at things that might work under certain conditions and not others—perhaps a pilot study, for instance. You need to treat your organization as an unfinished prototype that you can update to accommodate the vast information you have out there. And if you can treat such projects as temporary, to the extent possible, that reduces the risk as well. But it's amazing how few organizations actually use that approach.

Why is it so hard to get people to test something out before they leap?

There's a whole term for it, called "escalating commitment to a failing course of action." If you look at the conditions under which it happens, a lot of times managers make a public commitment to a course of action and spend a lot of resources. They mobilize a whole base of support around it, and their medium-term financial well-being is dependent on it.

At that point it's very hard to pull the plug and to convince others they should do so. The thing to do is to build in organizational checks and balances so you're allowed to question things and allowed to fight such projects.

I don't believe that management can be turned into a science. In fact, I don't think that if you educated managers to find and analyze data, and judge their best ideas, they would become any better managers. The first and foremost thing about management is that, like medicine, it's a craft, and there's no way that I can figure out how people can learn to practice it, other than by actually doing it.

Thus the analogy to medicine. If you look at the literature on medicine, the primary predictor of how effective your doctor or your hospital will be on any given procedure is how many times they've done it before. So this notion that it's a craft applies in both cases, but evidence-based management, like evidence-based medicine, provides an attitude and a body of knowledge that enables people to practice their craft better.

Is there any way to use information technology that would make companies better at evidence-based management?

There are a number of examples—Yahoo!, DaVita [kidney dialysis services], Harrah's Entertainment—of organizations that rely heavily on their IT organizations to produce the best information so they can make decisions. In most modern organizations, IT plays a key role in getting good information back to users, and getting it back to them in a form they can actually use. I don't know how you could do this in most organizations without IT.

At the same time, one of the things I'm always amazed by in the IT world is the amount of time large software vendors spend talking to people in IT who buy their systems, but no time talking to users. The problem in IT is that no one seems clear on who's the customer.

What is the role of instinct and gut versus evidence? How do you balance the two?

It's interesting because it gets back to the current bestseller, Malcolm Gladwell's Blink, which claims that snap judgments are often better than deep analysis. It's based on a quite sound body of evidence.

But one of the points the book makes clear is that people who are especially good at making snap judgments are people who have a huge amount of clinical and especially evidence-based experience. Doctors are decision-makers, for instance, and they can make instant decisions because they have both clinical experience and the right evidence.

So there is an argument that if you do too much analysis, what you end up doing is wringing your hands and thinking about what to do under this condition, what to do under that condition. Again, you're better off if you can set up a situation where small failures and small successes can be learned from, as opposed to doing analysis after analysis after analysis and you still can't quite figure out what to do.

A.G. Lafley, chairman and CEO at Procter & Gamble, is a good example. The company has had a culture that has suffered from too much analysis and too much market research. But Lafley has pushed them to put products onto the market earlier just to see what happens, because it's hard to know otherwise.

So ultimately you need the right balance between being able to get the evidence, and not getting caught up in continually getting more and more. You have to be able to act, as well.

That's why the core idea of the book is this notion of wisdom—acting with knowledge while doubting what you know. You can't be so arrogant that you can't learn, nor so insecure that you can't act.