How to Increase the Reliability of Your IT Infrastructure Using Predictive Analytics REGISTER >
As amazing as they are, these systems aren't hard to create. "The software is not that difficult," says Nigel Rayner, research director at Gartner. "These vendors are bright people. The bigger problem will be that it's difficult for companies to adopt."
Some familiar organizational problems emerge. "You get tactical purchasing of BI solutions in silos," Rayner says. "Sales and marketing will go out and buy a system, for example, but they won't talk to finance. The CEO and CFO should be much more active in addressing this and take ownership. Otherwise, you end up with BI application spaghetti."
There are other people problems. Bruce Kasanoff of Westport, Conn., an adviser to technology companies, has helped firms wrestle with these systems. "The flaw with a lot of BI software is the people using it," he says. "It gives people the capability to do analysis, and they're not necessarily doing it correctly. They're not able to validate the conclusions. Instead of having a statistician or business analyst trained in confidence levels and statistics, it's just a marketing manager sitting at his desk. BI is democratizing some functions that maybe aren't ready to be democratized. Companies have to be careful that they aren't using powerful tools without adult supervision."
When Kasanoff was making the detailed decisions about fields and queries in BI systems, he realized how easy it was to arrive at misleading conclusions. "I could access a sales database and run a report that says these are my best customers. But it didn't look at support calls or returns or payments or credit problems. It was generating a completely false impression. There are times when your highest-volume customer loses you more money than anyone else, because of returns or late payments."
Or the numbers might tell you that customers in the Northeast are 20 percent more profitable this year, and you might change your strategy based on that. "And yet the fact is that because you don't understand regression analysis, you don't know that maybe it was just that it didn't snow in the Northeast this year," Kasanoff says. "Unless you really know how to isolate the cause, you're not able to draw an accurate conclusion." One remedy, he says, is a test: Offer a subset of your customers a certain product or discount and see if your conclusions hold up.
The key to using data is to understand your business first, then look at the numbers, says Peter Fader, associate professor of marketing at The Wharton School at the University of Pennsylvania. "Instead of crunching numbers, you need to sit down and say, 'What is the story? How do our customers act, how does the customer story change over time?' Tell those stories in a qualitative way, then bring in the data to check them."
Fader blames the dot-com crash partly on new companies trying to use data to understand customers for the first timeand misreading it. Experienced companies like Lands' End Inc. knew how customers behaved and weren't led astray by the flood of Web data, Fader says. "I spend a lot of time looking at data patterns, and I get concerned about a lot of data mining initiatives. People are throwing tons of resources at these databases hoping to discover some stunning insights, but it can do as much harm as good. They are basically taking away any judgment or critical thinking and instead saying, 'Ah, we'll have the computer do it for us.'"
The system Lesica is deploying at NewRoads is from MicroStrategy Inc. in McLean, Va. "We ask end-users what they want to learn with this software," says Brian Brinkmann, a senior product manager at MicroStrategy. "Then we sit down with a fairly knowledgeable analyst who understands the business model and understands the data, and we build the metricssales, or gross profits, whatever." Based on what they have learned over the years from other clients, MicroStrategy's experts have created hundreds of "out-of-the-box" CRM reports, which Lesica can select from or customize for his individual customers.
"How do we stop an end-user from misinterpreting the data?" Brinkmann asks. "We train them so they will understand what it is they're interpreting, how to navigate, drill down, what supporting reports are available to interpret what they're seeing. We go through a scenario with themsales are down, for instance." Typically, he says, companies will have "power users" in several departments who create the reports that others then use. So what Foster calls "data black belts" won't be out of workin fact, Gartner sees a looming shortage of thembut their handiwork will be much more widely usable.
A similar conversation takes place with new customers at SAS Institute Inc., an analytics vendor in Cary, N.C. SAS first asks the company what one metric it wants to target daily. Then it applies data mining tools to all of the company's data to discover what variables drive the key target. The Vermont Country Store chose daily revenue as its Web sites target, the same target it tracks in its catalog and retail store channels. SAS crunched the numbers and determined which variables, and in what order of importance, made a difference in daily revenue. At the top of the list was requests for catalogs. Knowing this, the company changed the Web site to make it easier to request a catalog, and since the dashboard shows how requests are going every day, the company can test the impact of the change.
The notion that the key metric was catalog requests was not accepted until it was discussed with the company. "It has to make sense," says Tom Grant, director of e-business strategy at SAS. "There could always be a casual relationship between two variables. So you have to be able to explain the result. In this case, Vermont Country Store said it makes sensepeople coming to the Web site for the first time want to see more items, so they request a catalog, and if they are that interested, they are also likely to buy."
The variables are recomputed nightlyalthough the key ones don't change oftenand SAS meets with every client for two hours each week to discuss what they've learned. That saved another client, which watched catalog response drop off and was tempted to respond by reducing catalog circulation. Analyzing Web log data found the answer: Customers were looking at the catalogs all right, but instead of buying by phone, they were going to the Web site to order.
So analytics isn't people-free business in a box? "No," says John Brocklebank, vice president of e-business development at SAS and one of 500 or so employees with advanced degrees running around the SAS campus. "We don't give them a black box. We give them a PhD in a box."