Case Study: Nasdaq

Here’s a story that got lost as the dot-com bubble expanded. In 1998, as the Nasdaq Composite Index smashed through 2000—on its way to over 5000 just two years later—executives at The Nasdaq Stock Market Inc. were worried about illogically high equity valuations. Stock prices were going up faster than earnings, and outlandish price-to-earnings ratios—when there were any earnings to consider—weren’t dissuading investors from buying more and more shares. “Supply was racing to keep up with demand, and the traditional equilibria curves in which high P/Es curbed investor enthusiasm were falling by the wayside,” says Alfred Berkeley, who was president of Nasdaq at the time. Berkeley was particularly concerned that the bubble then just beginning to expand would suddenly burst, taking with it both Nasdaq’s profits and its reputation; he couldn’t have known, of course, that the real run-up hadn’t even begun yet. And he was eager to figure out a way to address this problem before it got worse. Drawing on more than 20 years of experience as an investment banker with a focus on technology companies and startups prior to becoming president of Nasdaq in the mid-1990s, Berkeley was convinced that the deviation in market performance he had identified went well beyond anything traditional statistical modeling could handle. This was new territory—complex behavior that defied market expectations and broke routine economic rules, and he thought standard number-crunching would be too rigid to explain the anomalies.

So Berkeley contacted Michael Brown, Nasdaq’s chairman at the time (and a former chief financial officer at Microsoft Corp.), and asked him what Gates & Co. would do if faced with such a deeply perplexing business irregularity. Brown responded that Microsoft would hire the smartest people anywhere in the world to analyze it—and he told Berkeley that for this problem, he should contact BiosGroup Inc., a business strategy consultancy in Santa Fe, N.M.

Out of this conversation an intriguing collaboration emerged between Nasdaq and BiosGroup, in which a new, virtually untested programming technique called agent-based modeling was used to create an alternate, computer-based version of the exchange, which mimicked in all ways the behavior of Nasdaq and its participants. By running this dead-ringer simulation of Nasdaq over a period of nearly three years, BiosGroup scientists succeeded in forecasting the exchange’s performance under a range of market conditions. For instance, they mapped the influence of day traders on stock prices under varied circumstances, which provided clues about what was behind the big run-up in technology stocks. And they predicted what would happen to stock prices and liquidity when equity tick sizes were lowered to a penny from the traditional sixteenths of a dollar, or 6.25 cents, as the Securities and Exchange Commission was then planning to mandate. In the end, the approach didn’t provide Berkeley with the specific information he had originally wanted—that is, a formula for keeping the dot-com bubble from expanding.

The crash of 2000 took care of that. But in Berkeley’s view, it was the unexpected and unintended results of agent-based modeling that were most valuable: The program offered never-before-obtained insights into how markets like Nasdaq operate and how they respond to regulatory changes. “It was extremely useful to me because it had predictive power,” says Berkeley, now Nasdaq’s vice chairman. “It clarified for me which management issues are important to watch and that I must be prepared to address.”

Get the Free Newsletter!

Subscribe to Daily Tech Insider for top news, trends, and analysis.

Latest Articles