Morgan runs technology operations for the 215-year-old Philadelphia Stock Exchange, the nation's first stock exchange. But its age doesn't make its daily load any easier, with networks having to handle 120,000 messages per second and peaks of 200,000 messages per second.
But because it is a financial exchange, any delayeven a half-secondis not acceptable.
"We measure transactions in milliseconds these days. This business can't tolerate delays: We're pricing customer orders," Morgan said. "It's survival for us."
In recent years, financial activity has pushed that IT demand much higher. "If you go back five or six years, probably the number [of messages per second] was 10,000 or less," which is one-twelfth today's volume, Morgan said.
Morgan delivers that real-time speed with some homegrown applications sitting atop Sun Microsystems Solaris 10 servers, Stratus fault-tolerant servers in a Nortel network.
The CIO argues strongly for using as much standardized software as possible; the exchange's Web site runs on Windows, and e-mail is using Microsoft Outlook.
"We use the standard Windows environment for all that, but not for our trading. On the trading side, there simply aren't many packages," he said.
"There are many for broker dealers, but a select for exchanges. There aren't that many exchanges and, because of the custom nature of each exchange's business, it's very hard to find an off-the-shelf" package.
Having the network deliver all of those messages per secondMorgan's people stress test their system with 200,000 messages per secondis only part of the battle.
After the messages are delivered, they have to be stored, catalogued and archived. These days, that's about one-half billion messages every day.
All things considered, Morgan said, the storage is the easy part. "Data storage for us during the day is not the challenge. The challenge for us is retention," he said. "This is more about cost, given our size and the challenges."
At about 490 million messages a day, the government-required seven years of message retention adds up quickly.
The exchange handles messages in two ways, splitting them into recent messages (about three months' worth) where the data needs to be readily accessible and the remainder that can be held in offsite storage about 25 miles away from headquarters.
Much of the data is managed in SANs (Storage Area Networks) with about 10TB of storage at headquarters.
The balancing act of when to have data no longer be so readily available is primarily a budget issue.
The data that is kept super-accessible costs a lot more to maintain. "The longer you wait for it, the cheaper the storage," Morgan said.
The 120,000 messages a second are primarily being managed by some Sun Fire 6800 series servers. Morgan estimates that a typical day handles about 400 million quotes.
"They were the largest servers we could get at the time," he said. "We try and leave as much spare capacity as possible. The key is to constantly be proactive, to be monitoring and measuring these systems. You have to always be watching, measuring."
In the never-ending argument of whether it's better to have a small number of big servers or a large number of medium or small servers, Morgan finds himself in the large server camp, opting for horizontal growth (adding CPU capacity to existing large servers) over vertical growth (adding more servers).
"It's so much easier to plop a board in and run," Morgan said.
Next Page: Philadelphia's advantages.