McLaren Has a Need for HANA Speed

By Michael Vizard

The difference between winning and losing a Formula One race is a matter of seconds. To help gain a competitive edge using sensors attached to its race cars, McLaren is regularly feeding gigabytes of data into an SAP HANA in-memory computing system to analyze data in real time during an actual race for the first time.

According to McLaren CIO Stuart Birrel, managing all that telemetry data was a major IT challenge. However, McLaren engineers can now ask any questions they want during a race and, unlike before, receive an answer.

“Engineers ask unusual questions. You have no idea what questions these guys are going to ask,” says Birrel. “They ask questions that we in IT could never predict.”

To provide the analytics environment that will allow its engineers to work in real time, McLaren has opted to deploy the SAP in-memory computing platform on a cloud computing platform managed by SAP. That approach effectively takes McLaren out of the business of building and setting up systems to run analytics applications, which makes it easier to respond to new requests for applications.

During a race a McLaren Formula One car generates about 1GB of raw data. By deploying HANA, McLaren engineers can now query that data and get a response in a tenth of a second. This capability is not only changing the way McLaren builds and tests the cars it designs each year for Formula One races, it also allows McLaren to make adjustments in real time as conditions change on any race course.

“It used to take use two days just to model the data,” says Birrel.

Models and simulations can be used to build what-if scenarios that allow engineers to determine what might have happened in a particular scenario if different decisions had been made.

HANA also allows McLaren to identify car components that are suffering fatigue so they can be replaced in a timely manner.

Based on a columnar database optimized for DRAM operating on Intel Xeon-class processors and solid-state storage drives, HANA allows analytic and transaction workloads to run simultaneously in real time. As such, HANA essentially eliminates the need for batch processing of analytic application workloads.

By taking advantage of more efficient approaches to SQL and stored procedures and by eliminating reliance of disk-based storage, the total cost of managing the data center environment can shrink by up 30 percent, largely because of the application of data compression algorithms being used in the SAP HANA platform.

The SAP HANA architecture provides support for different types of engines that can handle, for example, text analytics, spatial or geo-spatial data, and even legacy relational database applications. The system then makes use of data virtualization software, a columnar database construct and a Delta data store to fetch different data types being processed by each individual engine.

With HANA available on premise and in the cloud, SAP envisions an IT world where all “hot data” now runs in-memory on an x86 server, while warm or cold data is stored on traditional disk drives. At the moment, SAP has about 1,500 HANA customers, the vast majority of which are still in pilot.

By essentially eliminating or sharply reducing the need for batch processing, SAP is making a case for simplifying the management of the data center by relying more on HANA for almost every type of application workload.

How long it will take for IT organizations to make the transition to in-memory computing platforms is still anybody’s guess. But given the better performance provided by these systems, the pressure to abandon slower applications running on legacy systems will be high.

“We’ve all seen how new innovations lead to extinction of products and even entire companies,” says SAP co-CEO Jim Hagerman Snabe. “HANA is the biggest innovation to come along in IT in the last 20 years.”

Get the Free Newsletter!

Subscribe to Daily Tech Insider for top news, trends, and analysis.

Latest Articles