How Facebook Handles Its Really Big Data

CIO Insight Staff Avatar

Updated on:

MENLO PARK, Calif. — Facebook is much like the Starship Enterprise in that it likes to go to places no company has gone before.

This is probably because not too many IT companies, especially young ones, have had to serve upwards of 950 million registered users — including a high percentage on a real-time basis — daily. Not many have to sell advertising to about 1 million customers or have dozens of new products in the works, all at the same exact time.

Facebook, which has a clear do-it-yourself IT approach, also designs its own servers and networking. It designs and builds its own data centers. Its staff writes most of its own applications and creates virtually all of its own middleware. Everything about its operational IT unites it in one extremely large system that is used by internal and external folks alike.

For example, Facebook’s human resources group, the accounting office, Mark Zuckerberg on email, and even you at your laptop checking your status are all using exactly the same gigantic, amorphous data center system that circles the globe in its power and scope.

Everything Facebook Does Involves Big Data

"So just about everything we do turns out to be a big data problem," said Jay Parikh, vice president of Infrastructure Engineering at Facebook, who spoke recently to a small group of journalists at the company headquarters. "This affects every layer of our staff. We’ve talked with some of you about the servers, storage, networking and the data center, as well as all the software, the operations, the visibility, the tools — it all comes together in this one application that we have to provide to all our users."

Big data simply is about having insight and using it to make impact on your business, Parikh said.

"It’s really very simplistic. If you aren’t taking advantage of the data you are collecting and being kept in your business, then you just have a pile of a lot of data," Parikh said. "We are getting more and more interested in doing things with the data we are collecting."

Facebook doesn’t always know what it wants to do with the user lists, Web statistics, geographic information, photos, stories, messages, Web links, videos and everything else that the company collects, Parikh said. "But we want to collect everything, we want to instrument everything: cameras, when that door opens and closes, the temperature in this room, who walks in and out the lobby.

"We want to know who visits the site, what activities they do, where they do it on the site. So everything is interesting to us," he said.

Facebook opened its first wholly owned data center in spring 2011 in Prineville, Ore., following a two-and-a-half-year construction period. It is custom-built for Facebook’s purposes and uses the company’s Open Compute Project architecture. It has two huge, 330,000 square-foot buildings on the site; one is for the daily operations, and one is for cold storage.

If you ask anybody at Facebook how much storage the company is running at any given time, you’ll never get a straight answer, because they honestly do not know.

Let’s just say that Facebook never leaves storage-buying mode.

Facebook launched the OCP on April 7, 2011. This is an unprecedented attempt to open-source the specifications it employs for its hardware and data center to efficiently power a social network comprising 950 million-plus people.

As part of the project, Facebook has published specs and mechanical designs used to construct the motherboards, power supplies, server chassis, and server and battery cabinets for its data center. That’s unprecedented enough for a company of Facebook’s growing scale, but the social network is also open-sourcing specs for its data center’s electrical and mechanical construction.

To read the original eWeek article, click here: How Facebook Is Handling All That Really Big Data

CIO Insight Staff Avatar