Facebook has opened the doors, so to speak, to its new data center
in Lulea, Sweden. The data center marks a first for the social networking giant in two ways: It's the first in Europe and it's the first to be equipped with all Facebook-designed servers. The servers are handling live traffic from around the world.
The data center debut is the first full fruits of the company's Open Compute Project. Facebook started the project nearly two years ago with a goal to build one of the most efficient computing infrastructures at the lowest possible cost. By releasing the Open Compute Project technologies as open hardware, Facebook's goal is to develop servers and data centers following the model traditionally associated with open source software projects.
"All the equipment inside is powered by locally generated hydro-electric energy. Not only is it 100 percent renewable, but the supply is also so reliable that we have been able to reduce the number of backup generators required at the site by more than 70 percent," the company said in a news release. "In addition to harnessing the power of water, we are using the chilly Nordic air to cool the thousands of servers that store your photos, videos, comments, and Likes. Any excess heat that is produced is used to keep our office warm."
Cutting Out the Middle Man
Facebook said its commitment to energy efficiency is also evident inside Lulea's giant data halls. Nearly all the technology in the facility, from the servers to the power distribution systems, is based on Open Compute Project designs. The designs are highly efficient and leave out unnecessary bits of metal and plastic. These designs are then shared with the broader community, so anyone can use or improve them.
According to Facebook, all this adds-up to a pretty impressive power usage efficiency (PUE) number. In early tests, Facebook's Lulea data center is averaging a PUE in the region of 1.07. As with our other data centers, the company said it will soon be adding a real-time PUE monitor so everyone can see how it is performing on a minute-by-minute basis.
Charles King, principal analyst at Pund-IT , said Facebook isn't the first company that has pursued the open hardware course. And there have been rumors around Google and other large cloud hyper-scale Web 2.0 companies pursuing initiatives to come up with their own server design and outsource the construction rather than relying on traditional server vendors.
Eating Their Own Dog Food
"Where the open compute project diverges from that a bit is that Facebook's idea was to create a design for servers that they then released into the wild as an open source project and invited other companies to contribute to and use," King told us. "The idea was if you came up with a single design for a server that both manufacturers and users could get on board with, it would break down a lot of the barriers."
One of those barriers is incompatibility. With the Facebook model, there would be no incompatibilities between components or problems getting stuck in a single proprietary server architecture. King calls the initiative interesting from two standpoints.
"This is kind of a classic 'we're eating our own dog food' kind of announcement. Facebook is saying: 'We launched this initiative and look, we're using it, too. And we are using it in this brand new state-of-the-art data center. And it's a highly energy efficient data center'," King said. "There's a good deal of self-promotional back-patting going on here, but at the same time it's an effort that's worthy of attention and respect. They are moving forward with this in a way that I think is quite reasonable and sustainable."