Facebook already built its own data center and its own servers. And now the social-networking giant is building its own storage hardware — hardware for housing all the digital stuff uploaded by its more than 845 million users. “We store a few photos here and there,” says Frank Frankovsky, the ex-Dell man who oversees hardware design at Facebook. That would be an understatement. According to some estimates, the company stores over 140 billion digital photographs — and counting. Like Google and Amazon, Facebook runs an online operation that’s well beyond the scope of the average business, and that translates to unprecedented hardware costs — and hardware complications. If you’re housing 140 billion digital photos, you need a new breed of hardware. As with its data center and server creations, Facebook intends to “open source” its storage designs, sharing them with anyone who wants them. The effort is part of the company’s Open Compute Project, which seeks to further reduce the cost and power consumption of data center hardware by facilitating collaboration across the industry.
Facebook is so big that it has its own flag:
Walking in, yes, we are in the right place:
Just past the Facebook sign is a monitor in the lobby that shows you the state of the datacenter and how well the cooling systems are working:
Inside the security door the local community made these quilts, which is their interpretation of what a social network looks like:
Walking in Thomas Furlong, director of site operations at Facebook, brought us into a huge series of rooms which “process” the air. First room filters the air. Second room filters it further.
Here’s Thomas showing us one of the huge walls of filters (these filters are similar to the ones in my home heating system, except here Facebook has a wall of them).
Here’s a better shot of just how massive this filtering room is:
Then the air goes into a third room, one where the air is mixed to control humidity and temperature (if it’s cold outside, as it was today, they bring some heat up from inside the datacenter and mix it here) and on the other side, there’s a huge array of fans, each of which has a five horsepower motor (today the fans were moving at 1/3 speed, which makes them more efficient).
Here you can see the back sides of one of the huge banks of filters:
Here Thomas stands in front of the fans:
Here’s a closeup look at one of the fans that forces air through the datacenter and through the filtering/processing rooms:
Finally, the air moves through one final step before going downstairs into the datacenter. In this final step small jets spray micro-packets of water into the air. As the water evaporates, which it does very rapidly, it cools the air. One room I didn’t take photos in was filled with pumps and reverse osmosis filters, which makes the water super pure so it works better when using it to cool in this way. One final set of filters makes sure no water gets into the datacenter. Here’s a closer look at the array of water jets:
Here you can see the scale of the room that sprays that water:
Here’s a closeup of one of the jets of cooling water:
Finally we got to follow the air down into the datacenter where there was a huge floor with dozens of rows. Each row had rack after rack of servers.
Here Thomas stands in front of just one of those racks:
This 180-degree view gives you a look down the main corridor (on the side you can see is only half the datacenter — these are the newer “Open Compute” servers, the other half they asked us not to take pictures of, and that contained their older server technology).
What does this all mean? Well, for one, it brings jobs to Prineville, which is a small town with about 10,000 residents in a very rural county (we drove about half an hour through mostly farmland just to get to Prineville). But listen to Prineville’s mayor to hear what it means for her community.
0 ความคิดเห็น:
Post a Comment