post

FaceBook open sources building an energy efficient data center

FaceBook's new custom-built Prineville Data Centre

Back in 2006 I was the co-founder of a Data Centre in Cork called Cork Internet eXchange. We decided, when building it out, that we would design and build it as a hyper energy-efficient data centre. At the time, I was also heavily involved in social media, so I had the crazy idea, well, if we are building out this data centre to be extremely energy-efficient, why not open source it? So we did.

We used blogs, flickr and video to show everything from the arrival of the builders on-site to dig out the foundations, right through to the installation of customer kit and beyond. This was a first. As far as I know, no-one had done this before and to be honest, as far as I know, no-one since has replicated it. Until today.

Today, Facebook is lifting the lid on its new custom-built data centre in Prineville, Oregon.

Not only are they announcing the bringing online of their new data centre, but they are open sourcing its design, specifications and even telling people who their suppliers were, so anyone (with enough capital) can approach the same suppliers and replicate the data centre.

Facebook are calling this the OpenCompute project and they have released a fact sheet [PDF] with details on their new data center and server design.

I received a pre-briefing from Facebook yesterday where they explained the innovations which went into making their data centre so efficient and boy, have they gone to town on it.

Data centre infrastructure
On the data centre infrastructure side of things, building the facility in Prineville, Oregon (a high desert area of Oregon, 3,200 ft above sea level with mild temperatures) will mean they will be able to take advantage of a lot of free cooling. Where they can’t use free cooling, they will utilise evaporative cooling, to cool the air circulating in the data centre room. This means they won’t have any chillers on-site, which will be a significant saving in capital costs, in maintenance and in energy consumption. And in the winter, they plan to take the return warm air from the servers and use it to heat their offices!

By moving from centralised UPS plants to 48V localised UPS’s serving 6 racks (around 180 Facebook servers), Facebook were able to re-design the electricity supply system, doing away with some of the conversion processes and creating a unique 480V distribution system which provides 277V directly to each server, resulting in more efficient power usage. This system reduces power losses going in the utility to server chain, from an industry average 11-17% down to Prineville’s 2%.

Finally, Facebook have significantly increased the operating temperature of the data center to 80.6F (27C) – which is the upper limit of the ASHRAE standards. They also confided that in their next data centre, currently being constructed in North Carolina, they expect to run it at 85F – this will save enormously on the costs of cooling. And they claim that the reduction in the number of parts in the data center means they go from 99.999% uptime, to 99.9999% uptime.

New Server design
Facebook also designed custom servers for their data centres. The servers contain no paint, logos, stickers bezels or front panel. They are designed to be bare bones (using 22% fewer materials than a typical 1U server) and for ease of serviceability (snap-together parts instead of screws).

The servers are 1.5U tall to allow for larger heat sinks and larger (slower turning and consequently more efficient) 60mm fans. These fans only take 2-4% of the energy of the server, compared to 10-20% for typical servers. The heat sinks are all spread at the back of the mother board so none of them will be receiving pre-heated air from another heat sink, reducing the work required of the fans.

The server power supply accepts both 277V AC power from the electrical distribution system and 44V DC from the UPS in the event of a utility power failure. These power supplies have a peak efficiency of 94.5% (compared to a more typical 90% for standard PSU’s) and they connect directly to the motherboard, simplifying the design and reducing airflow impedance.

Open Compute
Facebook relied heavily on open source in creating their site. Now, they say, they want to make sure the next generation of innovators don’t have to go through the same pain as Facebook in building out efficient data centre infrastructure. Consequently, Facebook is releasing all of the specification documentation which it gave to its suppliers for this project.

Some of the schematics and board layouts for the servers belong to the suppliers so they are not currently being published, though Facebook did say they are working with their suppliers to see if they will release them (or portions of them) but they haven’t reached agreement with the suppliers on this just yet.

Asked directly about their motivations for launching Open Compute Facebook’s Jay Park came up with this classic reply

… it would almost seem silly to do all this work and just keep it closed

Asking Facebook to unfriend coal
Greenpeace started a campaign to pressure Facebook into using more renewable energy resources due to the fact that Pacific Power, the energy supplier Facebook will be using for its Prineville data center produces almost 60% of its electricity from burning coal.

Greenpeace being Greenpeace, created a very viral campaign, using the Facebook site itself, and the usual cadre of humurous videos etc., to apply pressure on Facebook to thinking of sourcing its electricity from more renewable sources.

When we asked Facebook about this in our briefing, they did say that their data centre efforts are built around many more considerations than just the source of energy that comes into the data centre. They then went on to maintain that they are impressed by Pacific Power’s commitment to moving towards renewable sources of energy (they are targeting having 2,000MW of power from renewables by 2013). And they concluded by contending that the efficiencies they have achieved in Prineville more than offsets the use of coal which powers the site.

Conclusion
Facebook tell us this new custom data centre at Prineville has a PUE of 1.07, which is very impressive.

They have gone all out on innovating their data centre and the servers powering their hugely popular site. More than that though, they are launching the Open Compute Project giving away all the specs and vendor lists required to reproduce an equally efficient site. That is massively laudable.

It is unfortunate that their local utility has such a high gen-mix of coal in its supply to besmirch an otherwise great energy and sustainability win for Facebook. The good thing though is that as the utility adds to its portfolio of renewables, Facebook’s site will only get greener.

For more on this check out the discussions on Techmeme

You should follow me on Twitter here

Photo credit FaceBook’s Chuck Goolsbee

post

Open source, open data, climate, energy and collaboration

Hard drive

James pointed me to a post yesterday about Transparency in energy usage – it made for interesting reading.

The article talks about the benefits of having open data in the energy arena. In this case, the post is specifically talking about the US Energy Information Administration‘s publishing of historical annual energy statistics for the entire US – included are data on total energy production, consumption, and trade; overviews of petroleum, natural gas, coal, electricity, nuclear energy, renewable energy, international energy, as well as financial and environmental indicators; and data unit conversion tables.

The data is downloadable in PDF, XLS and CSV formats, sliced and diced in any number of ways, or you can download it in its entirety. Amazing stuff.

By the way, if you don’t think open data can be made fascinating, you haven’t seen what Hans Rosling can do with the visualisation of statistics.

The article correctly points out though that the data, by itself, is not easily digestible. It is the combination of open data, along with the collaborative efforts of GOOD with hyperakt which makes this data far easier to digest.

And that’s key – we have seen the massive savings open source software has brought us (saving $60bn per annum, in direct costs). The other real benefits of open source though come from its crowd-sourced nature (rapid bug-fixing, security, mitigation of vendor collapse/product discontinuation, community, etc.).

Given the two recent controversies around climate science – a bit more crowd-sourcing of the data and methodologies in climate science should be welcomed by anyone seeking to add to climate science’s credibility.

With that in mind, it is great to see the Climate Code Foundation set up. The Climate Code Foundation say their goals are to promote public understanding of climate science:

  • by increasing the visibility and clarity of the software used in climate science, and by encouraging climate scientists to do the same
  • by encouraging good software development and management practices among climate scientists and
  • by encouraging the publication of climate science software as open source

The more openness and transparency there is in climate science (and every other science but especially climate science), the better for everyone (except perhaps the Lord Moncktons of the world).

You should follow me on Twitter here

Photo credit splorp

post

AMEE lands funding – Green is the new black!

Where there's muck, there's brass

Photo credit nickwheeleroz

I just received news that AMEE has landed funding. We have mentioned AMEE already a ton of times on GreenMonk, but that is because they share our open-source philosophy and they are Green. In fact, AMEE CEO Gavin Starks is the one who came up with the MegaTom concept!

From the press release:

AMEE, the World’s Energy Meter, has secured substantial Series A financing from leading VC funds in the USA and UK.

AMEE is a web-based service (API) that combines measurement, calculation, profiling and transactional systems, representing emissions data from 150 countries and regions. As a neutral data aggregation platform, AMEE’s vision is to enable the measurement of the “Carbon Footprint” of everything on Earth. AMEE aims to assist with the development of energy and carbon as a global currency, assisting governments and companies that need to account for and trade internationally in CO2 emissions.

The collaboration between O’Reilly Alphatech Ventures (OATV), Union Square Ventures (USV) and The Accelerator Group (TAG) will enable AMEE to expand its reach by enhancing its data, and extend globally.

The AMEE platform is already used internationally by many organisations including; UK Government (Defra/DECC), Morgan Stanley, Google, Radiohead, Nesta, the Irish Government, the Welsh Assembly, the Energy Saving Trust, BRE, Sun Microsystems, plus numerous other IT, business services and software companies.

Gavin Starks, Founder and CEO of AMEE commented, “AMEE’s growth over the past 12 months has been quite remarkable. We are delighted to have the financial and strategic support of such experienced investors to take our business forward.

AMEE’s is driving change by increasing the accuracy and transparency of emissions and consumption in a manner that has not been achieved by any legislation, market or service to date. We have developed and demonstrated a forward-thinking and innovative business model. It is based around neutrality, scale and collaboration. This reflects the dramatic changes that will impact our societies, their financial and social systems in the years to come.

The execution of the Climate Change Act in the UK last week, combined with President Elect Obama’s forward-looking Federal Cap & Trade statements are indicators of the scale of change approaching us.”

Bryony Worthington, Head of International Policy at AMEE added, “As one of the authors of the UK Climate Change Act, I am delighted to be bringing dedicated solutions to industry and consumers. The time to act is now.”

Mark P. Jacobsen, Managing Director of O’Reilly AlphaTech Ventures (“OATV”), commented: “AMEE’s vision to aggregate all of the energy data in the world fits OATV’s mission to invest in stuff that matters. With the recent sea change in America’s political climate, we look forward to AMEE bringing its platform-based data service to clients in the States.”

Ablert Wenger, Partner at Union Square Ventures, commented, “We believe that emissions information is critical to better decision making by individuals and companies. We are excited that AMEE’s service helps to substantially improve the timeliness and accuracy of emissions measurement.”

Great news, congrats Gavin and team. With the slew of funding announcements in the Green space, despite the trying financial times (because of…), Green is quickly becoming the new black!

post

Clear Climate Code project needs your help

Climate record
Photo Credit vodstrup

I came across the Clear Climat Code project via a message from Sig.

Like all good ideas, this one is very simple –

The Clear Climate Code project writes and maintains software for climate modelling and analysis, with an emphasis on clarity and correctness.

The results of some climate-related software are used as the basis for important public policy decisions. If the software is not clearly correct, decision-making will be obscured by debates about it. The project goals are to clear away that obscurity, to increase the clarity and correctness of climate science software.

Ticks two of my favourite boxes straight away – open source and the climate!

The guys in Ravenbrook, who are coordinating this project took this task on themselves and they have decided to seek a little outside help in the process.

If you can program in Python and would like to help out, you can get in touch with Ravenbrook by email here

post

The future of source material. Learning to see the steps.

IMG_9501_2.jpg

I’m busy helping to build Akvo at the moment, an open source wiki and set of collaboration and finance tools that pool the knowledge that already exists in different NGO and government silos, to help the world’s poorest communities quickly build water and sanitation facilities. It’s worth doing – today over 1.1 billion people are without safe drinking water and, globally, 2.6 billion lack basic sanitation. Each year, as a result, 1.8 million children die of diarrhoea and other diseases, 440 million school days are missed, and in sub-Saharan Africa alone $28.4 billion (USD) are lost in productivity and opportunity costs.

The great thing about tackling a problem this urgent is that you can challenge every aspect of how things are currently done – the assumptions that keep us constrained. What has become quickly apparent as we’ve presented the prototype and raised funds from development sector groups is that this field is wide open to act as a test-bed for our first game-changing element – that open source principles of knowledge sharing can change how development is organised. This component was surprisingly easy to explain to an NGO audience, in fact, with a panel debate at Stockholm World Water Week demonstrating a sound appreciation from relevant parties of the opportunities it presents to reduce costs and improve participation and technology re-use and longevity.

Yet underneath there is a tougher issue to deal with, and it becomes more apparent when dealing with that other movement of the moment – the opening up of knowledge systems via social media, and the tensions that creates for organisations built on hierarchical, command and control lines.

The problem is that organisations that have evolved as a hierarchy, with a clear chain of command, are not particularly effective when tasked with gathering and refining content in an emerging infrastructure shaped by social media and by processes that share every stage of a product (or story’s) development with anyone who is interested.

Because while digital material, by its nature, can be updated whenever there is a good reason to do so, it often isn’t. Instead, the vast majority of digital material today continues to be written, approved and published as if it was print material – it just happens to be made available digitally. Almost all marketing departments work this way.

And here’s where I’m going to collapse my lessons from open source and social media together. The central problem in most modern organisations is that there is no culture of shared, authentic core content. Traditional marketing and communications teams have developed stories in a linear fashion, with source material being assumed to be the final polished product, rather than the raw facts and figures. The source becomes the brochure, rather than the original interview that created insights for each section of that brochure. While technology such as wikipedia-style databases allow it, established processes of information gathering make it impossible to easily reference original source material in end products, and when that source material changes it is unusual for end products to be updated without considerable management activity.

This linear process of content creation and approval, favoured today, is designed to discard the real source content and create an improved edited reality, usually a report that is distorted to answer particular questions, or a document that tells a certain story to a certain audience. The organisation – or more accurately individual actors – try to hide any ‘weaknesses’ in the original source or make decisions along the way about which portions of the source should be published more widely and which should remain confidential. In other words, they attempt to control access to the source content. With emerging social media processes – pioneered in particular by the open source software movement – the philosophy is that the source content is open to all unless there is consensus that an individual should be excluded from either reading (unlikely) or editing (more likely) that content. The aim is to encourage all to feel they can contribute to and edit the source code – all actors are encouraged to improve the quality of the source code itself, perhaps by making connections between it and other things, or even simply by tidying it up. In all cases, what is changed can be tracked and large numbers of content editors constantly watch over changes and rigorously review and tweak material.

Yet over decades we have created organisations that usually have two parallel organisational realities – an internal organisation that is quirky, has politics, problems, secret plans, good people and bad people, versus an external organisation that is coherent, polished and near perfect.

The key beneficiary of maintaining two separate organisations is usually the marketing (or legal) department. Millions of man hours are applied globally to take real scenarios and polish them into something suitable for external consumption. Maybe its time to refocus our efforts, giving people at every level and every stage in the process of product and service development the tools and skills needed to tell their own, real stories at every stage. Doing so is no longer a technology problem – it’s a management one.

Mark Charmer is director of The Movement Design Bureau. He co-edits Re*Move and is a contributor to Greenmonk.

post

An open perspective on global development

Anyone trying to make sense of the opportunity to apply the principles of open source to opportunities wider than just software needs to listen to the recent series of Reith Lectures, broadcast by BBC Radio 4.

The Beeb’s curious podcast policy means that the archives are only accessible via its radio player, but right now you can grab a proper mp3 download of the final lecture, ‘global politics in a complex age’.

Here’s a taste:

“Once the problems are recognized, and the deep science is understood, it is far easier to come up with solutions, which typically require the application of new technologies at a scale to address the challenge. Those technologies exist, or can be developed. Public policies will be needed to get them into place.

Fortunately, governments will not need to do all of the heavy lifting. Individual champions of solutions can make great headway in demonstrating what needs to be done. New technologies for specific problems can be proved at a small scale and then taken to global scale. Social entrepreneurs from every sector can step forward with proposed solutions. The main role of government is stand prepared, with checkbook at hand and policy brief ready, to take working solutions to the needed scale.”

If you have time to listen to just one lecture, I’d go for lecture 3. If you’re interested in making sense of how open principles can be applied to speed up action on sustainability, or to transform the pace at which the world tackles the challenges of development, you’ll find inspiration here.

I’d also recommend dipping into Digital Formations: IT and New Architectures in the Global Realm. It’s a bit academic (read heavy work) but the insights are worth it…

post

“the first open source project dedicated to ecology”

via worldchanging comes news of openecosource.org

“Our mission is to create a web base tool, which will help speed up the distribution of available knowledge and connect efforts that aim to create a sustainable environment.”

How cool is that – someone else focusing on Green From The Roots Up. Greenmonk is also about applying open source methods to greening issues, so we’ll definitely be tracking this resource, and almost certainly contributing in some form or other.