post

IBM, Big Green, Rational and Eco-aware Programming

I am at an event at IBM South Bank looking at some data center futures. The current session is with Christopher O’Connor, vice-president strategy and market management, Tivoli, who just raised an issue that I have been thinking about a lot lately. Just what will it mean to develop greener software? What would a green API look like? As usual- better performance is one answer to the problem.  Lean is Green.

Chris said that Rational, IBM’s software development tools and process organisation, is now looking at “green aware programming”. Good job. Chris mentioned one immediate area of concern – “the fetch”. That is – code that keeps calling a database tends to be performance intensive, and indentifying fetch bottlenecks could be a great step towards writing code that consumes less power. We’re talking about heat maps for code.

It will be interesting to see more about the Rational approach, and I will make an effort to do just that. But for now, its just good to report that IBM is thinking deeply about the problem and developing tools to support its findings.

On that note I am beginning to wonder if beautiful code is green code. Code generation tends to generate pretty ugly code – but is it less efficient?  Developers that write beautiful code may end up in great demand for their green coding: but this is pure conjecture at this point…

Comments

  1. says

    May not be so serendipitous, unfortunately.

    There’s usually a trade-off between programmer work and computer work. With beautiful code pushing more of the effort to the computer.

    A simple example, beautiful code is breaking your code into smaller functions for clarity and easier reuse. But more functions means more instructions executed. People really concerned about performance and minimizing computer instructions, unwind their programs into large repeated chunks rather than call functions.

    Or take garbage collection, the computer has to do more work to clean up after the programmer who can forget about the memory she’s allocated.

  2. says

    Efficient code is good from the point of view of power into a server but if it takes the engineer longer to write green code then they may end up boiling more water for tea or purchasing more aluminum caffeinated beverage cans. Then it becomes a lifecycle assessment question I guess? However, if the engineer can take less time to write green code that sounds like a competitive advantage.

  3. says

    Love the photos James. They are infra red and normal light photos of racks in a data centre. They indicate beautifully one of the major inefficiencies with current data centres – air mixing.

    You can see a clear delta of temp from the cooler bottom of the racks to the warmer top (where warm air from the back of the racks is spilling over the top and heating the air you have spent so much cooling.

    This has a knock-on effect on the lifespan of equipment in racks meaning the higher up a rack the equipment is, the sooner it fails!

    The best way to combat this is to stop mixing of air in DCs by implementing a containment strategy. From an efficiency point of view, cold-aisle containment makes most sense in this latitude.

    The disadvantage of cold-aisle containment is that it means the DC room is hot and this has issues of perception for any customers entering it. That needs to be overcome beforehand with education.

  4. says

    (Warning: I’ve never previously thought about this topic, so consider this all conjecture for the purposes of brainstorming :-)).

    I think James Urquhart is on an interesting point re: utility computing. When I was looking at this several years ago, we always talked about how in traditional non-utility environments, servers run at an average of around 10-25% utilization, but you need all of these servers to handle peak load. But most of the time you’re sucking a lot of power just to keep the fans and memory running on an otherwise idle server.

    Utility computing intends to solve this problem by spreading the load of many applications across many servers with the theory being that different applications have different peak times, so you can run at a much higher average utilization (similar to how insurance spreads risk across a large number of people). So you need net less servers for the same number of applications and as JamesU mentioned, you pay as you go.

Trackbacks

  1. […] Greenmonk: IBM, Big Green, Rational and Eco-aware Programming – “I am beginning to wonder if beautiful code is green code. Code generation tends to generate pretty ugly code – but is it less efficient? Developers that write beautiful code may end up in great demand for their green coding: but this is pure conjecture at this point…“ […]