post

Big emphasis on mobility at the SAP for Utilities conference

SAP for Utilities

I attended the SAP for Utilities conference in San Antonio last week. I gave the closing keynote (which I’ll write-up in another post).

I was interested though by the fact that two themes recurred in all the opening keynotes.
1. All of the opening keynoters made mention of Social Media – this was a huge relief because my closing talk was due to be on Social Media, so the speakers were setting the stage nicely! And
2. Mobility was talked up big-time by the speakers

I had expected some talk of mobility, along with HANA, Smart Grids Cloud and Analytics – the usual gamut of topics at these events and they were indeed all addressed, but there was a definite emphasis on mobility over all other topics.

It is understandable – with the advent of tablets and smartphones, computing is going mobile, no question about it. I think it was Cisco’s CTO Paul De Martini who dropped the stat that 200,000 new android devices are being activated daily.

This impacts utility companies on two fronts:
1. On the customer front, utilities can now drop the idea of in-home energy management devices and, instead, assume the vast majority of their customers has access to a smartphone or tablet and
2. On the employee front, utilities have lots of mobile workers – the ability to connect them easily back into corporate applications will be game changing.

In my talk on social media strategies for utilities – I suggested that utilities equip every truck-roll with a smartphone. That way, when they get to site to repair a downed line (or whatever), they can take a quick video of the damage, the people working on-site, and in the voice-over give a rough estimated time of recovery. This can be uploaded to YouTube at the touch of a button on the phone and so, call center operators, and social media departments can direct enquiries to the video – immediately helping diffuse the frustration of having power cut.

Programs like this can even be pro-active and the customer service benefits of rolling this out should not be under-estimated.

Utilities are entering a new, more challenging era. Mobility solutions (especially when combined with social media) will be a powerful tool to help them meet these challenges.

Full disclosure – SAP is a GreenMonk client and paid travel and expenses for me to attend the conference.

Photo credit Tom Raftery

post

SAP’s sustainability customers are talking up SAP!

Cash register

When I interviewed SAP’s Jeremiah Stone last May about SAP’s Sustainability story at SapphireNow, he mentioned that SAP was committed to an increasing focus on sustainability customer success stories

I?m excited to hear less about SAP being sustainable in our vision and a lot more about SAP?s customers embedding our technology to have more sustainable business and more sustainable products of their own.

At SapphireNow, according to Jeremiah, around 80% of the Sustainability events were customer-led.

SAP seems to have continued that momentum, having recently published a series of videos of customers talking to SAP about how SAP’s sustainability solutions boosted their Sustainability programs. There are interviews with Lexmark, CSC, Dow, Air Products, amongst others.

It is one thing to be reporting on your own sustainability initiatives, as SAP do in their online Sustainability Report, but quite another when you can roll out customers who are willing to go on record saying your products helped them become more sustainable. That definitely helps take the credibility up a notch.

Now, imagine if you could quantify the sustainability savings your customers have achieved as a result of deploying your software…

Photo credit skpy

post

Smarter cities – cities of almost any size can now go digital, with all the efficiency gains that brings

City

I attended an IBM Smarter Cities analyst event last week, and it was, not surprisingly, very interesting.

What is the whole rationale behind making cities smarter?

Well, there are a number of factors. For one, the world’s population has doubled in the last 40 years (from 3.5 billion to almost 7 billion). And with the mushrooming population, there is also an increase in urbanisation (in 1800, 3% of the world’s population lived in cities, whereas in 2007 that figured went above 50% for the first time).

The surging numbers of people living in cities are increasing demands on municipalities for services like water, energy, transportation, housing, healthcare and public safety. This is happening at a time of constrained resources and ageing infrastructures for many existing cities.

At the IBM Smarter Cities event, IBM showcased both some of the technologies they are providing to cities and also case studies of some of the solutions they have rolled out.

Intelligent Operations Center

The core of IBM’s offerings is its Intelligent Operations Center (IOC) – this is a application capable of taking information from virtually any IT system a city may have (water management, video surveillance, first responder systems, traffic management, etc.), combining this data and using it to kick off workflows, to trigger alerts, to display on dashboards and/or for data export.

The fact that the system can take in inputs from such a wide variety of systems is, in large part due to its use of the Common Alerting Protocol (CAP) – an XML-based protocol for exchanging alerts between systems. From the CAP Wikipedia entry:

Alerts from the United States Geological Survey, the Department of Homeland Security, NOAA and the California Office of Emergency Services can all be received in the same format, by the same application. That application can, for example, sound different alarms based on the information received.

The IOC’s flexibility when it comes to data inputs ensures it can take in information from almost any IT system – it can also output that same data to other systems or run data through rules engines to kick off workflows. This means the IOC has huge potential as a way to take in information from many disparate sources, have it acted on, and display results in realtime to the responsible city officials.

However, it is those same city officials who may be biggest barrier to the success of the IOC. To get the most from the IOC, it needs access to the relevant data, but that requires the buy-in of the data owners. Most city administrations are based around silos and the people responsible for managing those silos may be inclined to view the data as their own fiefdom. Data sharing cultures will need to be far more widely accepted in city government for the IOC to reach its full potential.

The Smarter Cities sales cycle must be fascinating and likely involves more change management skills than sales ones.

One of the initial customers for IBM’s Smarter Cities solutions was Rio de Janeiro but there was a burning platform there. Rio is hosting the 2016 Summer Olympics and the 2014 World Cup so it needed to ensure it had all its systems in tip-top shape. Other cities have signed up for partial roll-outs (Washington DC and the Sonoma County Water Authority for water management, Richmond Va., and New York for crime reduction and Bolzano Italy for management of the elderly, to name a few). In their cases, increasing sales will be very much a matter of up-selling additional efficiency services.

One of the intriguing aspects of the IBM Smarter Cities solution is that there is a cloud delivered version. This lack of a requirement for a hardware installation can drastically cut costs, the time to roll-out and the IT administrative burden (backups, disaster recovery and availability) making it an ideal solution for smaller urban areas which couldn’t previously have considered such an option.

For the first time, cities of almost any size can now go digital, with all the efficiency gains that brings.

Photo credit Nrbelex

post

TEPCO using realtime information to help reduce energy consumption in Japan

TEPCO realtime energy chart

TEPCO, the Japanese power company who own the Fukushima nuclear power plant, are in an unenviable position. Their Fukushima nuclear power plant is the site of one of the world’s worst industrial accidents, they have been accused of not just incompetence but of falsifying safety records and yet they have to continue to supply power to Japan.

Japan itself is facing some significant challenges – only 17 of its 54 nuclear power reactors are operational heading into August, traditionally its month of peak demand. Japan needs to try to avoid rolling blackouts, and TEPCO has stepped up to helping out.

TEPCO energy message

TEPCO energy message

On TEPCO’s home page they give top line data for the maximum demand for the day, as well as the maximum amount that will be able to be supplied. As long as the demand doesn’t exceed the supply, no blackouts.

TEPCO have gone further though with a realtime chart of energy demand (updated every five minutes) versus maximum supply and also graphed against the demand on the same day in 2010 (see the chart at the top of this post). We have long argued here on GreenMonk that giving people access to information will help change behaviour. This campaign is a great example of realtime energy information in action and it appears to be helping because electricity consumption is down around 15% on last year.

This information is certainly not the only thing helping people reduce their electricity consumption – TEPCO and others also have energy reduction tips on their website and the tragedy of the Earthquake, followed by the devastating tsunami galvanised a sense of national unity, such that now anyone seen to be wasting scarce electricity is ostracised.

People and companies are turning off lights, removing bulbs, changing the thermostat on air conditioning units in a way that would previously have been thought impossible. In fact a certain pride is creeping into the campaign. Tatsuo Nakahara, administrative manager at Meiwa Rubber, a manufacturer of printing equipment with factories in Tokyo, said in an interview quoted in the New York Times

The government?s figure is 15 percent, but we?re aiming to cut by 25 percent

He added that in the months after the March disaster, the company had already succeeded in conserving 20 percent.

Can this effort be sustained? Only time will tell but if the Japanese can get through August, they’ll have passed through the worst of it. They should consider also giving people individual energy management dashboards and something else that may help maintain the momentum, as I have posited here previously, is the addition of social media and gaming to the effort. Letting people share their energy reduction achievements with their social networks, setting targets for reductions, creating leaderboards and awarding achievement badges etc.

TEPCO may want to seriously consider this. Its reputation, nationally and internationally, is in shreds. If they can be seen now as an agent of good in this crisis, they may be able to resurrect some bit of pride in their brand.

Photo credit Tom Raftery

post

Carbon Disclosure Project’s emissions reduction claims for cloud computing are flawed

data center

The Carbon Disclosure Project (CDP) is a not-for-profit organisation which takes in greenhouse gas emissions, water use and climate change strategy data from thousands of organisations globally. This data is voluntarily disclosed by these organisations and is CDP’s lifeblood.

Yesterday the CDP launched a new study Cloud Computing ? The IT Solution for the 21st Century a very interesting report which

delves into the advantages and potential barriers to cloud computing adoption and gives insights from the multi-national firms that were interviewed

The study, produced by Verdantix, looks great on the surface. They have talked to 11 global firms that have been using cloud computing for over two years and they have lots of data on the financial savings made possible by cloud computing. There is even reference to other advantages of cloud computing – reduced time to market, capex to opex, flexibility, automation, etc.

However, when the report starts to reference the carbon reductions potential of cloud computing it makes a fundamental error. One which is highlighted by CDP Executive Chair Paul Dickinson in the Foreword when he says

allowing companies to maximize performance, drive down costs, reduce inefficiency and minimize energy use ? and therefore carbon emissions

[Emphasis added]

The mistake here is presuming a direct relationship between energy and carbon emissions. While this might seem like a logical assumption, it is not necessarily valid.

If I have a company whose energy retailer is selling me power generated primarily by nuclear or renewable sources for example, and I move my applications to a cloud provider whose power comes mostly from coal, then the move to cloud computing will increase, not decrease, my carbon emissions.

The report goes on to make some very aggressive claims about the carbon reduction potential of cloud computing. In the executive summary, it claims:

US businesses with annual revenues of more than $1 billion can cut CO2 emissions by 85.7 million metric tons annually by 2020

and

A typical food & beverage firm transitioning its human resources (HR) application from dedicated IT to a public cloud can reduce CO2 emissions by 30,000 metric tons over five years

But because these are founded on an invalid premise, the report could just as easily have claimed

US businesses with annual revenues of more than $1 billion can increase CO2 emissions by 85.7 million metric tons annually by 2020

and

A typical food & beverage firm transitioning its human resources (HR) application from dedicated IT to a public cloud can increase CO2 emissions by 30,000 metric tons over five years

This wouldn’t be an issue if the cloud computing providers disclosed their energy consumption and emissions information (something that the CDP should be agitating for anyway).

In fairness to the CDP, they do refer to this issue in a sidebar on a page of graphs when they say:

Two elements to be considered in evaluating the carbon impact of the cloud computing strategies of specific firms are the source of the energy being used to power the data center and energy efficiency efforts.

However, while this could be taken to imply that the CDP have taken data centers’ energy sources into account in their calculations, they have not. Instead they rely on models extrapolating from US datacenter PUE information [PDF] published by the EPA. Unfortunately the PUE metric which the EPA used, is itself controversial.

For a data centric organisation like the CDP to come out with baseless claims of carbon reduction benefits from cloud computing may be at least partly explained by the fact that the expert interviews carried out for the report were with HP, IBM, AT&T and CloudApps – all of whom are cloud computing vendors.

The main problem though, is that cloud computing providers still don’t publish their energy and emissions data. This is an issue I have highlighted on this blog many times in the last three years and until cloud providers become fully transparent with their energy and emissions information, it won’t be possible to state definitively that cloud computing can help reduce greenhouse gas emissions.

Photo credit Tom Raftery

post

SAP talks up Sustainable Programming

Renewable energy source code

There is a lot of work happening to reduce the increasing energy footprint of ICT – at the infrastructure level, at the CPU level with ARM-based servers, and at the storage level. However, until recently, very little had been mentioned about the energy footprint of the software running on servers.

This is a topic I first raised with SAP back in 2008 at their TechEd event in Berlin. I urged them to have a Green coding theme to their next (2009) TechEd event, which, in fairness to them, they did.

SAP have now decided to address this issue at a broader level and recently published two blog posts (here and here) on what it calls Sustainable Programming. Accompanying the blog posts are two PDF articles which go into the topics in a bit more detail (here and here).

These blog posts and the accompanying articles are aimed at the many developers SAP has in it’s SDN community globally (over 1 million at time of writing).

These developers are typically working on large enterprise systems responsible for running organisations and often dealing with hundreds/thousands of transactions per minute. These transactions routinely involve database read/write operations and each has a discreet energy footprint. Reducing the energy requirement of these transactions can have a significant effect on an application’s power requirements (and consequently its costs and CO2 footprint).

Helping these developers learn sustainable programming techniques will benefit the organisations the developers are working for, SAP by extension and the reduced carbon emissions will help the rest of us.

Reading through the documents, I can’t help thinking that they are light on specifics and psuedo-code. I suspect this initiative could do with a wiki for developers to add in code snippets, have discussions and swap best-practices.


Photo credit Tom Raftery

post

Hints for competitors in EDF’s 2012 Sustainable Design Challenge

Eiffel Tower

I was in Paris earlier this week as one of the Judge’s for EDF’s Sustainable Design Challenge. If you are not familiar with EDF, they are the world’s largest utility company and while they operate in Europe, Latin America, Asia, the Middle-East and Africa, they are headquartered in Paris.

There were 31 entries in the competition coming from a variety of design schools, universities and even an American high-school. As well as prize money, the eight selected finalists will be helped develop their projects over the next year and they will be displayed at the EDF pavilion at the 2012 London Olympics. This is even more impressive when you realise that the EDF pavilion at the Olympics will be one of only five pavilions there.

The quality of the thirty one entries was, in general, quite high. The thing which disappointed me though was the seeming lack of engineering knowledge amongst the entries. Many seemed to be of the opinion that small piezoelectric generators can generate vast qualities of electricity (they can’t!).

More disappointing though was that three fundamental energy technologies were totally ignored. While some of the entries used wind generation, none used solar as a key technology. Similarly, none referenced energy storage and not a single entry used any smart grid technologies.

Part of this has to come down to the fact that the participating schools were more design than engineering schools, but still, these were fairly big technologies to have been ignored.

Other than that, the competition was spectacularly well run and kudos to EDF for raising awareness of sustainability in design in the running of this competition. This will be an annual competition, so for participants in the 2012 EDF Sustainable Design Challenge – you now know what you need to work on 😉

This was the entry by the US high school – it was one of 8 selected to be a finalist:

Photo credit Tom Raftery

post

Friday Green Numbers round-up for July 8th 2011

Green Numbers

With the summer slowdown in travel, I’m re-instating the Friday Green Numbers Round-up – and so without further ado…

  1. Whitehall surpasses 10% CO2 reduction target

    Whitehall has surpassed its target of slashing its CO2 emissions by ten percent in one year, achieving a cut of almost 14 percent.

    Prime minister David Cameron said central government emissions have fallen by 13.8 percent in the past year, reducing energy bills by an estimated ?13 million.

    Topping the table was the Department for Education, which achieved a 21.5 percent cut, while the… Read on

  2. Britain’s richest man to build giant Arctic iron ore mine 300 miles inside Arctic Circle

    Lakshmi Mittal’s ‘mega-mine’ is believed to be the largest mineral extraction project in the region but threatens unique wildlife

    Britain’s richest man is planning a giant new opencast mine 300 miles inside the Arctic Circle in a bid to extract a potential $23bn (?14bn) worth of iron ore.

    The “mega-mine” ? which includes a 150km railway line and two new ports ? is believed to be the largest mineral extraction project in the Arctic and highlights the huge… Read on

  3. Amazon Resists Pressure To Disclose Data On Carbon Footprint

    Amazon revolutionized the retail industry in the United States, and for several years has had a strong presence in Europe and Asia. Its market cap among retailers lags only behind Walmart.

    Despite its successes, the e-commerce giant has attracted criticism for a perceived lack of transparency of its carbon footprint…. Read on

  4. Facebook in the top 10 most hated companies in America

    Business Insider posted an article titled ?The 19 Most Hated Companies In America.? The data was based on the American Customer Satisfaction Index (ACSI), which releases industry results monthly and updates its national index quarterly.

    Facebook was placed at number 10. I decided to take a look at just the 2010 data, which is the latest available if you want to see ratings from all the companies in the US…. Read on

  5. 7 ways cloud computing could be even greener

    Forrester Research is the latest organization to explore the link between cloud computing and green IT.

    Forrester notes that by its nature, cloud computing is more efficient. But here are seven ways that an IT professional can make his or her cloud computing even greener ? regardless of whether or not the approach is public or private:…. Read on

  6. E-On investing $600 million in Illinois wind farms

    Northwest of Kokomo, along U.S. 24 near the Indiana-Illinois state line, the horizon is broken by the sight of dozens of wind turbines slowly turning in the breeze.

    There, in the small town of Watseka, Ill., E-On Climate & Renewables is putting the finishing touches on the Settler’s Trail Wind Farm, and the company soon will start work on the Pioneer Trail Wind Farm in a neighboring portion of Iroquois County.

    E-On also plans to construct a major wind farm across parts of Howard, Tipton, Grant and Madison counties.

    Construction on Phase 1 of the Wildcat Wind Farm is…. Read on

  7. UK’s two biggest solar installations start generating energy

    A huge solar farm in Lincolnshire and another in Cornwall started generating green electricity on Thursday to become the UK’s two biggest solar installations, as developers rushed to beat an imminent cut in government subsidies.

    The 1MW Fen Farm solar park and the 1.4MW Wheal Jane park in Truro are two of several such large-scale projects rushing to connect to the grid. They are trying to benefit from a…. Read on

  8. Missing: 163 Million Women

    AMidway through his career, Christophe Guilmoto stopped counting babies and started counting boys. A French demographer with a mathematician’s love of numbers and an anthropologist’s obsession with detail, he had attended graduate school in Paris in the 1980s, when babies had been the thing.

    He did his dissertation research in Tamil Nadu.

    As it turned out, Tamil Nadu was in fact one of the states where girls had a better prospect of survival, while in 2001 the northwest, a wealthy region considered India’s breadbasket, reported a regional sex ratio at birth of 126?that is, 126 boys for every 100 girls. (The natural human sex ratio at birth is 105 boys for every 100 girls.) The cause for this gap, Guilmoto quickly learned, was that pregnant women were taking advantage of a cheap and pervasive sex determination tool?ultrasound?and aborting if the fetus turned out to be female… Read on

Photo credit Tom Raftery

post

Computer storage systems rapidly taking on the energy efficiency challenge

In the video above, Dave Wright, founder and CEO of SolidFire makes the point that what with ARM-based servers, OpenCompute, etc. there has been a lot of breakthroughs on the computing side of servers, to make them more efficient recently, but very little innovation has happened with storage systems. Predictably he’s gone after storage modernisation with his new company SolidFire offering SSD-based enterprise storage solutions.

My laptop

My laptop

One of the biggest advantages of SSD’s, as storage for servers, is it is incredibly fast, so you get an immediate performance win. I first found this when I changed my laptop to one with an SSD, instead of a normal HDD. The drive is far faster, but because the SSD doesn’t generate heat, there is no requirement for a fan. This makes the laptop cooler (no laptop burn), quieter and it has a far longer battery life. Samsung affirmed this in a server situation when I talked to them earlier this year. Because SSD’s don’t require power-hungry fans to cool down the heat created by spinning drives, the reduced power requirement and heat generation is a big win in a data centre environment.

SolidFire are far from being alone in this field. Just last week FlashSoft announced that they had secured $3m in first round funding to develop Flash virtualisation software for enterprises. They have nifty software which runs on servers with hybrid storage (some SSD and some HDD). Their software identifies regularly accessed data (hot data) and caches this in SSD, while moving less frequently accessed data to spinning disks. Having regularly accessed data in a cache on SSD greatly increases the performance of the storage.

The hybrid model is one way of getting over the issue of the cost differential between HDD’s and SSD’s. SolidFire have a different approach – they don’t go for the hybrid model. Instead their all-SSD model uses a combination of data compression, de-duplication and thin client provisioning to reduce the amount of space required for storage.

A performance enhancing tactic regularly employed with HDD’s is to only use a small amount of the available space on the outside of the disk for your storage. The outside of the disk spins fastest giving you faster read/write access. However, this is hugely inefficient as most of the disk remains unused.

SolidFire do away with the need to have any HDD’s at all making your storage far more efficient. While in Flashsoft’s hybrid model, you can do away with the requirement for faster spinning SAS drives and instead go for slower, cheaper SATA drives without taking a performance hit. Both solutions reduce your energy and cooling needs.

Then out of Japan comes news that in response to requirements for energy efficiency there (due to the earthquake earlier this year closing nuclear power plants), Nexsan have come up with new power managed storage systems with in-built MAID capable of supporting any combination of SATA/SAS/SSD drives. Because MAID allows disks to be spun down when not in use, Nexsan are claiming up to 85% savings in energy usage for its systems.

It is true certainly that SSD’s have a shorter lifetime than HDD’s but even this has been given a boost with the recent announcement from IBM that their new Phase Change Memory chips (PCM) will be faster, cheaper and longer lasting than todays SSD’s.

So while Dave, above, feels that there isn’t much innovation happening in the efficiency of storage, I would respectfully differ and say this is very exciting times to be looking into storage energy efficiency!

Photo credit Tom Raftery

post

Logica and SAP in exclusive joint bid for UK Smart Meter data provisioning

Smart meter

The UK has an interesting Smart Meter infrastructure model. Data from all the country’s Smart Meters will flow to a centralised data repository (called the DCC), from where, energy retailers will pull the data for billing purposes. The beauty of this system is that consumers dictate who has access to their data, and so switching energy providers, is not held up by data ownership issues.

The build-out of this system is still at very early stages with RFP’s expected towards the end of the year but SAP and Logica have come out of the blocks early with an announcement that they are going to put in a joint bid to become the data service provider for the DCC.

Logica and SAP are both heavily involved in the utilities sector in the UK, so it makes sense for them to bid for this – the interesting aspect is that they agreed to bid together and that their joint bid is exclusive.

The six main suppliers in the UK are all either involved in trials, or in the process of starting to trial smart meters. All six are using Logica’s head-end system for their trials, so if Logica and SAP win the bid, the transition to the DCC system should be relatively painless.

Talking to Tara McGeehan, Logica’s Head of Utilities UK on Monday, she said that the idea behind the bid was to move the debate away from technology and comms, onto the power of the data to affect things like micro-generation, energy efficiency and smart grids.

Having seen Centrica’s Smart Meter Analytics application, which runs on SAP’s HANA, earlier this year, the proposition that there is gold in them thar data, certainly rings true.

Photo credit Tom Raftery