Search Results for: utilities

post

Oracle’s Utilities Meter Data Management taking off

Oracle

Photo credit Not Quite a Photographr

Interesting bits of news from Oracle on the Smart Grid front in the last couple of days:

  1. Oracle recently released Oracle Utilities Meter Data Management 1.5, which includes enhancements to help accelerate advanced metering infrastructure (AMI) integrations, to ultimately lower implementation costs for utilities that are implementing smart metering programs, to detect outages more quickly, drive energy efficiency initiatives and provide more accurate billing information to customers.
  2. UtiliPoint reported that Oracle won seven out of 14 major meter data management customers in 2008 (no link, sorry as UtilitiPoint charge around $3,000 for their reports!)
  3. Modesto Irrigation District is rolling out a Smart Meter project to more than 91,270 residential and about 12,700 commercial and industrial customers using Oracle’s Meter Data Management. Tom Kimball, MID’s Assistant General Manager for Transmission and Distribution, said

    Smart meters make good economic sense for consumers and utilities alike in this time of rising electric rates. Moreover, the California Energy Commission may soon require this type of electric meter, and the Legislature is moving in the same direction

  4. And news just in today that Italy’s Acea Distribuzione selected Oracle Utilities Meter Data Management to support its Automatic Meter Management (AMM) project, covering approximately 1.6 million meters – making it one of the largest AMM deployments in Europe to date.

    The Oracle solution will help us to provide our customers with advanced options including consumption profiles as well as consumption information online – ultimately allowing the consumer to make more informed decisions about their energy use

    said Delio Svaluto Moreolo, Metering Department, Acea Distribuzione S.p.A.

We have been writing a lot on this blog about the advantages of Smart Grids, and president Obama has recently called for the rollout of 40m smart meters in the US so it is great to see the big software vendors pushing out the necessary apps to help utilities make smart grids a reality.

post

Oracle Utilities, Smart Grids and vehicle to grid

I was talking to Guerry Waters, the VP Industry Strategy in the Oracle Utilities Global Business Unit the other day.

Guerry was telling me about Oracle Utilities’ background and how they came about as the result of Oracle’s acquisition of SPL back in Nov 2006 and Lodestar in 2007.

We got onto the subject of Demand Response (surprise, surprise!) and I raised my concerns about utilities being too command and control. When I said that for DR to really take off consumers need to be in control of their devices Guerry said:

The idea of automatic control of your Demand Response in the home is very intriguing but very much on the edge now, so what we are doing is we are working with a number of companies, like Tendril, that provides Home Area Networks (HANs) and control of devices in the home, where there can be parameters set from signals that are being passed to the HAN about price…. and bring that down to the HAN and let the HAN respond according to parameters that have been set by the consumer themselves…. and give the consumer opt-out capabilities from that.

Guerry went on to describe scenarios where your Home Area Network can contact you via SMS, for instance if you are away from home to alert you that your HAN is about to respond to a DR signal and do you want to overide or not!

Guerry did say that there are very few utilities thinking this far out but the fact that there are any is hugely heartening!

Our conversation went on to discussing vehicle to grid technologies and it was super to see that Oracle are thinking about the challenges to be overcome and ways to roll out the technologies required to make this a reality.

With both SAP and Oracle rolling out enabling technologies in this space, the Electricity 2.0 vision is quickly becoming a reality.

post

Utilities are too top-down, command and control

Top-Down

Photo Credit Mikey aka DaSkinnyBlackMan

Utilities are top-down.

Whenever I talk to utilities about Smart Grids and Smart Meters they always trot out the same speech. They want to use Demand Response for peak shaving and they want to implement it by having a mechanism whereby they can come in to their customer’s houses at times of maximum demand and turn down the settings on the aircon, immersion heater, etc.

Unfortunately this kind of traditional top-down, command and control attitude is more likely to turn people off Demand Response programs than to sell it to them.

I know that as a consumer I want to be able to program my appliances myself so that I decide when they turn on/off in response to price signals from the grid. The same is true for fridges/freezers and water immersions – I want them to change thermostat settings to take in electricity at times when energy is cheap and not when it is expensive by MY definitions of cheap and expensive.

I want control of my appliances. I do not want the utility deciding to come in and adjust or turn them on/off for me because it suits them.

Demand Response programs will be hugely beneficial to the utilities and consumers alike but they are complex to explain. If you couple that with the utility having control of your appliances they suddenly become a far harder sell.

Give customers more control of their electricity bill. Allow them reduce costs without reducing usage, by owner controlled, programmatic, time-shifting of consumption and suddenly Demand Response programs becomes an easy sell.

And when you couple that with how Demand Response will stabilise the grid facilitating greater penetration of variable supplies (i.e. weather-based renewables like wind and solar) and you have a win, win, win!

post

Smart Grid demo at the SAP for Utilities Conference

I attended the SAP for Utilities conference last week in San Antonio and was pleasantly surprised to find that many of the Utilities attending were thinking about rolling out Smart Grids or were already running pilot Smart Grids. There were even a couple who were well underway with their Smart Grid rollout project.

Demand Response was being discussed extensively and was cited by most as one of the principal advantages of Smart Grids.

Smart Grids and Demand Response are topics we have covered extensively here on GreenMonk.net and they are core to the Electricity 2.0 talk I gave in Berlin at the Web 2.0 Expo. The importance of Smart Grids and Demand Response cannot be overstated when it comes to energy efficiencies and energy demand management.

SAP are working closely with utilities through the Lighthouse Council, to ensure that whenever a utility wants to go from a traditional grid to a smart grid infrastructure, SAP will have the necessary software pieces in place for them (Enterprise Asset Management, Customer Relationship Management, and the newer Energy Capital Management).

In the video above, Russell Boyer demonstrates a Smart Grid in action. In this use case, Russell acts as the utility call center for a customer who is moving out and wants their power disconnected. The Smart Grid allows the agent to take a meter reading, and shut off power to the meter remotely. This isn’t the best demonstration of the potential of Smart Grids but it was the first time I saw Smart Grid technologies live in action and I have to admit to being wow’d by it.

[Disclosure – SAP covered my expenses for attending this conference]

post

IBM acquires Weather.com for Cloud, AI (aaS), and IoT

Raindrops keep falling...

IBM has announced the completion of the acquisition The Weather Company’s B2B, mobile and cloud-based web-properties, weather.com, Weather Underground, The Weather Company brand and WSI, its global business-to-business brand.
Weather Channel screenshot
At first blush this may not seem like an obvious pairing, but the Weather Company’s products are not just their free apps for your smartphone, they have specialised products for the media industry, the insurance industry, energy and utilities, government, and even retail. All of these verticals would be traditional IBM customers.

Then when you factor in that the Weather Company’s cloud platform takes in over 100 Gbytes per day of information from 2.2 billion weather forecast locations and produces over 300 Gbytes of added products for its customers, it quickly becomes obvious that the Weather Company’s platform is highly optimised for Big Data, and the internet of Things.

This platform will now serve as a backbone for IBM’s Watson IoT.

Watson you will remember, is IBM’s natural language processing and machine learning platform which famously took on and beat two former champions on the quiz show Jeopardy. Since then, IBM have opened up APIs to Watson, to allow developers add cognitive computing features to their apps, and more recently IBM announced Watson IoT Cloud “to extend the power of cognitive computing to the billions of connected devices, sensors and systems that comprise the IoT”.

Given Watson’s relentless moves to cloud and IoT, this acquisition starts to make a lot of sense.

IBM further announced that it will use its network of cloud data centres to expand Weather.com into five new markets including China, India, Brazil, Mexico and Japan, “with the goal of increasing its global user base by hundreds of millions over the next three years”.

With Watson’s deep learning abilities, and all that weather data, one wonders if IBM will be in a position to help scientists researching climate change. At the very least it will help the rest of us be prepared for its consequences.

New developments in AI and deep learning are being announced virtually weekly now by Microsoft, Google and Facebook, amongst others. This is a space which it is safe to say, will completely transform how we interact with computers and data.

post

Flexible Power Alliance develops open source software and standard for smart grid demand management

We have been talking here on GreenMonk about energy demand management since early 2008, and our take on it has always been that for demand management to work, it will need to be automated. Unfortunately, finding a decent automated demand management solution has proven elusive. In part, the recent rise of the Internet of Things technologies has helped spur more interest and developments in this area.

Last week, for example, we attended the European Utility Week in Vienna, and amongst the many fascinating stands that were there, I came across the Flexible Power Alliance on CGI‘s stand.

The Flexible Power Alliance is an open source software alliance comprised of software companies (CGI, and Accenture), Utilities (Alliander and Stedin), and research organisations (such as TNO). This Alliance has developed a standard called FAN which is a communication layer between devices and energy services, and open source software called PowerMatcher, which helps to match the supply and demand of electricity on a grid.

The software developed with Java and OSGI, is Apache 2.0 licensed, and is available to download (or fork) on Github.

And in the video above, we talk to Alliander DevOps Consultant Alexander Krstulovic, and he demonstrates the software in action on a small microgrid. The software turns up and down the consumption of a bank of LED lights, and changes the price of electricity depending on the realtime availability of energy on the virtual market created by the software.

It is worth pointing out that Alliander has trialled this software in the real world, and are now in the process of commercialising it.

post

IBM’s InterConnect 2015, the good and the not so good

IBM InterConnect 2015

IBM invited me to attend their Cloud and Mobile Conference InterConnect 2015 last week.

Because of what IBM has done globally to help people get access to safe water, to help with solar forecasting, and to help deliver better outcomes in healthcare, for example, I tend to have a very positive attitude towards IBM.

So I ventured to the conference with high hopes of what I was going to learn there. and for the most part I wasn’t disappointed. IBM had some very interesting announcements, more on which later.

However, there is one area where IBM has dropped the ball badly – their Cloud Services Division, Softlayer.

IBM have traditionally been a model corporate citizen when it comes to reporting and transparency. They publish annual Corporate Responsibility reports with environmental, energy and emissions data going all the way back to 2002.

However, as noted here previously, when it comes to cloud computing, IBM appear to be pursuing the Amazon model of radical opaqueness. They refuse to publish any data about the energy or emissions associated with their cloud computing platform. This is a retrograde step, and one they may come to regret.

Instead of blindly copying Amazon’s strategy of non-reporting, shouldn’t IBM be embracing the approach of their new best buddies Apple? Apple, fed up of being Greenpeace’d, and seemingly genuinely wanting to leave the world a better place, hired the former head of the EPA, Lisa Jackson to head up its environmental initiatives, and hasn’t looked back.

Apple’s reporting on its cloud infrastructure energy and emissions, on its supply chain [PDF], and on its products complete life cycle analysis, is second to none.

This was made more stark for me because while at InterConnect, I read IBM’s latest cloud announcement about their spending $1.2bn to develop 5 new SoftLayer data centres in the last four months. While I was reading that, I saw Apple’s announcement that they were spending €1.7bn to develop two fully renewably powered data centres in Europe, and I realised there was no mention whatsoever of renewables anywhere in the IBM announcement.

GreenQloud Dashboard

Even better than Apple though, are the Icelandic cloud computing company GreenQloud. GreenQloud host most of their infrastructure out of Iceland, (Iceland’s electricity is generated 100% by renewable sources – 70% hydro and 30% geothermal), and the remainder out of the Digital Fortress data center in Seattle, which runs on 95% renewable energy. Better again though, GreenQloud gives each customer a dashboard with the total energy that customer has consumed and the amount of CO2 they have saved.

This is the kind of cloud leadership you expect from a company with a long tradition of openness, and the big data and analytics chops that IBM has. Now this would be A New Way to Think for IBM.

But, it’s not all bad news, as I mentioned at the outset.

IBM Predictive Maintenance

As you’d expect, there was a lot of talk at InterConnect about the Internet of Things (IoT). Chris O’Connor, IBM’s general manager of IoT, in IBM’s new IoT division, was keen to emphasise that despite the wild hype surrounding IoT at the moment, there’s a lot of business value to be had there too. There was a lot of talk about IBM’s Predictive Maintenance and Quality solutions, for example, which are a natural outcome of IBM’s IoT initiatives. IBM has been doing IoT for years, it just hasn’t always called it that.

And when you combine IBM’s deep expertise in Energy and Utilities, with its knowledge of IoT, you have an opportunity to create truly Smart Grids, not to mention the opportunities around connected cities.

In fact, IoT plays right into the instrumented, interconnected and intelligent Smarter Planet mantra that IBM has been talking for some time now, so I’m excited to see where IBM go with this.

Fun times ahead.

Disclosure – IBM paid my travel and accommodation for me to attend InterConnect.

post

The coming together of the Internet of Things and Smart Grids

I was asked to speak at the recent SAP TechEd && d-code (yes, two ampersands, that’s the branding, not a typo) on the topic of the Internet of Things and Energy.

This is a curious space, because, while the Internet of Things is all the rage now in the consumer space, the New Black, as it were; this is relatively old hat in the utilities sector. Because utilities have expensive, critical infrastructure in the field (think large wind turbines, for example), they need to be able to monitor them remotely. These devices use Internet of Things technologies to report back to base. this is quite common on the high voltage part of the electrical grid.

On the medium voltage section, Internet of Things technologies aren’t as commonly deployed currently (no pun), but mv equipment suppliers are more and more adding sensors to their equipment so that they too can report back. In a recent meeting at Schneider Electric’s North American headquarters, CTO Pascal Brosset announced that Schneider were able to produce a System on a Chip (SoC) for $2, and as a consequence, Schneider were going to add one to all their equipment.

And then on the low voltage network, there are lots of innovations happening behind the smart meter. Nest thermostats, Smappee energy meters, and SmartThings energy apps are just a few of the many new IoT things being released recently.

Now if only we could connect them all up, then we could have a really smart grid.

In case you are in the area, and interested, I’ll be giving a more business-focussed version of this talk at our Business of IoT event in London on Dec 4th.

The slides for this talk are available on SlideShare.

post

Customer service, in-memory computing, and cloud? The utility industry is changing.

SAP For Utilities 2014 Exec Panel

I attended this year’s North American SAP for Utilities event and I was pleasantly surprised by some of the things I found there.

The utilities industry (electricity, gas, and water) are regulated industries which can’t go down (or at least, shouldn’t go down). Because of this, the industry is very slow to change (the old “if it ain’t broke…” mindset). However, with technology relentlessly enabling more and more efficiencies at the infrastructure level, utilities need to learn how to be agile without affecting their service.

This is challenging, sure. But, on the other hand, organisations like Google, Facebook, and Microsoft are incredibly nimble, updating their technologies all the time, and yet they have far better uptime figures than most utilities, I suspect (when is the last time Google was down for you, versus when did your electricity last go out?).

Having said all that, at this year’s event I saw glimmers of hope.

There were a number of areas where change is being embraced:

  1. Customer Service – utility companies have traditionally not been very consumer friendly. This is the industry which refers to its customers as rate payers, and end-points. However, that is starting to break down. This breakdown has been hastened in some regions by market liberalisation, and in all areas by the huge adoption of social media by utility customers. SAP for Utilities agenda
    Utility companies are now starting to adopt social media and utilise some of the strategies we have spoken about and written about so often here.
    What was really encouraging though, was to see that one of the four parallel tracks on the first day of the conference was dedicated to usability (which admittedly is more geared to usability of apps for utility employees, but there’s a knock-on for its customers too), and even better, on the second day of the conference, one of the four parallel tracks dedicated to customer engagement!
  2. In-memory computing – SAP has been pushing its SAP HANA in-memory computing platform to all its customers since it was announced in 2010. As mentioned previously, utility companies are slow to change, so it was interesting to listen to Snohomish County PUD CIO Benjamin Beberness, in the conference’s closing keynote, talking about his organisation’s decision to go all-in on SAP’s HANA in-memory platform. I shot an interview with Benjamin which I’ll be publishing here in the next few days where he talks about some of the advantages for Snohomish PUD of in-memory computing.
  3. Cloud computing – and finally, there was some serious talk of the move to Cloud computing by utilities. In the Utility Executive Panel (pictured above), Xcel Energy‘s CIO and VP, David Harkness said that before he retires his organisation will have closed their data center and moved their IT infrastructure entirely to the cloud. And he then added a rider that his retirement is not that far off.
    Given that this was the week after the celebrity photo leaks, there was also, understandably, some discussion about the requirement for cybersecurity, but there was broad acceptance of the inevitability of the move to cloud computing

I have been attending (and occasionally keynoting) this SAP for Utilities event now since 2008 so it has been very interesting to see these changes occurring over time. A year and a half ago I had a conversation with an SAP executive where I said it was too early to discuss cloud computing with utilities. And it was. Then. But now, cloud is seen by utilities as an a logical addition to their IT roadmap. I wouldn’t have predicted that change coming about so soon.

Disclosure – SAP paid my travel and accommodation to attend the event.

post

Just where does Intel fit in the Smart Grid ecosystem?

Intel Solar Installation Vietnam

When we think of smart grids, Intel is not the first name we think of.

This is a perception that Intel is anxious to change. The multinational chip manufacturer, which has seen its revenues drop in recent years, is looking for new markets for its products. The smart grid space with increasing billions being spent annually begins to look very attractive. And Intel’s place in it? Intel is positioning itself very much at the smart in the smart grid. The distributed compute and intelligence needed to power the smart grid.

One of the most important functions of an electricity grid is to precisely balance electricity supply from its various generators, to the demand from consumers. This is an increasingly complex task in a world where consumers are more and more becoming prosumers (producers and consumers). Utilities need transparency to forecast how much energy will be consumed, and where to help stabilise the energy flows, and the price of energy on the grid.

To achieve this transparency, particularly towards the edges of the network, chips like Intel’s Quark SoC will be important.

Intel isn’t relying on its silicon chops alone though. Speaking recently to Intel’s Hannes Schwaderer (Director of Energy and Industrial Applications for EMEA), Hannes was keen to point out the other strengths Intel brings to the table. Smart grids generate big data, and lots of it. Intel’s investments in Cloudera, and Mashery give Intel a big footprint in the big data and analytics spaces, according to Hannes.

And then there’s the old security chestnut. Apart from your health, no data is as personal, and so in need of privacy, as your energy information. Intel’s purchase of computer security company McAfee allows it to offer quite a unique combination of hardware (the chips), analytics, and security to its potential customers.

And while end consumers were never Intel’s customers in the PC world, similarly in the smart grid space utilities are not the end customer for Intels smart grid solutions. Rather expect to see Intel selling to the Schneider Electronics, the GE’s, the Siemens of this world.