Here comes the sun… IBM and solar forecasting

Concentrating solar power array

For decades now electricity grids have been architected in the same way with large centralised generation facilities pumping out electricity to large numbers of distributed consumers. Generation has been controlled, and predictable. This model is breaking down fast.

In the last decade we have seen a massive upsurge in the amount of renewable generation making its way onto the grid. Most of this new renewable generation is coming from wind and solar. Just last year (2013), almost a third of all newly added electricity generation in the US came from solar. That’s an unprecedented number which points to a rapid move away from the old order.

This raises big challenges for the grid operators and utilities. Now they are moving to a situation where generation is variable and not very predictable. And demand is also variable and only somewhat predictable. In a situation where supply and demand are both variable, grid stability can be an issue.

To counter this, a number of strategies are being looked at including demand response (managing the demand so it more closely mirrors the supply), storage (where excess generation is stored as heat, or potential energy, and released once generation drops and/or demand increases), and better forecasting of the generation from variable suppliers.

Some of the more successful work being done on forecasting generation from renewables is being undertaken by Dr Hendrik Hamann at IBM’s TJ Watson Research Center, in New York. Specifically Dr Hamann is looking at improving the accuracy of forecasting solar power generation. Solar is extremely complex to forecast because factors such as cloud cover, cloud opacity and wind have to be taken into account.
IBM Solar Forecaster
Dr Hamann uses a deep machine learning approach to tackle the many petabytes of big data generated by satellite images, ground observations, and solar databases. The results have been enviable apparently. According to Dr. Hamann, solar forecast accuracy using this approach is 50% more accurate than the next best forecasting model. And the same approach can be used to predict rainfall, surface temperature, and wind. In the case of wind, the forecast accuracy is 35% better than the next best model.

This is still very much a research project so there is no timeline yet on when (or even if) this will become a product, but if it does, I can see it being an extremely valuable tool for solar farm operators (to avoid fines for over-production, for example), for utilities to plan power purchases, and for grid management companies for grid stability purposes.

The fact that it is a cloud delivered (pun intended, sorry) solution would mean that if IBM brings it to market it will have a reduced cost and time to delivery, bringing it potentially within reach of smaller operators. And with the increase in the number of solar operators (140,000 individual solar installations in the U.S. in 2013) on the grid, highly accurate forecasting is becoming more important by the day.

Microsoft, big data and smarter buildings

Smarter building dashboard

If you checked out the New York Times Snow Fall site (the story of the Avalanche at Tunnel Creek), then Microsoft’s new 88 Acres site will look familiar. If you haven’t seen the Snow Fall site then go check it out, it is a beautiful and sensitive telling of a tragic story. You won’t regret the few minutes you spend viewing it.

Microsoft’s 88 Acres is an obvious homage to that site, except that it tells a good news story, thankfully, and tells it well. It is the story of how Microsoft is turning its 125-building Redmond HQ into a smart corporate campus.

Microsoft’s campus had been built over several decades with little thought given to integrating the building management systems there. When Darrell Smith, Microsoft’s director of facilities and energy joined the company in 2008, he priced a ‘rip and replace’ option to get the disparate systems talking to each other but when it came in at in excess of $60m, he decided they needed to brew their own. And that’s just what they did.

Using Microsoft’s own software they built a system capable of taking in the data from the over 30,000 sensors throughout the campus and detecting and reporting on anomalies. They first piloted the solution on 13 buildings on the campus and as they explain on the 88 Acres site:

In one building garage, exhaust fans had been mistakenly left on for a year (to the tune of $66,000 of wasted energy). Within moments of coming online, the smart buildings solution sniffed out this fault and the problem was corrected.
In another building, the software informed engineers about a pressurization issue in a chilled water system. The problem took less than five minutes to fix, resulting in $12,000 of savings each year.
Those fixes were just the beginning.

The system balances factors like the cost of a fix, the money that will be saved by the fix, and the disruption a fix will have on employees. It then prioritises the issues it finds based on these factors.

Microsoft facilities engineer Jonathan Grove sums up how the new system changes his job “I used to spend 70 percent of my time gathering and compiling data and only about 30 percent of my time doing engineering,” Grove says. “Our smart buildings work serves up data for me in easily consumable formats, so now I get to spend 95 percent of my time doing engineering, which is great.”

The facilities team are now dealing with enormous quantities of data. According to Microsoft, the 125 buildings contain 2,000,000 data points outputting around 500,000,000 data transactions every 24 hours. The charts, graphics and reports it produces leads to about 32,300 work orders being issued per quarter. And 48% of the faults found are corrected within 60 seconds. Microsoft forecasts energy savings of 6-10% per year, with an implementation payback of 18 months.

Because Microsoft’s smart building tool was built using off the shelf Microsoft technologies, it is now being productised and will be offered for sale. It joins a slew of other smarter building solutions currently on the market from the likes of IBM, Echelon, Cisco et al, but given this one is built with basic Microsoft technologies, it will be interesting to see where it comes in terms of pricing.

Price will certainly be one of the big deciding factors in any purchasing decision, any building management tool will need to repay it’s costs within at least 18 months to merit consideration. Functionality too will be one of the primary purchase filters and what is not clear at all, from the Microsoft report, is whether their solution can handle buildings on multiple sites or geographies. If I hear back either way from Microsoft on this, I will update this post.

This is a market that is really starting to take off. Navigant Research (formerly Pike Research) issued a report last year estimating the size of the smart building managed services market alone will grow from $291m in 2012 to $1.1bn by 2020. While IMS Research estimated the Americas market for integrated and intelligent building systems was be worth more than $24 billion in 2012.

One thing is for sure, given that buildings consume around 40% of our energy, any new entrant into the smarter buildings arena is to be welcomed.

Image credit nicadlr

Sustainability, social media and big data

The term Big Data is becoming the buzz word du jour in IT these days popping up everywhere, but with good reason – more and more data is being collected, curated and analysed today, than ever before.

Dick Costolo, CEO of Twitter announced last week that Twitter is now publishing 500 million tweets per day. Not alone is Twitter publishing them though, it is organising them and storing them in perpetuity. That’s a lot of storage, and 500 million tweets per day (and rising) is big data, no doubt.

And Facebook similarly announced that 2.5 billion content items are shared per day on its platform, and it records 2.7 billion Likes per day. Now that’s big data.

But for really big data, it is hard to beat the fact that CERN’s Large Hadron Collider creates 1 petabyte of information every second!

And this has what to do with Sustainability, I hear you ask.

Well, it is all about the information you can extract from that data – and there are some fascinating use cases starting to emerge.

A study published in the American Journal of Tropical Medicine and Hygiene found that Twitter was as accurate as official sources in tracking the cholera epidemic in Haiti in the wake of the deadly earthquake there. The big difference between Twitter as a predictor of this epidemic and the official sources is that Twitter was 2 weeks faster at predicting it. There’s a lot of good that can be done in crisis situations with a two week head start.

Another fascinating use case I came across is using social media as an early predictor of faults in automobiles. A social media monitoring tool developed by Virginia Tech’s Pamplin College of Business can provide car makers with an efficient way to discover and classify vehicle defects. Again, although at early stages of development yet, it shows promising results, and anything which can improve the safety of automobiles can have a very large impact (no pun!).

GE's Grid IQ Insight social media monitoring tool

GE have come up with another fascinating way to mine big data for good. Their Grid IQ Insight tool, slated for release next year, can mine social media for mentions of electrical outages. When those posts are geotagged (as many social media posts now are), utilities using Grid IQ Insight can get an early notification of an outage in its area. Clusters of mentions can help with confirmation and localisation. Photos or videos added of trees down, or (as in this photo) of a fire in a substation can help the utility decide which personnel and equipment to add to the truckroll to repair the fault. Speeding up the repair process and getting customers back on a working electricity grid once again can be critical in an age where so many of our devices rely on electricity to operate.

Finally, many companies are now using products like Radian6 (now re-branded as Salesforce Marketing Cloud) to actively monitor social media for mentions of their brand, so they can respond in a timely manner. Gatorade in the video above is one good example. So too are Dell. Dell have a Social Media Listening Command Centre which is staffed by 70 employees who listen for and respond to mentions of Dell products 24 hours a day in 11 languages (English, plus Japanese, Chinese, Portugese, Spanish, French, German, Norwegian, Danish, Swedish, and Korean). The sustainability angle of this story is that Dell took their learnings from setting up this command centre and used them to help the American Red Cross set up a similar command centre. Dell also contributed funding and equipment to help get his off the ground.

No doubt the Command Centre is proving itself invaluable to the American Red Cross this week mining big data to help people in need in the aftermath of Hurricane Sandy.