The future of electric utilities – change and disruption ahead

The utilities industry has typically been change averse, and often for good reasons, but with the technological advances of the past few years, the low carbon imperative, and pressure from customers, utilities are going to have to figure out how to disrupt their business, or they will themselves be disrupted.

I gave the opening keynote at this year’s SAP for Utilities event in Huntington Beach on the topic of the Convergence of IoT and Energy (see the video above). Interestingly, with no coordination beforehand, all the main speakers referred to the turmoil coming to the utilities sector, and each independently referenced Tesla and Uber as examples of tumultuous changes happening in other industries.

What are the main challenges facing the utilities industry?

As noted here previously, due to the Swanson effect, the cost of solar is falling all the time, with no end in sight. The result of this will be more and more distributed generation being added to the grid, which utilities will have to manage, and added to that, the utilities will have reduced income from electricity sales, as more and more people generate their own.

On top of that, with the recent launch of their PowerWall product, Tesla ensured that in-home energy storage is set to become a thing.

Battery technology is advancing at a dizzying pace, and as a consequence:

1) the cost of lithium ion batteries is dropping constantly Battery Cost


2) the energy density of the batteries is increasing all the time Li-Ion battery energy Density

(Charts courtesy of Prof Maarten Steinbuch, Director Graduate Program Automotive Systems, Eindhoven University of Technology)

With battery prices falling, solar prices falling, and battery energy density increasing, there is a very real likelihood that many people will opt to go “off-grid” or drastically reduce their electricity needs.

How will utility companies deal with this?

There are many possibilities, but, as we have noted here previously, an increased focus on by utilities on energy services seems like an obvious one. This is especially true now, given the vast quantities of data that smart meters are providing utility companies, and the fact that the Internet of Things (IoT) is ensuring that a growing number of our devices are smart and connected.

Further, with the cost of (solar) generation falling, I can foresee a time when utility companies move to the landline model. You pay a set amount per month for the connection, and your electricity is free after that. Given that, it is all the more imperative that utility companies figure out how to disrupt their own business, if only to find alternative revenue streams to ensure their survival.

So, who’s going to be the Uber of electricity?


Flexible Power Alliance develops open source software and standard for smart grid demand management

We have been talking here on GreenMonk about energy demand management since early 2008, and our take on it has always been that for demand management to work, it will need to be automated. Unfortunately, finding a decent automated demand management solution has proven elusive. In part, the recent rise of the Internet of Things technologies has helped spur more interest and developments in this area.

Last week, for example, we attended the European Utility Week in Vienna, and amongst the many fascinating stands that were there, I came across the Flexible Power Alliance on CGI‘s stand.

The Flexible Power Alliance is an open source software alliance comprised of software companies (CGI, and Accenture), Utilities (Alliander and Stedin), and research organisations (such as TNO). This Alliance has developed a standard called FAN which is a communication layer between devices and energy services, and open source software called PowerMatcher, which helps to match the supply and demand of electricity on a grid.

The software developed with Java and OSGI, is Apache 2.0 licensed, and is available to download (or fork) on Github.

And in the video above, we talk to Alliander DevOps Consultant Alexander Krstulovic, and he demonstrates the software in action on a small microgrid. The software turns up and down the consumption of a bank of LED lights, and changes the price of electricity depending on the realtime availability of energy on the virtual market created by the software.

It is worth pointing out that Alliander has trialled this software in the real world, and are now in the process of commercialising it.


What do we do in a world where energy is in abundance?

Swanson Effect

The cost of solar power is falling in direct relation to the amount of solar power modules being produced. With no end in sight to this price reduction, we should soon be in a world where energy is in abundance.

Solar PV Installed globally

Moore’s Law, the law which says the number of transistors in computers doubles every two years approximately, has an equivalent in solar power called Swanson’s Law. Swanson’s Law says that the price of solar panels tends to drop 20% for every doubling of cumulative shipped volume. This leads to a positive feedback loop of lower prices meaning more solar pv installed, leading to lower prices, and so on. And consequently, as the price of solar panels drops (see top graph), the amount of installed solar globally has increased exponentially (see graph on right).

This law has held true since 1977, and according to the Economist

technological developments that have been proved in the laboratory but have not yet moved into the factory mean Swanson’s law still has many years to run

This positive feedback loop is manifesting itself in China, where in May of this year the National Development and Reform Commission announced that China would target a more than tripling of its installed solar capacity to 70 gigawatts (GW) by 2017.

While in India, Prime Minister Narendra Modi and his cabinet recently approved increasing the country’s solar target five fold to a goal of reaching 100GW by 2022.

To put those numbers in perspective, according to the International Energy Agency’s Snapshot of Global PV Markets 2014 report, the total amount of solar PV installed globally reached 177GW at the end of 2014.

And it is not just South East Asia, Brazil and the US this week reached an historic climate agreement which will require.

the US to triple its production of wind and solar power and other renewable energies. Brazil will need to double its production of clean energy. The figures do not include hydro power.

And according to GTM Research, by 2020 Europe will install 42GW and account for 31% of the global solar market.

Is this having a significant effect on pricing?

Absolutely it is. The price for installed solar hit another new low at the end of last year when Dubai utility DEWA awarded a contract to Riyadh based consortium Acwa Power to build and operate a 200MW solar park for a guaranteed purchase price of 5.84US cents per kWh for 25 years.

To fully appreciate the significance of this price, it is necessary to understand that the price of natural gas – which generates 99% of the UAE’s electricity – stands at 9 cents. So, in the United Arab Emirates at least, solar power is currently 65% of the cost of the next cheapest form of electricity production. And its price will continue to decline for the foreseeable future.

So, solar power is cheap (in some cases 65% of the cost of the next nearest competitor), its price is constantly dropping, and with 100’s of gigawatts of orders coming into the pipeline, the price reductions may even accelerate.

In this scenario we are headed into a world where solar power rates, for all intents and purposes, approach zero. In that situation, the question becomes, what do we do in a world where energy is in abundance?


GreenTouch release tools and technologies to significantly reduce mobile networks energy consumption

Mobile Phone

Mobile industry consortium GreenTouch today released tools and technologies which, they claim, have the potential to reduce the energy consumption of communication networks by 98%

The world is now awash with mobile phones.

According to Ericsson’s June 2015 mobility report [PDF warning], the total number of mobile subscriptions globally in Q1 2015 was 7.2 billion. By 2020, that number is predicted to increase another 2 billion to 9.2 billion handsets.

Of those 7.2 billion subscriptions, around 40% are associated with smartphones, and this number is increasing daily. In fact, the report predicts that by 2016 the number of smartphone subscriptions will surpass those of basic phones, and smartphone numbers will reach 6.1 billion by 2020.

Number of connected devices

When you add to that the number of connected devices now on mobile networks (M2M, consumer electronics, laptops/tablets/wearables), we are looking at roughly 25 billion connected devices by 2020.

That’s a lot of data passing being moved around the networks. And, as you would expect that number is increasing at an enormous rate as well. There was a 55% growth in data traffic between Q1 2014 and Q1 2015, and there is expected to be a 10x growth in smartphone traffic between 2014 and 2020.

So how much energy is required to shunt this data to and fro? Estimates cite ICT as being responsible for the consumption of 2% of the world’s energy, and mobile networking making up roughly half of that. With the number of smartphones set to more than double globally between now and 2020, that figure too is shooting up.

Global power consumption by telecommunications networks

Fortunately five years ago an industry organisation called GreenTouch was created by Bell Labs and other stakeholders in the space, with the object of reducing mobile networking’s footprint. In fact, the goal of GreenTouch when it was created was to come up with technologies reduce the energy consumption of mobile networks 1,000x by 2015.

Today, June 18th in New York, they announced the results of their last five years work, and it is that they have come up with ways for mobile companies to reduce their consumption, not by the 1,000x that they were aiming for, but by 10,000x!

The consortium also announced

research that will enable significant improvements in other areas of communications networks, including core networks and fixed (wired) residential and enterprise networks. With these energy-efficiency improvements, the net energy consumption of communication networks could be reduced by 98%

And today GreenTouch also released two tools for organisations and stakeholders interested in creating more efficient networks, GWATT and Flexible Power Model.

They went on to announce some of the innovations which led to this potential huge reduction in mobile energy consumption:


  • Beyond Cellular Green Generation (BCG2) — This architecture uses densely deployed small cells with intelligent sleep modes and completely separates the signaling and data functions in a cellular network to dramatically improve energy efficiency over current LTE networks.
  • Large-Scale Antenna System (LSAS) — This system replaces today’s cellular macro base stations with a large number of physically smaller, low-power and individually-controlled antennas delivering many user-selective data beams intended to maximize the energy efficiency of the system, taking into account the RF transmit power and the power consumption required for internal electronics and signal processing.
  • Distributed Energy-Efficient Clouds – This architecture introduces a new analytic optimization framework to minimize the power consumption of content distribution networks (the delivery of video, photo, music and other larger files – which constitutes over 90% of the traffic on core networks) resulting in a new architecture of distributed “mini clouds” closer to the end users instead of large data centers.
  • Green Transmission Technologies (GTT) – This set of technologies focuses on the optimal tradeoff between spectral efficiency and energy efficiency in wireless networks, optimizing different technologies, such as single user and multi-user MIMO, coordinated multi-point transmissions and interference alignment, for energy efficiency.
  • Cascaded Bit Interleaving Passive Optical Networks (CBI-PON) – This advancement extends the previously announced Bit Interleaving Passive Optical Network (BiPON) technology to a Cascaded Bi-PON architecture that allows any network node in the access, edge and metro networks to efficiently process only the portion of the traffic that is relevant to that node, thereby significantly reducing the total power consumption across the entire network.

Now that these innovations are released, mobile operators hoping to reduce their energy costs will be looking closely to see how they can integrate these new tools/technologies into their network. For many, realistically, the first opportunity to architect them in will be with the rollout of the 5G networks post 2020.

Mobile phone mast

Having met (and exceeded) its five year goal, what’s next for GreenTouch?

I asked this to GreenTouch Chairman Thierry Van Landegem on the phone earlier in the week. He replied that the organisation is now looking to set a new, bold goal. They are looking the energy efficiency of areas such as cloud, network virtualisation, and Internet of Things, and that they will likely announcement their next objective early next year.

I can’t wait to see what they come up with next.

Mobile phone mast photo Pete S


Technology is moving us, finally, towards the vision of personalised medicine

We attended this year’s SapphireNow event (SAP’s customer and partner conference) in Orlando and were very impressed with some of the advances SAP and their ecosystem are making in the field of healthcare.

Why is this important?

Healthcare for many decades now has been stagnant when it comes to technological disruption. Go to most hospitals today and you will still see doctors using paper and clipboards for their patient notes. Don’t just take our word for it, in her highly anticipated 2015 Internet Trends report Mary Meeker clearly identified that the impact of the Internet on healthcare is far behind most other sectors.

But this is changing, and changing rapidly. The changes coming to the healthcare sector will be profound, and will happen faster than anyone is prepared for.

DNA sequencing cost per genome

And one of the main catalysts of this change has been the collapse in the cost of gene sequencing in the last ten years. See that collapse charted in the graph to the right. And note that the y-axis showing the cost of sequencing is using a logarithmic scale. The costs of sequencing are falling far faster than the price of the processing power required to analyse the genetic data. This means the cost of sequencing is now more influenced by the cost of data analysis, than data collection. This has been a remarkable turn of events, especially given the first human genome was only published fourteen years ago, in 2001.

The advances in the data analytics picking up pace too. In memory databases, such as SAP’s HANA, and cognitive computing using devices like IBM’s Watson, are contributing enormously to this.

To get an idea just how much the analytics is advancing, watch the analysis of data from 100,000 patients by Prof Christof von Kalle, director of Heidelberg’s National Center for Tumor Diseases in the video below. Keep in mind that each of the 100,000 patients has 3bn base pairs in their genome, and he’s analysing them in realtime (Prof Von Kalle’s demo starts at 1:00:03 in the video, and lasts a little over 5 minutes).

As he says at the conclusion, two years ago a similar study conducted over several years by teams of scientists was published as a paper in the journal Nature. That’s an incredible rate of change.

IBM are also making huge advances in this field with their cognitive computing engine, Watson. In a recent announcement, IBM detailed how they have teamed up with fourteen North American cancer institutes to analyse the DNA of their patients to gain insights into the cancers involved, and to speed up the era of personalised medicine.

Personalised medicine is where a patient’s DNA is sequenced, as is the DNA of their tumour (in the case of cancer), and an individualised treatment, specific to the genotype of their cancer is designed and applied.

This differs from the precision medicine offerings being offered today by Molecular Health, and discussed by Dr Alexander Picker in the video at the top of this post.

Precision medicine is where existing treatments are analysed to see which is best equipped to tackle a patient’s tumour, given their genotype, and the genotype of their cancer. One thing I learned from talking to Dr Picker at Sapphirenow is that cancers used to be classified by their morphology (lung cancer, liver cancer, skin cancer, etc.) and treated accordingly. Now, cancers are starting to be classified according to their genotype, not their morphology, and tackling cancers this way is a far more effective form of therapy.

Finally, SAP and IBM are far from being alone in this space. Google, Microsoft and Apple are also starting to look seriously at this health.

With all this effort being pored into this personalised medicine, I think it is safe to say Ms. Meeker’s 2016 slide featuring health will look a little different.

UPDATE – Since publishing this post SAP have uploaded a video to YouTube showcasing their internal application of Molecular Health’s solution for employees of SAP who are diagnosed with cancer. You can see that below:

Full disclosure – SAP paid my travel and accommodation to attend their Sapphirenow event


Equinix rolls out 1MW fuel cell for Silicon Valley data center

Equinix Silicon Valley Data Center

Equinix is powering one of its Silicon Valley data centers with a 1MW Bloom Energy fuel cell

As we have pointed out here many times, the main cloud providers (particularly Amazon and IBM) are doing a very poor job either powering their data centers with renewable energy, or reporting on the emissions associated with their cloud computing infrastructure.

Given the significantly increasing use of cloud computing by larger organisations, and the growing economic costs of climate change, the sources of the electricity used by these power-hungry data centers is now more relevant than ever.

Against this background, it is impressive to see to see Equinix, a global provider of carrier-neutral data centers (with a fleet of over 100 data centers) and internet exchanges, announce a 1MW Bloom Energy biogas fuel cell project at its SV5 data center, in Silicon Valley. Biogas is methane gas captured from decomposing organic matter such as that from landfills or animal waste.

Why would Equinix do this?

Well, the first phase of California’s cap and trade program for CO2 emissions commenced in January 2013, and this could, in time lead to increased costs for electricity. Indeed in their 2014 SEC filing [PDF], Equinix note that:

The effect on the price we pay for electricity cannot yet be determined, but the increase could exceed 5% of our costs of electricity at our California locations. In 2015, a second phase of the program will begin, imposing allowance obligations upon suppliers of most forms of fossil fuels, which will increase the costs of our petroleum fuels used for transportation and emergency generators.

We do not anticipate that the climate change-related laws and regulations will force us to modify our operations to limit the emissions of GHG. We could, however, be directly subject to taxes, fees or costs, or could indirectly be required to reimburse electricity providers for such costs representing the GHG attributable to our electricity or fossil fuel consumption. These cost increases could materially increase our costs of operation or limit the availability of electricity or emergency generator fuels.

In light of this, self-generation using fuel cells looks very attractive, both from the point of view of energy cost stability, and reduced exposure to increasing carbon related costs.

On the other hand, according to today’s announcement, Equinix already gets approximately 30% of its electricity from renewable sources, and it plans to increase this to 100% “over time”.

Even better than that, Equinix is 100% renewably powered in Europe despite its growth. So Equinix is walking the walk in Europe, at least, and has a stated aim to go all the way to 100% renewable power.

What more could Equinix do?

Well, two things come to mind immediately:

  1. Set an actual hard target date for the 100% from renewables and
  2. Start reporting all emissions to the CDP (and the SEC)

Given how important a player Equinix in the global internet infrastructure, the sooner we see them hit their 100% target, the better for all.


Utilities and change – times are changing

I attended this year’s International SAP for Utilities event in Berlin and despite it’s being co-located with the SAP Oil and Gas conference which led to some unfortunate references in the keynotes, it was an interesting event.

Some of the numbers I learned at the event were that of SAP’s 5,800 utility customers, 65 are on HANA, and the first Suite on HANA customer (Snohomish County Public Utility District) will go live in September.

SAP also let it be known that it has 2 utilities customers who are on their new S4 platform. One is a net new customer, while the other is migrating so as to be a “Utility of the Future”.

Apart from that I had conversations with several SAP Utilities customers, and I was surprised at how utilities, who have traditionally been averse to change (65 out of 5,800 have moved to HANA), are starting to realise that technological change is inevitable, and so are starting to embrace it. Albeit slowly.

I spoke to Khalid Al Dossary from Saudi Electricity Company (see video above) and he told me of two projects they’ve recently rolled out. The first was a move to paper invoicing because they want to move completely away from paper. And the second, even more interesting was the rollout of SuccessFactors, for talent management (HR).

Why is this interesting? Well, SuccessFactors is cloud delivered and utilities have been seen as being cloud averse. I remember having conversations with utilities executives who said they’d never move to cloud only two years ago. It’s funny how time moves on.

I also spoke to Hydro Tasmania’s Rick Quarmby (see below) who talked about two projects Hydro Tasmania rolled out recently. One using OpenText for document management, and the other was a workforce productivity app (which you can see the employees rave about here).

I also recall, in the mid to late nineties, having conversations with people who said their business’ didn’t need to have a website. Or that they would rollout email and Internet access within the company, but only to certain employees who might need it.

Thankfully those days are long gone, and now it is unusual for organisations not to have a website, or to block internet access for their employees.

In a similar vein, with the increasing pace of technological change, I fully expect the vast majority of utilities to have moved fully to the cloud ten years from now, and any who haven’t will be viewed as laggards.


IBM to increase the amount of renewable electricity it procures

IBM branded battery

After returning from IBM’s InterConnect conference recently we chided IBM for their radical opaqueness concerning their cloud emissions, and their lack of innovation concerning renewables.

However, some better news emerged in the last few days.

The Whitehouse last week hosted a roundtable of some of the largest Federal suppliers to discuss their GHG reduction targets, or if they didn’t have any, to create and disclose them.

Coming out of that roundtable, IBM announced its committment to procure electricity from renewable sources for 20% of its annual electricity consumption by 2020. To do this, IBM will contract over 800 gigawatt-hours (GWh) per year of renewable electricity.

And IBM further committed to:

Reduce CO2 emissions associated with IBM’s energy consumption 35% by year-end 2020 against base year 2005 adjusted for acquisitions and divestitures.

To put this in context, in the energy conservation section of IBM’s 2013 corporate report, IBM reports that it sourced 17% of its electricity from renewable sources in 2013.

It is now committing to increase that from the 2013 figure of 17% to 20% by 2020. Hmmm.

IBM committed to purchasing 800 GWh’s of renewable electricity per year by 2020. How does that compare to some of its peers?

In 2014, the EPA reported that Intel purchased 3,102 GWh’s, of renewable electricity, and Microsoft purchased 2,488 GWh’s which, in both cases amounted to 100% of their total US electricity use.

In light of this, 800 GWh’s amounting to 20% of total electricity use looks a little under-ambitious.

On the other hand, at least IBM are doing something.

Amazon, as noted earlier, have steadfastly refused to do any reporting of their energy consumption, and their emissions. This may well be, at least in part, because Amazon doesn’t sell enough to the government to appear on the US Federal government’s Greenhouse Gas Management Scorecard for significant suppliers.

With the news this week that 2015 will likely be the hottest year on record, and that the Antarctic ice sheets are melting at unprecedented rates, it is time for organisations that can make a significant difference, to do so.

Google, purchased 32% of their total US energy from renewables in 2014. But more than that, this week it emerged that Google are considering moving climate denying sites down the list of Google search results.

And Dell have introduced AirCarbon, packaging for its products which is externally certified carbon negative.

These are the kinds of measures that can make a difference.

Come on IBM. If this were your Spring Break report card, it’d read “IBM – could work harder”.


Apple launches ResearchKit – secure, private, open source medical research


Apple announced a new initiative at its Spring Forward event yesterday – ResearchKit.

What is ResearchKit? Apple’s SVP of Operations, Jeff Williams, described it as a framework for medical researchers to create and deploy mobile apps which collect and share medical data from phone users (with their permission), and share it with the researchers.

Why is this important? Previously it has proven difficult for research organisations to secure volunteers for research studies, and the data collected from such studies is often collected, at best, quarterly.

With this program, Apple hopes to help researchers more easily attract volunteers, and collect their information far more frequently (up to once a second), yielding far richer data.

The platform itself launches next month, but already there are 5 apps available, targeting Parkinson’s, diabetes, heart disease, asthma, and breast cancer. These apps have been developed by medical research organisations, in conjunction with Apple.

The success of this approach can be seen already in this tweet:

I downloaded mPower, the app for Parkinson’s to try it out, but for now, they are only signing up people who are based in the US.

As well as capturing data for the researchers, mPower also presents valuable information to the user, tracking gait and tremor, and seeing if they improve over time, when combined with increased exercise. So the app is a win both for the research organisations, and for the users too.

Apple Does Not See Your Data

Apple went to great pains to stress that the user is in complete control over who gets to see the data. And Apple themselves doesn’t ever get to see your data.

This is obviously a direct shot at Google, and its advertising platform’s need to see your data. Expect to hear this mantra repeated more and more by Apple in future launches.

This focus on privacy, along with Apple’s aggressive stance on fixing security holes, and defaulting to encryption on its devices, is becoming a clear differentiator between Apple and Android (and let’s face it, in mobile, this is a two horse race, for now).

ResearchKit Open Source

Finally, Williams concluded the launch by saying Apple wants ResearchKit on as many devices as possible. Consequently, Apple are going to make ResearchKit open source. It remains to see which open source license they will opt for.

But, open sourcing ResearckKit is a very important step, as it lends transparency to the privacy and security which Apple say is built-in, as well as validating Apple’s claim that they don’t see your data.

And it also opens ResearchKit up to other mobile platforms to use (Android, Windows, Blackberry), vastly increasing the potential pool of participants for medical research.

We have documented here on GreenMonk numerous times how Big Data, and analysis tools are revolutionising health care.

Now we are seeing mobile getting in on the action too. And how.


IBM’s InterConnect 2015, the good and the not so good

IBM InterConnect 2015

IBM invited me to attend their Cloud and Mobile Conference InterConnect 2015 last week.

Because of what IBM has done globally to help people get access to safe water, to help with solar forecasting, and to help deliver better outcomes in healthcare, for example, I tend to have a very positive attitude towards IBM.

So I ventured to the conference with high hopes of what I was going to learn there. and for the most part I wasn’t disappointed. IBM had some very interesting announcements, more on which later.

However, there is one area where IBM has dropped the ball badly – their Cloud Services Division, Softlayer.

IBM have traditionally been a model corporate citizen when it comes to reporting and transparency. They publish annual Corporate Responsibility reports with environmental, energy and emissions data going all the way back to 2002.

However, as noted here previously, when it comes to cloud computing, IBM appear to be pursuing the Amazon model of radical opaqueness. They refuse to publish any data about the energy or emissions associated with their cloud computing platform. This is a retrograde step, and one they may come to regret.

Instead of blindly copying Amazon’s strategy of non-reporting, shouldn’t IBM be embracing the approach of their new best buddies Apple? Apple, fed up of being Greenpeace’d, and seemingly genuinely wanting to leave the world a better place, hired the former head of the EPA, Lisa Jackson to head up its environmental initiatives, and hasn’t looked back.

Apple’s reporting on its cloud infrastructure energy and emissions, on its supply chain [PDF], and on its products complete life cycle analysis, is second to none.

This was made more stark for me because while at InterConnect, I read IBM’s latest cloud announcement about their spending $1.2bn to develop 5 new SoftLayer data centres in the last four months. While I was reading that, I saw Apple’s announcement that they were spending €1.7bn to develop two fully renewably powered data centres in Europe, and I realised there was no mention whatsoever of renewables anywhere in the IBM announcement.

GreenQloud Dashboard

Even better than Apple though, are the Icelandic cloud computing company GreenQloud. GreenQloud host most of their infrastructure out of Iceland, (Iceland’s electricity is generated 100% by renewable sources – 70% hydro and 30% geothermal), and the remainder out of the Digital Fortress data center in Seattle, which runs on 95% renewable energy. Better again though, GreenQloud gives each customer a dashboard with the total energy that customer has consumed and the amount of CO2 they have saved.

This is the kind of cloud leadership you expect from a company with a long tradition of openness, and the big data and analytics chops that IBM has. Now this would be A New Way to Think for IBM.

But, it’s not all bad news, as I mentioned at the outset.

IBM Predictive Maintenance

As you’d expect, there was a lot of talk at InterConnect about the Internet of Things (IoT). Chris O’Connor, IBM’s general manager of IoT, in IBM’s new IoT division, was keen to emphasise that despite the wild hype surrounding IoT at the moment, there’s a lot of business value to be had there too. There was a lot of talk about IBM’s Predictive Maintenance and Quality solutions, for example, which are a natural outcome of IBM’s IoT initiatives. IBM has been doing IoT for years, it just hasn’t always called it that.

And when you combine IBM’s deep expertise in Energy and Utilities, with its knowledge of IoT, you have an opportunity to create truly Smart Grids, not to mention the opportunities around connected cities.

In fact, IoT plays right into the instrumented, interconnected and intelligent Smarter Planet mantra that IBM has been talking for some time now, so I’m excited to see where IBM go with this.

Fun times ahead.

Disclosure – IBM paid my travel and accommodation for me to attend InterConnect.