post

IBM acquires Weather.com for Cloud, AI (aaS), and IoT

Raindrops keep falling...

IBM has announced the completion of the acquisition The Weather Company’s B2B, mobile and cloud-based web-properties, weather.com, Weather Underground, The Weather Company brand and WSI, its global business-to-business brand.
Weather Channel screenshot
At first blush this may not seem like an obvious pairing, but the Weather Company’s products are not just their free apps for your smartphone, they have specialised products for the media industry, the insurance industry, energy and utilities, government, and even retail. All of these verticals would be traditional IBM customers.

Then when you factor in that the Weather Company’s cloud platform takes in over 100 Gbytes per day of information from 2.2 billion weather forecast locations and produces over 300 Gbytes of added products for its customers, it quickly becomes obvious that the Weather Company’s platform is highly optimised for Big Data, and the internet of Things.

This platform will now serve as a backbone for IBM’s Watson IoT.

Watson you will remember, is IBM’s natural language processing and machine learning platform which famously took on and beat two former champions on the quiz show Jeopardy. Since then, IBM have opened up APIs to Watson, to allow developers add cognitive computing features to their apps, and more recently IBM announced Watson IoT Cloud “to extend the power of cognitive computing to the billions of connected devices, sensors and systems that comprise the IoT”.

Given Watson’s relentless moves to cloud and IoT, this acquisition starts to make a lot of sense.

IBM further announced that it will use its network of cloud data centres to expand Weather.com into five new markets including China, India, Brazil, Mexico and Japan, “with the goal of increasing its global user base by hundreds of millions over the next three years”.

With Watson’s deep learning abilities, and all that weather data, one wonders if IBM will be in a position to help scientists researching climate change. At the very least it will help the rest of us be prepared for its consequences.

New developments in AI and deep learning are being announced virtually weekly now by Microsoft, Google and Facebook, amongst others. This is a space which it is safe to say, will completely transform how we interact with computers and data.

post

GreenTouch release tools and technologies to significantly reduce mobile networks energy consumption

Mobile Phone

Mobile industry consortium GreenTouch today released tools and technologies which, they claim, have the potential to reduce the energy consumption of communication networks by 98%

The world is now awash with mobile phones.

According to Ericsson’s June 2015 mobility report [PDF warning], the total number of mobile subscriptions globally in Q1 2015 was 7.2 billion. By 2020, that number is predicted to increase another 2 billion to 9.2 billion handsets.

Of those 7.2 billion subscriptions, around 40% are associated with smartphones, and this number is increasing daily. In fact, the report predicts that by 2016 the number of smartphone subscriptions will surpass those of basic phones, and smartphone numbers will reach 6.1 billion by 2020.

Number of connected devices

When you add to that the number of connected devices now on mobile networks (M2M, consumer electronics, laptops/tablets/wearables), we are looking at roughly 25 billion connected devices by 2020.

That’s a lot of data passing being moved around the networks. And, as you would expect that number is increasing at an enormous rate as well. There was a 55% growth in data traffic between Q1 2014 and Q1 2015, and there is expected to be a 10x growth in smartphone traffic between 2014 and 2020.

So how much energy is required to shunt this data to and fro? Estimates cite ICT as being responsible for the consumption of 2% of the world’s energy, and mobile networking making up roughly half of that. With the number of smartphones set to more than double globally between now and 2020, that figure too is shooting up.

Global power consumption by telecommunications networks

Fortunately five years ago an industry organisation called GreenTouch was created by Bell Labs and other stakeholders in the space, with the object of reducing mobile networking’s footprint. In fact, the goal of GreenTouch when it was created was to come up with technologies reduce the energy consumption of mobile networks 1,000x by 2015.

Today, June 18th in New York, they announced the results of their last five years work, and it is that they have come up with ways for mobile companies to reduce their consumption, not by the 1,000x that they were aiming for, but by 10,000x!

The consortium also announced

research that will enable significant improvements in other areas of communications networks, including core networks and fixed (wired) residential and enterprise networks. With these energy-efficiency improvements, the net energy consumption of communication networks could be reduced by 98%

And today GreenTouch also released two tools for organisations and stakeholders interested in creating more efficient networks, GWATT and Flexible Power Model.

They went on to announce some of the innovations which led to this potential huge reduction in mobile energy consumption:

·

  • Beyond Cellular Green Generation (BCG2) — This architecture uses densely deployed small cells with intelligent sleep modes and completely separates the signaling and data functions in a cellular network to dramatically improve energy efficiency over current LTE networks.
  • Large-Scale Antenna System (LSAS) — This system replaces today’s cellular macro base stations with a large number of physically smaller, low-power and individually-controlled antennas delivering many user-selective data beams intended to maximize the energy efficiency of the system, taking into account the RF transmit power and the power consumption required for internal electronics and signal processing.
  • Distributed Energy-Efficient Clouds – This architecture introduces a new analytic optimization framework to minimize the power consumption of content distribution networks (the delivery of video, photo, music and other larger files – which constitutes over 90% of the traffic on core networks) resulting in a new architecture of distributed “mini clouds” closer to the end users instead of large data centers.
  • Green Transmission Technologies (GTT) – This set of technologies focuses on the optimal tradeoff between spectral efficiency and energy efficiency in wireless networks, optimizing different technologies, such as single user and multi-user MIMO, coordinated multi-point transmissions and interference alignment, for energy efficiency.
  • Cascaded Bit Interleaving Passive Optical Networks (CBI-PON) – This advancement extends the previously announced Bit Interleaving Passive Optical Network (BiPON) technology to a Cascaded Bi-PON architecture that allows any network node in the access, edge and metro networks to efficiently process only the portion of the traffic that is relevant to that node, thereby significantly reducing the total power consumption across the entire network.

Now that these innovations are released, mobile operators hoping to reduce their energy costs will be looking closely to see how they can integrate these new tools/technologies into their network. For many, realistically, the first opportunity to architect them in will be with the rollout of the 5G networks post 2020.

Mobile phone mast

Having met (and exceeded) its five year goal, what’s next for GreenTouch?

I asked this to GreenTouch Chairman Thierry Van Landegem on the phone earlier in the week. He replied that the organisation is now looking to set a new, bold goal. They are looking the energy efficiency of areas such as cloud, network virtualisation, and Internet of Things, and that they will likely announcement their next objective early next year.

I can’t wait to see what they come up with next.


Mobile phone mast photo Pete S

post

IBM’s InterConnect 2015, the good and the not so good

IBM InterConnect 2015

IBM invited me to attend their Cloud and Mobile Conference InterConnect 2015 last week.

Because of what IBM has done globally to help people get access to safe water, to help with solar forecasting, and to help deliver better outcomes in healthcare, for example, I tend to have a very positive attitude towards IBM.

So I ventured to the conference with high hopes of what I was going to learn there. and for the most part I wasn’t disappointed. IBM had some very interesting announcements, more on which later.

However, there is one area where IBM has dropped the ball badly – their Cloud Services Division, Softlayer.

IBM have traditionally been a model corporate citizen when it comes to reporting and transparency. They publish annual Corporate Responsibility reports with environmental, energy and emissions data going all the way back to 2002.

However, as noted here previously, when it comes to cloud computing, IBM appear to be pursuing the Amazon model of radical opaqueness. They refuse to publish any data about the energy or emissions associated with their cloud computing platform. This is a retrograde step, and one they may come to regret.

Instead of blindly copying Amazon’s strategy of non-reporting, shouldn’t IBM be embracing the approach of their new best buddies Apple? Apple, fed up of being Greenpeace’d, and seemingly genuinely wanting to leave the world a better place, hired the former head of the EPA, Lisa Jackson to head up its environmental initiatives, and hasn’t looked back.

Apple’s reporting on its cloud infrastructure energy and emissions, on its supply chain [PDF], and on its products complete life cycle analysis, is second to none.

This was made more stark for me because while at InterConnect, I read IBM’s latest cloud announcement about their spending $1.2bn to develop 5 new SoftLayer data centres in the last four months. While I was reading that, I saw Apple’s announcement that they were spending €1.7bn to develop two fully renewably powered data centres in Europe, and I realised there was no mention whatsoever of renewables anywhere in the IBM announcement.

GreenQloud Dashboard

Even better than Apple though, are the Icelandic cloud computing company GreenQloud. GreenQloud host most of their infrastructure out of Iceland, (Iceland’s electricity is generated 100% by renewable sources – 70% hydro and 30% geothermal), and the remainder out of the Digital Fortress data center in Seattle, which runs on 95% renewable energy. Better again though, GreenQloud gives each customer a dashboard with the total energy that customer has consumed and the amount of CO2 they have saved.

This is the kind of cloud leadership you expect from a company with a long tradition of openness, and the big data and analytics chops that IBM has. Now this would be A New Way to Think for IBM.

But, it’s not all bad news, as I mentioned at the outset.

IBM Predictive Maintenance

As you’d expect, there was a lot of talk at InterConnect about the Internet of Things (IoT). Chris O’Connor, IBM’s general manager of IoT, in IBM’s new IoT division, was keen to emphasise that despite the wild hype surrounding IoT at the moment, there’s a lot of business value to be had there too. There was a lot of talk about IBM’s Predictive Maintenance and Quality solutions, for example, which are a natural outcome of IBM’s IoT initiatives. IBM has been doing IoT for years, it just hasn’t always called it that.

And when you combine IBM’s deep expertise in Energy and Utilities, with its knowledge of IoT, you have an opportunity to create truly Smart Grids, not to mention the opportunities around connected cities.

In fact, IoT plays right into the instrumented, interconnected and intelligent Smarter Planet mantra that IBM has been talking for some time now, so I’m excited to see where IBM go with this.

Fun times ahead.

Disclosure – IBM paid my travel and accommodation for me to attend InterConnect.

post

EPRI releases open, standards based software, to connect smart homes to the smart grid

Smart Appliance Screen

Automated Demand Response (ADR) is something we’ve talked about here on GreenMonk for quite a while now. And in other fora, at least as far back as 2007.

What is Automated Demand Response? Well, demand response is the process whereby electricity consumers (typically commercial) reduce their usage in response to a signal from the utility that they are in a period of peak demand. The signal often takes the form of a phone call.

Automated demand response, as you would imagine, is when this procedure is automated using software signals (often signals of price fluctuation). The development of ADR technologies received a big boost with the development of the OpenADR standard, and the subsequent formation in 2010 of the OpenADR Alliance to promote its use.

Consequently, EPRI‘s recent announcement that it has developed automated demand response software, is to be welcomed.

In their announcement EPRI say the new software will:

provide a common way for devices and appliances on the electric grid to respond automatically to changes in price, weather, and demand for power, a process called automated demand response (ADR).

ADR makes it possible to translate changes in wholesale markets to corresponding changes in retail rates. It helps system operators reduce the operating costs of demand response (DR) programs while increasing its resource reliability. For customers, ADR can reduce the cost of electricity by eliminating the resources and effort required to achieve successful results from DR programs.

The EPRI ADR software was certified by the OpenADR Alliance. “Making this software freely available to the industry will accelerate the adoption of standards-based demand response” said Mark McGranaghan, vice president of Power Delivery and Utilization at EPRI.

This software has the potential to finally bring the smart grid into the home, allowing smart appliances to adjust their behaviour depending on the state of the grid. Some manufacturers have been fudging this functionality already with a combination of internet connected devices and cloud computing resources (see Whirlpool 6th Sense device above). And others, like GE are planning to bring older appliances into the connected fold, by sending out wifi modules that add new sensor capabilities.

Connecting appliances to the grid has the ability to make them far smarter. We’ll be discussing this, and more IoT topics in far more detail at ThingMonk, our upcoming Internet of Things event, in Denver next month. Hope to see you there.

post

Tips for starting out coding for the Internet of Things

We attended the SAP TechEd && d-code events recently. One of the more interesting parts of the showfloor was the Internet of Things (IoT) area. In this area there were demos of Internet of Things technologies currently in use by the likes of port of Hamburg, SK Solutions intelligent crane solutions (of which, we’ll be publishing a video in a subsequent post), and Internet connected vending machines, amongst other displays.

Even more interesting than the demos though was the IoT hacking area. In this area, SAP staff worked to create interesting Internet of Things connected devices, and there were machines available with Arduino, Tessel, and Beaglebone microcontrollers and instructions on how to connect them to sensors, pull data from the sensors, and push that data up to the Hana Cloud Platform.

In the Las Vegas event the SAP staff created the scarecrow seen in the video above. This scarecrow would flash the LEDs in its eyes, move its head, move its arms, and fire a Nerf gun when commanded to do so over Twitter. In the slo-mo video above, it does all the actions at once. Apologies for the quality of the video, it was shot using a smartphone lying prone on the floor.

We spoke to SAP’s Craig Cmehil subsequently to get hints on how to start out learning about hacking Internet of Things projects at home and he supplied us with a list of resources.

Craig recommended getting started with one of the following kits:

For Arduino

While for Raspberry Pi there’s

The links above are direct links to these items on Amazon, and there are many more accessories available on the Sparkfun site.

 

Not sure what the differences are between an Arduino and a Raspberry Pi? Check out this great explainer on Read Write Web.

Now, having decided on your IoT platform, what about some good resources, well,

Arduino Starter Kit

  • if you are planning to include your kids in the process, then Raspberry Pi kid is a good blog to check out
  • Coder for Raspberry Pi is an open source project to teach kids how to build websites using the Raspberry Pi
  • Adafruit has some great lessons on coding for Raspberry Pi, like this one for temperature sensing
  • Adafruit also has lots for Arduino
  • The Arduino site has lots of resources available for all levels of learner

And if you are wondering about connecting these devices to the cloud, Rui Nogueira has a great two piece blog post with detailed instructions for Raspberry Pi here.

If hacking microcontrollers is your thing, or you think it could be, then our ThingMonk event next week in the UK is the place to be. It is a three day event with day one being hacking, day two is IoT tech talks, and day three (called Business of IoT) is business related IoT talks.

post

The coming together of the Internet of Things and Smart Grids

I was asked to speak at the recent SAP TechEd && d-code (yes, two ampersands, that’s the branding, not a typo) on the topic of the Internet of Things and Energy.

This is a curious space, because, while the Internet of Things is all the rage now in the consumer space, the New Black, as it were; this is relatively old hat in the utilities sector. Because utilities have expensive, critical infrastructure in the field (think large wind turbines, for example), they need to be able to monitor them remotely. These devices use Internet of Things technologies to report back to base. this is quite common on the high voltage part of the electrical grid.

On the medium voltage section, Internet of Things technologies aren’t as commonly deployed currently (no pun), but mv equipment suppliers are more and more adding sensors to their equipment so that they too can report back. In a recent meeting at Schneider Electric’s North American headquarters, CTO Pascal Brosset announced that Schneider were able to produce a System on a Chip (SoC) for $2, and as a consequence, Schneider were going to add one to all their equipment.

And then on the low voltage network, there are lots of innovations happening behind the smart meter. Nest thermostats, Smappee energy meters, and SmartThings energy apps are just a few of the many new IoT things being released recently.

Now if only we could connect them all up, then we could have a really smart grid.

In case you are in the area, and interested, I’ll be giving a more business-focussed version of this talk at our Business of IoT event in London on Dec 4th.

The slides for this talk are available on SlideShare.

post

Schneider Electric – focussed on making organisations more efficient

Schneider Influencer Summit

We were invited to attend this year’s Schneider Electric Influencer Summit and jumped at the chance. Why? Schneider Electric is a fascinating company with fingers in lots of pies, and we were keen to learn more about this company.

Schneider Electric was founded in 1836, so the company is coming up on 180 years old. Schneider reported revenue of almost €23.5bn in 2013, of which €1.9bn was profit, and employs in the order of 152,000 people globally. So, not an insignificant organisation.

The Influencer Summit coincided with the opening of its Boston One campus, Schneider Electric’s new facility in Andover. This site is now Schneider’s main R&D lab, as well as its North American HQ. Situating its main R&D labs in its HQ says a lot about how Schneider views the importance of research and development. In fact, at the event Schneider EVP and North American CEO Laurent Vernerey, reported that Schneider devotes 4-5% of sales to R&D annually.

At the influencer event, we discovered the breath of Schneider’s portfolio went far beyond what we were aware of. Not only are they heavily involved in electrical automation, control and distribution systems, but they also help make highly energy efficient data centres (they bought APC back in 2007), they have building management solutions, a cybersecurity suite (developed especially for critical infrastructure), water management solutions, a smart cities business, a weather forecasting arm (with a staff of 80 meteorologists!), and a strong services division. See, fingers in lots of pies!

Schneider Electric, as its name suggests, was traditionally more of a hardware company, but with the move to the digitisation of infrastructure, that has changed fundamentally, and Schneider is now very much a software company as well as a hardware one. Of the 20,000 employees in North America, 1,200 are software engineers.

This digitisation of infrastructure is happening at an ever increasing pace, helped by the constantly falling price of electronics and sensors. If it costs a mere $2.50 to put an SoC on a piece of infrastructure, why wouldn’t you do it? Particularly when adding the SoC makes the device IP addressable. Now it can report back on its status in realtime. As Schneider CMO Chris Hummel said, “connected systems will fundamentally change everything”.

Addressing potential security issues associated with making critical infrastructure IP addressable Schneider said that connected devices are more secure than disconnected devices because they can be monitored, and everything that’s done to them can be tracked.

With that in mind, it is not surprising that Schneider is a member of the Industrial Internet Consortium.

While it is always instructive to hear a company’s executives talking about their organisation, it is always far more interesting to hear their customers speak. And this event didn’t disappoint on that score. The customer speaker in this case was Todd Isherwood, the Energy Efficiency and Alternative Energy project manager for the City of Boston. Todd discussed how the City of Boston, with 15,000 employees, 2,700 utility accounts and a $50m electricity spend was working with Schneider Electric on its journey to becoming a more sustainable city.

Boston launched its Greenovate Boston campaign, it passed its Building Energy Reporting and Disclosure Ordinance (BERDO). This Ordinance requires Boston’s large- and medium-sized buildings to report their annual energy and water use to the City of Boston, after which the City makes the information publicly available. All of which will have helped Boston achieve its ranking of most energy efficient city in the US.

The biggest takeaway from the event though, was that Schneider Electric is, at its core, hugely interested in helping organisations become more efficient. And seemingly for all the right reasons. That’s not something you can say about many companies. And because of that, we’ll be watching Schneider with great interest from here on out.

Disclosure – Schneider Electric paid my travel and accommodation expenses to attend this event.

post

Technology for Good – episode thirty four with Salesforce’s John Tascheck

Welcome to episode thirty four of the Technology for Good hangout. In this week’s episode our guest was SalesForce SVP of Strategy, John Taschek. John and I are both longtime members of the Enterprise Irregulars, but this was the first time John and I had had a conversation outside of email!

Some of the more fascinating stories we looked at on the show, included a very successful Kickstarter campaign for a small router which can completely anonymise your internet activity, Lockheed Martin announcing that they’ve made a breakthrough on nuclear fusion technology, and Satya Nadella’s response to his gaffe last week about women seeking a raise.

Here is the full list of stories that we covered in this week’s show:

 

Climate

Energy

Hardware

Internet of Things

Wearables

Mobility

Comms

Privacy

Open Source

Sustainability

post

Technology for Good – episode twenty six with Open Data Institute’s James Smith

Welcome to episode twenty six of the Technology for Good hangout. In this week’s episode we had The Open Data Institute‘s James Smith as the guest on our show.

I was very keen to have James on the show, especially since he recently announced that he is standing for election to the UK parliament next year. James is running on the principles in the OpenPolitics Manifesto, an open source plan for the UK that anyone can contribute to. This is obviously a an extremely innovative approach to electioneering, as well as being uniquely democratic. Believing as I do in the Geek Manifesto, I think it is vital we elect scientifically literate people to the world’s parliaments, so I definitely wanted James to come onto the show. And if he’s willing, I’ll ask him on again sooner to election time.

We covered a lot of topics in the show, including the US public being in favour of a carbon tax, the new Airbus electric plane, and Google’s new moonshot project, the human body.

Here is the full list of stories that we covered in this week’s show:

Climate

Energy

Internet of Things

Drones

Health

Mobile

Security

Diversity

post

Technology for Good – episode twenty five with SAP Mentor Chris Kernaghan

Welcome to episode twenty five of the Technology for Good hangout. In this week’s episode we had cloud architect and SAP Mentor Chris Kernaghan as the guest on our show. This was not Chris’ first time co-hosting the Technology for Good show, so as an old hand, I knew this was going to be a fun show, and so it was. We covered a lot of topics in the show, including the repeal of carbon tax in Australia, IBM and Apple’s enterprise partnership, and Microsoft’s shedding of 18,000 employees (and a platform!).

Here is the full list of stories that we covered in this week’s show:

Climate

Renewables

Big Data

Mobile

Apps

Internet of Things

Comms

Wearables

Cognitive Computing

Misc