Category Archives: Internet of Things

The IoT in Palo Alto: connecting America’s digital city

jonathan_reichental_headshot_banffPalo Alto is not your average city. Established by the founder of Stanford University, it was the soil from which Google, Facebook, Pinterest and PayPal (to name a few) have sprung forth. Indeed, Palo Alto has probably done more to transform human life in the last quarter century than any other. So, when we think of how the Internet of Things is going to affect life in the coming decades, we can be reasonably sure where much of expected disruption will originate.

All of which makes Palo Alto a great place to host the first IoT Data Analytics & Visualization event (February 9 – 11, 2016). Additionally fitting: the event is set to be kicked off by Dr. Jonathan Reichental, the city’s Chief Information Officer: Reichental is the man entrusted with the hefty task of ensuring the city is as digital, smart and technologically up-to-date as a place should be that has been called home by the likes of Steve Jobs, Mark Zuckberg, Larry Page and Sergey Brin.

Thus far, Reichental’s tenure has been a great success. In 2013, Palo Alto was credited with being the number one digital city in the US, and has made the top five year upon year – in fact, it so happens that, following our long and intriguing telephone interview, Reichental is looking forward to a small celebration to mark its latest nationwide ranking.

BCN: Jonathan, you’ve been Palo Alto’s CIO now for four years. What’s changed most during that time span?

Dr Jonathan Reichental: I think the first new area of substance would be open government. I recognise open government’s been a phenomenon for some time, but over the course of the last four years, it has become a mainstream topic that city and government data should be easily available to the people. That it should be machine readable, and that an API should be made available to anyone that wants the data. That we have a richer democracy by being open and available.

We’re still at the beginning however. I have heard that there are approximately 90,000 public agencies in the US alone. And every day and week I hear about a new federal agency or state or city of significance who are saying, ‘you can now go to our data portal and you can access freely the data of the city or the public agency. The shift is happening but it’s got some way to go.

Has this been a purely technical shift, or have attitudes had to evolve as well?

I think if you kind of look at something like cloud, cloud computing and cloud as a capability for government – back when I started ‘cloud’ was a dirty word. Many government leaders and government technology leaders just weren’t open to the option of putting major systems off-premise. That has begun to shift quite positively.

I was one of the first to say that cloud computing is a gift to government. Cloud eliminates the need to have all the maintenance that goes with keeping systems current and keeping them backed up and having disaster recovery. I’ve been a very strong proponent of that.

Then there’s social media  – government has fully embraced that now, having been reluctant early on. Mobile is beginning to emerge though it’s still very nascent. Here in Palo Alto we’re trying to make all services that make sense accessible via smart phone. I call it ‘city in a box.’ Basically, bringing up an app on the smart phone you should be able to interact with government – get a pet license, pay a parking fee, pay your electrical bill: everything should really be right there on the smartphone, you shouldn’t need to go to City Hall for many things any more.

The last thing I’d say is there has been an uptake in community participation in government. Part of it is it’s more accessible today, and part of it is there’s more ways to do so, but I think we’re beginning also to see the fruits of the millennial generation – the democratic shift in people wanting to have more of a voice and a say in their communities. We’re seeing much more in what is traditionally called civic engagement. But ‘much more’ is still not a lot. We need to have a revolution in this space for there to be significant change to the way cities operate and communities are effective.

Palo Alto is hosting the IoT Data Analytics & Visualization in February. How have you innovated in this area as a city?

One of the things we did with data is make it easily available. Now we’re seeing a community of people in the city and beyond, building solutions for communities. One example of that is a product called Civic Insight. This app consumes the permit data we make available and enables users to type in an address and find out what’s going on in their neighbourhood with regard to construction and related matters.

That’s a clear example of where we didn’t build the thing, we just made the data available and someone else built it. There’s an economic benefit to this. It creates jobs and innovation – we’ve seen that time and time again. We saw a company build a business around Palo Alto releasing our budget information. Today they are called OpenGov, and they sell the solution to over 500 cities in America, making it easy for communities to understand where their tax payer dollars are being spent. That was born and created in Palo Alto because of what we did making our data available.

Now we get to today, and the Internet of Things. We’re still – like a lot folks, especially in the government context – defining this. It can be as broad or as narrow as you want. There’s definitely a recognition that when infrastructure systems can begin to share data between each other, we can get better outcomes.

The Internet of Things is obviously quite an elastic concept, but are there areas you can point to where the IoT is already very much a reality in Palo Alto?

The clearest example I can give of that today is our traffic signal system here in the city. A year-and-a-half ago, we had a completely analogue system, not connected to anything other than a central computer, which would have created a schedule for the traffic signals. Today, we have a completely IP based traffic system, which means it’s basically a data network. So we have enormous new capability.

For example, we can have schedules that are very dynamic. When schools are being let out traffic systems are one way, at night they can be another way, you can have very granular information. Next you can start to have traffic signals communicate with each other. If there is a long strip of road and five traffic systems down there is some congestion, all the other traffic signals can dynamically change to try and make the flow better.

It goes even further than this. Now we can start to take that data – recording, for example, the frequency and volume of vehicles, as well as weather, and other ambient characteristics of the environment – and we can start to send this to the car companies. Here at Palo Alto, almost every car company has their innovation lab. Whether it’s Ford, General Motors, Volkswagen, BMW, Google (who are getting into the car business now) – they’re all here and they all want our data. They’re like: ‘this is interesting, give us an API, we’ll consume it into our data centres and then we’ll push into cars so maybe they can make better decisions.’

You have the Internet of Things, you’ve got traffic signals, cloud analytics solutions, APIs, and cars as computers and processors. We’re starting to connect all these related items in a way we’ve never done before. We’re going to follow the results.

What’s the overriding ambition would you say?

We’re on this journey to create a smart city vision. We don’t really have one today. It’s not a product or a service, it’s a framework. And within that framework we will have a series of initiatives that focus on things that are important to us. Transportation is really important to us here in Palo Alto. Energy and resources are really important: we’re going to start to put sensors on important flows of water so we can see the amount of consumption at certain times but also be really smart about leak detection, potentially using little sensors connected to pipes throughout the city. We’re also really focused on the environment. We have a chief sustainability officer who is putting together a multi-decade strategy around what PA needs to do to be part of the solution around climate change.

That’s also going to be a lot about sensors, about collecting data, about informing people and creating positive behaviours. Public safety is another key area. Being able to respond intelligently to crimes, terrorism or natural disasters. A series of sensors again sending information back to some sort of decision system that can help both people and machines make decisions around certain types of behaviours.

How do you expect this whole IoT ecosystem to develop over the next decade?

Bill Gates has a really good saying on this: “We always overestimate the change that will occur in the next two years and underestimate the change that will occur in the next ten.”  It’s something that’s informed me in my thinking. I think things are going to move faster and in more surprising ways in the next ten years for sure: to the extent that it’s very hard to anticipate where things are headed.

We’re disrupting the taxi business overnight, the hotel business, the food business. Things are happening at lightning speed. I don’t know if we have a good sense of where it’s all headed. Massive disruption across all domains, across work, play, healthcare, every sort of part of our lives.

It’s clear that – I can say this – ten years from now won’t be the same as today. I think we’ve yet to see the full potential of smart phones – I think they are probably the most central part of this ongoing transformation.

I think we’re going to connect many more things that we’re saying right now. I don’t know what the number will be: I hear five billion, twenty billion in the next five years. It’s going to be more than that. It’s going to become really easy to connect. We’ll stick a little communication device on anything. Whether it’s your key, your wallet, your shoes: everything’s going to be connected.

Palo Alto and the IoT Data Analytics & Visualization event look like a great matchup. What are you looking forward to about taking part?

It’s clearly a developing area and so this is the time when you want to be acquiring knowledge, networking with some of the big thinkers and innovators in the space. I’m pleased to be part of it from that perspective. Also from the perspective of my own personal learning and the ability to network with great people and add to the body of knowledge that’s developing. I’m going to be kicking it off as the CIO for the city.

Wind River launches comprehensive cloud suite

Cloud computing conceptEmbedded software vendor Wind River has launched what it describes as a ‘comprehensive cloud suite’ for multi-architecture operating systems.

The new Wind River range includes the Helix Cloud, Rocket and Pulsar Linux offerings which are designed to communicate across multiple devices, gateways and microcontroller units (MCUs).

The Helix Cloud is a family of software-as-a-service (SaaS) products including development tools, virtual labs and deployed devices. Their joint mission is to simplify and automate the building and managing of IoT technologies at every stage of the life cycle of a system, from design to decommissioning. The Helix Lab Cloud is a virtual hardware lab for simulating and testing IoT devices and complex systems. Meanwhile, the Device Cloud is designed for managing IoT devices and their data.

Wind River claims it can simplify edge-to-cloud development with a single operating system controlling all dialogue between the device and the cloud. Wind River’s Rocket is described as a tiny-footprint commercial-grade quality real-time operating system that’s directly connected to its Helix Cloud. This, it claims, creates the support for multiple architectures and applications running on the type of 32-bit MCUs used in small-footprint sensor hubs, wearables and edge devices.

Pulsar Linux is a small-footprint commercial-grade binary Linux OS based on the Wind River Linux distribution that connects directly to the Helix Cloud to run on applications scaling from 32-bit MCUs to 64-bit CPUs.

The platform independent Rocket and Pulsar Linux support Intel and ARM architectures and a range of mainstream commercial boards, so that apps can run on any device and the developer can create an open collaborative ecosystem.

Wind River partners include Advantech, Freescale, HCL Technologies, Texas Instruments and Xilinx. It has also launched a new developer programme for ISVs, OEMs, systems integrators, ODMs and cloud operators.

Azure to lift elevators into the cloud with IoT-based maintenance service

Elevator lift cloudThe world’s one billion lift users could benefit from a new cloud based system which could elevate the mode of transport from ‘out of service’ to ‘as a service’, through a series of technical improvements.

Lift manager ThyssenKrupp has launched MAX, a system of ‘urban efficiency’ which runs via Microsoft Azure’s Internet of Things (IoT) enabled technology.

With 12 million lifts across the world shifting a billion people each day, downtime of this increasingly critical transport mechanism is becoming increasingly critical. However, management of lifts has not kept pacing with technology developments and service disruptions amount to over 190 million cumulative hours. The amount of time wasted, as staff are forced to walk up several floors, or wait for a crowded alternative lift, has not been calculated but is expected to be several times higher. However, the ThyssenKrupp Elevator company says its new MAX system will improve productivity by cutting lift service outages by half.

The MAX system aims to raise the reliability of lifts with a predictive and pre-emptive service that uses remote monitoring to dramatically increase the availability levels lifts. It uses intelligent agents to tell service technicians the needs of the lift, including the identification of repairs, component replacements and proactive system maintenance.

Data collected from millions of connected ThyssenKrupp elevators is sent to a system running in Microsoft’s Azure cloud, where an algorithm calculates the remaining lifetime of key systems and components in each elevator. From this ThyssenKrupp’s team of 20,000 global service engineers and technicians can inform building owners in advance when systems or components will need attention. These programmed interventions help avert life downtime, according to Andreas Schierenbeck, CEO of ThyssenKrupp Elevator.

“Our mission is to transform a century-old industry that has relied on established technology until now. We are very pleased to take ThyssenKrupp into the digital era and change the way the elevator industry offers maintenance services,” he said .

ThyssenKrupp aims to connect 180,000 units in North America and Europe, with the US, Germany, and Spain as pilot countries and other key countries in Europe, Asia and South America following shortly after. In two years, the offering will be expanded to all continents, becoming available to 80% of all elevators worldwide.

IBM to buy The Weather Company and make it elementary to Watson

IBM The Weather Company PhotoIBM has entered an agreement to buy The Weather Company’s B2B, mobile and cloud-based web properties, in a bid to extend its Internet of Things range.

The acquired assets include WSI, weather.com, Weather Underground and The Weather Company brand. The Weather Channel will not be part of the acquisition but it will license weather forecast data and analytics from IBM under a long-term contract.

IBM says the combination of technology and expertise from the two companies will be foundation for the new Watson IoT Unit and Watson IoT Cloud platform as part of its $3 billion investment strategy in this sector.

The Weather Company’s cloud data system runs the fourth most-used mobile app daily in the United States and handles 26 billion inquiries a day.

On closing the deal, IBM will acquire The Weather Company’s product and technology assets that include meteorological data science experts, precision forecasting and a cloud platform that ingests, processes, analyses and distributes petabyte sized data sets instantly. The Weather Company’s models analyse data from three billion weather forecast reference points, more than 40 million smartphones and 50,000 airplane flights per day, allowing it to offer a broad range of data-driven products and services to 5000 clients in the media, aviation, energy, insurance and government industries.

The Weather Company’s mobile and web properties serves 82 million unique monthly visitors. IBM said it plans to develop The Weather Company’s digital advertising platform and skills, commercialising weather information through data-driven advertising with additional ad-sponsored consumer and business solutions.

“The next wave of improved forecasting will come from the intersection of atmospheric science, computer science and analytics,” said Weather Company CEO David Kenny. “Upon closing of this deal, The Weather Company will continue to be able to help improve the precision of weather forecasts and deepen IBM’s Watson IoT capabilities.”

Oracle announces new levels of cloud, mobile and IoT integration in its Cloud Platform

Oracle openworld 2015Oracle has announced at OpenWorld a ‘comprehensive’ suite of integration services to help clients connect their cloud mobile and IoT systems into the Oracle Cloud Platform.

The Oracle Cloud Platform for Integration portfolio now includes Oracle’s IoT Cloud, Integration Cloud, SOA Cloud and API Manager Cloud range of services.

Oracle says its Integration Cloud is ideal for non-technical users such as citizen integrators, the applications staff in IT departments and line of business managers who need to integration software as a service (SaaS) applications. To this end it comes with a simple, intuitive Web-based, point-and-click user interface.

On the other end of the technical competence spectrum, Oracle’s SOA Cloud was designed for integration developers. It provides a full integration platform, including service virtualisation, process orchestration, B2B integration, managed file transfer and business activity monitoring dashboards. In accordance with the more detailed nature of the work of the typical user it has fine-grained control and the capacity to support various use cases.

Oracle’s integration cloud services are fully portable, it claims, so that users can switch their integration workloads between on-premise and the cloud, as business requirements change.

The IT architectures that organizations have relied on for decades are too rigid and inflexible for the digital age, according to Amit Zavery, Oracle’s senior VP of Cloud Platform and Integration products at Oracle. “Organisations need to rethink API management and service integration across cloud, mobile and IoT initiatives,” said Zavery.

Oracle Cloud Platform’s suite of integration services will provide the flexibility to allow them to adapt, which will boost productivity, slash costs and catalyse the inventive processes, Zavery argued.

Oracle Internet of Things Cloud Service should make it easy to connect any device, whether it generates or analyses data and extends business processes within enterprise applications, says Oracle. This, it says, will lead to faster development of IoT applications, with preventive maintenance and asset tracking pre-integrated with other Oracle systems such as Oracle PaaS, Oracle SaaS, Oracle JD Edwards, Oracle E-Business Suite, and Oracle Fusion.

Meanwhile, the Oracle API Manager Cloud Service will help developers to create and expose APIs to internal or external consumers quickly but without compromising on security, according to the vendor.

The Oracle Cloud Platform is part of the Oracle Cloud. Oracle says its Cloud offering supports 70 million users and more than 34 billion transactions each day and runs on more than 50,000 devices and more than 800 petabytes of storage in 19 data centres around the world.

Microsoft and Azul Systems say Zulu Embedded will encourage IoT in Windows

internet of things farmingAzul Systems and Microsoft are to give Java developers open source development tools, device I/O libraries and a Java runtime targeting Internet of Things (IoT) applications on Windows 10.

The two vendors have created Zulu Embedded for Windows 10 IoT, which is a Java Development Kit (JDK), Java Virtual Machine (JVM) and a set of device I/O libraries. The libraries are based on OpenJDK, which has been certified by Azul for use with Windows 10 IoT Core and is compliant with the Java 8 SE specification.

Microsoft Windows 10 IoT Core, is a modified version of Windows 10 that has been tailored to suit cheap, small-footprint embedded devices such as those based on Raspberry Pi 2 and Minnowboard Max.

The aim of the partnership is to ensure Zulu Embedded meets Java development and runtime requirements for Microsoft’s IoT initiatives. The success of the joint effort will be gauged by the number of Java compatibility updates, security patches and the levels of support for additional IoT device connectivity, control and communication, according to a joint statement.

There are Java developers around the world using Windows 10 IoT core, according to Steve Teixeira, Director of Program Management for the Windows Internet of Things team at Microsoft. These new initiative means they will be assured of a high-quality foundation for their Java projects if they use the latest advances in OpenJDK.

“Developers have many development and deployment choices for their IoT applications,” said Teixeira. By giving them more support, they are more likely to stay in the Microsoft cloud camp, he said. “Microsoft and Azul have made it easy for those who prefer Java to build premier IoT devices running Windows.”

Azul Systems is committed to updating and evolving Zulu Embedded to meet the specific requirements of Microsoft’s IoT platforms, said Scott Sellers, CEO of Azul Systems.

Zulu Embedded for Windows 10 IoT is free to download and use and may be distributed without restriction.

Mercedes-Benz and Pivotal forge smart car apps on Cloud Foundry

connected-car-normalUsing your mobile phone while driving could become compulsory, thanks to a new connected car application being jointly developed by Mercedes-Benz and Pivotal.

The Mercedes me app will give drivers real-time information about the status of their cars through their smartphones and smart watches.

Pivotal and Mercedes-Benz are working on the app on Pivotal’s Cloud Foundry and Spring in a bid to give Mercedes drivers information about their car’s vital signs (such as oil, water and petrol levels) and remote control of everything from heating and locks to navigation. The system will work with a navigation tool via iPhone and Apple Watch.

According to Mercedes-Benz by 2020 all vehicles will be emission-free and will feature autonomous driving and deep levels of Internet connectivity. To support these initiatives, it is using Pivotal Labs’ cloud native platform, Pivotal Cloud Foundry with the developer framework Spring Boot.

With Daimler and Mercedes-Benz both anxious to meet emissions targets, their developers were keen to explore all the possibilities of Pivotal’s modern agile software development methods, said Scott Yara, Co-President at Pivotal. “They are now also a great software company,” said Yara.

Daimler’s work with Pivotal’s cloud platform minimized its innovation cycle by helping it develop a system faster than ever, according to Christoph Hartung, Head of Connected Cars at Mercedes-Benz. “Our collaboration with Pivotal will define a new digital driving culture with state of the art information technologies, online communication systems and automotive services,” said Hartnung.

New Teradata apps enable IoT analytics

Internet of things cloudTeradata customers can now listen to the Internet of Things data thanks to two new software innovations designed to create insights into developments.

Teradata’s new Listener and Aster Analytics applications can intelligently listen in real-time and then use analytics to see the distinctive patterns in massive streams of IoT data, it says.

Teradata Listener is an intelligent system that can follow multiple streams of sensor and IoT data wherever it exists globally and feed it to a choice of different analytical systems. Data sent to the Teradata Integrated Big Data Platform 1800 provides access to large volumes of data with its native support of JSON (Java Script Object Notation) data. Alternatively, data fed to a Hadoop based system can be analysed at scale with Teradata Aster Analytics on Hadoop.

Teradata Listener helps data scientists, business analysts and developers to analyse new data streams for faster answers to business questions. Users can analyse data from numerous sources including sensors, telematics, mobile events, click streams, social media feeds and IT server logs, without seeking technical help from the IT department.

Teradata Aster Analytics on Hadoop has 100 pre-configured analytics techniques and seven vertical industry applications to run directly on Hadoop.

In a hospital, data from magnetic resonance imaging (MRI), radiography, and ultrasound imaging equipment might be streamed as text logs. This information describing patient behaviour and sensor data could be streamed into an Hadoop data lake. The new systems allow the users to runs text analytics on the data in order to find out how effectively personnel are working and how efficiently expensive resources, such as MRI scanners, are being used.

“Customers can combine IoT data with business operations and human behavioural data to maximise analytic value,” said Hermann Wimmer, Teradata’s co-president, “Teradata Listener and Aster Analytics on Hadoop are breakthrough IoT technologies that push the analytic edge, making the ‘Analytics of Everything’ possible.”

The collection and analysis of sensor and IOT data has been integral to driving the efficiency of the rail business, according to railways expert Gerhard Kress, director of Analytical Services at Siemens’ Mobility Division.

Kii and KDDI say their joint platform will make IoT safe on cloud

Secure cloudJapanese telco KDDI is working with Internet of Things (IoT) cloud platform provider Kii to create a risk averse system in which enterprises can develop mobile apps.

The KDDI cloud platform service (KCPS) is described as a mobile back end as a service (mBaaS) offering that uses Kii’s software to create mobile and IoT apps on a private network. The two companies have worked together on ways to apply cloud disciplines for efficient sharing of resources, contained within the confines on an Intranet environment. The object of the collaboration is to allow companies to develop machine to machine systems, without exposing them to the public cloud while they are in development.

According to KDDI, the KCPS uses the telco’s Wide Area Virtual Switch to integrate a number of different virtual network layers with Kii’s software. Together they create a new level of fast connections across the Intranet. KCPS also provides a service environment for intranet-conscious customers who need high standards of security and enterprise functions without resorting to the public Internet, according to the vendor.

KDDI claims this is the first instance in which both Intranet and Internet services can work seamlessly with any mobile application developed on the KCPS platform.

KDDI’s application development support will allow developers to build better quality, lower priced applications in a short period of time, it claims. The platform is designed to help developers manage application development, devices and data, while providing essential features like push notifications and geo-location information. KCPS should be compatible with mobile apps on Android and iOS, according to KDDI.

“As the IoT gains mass acceptance, we see tremendous value helping mobile app developers get more IoT devices into the hands of consumers,” said Kii CEO Masanari Arai, “our collaboration will use the cloud to build the backend support of these apps in Japan.”

Amazon continues Internet of Things push with AWS IoT

Intel AWS IoT starter kitThe new AWS platform is designed to allow IoT devices to connect to the AWS cloud as well as a managed cloud service to assist with processing the data.

AWS IoT has been launched in beta, which usually means it’s not quite ready yet, but it needs people to try it out in order to iron out lingering bugs. In essence it appears to be Amazon’s play to put itself in the thick of the IoT land-grab, as the repository of all the data constantly being generated by the billions of sensors expected to comprise the IoT.

In many ways Amazon’s many previous launches and announcements at this year’s AWS re:Invent seems to have been leading up to this, as they’ve all been about making easier to transfer data into the AWS cloud. Specifically Amazon Kenisis Firehose, which is designed to make it easier to upload wireless streaming data to the AWS cloud, seems to have been launched with IoT in mind.

“The promise of the Internet of Things is to make everyday products smarter for consumers, and for businesses to enable better, data-driven offerings that weren’t possible before,” said Marco Argenti, VP of Mobile and IoT at AWS.

“World-leading organizations like Philips, NASA JPL, and Sonos already use AWS services to support the back-end of their IoT applications. Now, AWS IoT enables a whole ecosystem of manufacturers, service providers, and application developers to easily connect their products to the cloud at scale, take action on the data they collect, and create a new class of applications that interact with the physical world.”

Device connections are handled by a device gateway, which provides tools for predetermining responses to data received. AWS IoT also creates a virtual version of each device in the cloud so it can be interacted with even in times of intermittent connectivity. A dedicated SDK aims to make it easier for developers to do clever things with IoT devices and a bunch of semiconductor companies have already got on-board by embedding the SDK into IoT chips, including Broadcom, Intel, Marvell, Mediatek, Microchip, Qualcomm and TI. There are also a bunch of IoT starter kits which can, of course, be bought on Amazon.

“At Philips we aim to empower people to take greater control of their health with digital solutions that support healthy living and improved care coordination,” said Jeroen Tas, CEO Healthcare Informatics, Solutions and Services at Philips. “Our HealthSuite digital platform and its device cloud are already managing more than seven million connected, medical-grade and consumer devices, sensors, and mobile apps.

“With the addition of AWS IoT, we will greatly accelerate the pursuit of our vision. It will be easier to acquire, process, and act upon data from heterogeneous devices in real-time. Our products, and the care they support, are enabled to grow smarter and more personalized over time.”

On top of moves like the Dash Button IoT consumables automated ordering service, this move cements Amazon’s ambition to be a major IoT player, with AWS at the core. If it delivers on the promise of making IoT easier for companies and developers all the other tech giants currently involved in the IoT land grab may need to raise their game.