Archivo de la categoría: Features

Overcoming the data integration challenge in hybrid and cloud-based environments

Vivo, the Brazilian subsidiary of Spanish telco Telefónica deployed TOA Technologies' cloud-based field service management softawre

Industry experts estimate that data volumes are doubling in size every two years. Managing all of this is a challenge for any enterprise, but it’s not just the volume of data as much as the variety of data that presents a problems. With SaaS and on-premises applications, machine data, and mobile apps all proliferating, we are seeing the rise of an increasingly complicated value-chain ecosystem. IT leaders need to incorporate a portfolio-based approach and combine cloud and on-premises deployment models to sustain competitive advantage. Improving the scale and flexibility of data integration across both environments to deliver a hybrid offering is necessary to provide the right data to the right people at the right time.

The evolution of hybrid integration approaches creates requirements and opportunities for converging application and data integration. The definition of hybrid integration will continue to evolve, but its current trajectory is clearly headed to the cloud.

According to IDC, cloud IT infrastructure spending will grow at a compound annual growth rate (CAGR) of 15.6 percent each year between now and 2019 at which point it will reach $54.6 billion.  In line with this, customers need to advance their hybrid integration strategy to best leverage the cloud. At Talend, we have identified five phases of integration, starting from the oldest and most mature right through to the most bleeding edge and disruptive. Here we take a brief look at each and show how businesses can optimise the approach as they move from one step to the next.

Phase 1: Replicating SaaS Apps to On-Premise Databases

The first stage in developing a hybrid integration platform is to replicate SaaS applications to on-premises databases. Companies in this stage typically either need analytics on some of the business-critical information contained in their SaaS apps, or they are sending SaaS data to a staging database so that it can be picked up by other on-premise apps.

In order to increase the scalability of existing infrastructure, it’s best to move to a cloud-based data warehouse service within AWS, Azure, or Google Cloud. The scalability of these cloud-based services means organisations don’t need to spend cycles refining and tuning the databases. Additionally, they get all the benefits of utility-based pricing. However, with the myriad of SaaS apps today generating even more data, they may also need to adopt a cloud analytics solution as part of their hybrid integration strategy.

Phase 2: Integrating SaaS Apps directly with on-premises apps

Each line of business has their preferred SaaS app of choice: Sales departments have Salesforce, marketing has Marketo, HR has Workday, and Finance has NetSuite. However, these SaaS apps still need to connect to a back-office ERP on-premises system.

Due to the complexity of back-office systems, there isn’t yet a widespread SaaS solution that can serve as a replacement for ERP systems such as SAP R/3 and Oracle EBS. Businesses would be best advised not to try to integrate with every single object and table in these back-office systems – but rather to accomplish a few use cases really well so that their business can continue running, while also benefiting from the agility of cloud.

Phase 3: Hybrid Data Warehousing with the Cloud

Databases or data warehouses on a cloud platform are geared toward supporting data warehouse workloads; low-cost, rapid proof-of-value and ongoing data warehouse solutions. As the volume and variety of data increases, enterprises need to have a strategy to move their data from on-premises warehouses to newer, Big Data-friendly cloud resources.

While they take time to decide which Big Data protocols best serve their needs, they can start by trying to create a Data Lake in the cloud with a cloud-based service such as Amazon Web Services (AWS) S3 or Microsoft Azure Blobs. These lakes can relieve cost pressures imposed by on-premise relational databases and act as a “demo area”, enabling businesses to process information using their Big Data protocol of choice and then transfer into a cloud-based data warehouse. Once enterprise data is held there, the business can enable self-service with Data Preparation tools, capable of organising and cleansing the data prior to analysis in the cloud.

Phase 4: Real-time Analytics with Streaming Data

Businesses today need insight at their fingertips in real-time. In order to prosper from the benefits of real-time analytics, they need an infrastructure to support it. These infrastructure needs may change depending on use case—whether it be to support weblogs, clickstream data, sensor data or database logs.

As big data analytics and ‘Internet of Things’ (IoT) data processing moves to the cloud, companies require fast, scalable, elastic and secure platforms to transform that data into real-time insight. The combination of Talend Integration Cloud and AWS enables customers to easily integrate, cleanse, analyse, and manage batch and streaming data in the Cloud.

Phase 5: Machine Learning for Optimized App Experiences

In the future, every experience will be delivered as an app through mobile devices. In providing the ability to discover patterns buried within data, machine learning has the potential to make applications more powerful and more responsive. Well-tuned algorithms allow value to be extracted from disparate data sources without the limits of human thinking and analysis. For developers, machine learning offers the promise of applying business critical analytics to any application in order to accomplish everything from improving customer experience to serving up hyper-personalised content.

To make this happen, developers need to:

  • Be “all-in” with the use of Big Data technologies and the latest streaming big data protocols
  • Have large enough data sets for the machine algorithm to recognize patterns
  • Create segment-specific datasets using machine-learning algorithms
  • Ensure that their mobile apps have properly-built APIs to draw upon those datasets and provide the end user with whatever information they are looking for in the correct context

Making it Happen with iPaaS

In order for companies to reach this level of ‘application nirvana’, they will need to have first achieved or implemented each of the four previous phases of hybrid application integration.

That’s where we see a key role for integration platform-as-a-service (iPaaS), which is defined by analysts at  Gartner as ‘a suite of cloud services enabling development, execution and governance of integration flows connecting any combination of on premises and cloud-based processes, services, applications and data within individual or across multiple organisations.’

The right iPaaS solution can help businesses achieve the necessary integration, and even bring in native Spark processing capabilities to drive real-time analytics, enabling them to move through the phases outlined above and ultimately successfully complete stage five.

Written by Ashwin Viswanath, Head of Product Marketing at Talend

Cloud academy: Rudy Rigot and his new Holberton School

rudy rigotBusiness Cloud News talks to Container World (February 16 – 18, 2016 Santa Clara Convention Center, USA) keynote Rudy Rigot about his new software college, which opens today.

Business Cloud News: Rudy, first of all – can you introduce yourself and tell us about your new Holberton School?

Rudy Rigot: Sure! I’ve been working in tech for the past 10 years, mostly in web-related stuff. Lately, I’ve worked at Apple as a full-stack software engineer for their localization department, which I left this year to found Holberton School.

Holberton School is a 2-year community-driven and project-oriented school, training software engineers for the real world. No classes, just real-world hands-on projects designed to optimize their learning, in close contact with volunteer mentors who all work for small companies or large ones like Google, Facebook, Apple, … One of the other two co-founders is Julien Barbier, formerly the Head of Community, Marketing and Growth at Docker.

Our first batch of students started last week!

What are some of the challenges you’ve had to anticipate?

Since we’re a project-oriented school, students are mostly being graded on the code they turn in, that they push to GitHub. Some of this code is graded automatically, so we needed to be able to run each student’s code (or each team’s code) automatically in a fair and equal way.

We needed to get information on the “what” (what is returned in the console), but also on the “how”: how long does the code take to run?  How much resource is being consumed? What is the return code? Also, since Holberton students are trained on a wide variety of languages; how do you ensure you can grade a Ruby project, and later a C project, and later a JavaScript project, etc. with the same host while minimizing issues?

Finally we had to make sure that the student can commit code that is as malicious as they want, we can’t need to have a human check it before running it, it should only break their program, not the whole host.

So how on earth do you negotiate all these?

Our project-oriented training concept is new in the United States, but it’s been successful for decades in Europe, and we knew the European schools, who built their programs before containers became mainstream, typically run the code directly on a host system that has all of the software they need directly installed on the host; and then they simply run a chroot before running the student’s code. This didn’t solve all of the problem, while containers did in a very elegant way; so we took the container road!

HolbertonCloud is the solution we built to that end. It fetches a student’s code on command, then runs it based on a Dockerfile and a series of tests, and finally returns information about how that went. The information is then used to compute a score.

What’s amazing about it is that by using Docker, building the infrastructure has been trivial; the hard part has been about writing the tests, the scoring algorithm … basically the things that we actively want to be focused on!

So you’ve made use of containers. How much disruption do you expect their development to engender over the coming years?

Since I’m personally more on the “dev” end use of devops, I see how striking it is that containers restore focus on actual development for my peers. So, I’m mostly excited by the innovation that software engineers will be focusing on instead of focusing on the issues that containers are taking care of for them.

Of course, it will be very hard to measure which of those innovations were able to exist because containers are involved; but it also makes them innovations about virtually every corner of the tech industry, so that’s really exciting!

What effect do you think containers are going to have on the delivery of enterprise IT?

I think one takeaway from the very specific HolbertonCloud use case is that cases where code can be run trivially in production are getting rare, and one needs guarantees that only containers can bring efficiently.

Also, a lot of modern architectures fulfil needs with systems that are made of more and more micro-services, since we now have enough hindsight to see the positive outcomes on their resiliences. Each micro-service may have different requirements and therefore be relevant to be done each with different technologies, so managing a growing set of different software configurations is getting increasingly relevant. Considering the positive outcomes, this trend will only keep growing, making the need for containers keep growing as well.

You’re delivering a keynote at Container World. What’s the main motivation for attending?

I’m tremendously excited by the stellar line-up! We’re all going to get amazing insight from many different and relevant perspectives, that’s going to be very enlightening!

The very existence of Container World is exciting too: it’s crazy the long way containers have gone over the span of just a few years.

Click here to learn more about Container World (February 16 – 18, 2016 Santa Clara Convention Center, USA)

The IoT in Palo Alto: connecting America’s digital city

jonathan_reichental_headshot_banffPalo Alto is not your average city. Established by the founder of Stanford University, it was the soil from which Google, Facebook, Pinterest and PayPal (to name a few) have sprung forth. Indeed, Palo Alto has probably done more to transform human life in the last quarter century than any other. So, when we think of how the Internet of Things is going to affect life in the coming decades, we can be reasonably sure where much of expected disruption will originate.

All of which makes Palo Alto a great place to host the first IoT Data Analytics & Visualization event (February 9 – 11, 2016). Additionally fitting: the event is set to be kicked off by Dr. Jonathan Reichental, the city’s Chief Information Officer: Reichental is the man entrusted with the hefty task of ensuring the city is as digital, smart and technologically up-to-date as a place should be that has been called home by the likes of Steve Jobs, Mark Zuckberg, Larry Page and Sergey Brin.

Thus far, Reichental’s tenure has been a great success. In 2013, Palo Alto was credited with being the number one digital city in the US, and has made the top five year upon year – in fact, it so happens that, following our long and intriguing telephone interview, Reichental is looking forward to a small celebration to mark its latest nationwide ranking.

BCN: Jonathan, you’ve been Palo Alto’s CIO now for four years. What’s changed most during that time span?

Dr Jonathan Reichental: I think the first new area of substance would be open government. I recognise open government’s been a phenomenon for some time, but over the course of the last four years, it has become a mainstream topic that city and government data should be easily available to the people. That it should be machine readable, and that an API should be made available to anyone that wants the data. That we have a richer democracy by being open and available.

We’re still at the beginning however. I have heard that there are approximately 90,000 public agencies in the US alone. And every day and week I hear about a new federal agency or state or city of significance who are saying, ‘you can now go to our data portal and you can access freely the data of the city or the public agency. The shift is happening but it’s got some way to go.

Has this been a purely technical shift, or have attitudes had to evolve as well?

I think if you kind of look at something like cloud, cloud computing and cloud as a capability for government – back when I started ‘cloud’ was a dirty word. Many government leaders and government technology leaders just weren’t open to the option of putting major systems off-premise. That has begun to shift quite positively.

I was one of the first to say that cloud computing is a gift to government. Cloud eliminates the need to have all the maintenance that goes with keeping systems current and keeping them backed up and having disaster recovery. I’ve been a very strong proponent of that.

Then there’s social media  – government has fully embraced that now, having been reluctant early on. Mobile is beginning to emerge though it’s still very nascent. Here in Palo Alto we’re trying to make all services that make sense accessible via smart phone. I call it ‘city in a box.’ Basically, bringing up an app on the smart phone you should be able to interact with government – get a pet license, pay a parking fee, pay your electrical bill: everything should really be right there on the smartphone, you shouldn’t need to go to City Hall for many things any more.

The last thing I’d say is there has been an uptake in community participation in government. Part of it is it’s more accessible today, and part of it is there’s more ways to do so, but I think we’re beginning also to see the fruits of the millennial generation – the democratic shift in people wanting to have more of a voice and a say in their communities. We’re seeing much more in what is traditionally called civic engagement. But ‘much more’ is still not a lot. We need to have a revolution in this space for there to be significant change to the way cities operate and communities are effective.

Palo Alto is hosting the IoT Data Analytics & Visualization in February. How have you innovated in this area as a city?

One of the things we did with data is make it easily available. Now we’re seeing a community of people in the city and beyond, building solutions for communities. One example of that is a product called Civic Insight. This app consumes the permit data we make available and enables users to type in an address and find out what’s going on in their neighbourhood with regard to construction and related matters.

That’s a clear example of where we didn’t build the thing, we just made the data available and someone else built it. There’s an economic benefit to this. It creates jobs and innovation – we’ve seen that time and time again. We saw a company build a business around Palo Alto releasing our budget information. Today they are called OpenGov, and they sell the solution to over 500 cities in America, making it easy for communities to understand where their tax payer dollars are being spent. That was born and created in Palo Alto because of what we did making our data available.

Now we get to today, and the Internet of Things. We’re still – like a lot folks, especially in the government context – defining this. It can be as broad or as narrow as you want. There’s definitely a recognition that when infrastructure systems can begin to share data between each other, we can get better outcomes.

The Internet of Things is obviously quite an elastic concept, but are there areas you can point to where the IoT is already very much a reality in Palo Alto?

The clearest example I can give of that today is our traffic signal system here in the city. A year-and-a-half ago, we had a completely analogue system, not connected to anything other than a central computer, which would have created a schedule for the traffic signals. Today, we have a completely IP based traffic system, which means it’s basically a data network. So we have enormous new capability.

For example, we can have schedules that are very dynamic. When schools are being let out traffic systems are one way, at night they can be another way, you can have very granular information. Next you can start to have traffic signals communicate with each other. If there is a long strip of road and five traffic systems down there is some congestion, all the other traffic signals can dynamically change to try and make the flow better.

It goes even further than this. Now we can start to take that data – recording, for example, the frequency and volume of vehicles, as well as weather, and other ambient characteristics of the environment – and we can start to send this to the car companies. Here at Palo Alto, almost every car company has their innovation lab. Whether it’s Ford, General Motors, Volkswagen, BMW, Google (who are getting into the car business now) – they’re all here and they all want our data. They’re like: ‘this is interesting, give us an API, we’ll consume it into our data centres and then we’ll push into cars so maybe they can make better decisions.’

You have the Internet of Things, you’ve got traffic signals, cloud analytics solutions, APIs, and cars as computers and processors. We’re starting to connect all these related items in a way we’ve never done before. We’re going to follow the results.

What’s the overriding ambition would you say?

We’re on this journey to create a smart city vision. We don’t really have one today. It’s not a product or a service, it’s a framework. And within that framework we will have a series of initiatives that focus on things that are important to us. Transportation is really important to us here in Palo Alto. Energy and resources are really important: we’re going to start to put sensors on important flows of water so we can see the amount of consumption at certain times but also be really smart about leak detection, potentially using little sensors connected to pipes throughout the city. We’re also really focused on the environment. We have a chief sustainability officer who is putting together a multi-decade strategy around what PA needs to do to be part of the solution around climate change.

That’s also going to be a lot about sensors, about collecting data, about informing people and creating positive behaviours. Public safety is another key area. Being able to respond intelligently to crimes, terrorism or natural disasters. A series of sensors again sending information back to some sort of decision system that can help both people and machines make decisions around certain types of behaviours.

How do you expect this whole IoT ecosystem to develop over the next decade?

Bill Gates has a really good saying on this: “We always overestimate the change that will occur in the next two years and underestimate the change that will occur in the next ten.”  It’s something that’s informed me in my thinking. I think things are going to move faster and in more surprising ways in the next ten years for sure: to the extent that it’s very hard to anticipate where things are headed.

We’re disrupting the taxi business overnight, the hotel business, the food business. Things are happening at lightning speed. I don’t know if we have a good sense of where it’s all headed. Massive disruption across all domains, across work, play, healthcare, every sort of part of our lives.

It’s clear that – I can say this – ten years from now won’t be the same as today. I think we’ve yet to see the full potential of smart phones – I think they are probably the most central part of this ongoing transformation.

I think we’re going to connect many more things that we’re saying right now. I don’t know what the number will be: I hear five billion, twenty billion in the next five years. It’s going to be more than that. It’s going to become really easy to connect. We’ll stick a little communication device on anything. Whether it’s your key, your wallet, your shoes: everything’s going to be connected.

Palo Alto and the IoT Data Analytics & Visualization event look like a great matchup. What are you looking forward to about taking part?

It’s clearly a developing area and so this is the time when you want to be acquiring knowledge, networking with some of the big thinkers and innovators in the space. I’m pleased to be part of it from that perspective. Also from the perspective of my own personal learning and the ability to network with great people and add to the body of knowledge that’s developing. I’m going to be kicking it off as the CIO for the city.

BT and the IoT

BT Sevenoaks workstyle buildingIt is often said that the Internet of Things is all about data. Indeed, at its absolute heart, the whole ecosystem could even be reduced to four distinct layers, ones that are essentially applicable to any vertical.

First of all, you have the sensing layer: somehow (using sensors, Wi-Fi, beacons: whatever you can!) you have to collect the data in the first place, often in harsh environments. From there you need to transport the data on a connectivity layer. This could be mobile or fixed, Wi-Fi or something altogether more cutting edge.

Thirdly, you need to aggregate this data, to bring it together and allow it to be exchanged. Finally, there’s the crucial matter of analytics, where the raw data is transformed into something useful.

Operators such as BT sense the opportunities in this process – particularly in the first three stages. Some telcos may have arrived a little late to the IoT table, but there’s no question that – with their copious background developing vast, secure infrastructures – they enjoy some fundamental advantages.

“I see IoT as a great opportunity,” says Hubertus von Roenne, VP Global Industry Practices, BT Global Services. “The more the world is connected, the more you have to rely on a robust infrastructure, whether it’s connectivity or data centres, and the more you have to rely on secure and reliable environment. That’s our home turf. We are already active on all four layers, not only through our global network infrastructure, but also via our secure cloud computing capabilities and a ‘Cloud of Clouds’ technology vision that enables real time data crunching and strategic collaboration across very many platforms.”

An example of how BT is positioning itself can be seen in Milton Keynes, a flagship ‘smart city’ in the UK, with large public and private sector investment. BT is one of over a dozen companies from various industries testing out different use cases for a smarter, more connected city.

“In Milton Keynes we are the technology partner that’s collecting the data. We’ve created a data hub where we allow the information to be passed on, but also make it compatible and usable. The governance body of this Milton Keynes project decided very early to make it open source, open data, and allow small companies or individuals to play around with the data and turn it into applications. Our role is not necessarily to go onto the application layer – we leave that to others – our role is to allow the collection and transmission of data, and we help turn data into usable information.”

One use case BT is involved in is smart parking – figuring out how to help traffic management, reduce carbon footprint, and help the council to reduce costs and better plan for parking availability. “Lots of ideas which can evolve as you collect the data, and that’s BT’s role.”

Another good example of how BT can adapt its offerings to different verticals is its work in telecare and telehealth, where the telco currently partners with the NHS, providing the equipment, monitoring system, and certain administrative and operational units, leaving the medical part to the medical professionals.

While BT’s established UK infrastructure makes it well positioned to assume these kinds of roles in developing smarter cities and healthcare, in other, more commercial areas there are no place-specific constraints.

“Typically our core customer base for global services are the large multinational players,” says von Roenne, “and these operate around the world. We are bringing our network and cloud integration capabilities right down to the manufacturing lines or the coal face of our multinational customers. Just a few weeks ago, we announced a partnership with Rajant Corporation, who specialise in wireless mesh deployments, to enable organisations to connect and gather data from thousands of devices such as sensors, autonomous vehicles, industrial machinery, high-definition cameras and others.”

Indeed, there are countless areas where data can be profitably collated and exploited, and next month von Roenne will be attending Internet of Things World Europe in Berlin, where he will be looking to discover new businesses and business opportunities. “I think there is already a lot of low hanging fruit out there if we just do some clever thinking about using what’s out there,” he says, adding that, often, the area in which the data could really be useful is not necessarily the same as the one it’s being collected in.

The capacity to take a bird’s eye view, bringing together different sectors of the economy for everyone’s mutual benefit, is another advantage BT will point to as it positions itself for the Internet of Things.

Make your Sunday League team as ‘smart’ as Borussia Dortmund with IoT

IoT can help make your football team smarter

IoT can help make your football team smarter

How, exactly, is IoT changing competitive sports? And how might you, reader, go about making your own modest Sunday League team as ‘smart’ as the likes of AC Milian, Borussia Dortmund and Brazil?

We asked Catapult, a world leader in the field and responsible for connecting all three (as well as Premier League clubs including Tottenham, West Brom, Newcastle, West Ham and Norwich) exactly how the average sporting Joe could go about it. Here’s what the big teams are increasingly doing, in five easy steps.

Link-up play

The technology itself consists of a small wearable device that sits (a little cyborg-y) at the top of the spine under the uniform, measuring every aspect of an athlete’s movement using GPS antenna and motion sensors. The measurements include acceleration, deceleration, change of direction and strength – as well as more basic things like speed, distance and heart rate.

Someone’s going to have to take a bit of time off work though! You’ll be looking at a one- or two-day installation on-site with the team, where a sports scientist would set you up with the software.

Nominate a number cruncher

All the raw data you’ll collect is then put through algorithms that provide position-specific and sport-specific data output to a laptop. Many of Catapult’s Premier League and NFL clients hire someone specifically to analyse the massed data.  Any of your team-mates work in IT or accountancy?

Tackle number crunching

Now you’ve selected your data analyst, you’ll want to start them out on the more simple metrics. Everyone understands distance, for instance (probably the easiest way to understand how hard an athlete has worked). From there you can look at speed. Combine the two and you’ll have a fuller picture of how much of a shift Dean and Dave have really put in (hangovers notwithstanding).

Beyond this, you can start looking at how quickly you and your team mates accelerate (not very, probably), and  the effect of deceleration on your intensity afterward. Deceleration is usually the most harmful to tissue injuries.

Higher still up the spectrum of metrics, you can encounter a patented algorithm called inertial movement analysis, used to capture ‘micro-movements’ and the like.

Pay up!

Don’t worry, you won’t have to actually buy all the gear (which could well mean your entire team re-mortgaging its homes): most of Catapult’s clients rent the devices…

However, you’ll still be looking at about £100 per unit/player per month, a fairly hefty additional outlay.

Surge up your Sunday League!

However, if you are all sufficiently well-heeled (not to mention obsessively competitive) to make that kind of investment, the benefits could be significant.

Florida State Football’s Jimbo Fisher recently credited the technology with reducing injuries 88 per cent. It’s one of number of similarly impressive success stories: reducing injuries is Catapult’s biggest selling point, meaning player shortages and hastily arranged stand-ins could be a thing of the past.

Of course if the costs sound a bit too steep, don’t worry: although the timescale is up in the air, Catapult is ultimately planning to head down the consumer route.

The day could yet come, in the not too distant future, when every team is smart!

How will the Wearables market will continue to change and evolve? Jim Harper (Director of Sales and Business Development, Bittium) will be leading a discussion on this very topic at this year’s Internet of Things World Europe (Maritim Pro Arte, Berlin 6th – 7th October 2015)

A tale of two ITs

Werner Knoblich,  head of strategy at Red Hat in Europe, Middle East, and Africa (EMEA)

Werner Knoblich, senior vp and gm of Red Hat in EMEA

Gartner calls it ‘bimodal IT’; Ovum calls it ‘multimodal IT’; IDC calls it the ‘third platform’. Whatever you choose to call it, they are all euphemisms for the same evolutions in IT: a shift towards deploying more user-centric, mobile-friendly software and services that more scalable, flexible and easily integrated than the previous generation of IT services. And while the cloud has evolved as an essential delivery mechanism for the next generation of services, it’s also prompting big changes in IT says Werner Knoblich, senior vice president and general manager of Red Hat in EMEA.

“The challenge with cloud isn’t really a technology one,” Knoblich explains, “but the requirements of how IT needs to change in order to support these technologies and services. All of the goals, key metrics, ways of doing business with vendors and service providers have changed.”

Most of what Knoblich is saying may resonate with any large organisation managing a large legacy estate that wants to adopt more mobile and cloud services; the ‘two ITs can be quite jarring.

The chief goal used to be reliability; now it’s agility. In the traditional world of IT the focus was on price for performance; now it’s about customer experience. In traditional IT the most common approach to development was the classic ‘waterfall’ approach – requirements, design, implementation, verification, maintenance; now it’s all about agile and continuous delivery.

Most assets requiring management were once physical; now they’re all virtualised machines and microservices. The applications being adopted today aren’t monolithic beasts as they were traditionally, but modular, cloud-native apps running in Linux containers or platforms like OpenStack (or both).

Not just the suppliers – but also the way they are sourced – has changed. In the traditional world long-term, large-scale multifaceted deals were the norm; now, there are lots of young, small suppliers, contracted in short terms or on a pay-as-you-go basis.

“You really need a different kind of IT, and people who are very good in the traditional mode aren’t necessarily the ones that will be good in this new hybrid world,” he says. “It’s not just hybrid cloud but hybrid IT.”

The challenges are cultural, organisational, and technical. According to the 2015 BCN Annual Industry Survey, which petitioned over 700 senior IT decision makers, over 67 per cent of enterprises plan to implement multiple cloud services over the next 18 months, but close to 70 per cent were worried about how those services would integrate with other cloud services and 90 per cent were concerned about how they will integrate those cloud services with their legacy or on-premise services.

That said, open source technologies that also make use of open standards play a massive role in ensuring cloud-to-cloud and cloud-to-legacy integrations are achievable and, where possible, seamless – one of the main reasons why Linux containers are gaining so much traction and mind share today (workload portability). And open source technology is something Red Hat knows a thing or two about.

Beyond its long history in server and desktop OSs (Red Hat Enterprise Linux) and middleware (JBoss) the company is a big sponsor and early backer of Open Stack, increasingly popular cloud building software built on a Linux foundation. It helped create an open source platform as a service, OpenShift. The company is also working on Atomic Host, an open source container-based hosting mechanism for a slimmed down version of RHEL with support for other open source container technologies including Kubernetes and Docker, the darlings of the container community.

“Our legacy in open source is extremely important and even more important in cloud than the traditional IT world,” Knoblich says.

“All of the innovation happening today in cloud is open source – think of Docker, OpenStack, Cloud Foundry, Kubernetes, and you can’t really think of one pure proprietary offering that can match these in terms of the pace of innovation and the rate at which new features are being added,” he explains.

But many companies, mostly the large supertankers, don’t yet see themselves as ready to embrace these new technologies and platforms – not just because they don’t have the type or volume of workloads to migrate, because they require a huge cultural and organisational shift. And cultural as well as organisational shifts are typically rife with political struggles, resentment, and budgetary wrestling.

“You can’t just install OpenStack or Dockerise your applications and ‘boom’, you’re ready for cloud – it just doesn’t work that way. Many of the companies that are successfully embracing these platforms and digitising their organisations set up a second IT department that operates in parallel to the traditional one, and can only seed out the processes and practices – and technologies – they’ve embraced when critical mass is reached. Unless that happens, they risk getting stuck back in the traditional IT mentality.”

An effective open hybrid approach ultimately means not only embracing the open source solutions and technologies, but recognising that some large, monolithic applications – say, Cobol-based mainframe apps – won’t make it into this new world; neither will the processes needed to maintain those systems.

“For some industries, like insurance for instance, there isn’t a recognised need to ditch those systems and processes. But for others, particularly those being heavily disrupted, that’s not the case. Look at Volkswagen. They don’t just see Mercedes, BMW and Tesla as competitors – they see Google and Apple as competitors too because the car becomes a technology platform for services.”

“No industry is secure from disruption, particularly from players that scarcely existed a few years ago, which is why IT will be multi-modal for many, many years to come,” he concludes.

This interview was developed in partnership with Red Hat

Jennifer Kent of Parks Associates on IoT and healthcare

BCN spoke to Jennifer Kent, Director of Research Quality and Product Development at Parks Associates, on the anticipated impact IoT will have on healthcare.

BCN: Can you give us a sense of how big an impact the Internet of Things could have on health in the coming years?

Jennifer KentJennifer Kent: Because the healthcare space has been slow to digitize records and processes, the IoT stands to disrupt healthcare to an even greater extent than will be experienced in other industries. Health systems are just now getting to a point where medical record digitization and electronic communication are resulting in organizational efficiencies.

The wave of new data that will result from the mass connection of medical and consumer health devices to the Internet, as part of the IoT, will give care providers real insight for the first time into patients’ behaviour outside of the office. Parks Associates estimates that the average consumer spends less than 1% of their time interacting with health care providers in care facilities. The rest of consumers’ lives are lived at home and on-the-go, engaging with their families, cooking and eating food, consuming entertainment, exercising, and managing their work lives – all of which impact their health status. The IoT can help care providers bridge the gap with their patients, and can potentially provide insight into the sources of motivation and types of care plans that are most effective for specific individuals.

 

Do you see IoT healthcare as an essentially self-enclosed ecosystem, or one that will touch consumer IoT?

IoT healthcare will absolutely touch consumer IoT, at least in healthcare markets where consumers have some responsibility for healthcare costs, or in markets that tie provider payments to patients’ actual health outcomes. In either scenario, the consumer is motivated to take a greater interest in their own self-care, driving up connected health device and application use. While user-generated data from consumer IoT devices will be less clinically accurate or reliable, this great flood of data still has the potential to result in better outcomes, and health industry players will have an interest in integrating that data with data produced via IoT healthcare sources.

 

Medical data is very well protected – and quite rightly – but how big a challenge is this to the development of effective medical IoT, which after all depends on the ability to effectively share information?

All healthcare markets must have clear regulations that govern health data protection, so that all players can ensure that their IoT programs are in compliance with those regulations. Care providers’ liability concerns, along with the investments in infrastructure that are necessary to protect such data, have created the opportunity for vendors to create solutions that take on the burden of regulatory compliance for their clients. Furthermore, application and device developers on the consumer IoT side that border very closely the medical IoT vertical can seek regulatory approval –even if not required – as a means of attaining premium brand status from consumers and differentiation from the may untested consumer-facing applications on market.

Finally, consumers can be motivated to permit their medical data be shared, for the right incentive. Parks Associates data show that no less than 40% of medical device users in the U.S. would share the data from their devices in order to identify and resolve device problems. About a third of medical devices users in the US would share data from their devices for a discount on health insurance premiums. Effective incentives will vary, depending on each market’s healthcare system, but care providers, device manufacturers, app developers, and others who come into contact with medical device data should investigate whether potential obstacles related to data protection could be circumvented by incentivizing device end-users to permit data sharing.

 

You’re going to be at Internet of Things World Europe (5 – 7 October 2015 Maritim proArte, Berlin). What are you looking forward to discussing there and learning about?

While connected devices have been around for decades, the concept of the Internet of Things – in which connected devices communicate in a meaningful way across silos – is at a very early and formative stage. Industry executives can learn much from their peers and from visionary thinkers at this stage, before winners and losers have been decided, business plans hardened, and innovation slowed. The conversations among attendees at events like Internet of Things World Europe can shape the future and practical implementation of the IoT. I look forward to learning how industry leaders are applying lessons learned from early initiatives across markets and solution types.

Enabling smart cities with IoT

The Internet of Things will help make cities smarter

The Internet of Things will help make cities smarter

The population of London swells by an additional 10,000 a month, a tendency replicated in cities across the world. To an extent such growth reflects the planet’s burgeoning wider population, and there is even an interesting argument that cities are an efficient way of providing large numbers with their necessary resources. What we know as the ‘smart city’ may well prove to be the necessary means to manage this latest shift at scale.

Justin Anderson is sympathetic to this assessment. As the chairman of Flexeye, vice chair of techUK’s Internet of Things Council, and a leader of government-funded tech consortium Hypercat and London regeneration project Old Oak Common, he is uniquely positioned to comment on the technological development of our urban spaces.

“We are in an early stage of this next period of the evolution of the way that cities are designed and managed,” he says. “The funny thing about ‘smart’ of course, is that if you look back 5000 years, and someone suggested running water would be a good idea, that would be pretty smart at the time. ‘Smart’ is something that’s always just over the horizon, and we’re just going through another phase of what’s just over the horizon.”

There’s some irony in the fact that Anderson finds himself so profoundly involved in laying the foundations for smarter cities, since architects have been in his family for 400 years, and he intended to go in that direction himself before falling into the study of mathematics – which then led to a career in technology.

“There are lots of similarities between the two,” he says. “Stitching lots of complex things together and being able to visualise how the whole thing might be before it exists. And of course the smart city is a world comprised of both the physical and virtual aspects of infrastructure, both of which need to be tied together to be able to manage cities in a more efficient way.”

Like many of the great urban developments, the smart city is mostly going to be something invisible, something we quickly take for granted.

“We’re not necessarily all going to be directly feeling the might of processing power all around us. I think we’ll see a lot of investment on the industrial level coming into the city that’s going to be invisible to the citizen, but ultimately they will benefit because it’s a bit more friction taken out of their world. It’ll be a gradual evolution of things just working better – and that will have a knock on effect of not having to queue for so long, and life just being a little bit easier.”

There are, however, other ways connectivity could change urban life in the coming years: by reversing the process of urban alienation, and allowing online communities to come together and effect real world change.

“If you can engage citizens in part of that process as a way that they live, and make sure that they feel fully accountable for what the city might be, then there’s also a lot of additional satisfaction that could come from being a part of that city, rather than just a pawn in a larger environment where you really have no say and just have to do what you’ve got to do. Look at something like air quality – to be able to start to get that united force and be able to then put more pressure upon the city authorities to do something about it. Local planning policy is absolutely core in all of this.”

Anderson sees technology as an operative part of the trend towards devolution, with cities and their citizens gaining more and more control of their destiny. “If you build that sort of nuclear community around issues rather than just around streets or neighbourhoods, you get new levels of engagement.” For such changes to be effected, however, there is plenty that still needs doing on the technical level – a message Anderson will bringing to Internet of Things World Europe event in Berlin this October.

“I think the most important thing right now is that technology companies come together to agree on a common urban platform that is interoperable, allowing for different components to be used appropriately, and that we don’t find ourselves locked into large systems that mean cities can’t evolve in a flexible and fluid way in the future. We have to have that flexibility designed into this next stage of evolution that comes frMakom interoperability. My drive is to make sure everyone is a believer in interoperability.”

Lighting and the Internet of Things

Philips is experimenting with using connected lights to for everything from keeping us abreast of important messages to making video games more interactive and impactful on our senses

Philips is experimenting with using connected lights to for everything from keeping us abreast of important messages to making video games more interactive and impactful on our senses

When was the last time you thought about your lights? Whether you are outside or in, you will probably see 4, 5 or more sources of artificial light within view. There is an estimated 50 billion individual light points in the world today – seven or so per person; of all technology, the light bulb is arguably the most ubiquitous.

It is perhaps because of this ubiquity that light has all but disappeared from our conscious minds. We utilise it with minimal thought, though sometimes its complexities are impossible to ignore. If we were preparing a romantic dinner, for instance, we would tailor the lighting accordingly. We do this because lighting doesn’t merely reflect mood, but dictates it, something connectivity is increasingly enabling us to take advantage of.

“We’ve evolved for the last however many millions of years to expect light from the sun,” says George Yianni, head of technology at Philips Lighting Home Systems. “If there’s a bright, cool white light at midday from the sun, our brain is hardwired to be awake and alert. If there is a very warm dim light such as you get around sunset, our brain is hardwired to start winding down and relaxing.”

Yianni is a technological evangelist. In a very literal way he has seen the light, and he wants to harness this physiological sensitivity to light (among other responses) to help us to relax, to deal with jet lag, to concentrate better and much more. Due to the degree to which we take lighting for granted, however, it’s an area that poses obvious challenges to innovators:  “As a consumer, the only time you think about a light bulb inside your house is when one breaks and you have to try to find one that fits in the same socket and gives the same light output. But actually it is amazing how light can transform how a space looks, how you feel in a space.”

One of the first projects Yianni was involved in was the use of tunable white light in some German schools, giving teachers the ability to modify the lighting by changing the colour temperature, to calm students down, help them wake up, or enhance concentration (Yanni says their test scores improved significantly as a result). It was after working on a succession of such projects – including outdoor street lighting, office lighting, football stadiums, and more – that he accepted the challenge of introducing these kinds of effects and improvements into the home in the form of Philips Hue connected lighting for the home. “I wanted to make this kind of lighting accessible, understandable and impactful for normal people. I wanted people to think about lighting more than just when it’s broken.”

Some of the results and available use cases will be familiar to anyone with an eye open to contemporary commercial IoT. Lighting that knows when you’ve come home, for example, and can ensure that you don’t step into a dark, inhospitable house after a trip or long day at work. By the same token, remotely controllable or programmable lighting that can give people added peace of mind when they’re away – by making it look like they’re not.

Familiar as this latter use case might be, it also points towards another intriguing capacity of lighting. Usually, we turn lighting on and off according to whether we need it: but a house burglar may translate this as whether we are at home or out. Far from being oblivious to lighting, lighting speaks volumes to would-be burglars.

The potential of lighting to communicate in other, less nefarious contexts is something Phillips is encouraging its customers to exploit.

“We’re enabling people to use Philips Hue lights inside their homes and by extension the homes themselves to communicate simple notifications,” says Yianni. “So, in the morning, if the Philips Hue light in your porch is blue you know it’s going to rain that day, if it’s yellow you know it’s not so you can plan whether to bring a umbrella or not. Other customers are using Philips Hue lights to notify them about important email messages. There’s a wide range of way where people are actually using connected lighting in their homes to keep them informed in a less distracting way than an alarm or a buzzer.”

Another popular use case for smarter lighting concerns home entertainment. Whether we’re watching movies or TV, playing video games, or listening to music, Philips Hue is unique in that it can greatly enhance the experience through more than 300 third-party apps. Philips Hue launched the first video game, movies and TV shows with ‘scripted’ lighting programmed by the content creators to sync with their lights delivering a more immersive experience in the home. Yianni provides some examples: “As your health is dropping down in the video game Chariot, the Philips Hue lights turn red in your lounge. As a protagonist enters a dark cave in a movie, the Philips Hue lights will dim down.”

“For the last hundred years, people have been used to expecting nothing more on and off from a light bulb,” says Yianni. “We are changing that.”

In September Yianni will be appearing at Internet of Things World Europe in Berlin, where he’ll be using lighting to really illuminate the potential for IoT to revolutionise some of the most fundamental and taken-for-granted details of our day-to-day lives, as well as the central importance of communicating this to consumers. 

IoT security and the world of US medicine

IoT in healthcare faces its fair share of challenges

IoT in healthcare faces its fair share of challenges

Internet of Things security is anything but a homogenous concept. It is, rather, extremely dependent on the type of products being developed and – in many cases – the sort of regulatory restrictions they are subject to.

Of all the sectors where IoT is proliferating, however, it is arguably medical that is the most fraught. In medical IT, developers have to operate in a minefield of intense regulation, life and death safety issues, and an unusually high (and of course very much unwelcome) degree of scrutiny from hackers.

The hacking of medical data is a popular criminal enterprise, particularly in the US, where just last week UCLA Health hospitals say hackers may have accessed personal information and medical records of as many as 4.5 million patients.

However, while no-one would be overjoyed at the thought of something as intimate as their medical records falling into the hands of digital crooks, it is arguably the patient who has the least to worry about here. The main targets of medical data theft are US insurance companies and the institutions that administer Medicare. In the US, patients usually collect medication and leave it to pharmacists to bill the insurance companies.

A single refill for five months’ medication can easily add up to a few thousand dollars, so the rewards for effective fraud – with hackers posing as pharmacists – are large. Insurance companies, of course, foot the bill, while for those impersonated the results can cost time, stress, and in worst case scenarios a potentially dangerous delay in securing their medication.

It’s just one example of why security around medical data – medical IoT’s bread and butter – has to be so tight.

Someone extremely familiar with the territory is Sridhar Iyengar, one of the founders of AgaMatrix. At AgaMatrix, Iyengar  helped develop the first iPhone –connected medical device, a glucose monitor called iBGStar, then a revolutionary innovation for diabetes sufferers.

Nowadays Iyengar’s focus is on Misfit, a wearables company focussing on fitness rather than illness, but he is still deeply involved with issues surrounding IoT, health, and security. In September, he will attend Internet of Things Security conference in Boston as a keynote speaker, where he will draw on his expertise in diabetes to illustrate the wider challenges confronted by developers in the realm of medical IoT.

“The Holy Grail in this world of diabetes is what they call an artificial pancreas,” he says, “meaning that, if you can sense how much glucose is in your blood, you can pump in the right amount of insulin to automatically regulate it. Nobody has made a commercial version of that. Partly because the folks who make a glucose sensor are different to the folks that make the pumps and it has been  difficult for the two to cooperate due to trade secrets and the complexities of sharing the liability of devices from different manufacturers that must work in unison. The patients are left to suffer.”

In one famous incident, this frustrating discontinuity was first overcome by a “citizen scientist,” a father who hacked his diabetic child’s separate devices and was able to link the two together. While this was never marketed, it signalled that the race for a commercially viable artificial pancreas was very much on. However, while no-one would resent such intrepid ingenuity on the part of the “citizen scientist,” Iyengar points out that it is also demonstrates the devices in question were very much hackable.

“If somebody hacks into an insulin pump you could kill someone,” he says. “They overdose, they go into a coma, they die. None of these insulin pump manufacturers are going to open source anything: they can’t, because of the deadly consequences of someone hacking it.”

Ultimately, it will prove an interesting challenge to future regulators to establish precisely where to draw the line on issue such as this. Still, the capacity for others to easily take control of (for instance) a connected pacemaker is bound to generate a degree of concern.

Many of these issues are complicated by existing regulations. The US Health Insurance Portability and Accountability Act (HIPAA) requirements state that medical data can only be shared after it has been completely anonymised, which presents something of a paradox to medical IoT, and frequently requires complex architectures and dual databases, with pointers enabling healthcare professionals to blend the two together and actually make sense of them.

Issues like this mean developers can’t rely on industry standard architectures.

“You can’t rely on this network immune system that exists in the consumer software space where many different parties are vigilant in monitoring breaches and bugs because multiple vendors’ code is used by a product,” says Sridhar, picking an apt metaphor. “If you want to develop security related features you kind of have to do it yourself.”  In turn this means that, if there are breaches, you have to address them yourself. “It raises this interesting dilemma,” he says. “On the one hand the way that software’s written in the medical field, it’s supposed to be more safe. But in some situations it may backfire and the entire industry suffers.”