Category Archives: Interviews

Lighting and the Internet of Things

Philips is experimenting with using connected lights to for everything from keeping us abreast of important messages to making video games more interactive and impactful on our senses

Philips is experimenting with using connected lights to for everything from keeping us abreast of important messages to making video games more interactive and impactful on our senses

When was the last time you thought about your lights? Whether you are outside or in, you will probably see 4, 5 or more sources of artificial light within view. There is an estimated 50 billion individual light points in the world today – seven or so per person; of all technology, the light bulb is arguably the most ubiquitous.

It is perhaps because of this ubiquity that light has all but disappeared from our conscious minds. We utilise it with minimal thought, though sometimes its complexities are impossible to ignore. If we were preparing a romantic dinner, for instance, we would tailor the lighting accordingly. We do this because lighting doesn’t merely reflect mood, but dictates it, something connectivity is increasingly enabling us to take advantage of.

“We’ve evolved for the last however many millions of years to expect light from the sun,” says George Yianni, head of technology at Philips Lighting Home Systems. “If there’s a bright, cool white light at midday from the sun, our brain is hardwired to be awake and alert. If there is a very warm dim light such as you get around sunset, our brain is hardwired to start winding down and relaxing.”

Yianni is a technological evangelist. In a very literal way he has seen the light, and he wants to harness this physiological sensitivity to light (among other responses) to help us to relax, to deal with jet lag, to concentrate better and much more. Due to the degree to which we take lighting for granted, however, it’s an area that poses obvious challenges to innovators:  “As a consumer, the only time you think about a light bulb inside your house is when one breaks and you have to try to find one that fits in the same socket and gives the same light output. But actually it is amazing how light can transform how a space looks, how you feel in a space.”

One of the first projects Yianni was involved in was the use of tunable white light in some German schools, giving teachers the ability to modify the lighting by changing the colour temperature, to calm students down, help them wake up, or enhance concentration (Yanni says their test scores improved significantly as a result). It was after working on a succession of such projects – including outdoor street lighting, office lighting, football stadiums, and more – that he accepted the challenge of introducing these kinds of effects and improvements into the home in the form of Philips Hue connected lighting for the home. “I wanted to make this kind of lighting accessible, understandable and impactful for normal people. I wanted people to think about lighting more than just when it’s broken.”

Some of the results and available use cases will be familiar to anyone with an eye open to contemporary commercial IoT. Lighting that knows when you’ve come home, for example, and can ensure that you don’t step into a dark, inhospitable house after a trip or long day at work. By the same token, remotely controllable or programmable lighting that can give people added peace of mind when they’re away – by making it look like they’re not.

Familiar as this latter use case might be, it also points towards another intriguing capacity of lighting. Usually, we turn lighting on and off according to whether we need it: but a house burglar may translate this as whether we are at home or out. Far from being oblivious to lighting, lighting speaks volumes to would-be burglars.

The potential of lighting to communicate in other, less nefarious contexts is something Phillips is encouraging its customers to exploit.

“We’re enabling people to use Philips Hue lights inside their homes and by extension the homes themselves to communicate simple notifications,” says Yianni. “So, in the morning, if the Philips Hue light in your porch is blue you know it’s going to rain that day, if it’s yellow you know it’s not so you can plan whether to bring a umbrella or not. Other customers are using Philips Hue lights to notify them about important email messages. There’s a wide range of way where people are actually using connected lighting in their homes to keep them informed in a less distracting way than an alarm or a buzzer.”

Another popular use case for smarter lighting concerns home entertainment. Whether we’re watching movies or TV, playing video games, or listening to music, Philips Hue is unique in that it can greatly enhance the experience through more than 300 third-party apps. Philips Hue launched the first video game, movies and TV shows with ‘scripted’ lighting programmed by the content creators to sync with their lights delivering a more immersive experience in the home. Yianni provides some examples: “As your health is dropping down in the video game Chariot, the Philips Hue lights turn red in your lounge. As a protagonist enters a dark cave in a movie, the Philips Hue lights will dim down.”

“For the last hundred years, people have been used to expecting nothing more on and off from a light bulb,” says Yianni. “We are changing that.”

In September Yianni will be appearing at Internet of Things World Europe in Berlin, where he’ll be using lighting to really illuminate the potential for IoT to revolutionise some of the most fundamental and taken-for-granted details of our day-to-day lives, as well as the central importance of communicating this to consumers. 

IoT security and the world of US medicine

IoT in healthcare faces its fair share of challenges

IoT in healthcare faces its fair share of challenges

Internet of Things security is anything but a homogenous concept. It is, rather, extremely dependent on the type of products being developed and – in many cases – the sort of regulatory restrictions they are subject to.

Of all the sectors where IoT is proliferating, however, it is arguably medical that is the most fraught. In medical IT, developers have to operate in a minefield of intense regulation, life and death safety issues, and an unusually high (and of course very much unwelcome) degree of scrutiny from hackers.

The hacking of medical data is a popular criminal enterprise, particularly in the US, where just last week UCLA Health hospitals say hackers may have accessed personal information and medical records of as many as 4.5 million patients.

However, while no-one would be overjoyed at the thought of something as intimate as their medical records falling into the hands of digital crooks, it is arguably the patient who has the least to worry about here. The main targets of medical data theft are US insurance companies and the institutions that administer Medicare. In the US, patients usually collect medication and leave it to pharmacists to bill the insurance companies.

A single refill for five months’ medication can easily add up to a few thousand dollars, so the rewards for effective fraud – with hackers posing as pharmacists – are large. Insurance companies, of course, foot the bill, while for those impersonated the results can cost time, stress, and in worst case scenarios a potentially dangerous delay in securing their medication.

It’s just one example of why security around medical data – medical IoT’s bread and butter – has to be so tight.

Someone extremely familiar with the territory is Sridhar Iyengar, one of the founders of AgaMatrix. At AgaMatrix, Iyengar  helped develop the first iPhone –connected medical device, a glucose monitor called iBGStar, then a revolutionary innovation for diabetes sufferers.

Nowadays Iyengar’s focus is on Misfit, a wearables company focussing on fitness rather than illness, but he is still deeply involved with issues surrounding IoT, health, and security. In September, he will attend Internet of Things Security conference in Boston as a keynote speaker, where he will draw on his expertise in diabetes to illustrate the wider challenges confronted by developers in the realm of medical IoT.

“The Holy Grail in this world of diabetes is what they call an artificial pancreas,” he says, “meaning that, if you can sense how much glucose is in your blood, you can pump in the right amount of insulin to automatically regulate it. Nobody has made a commercial version of that. Partly because the folks who make a glucose sensor are different to the folks that make the pumps and it has been  difficult for the two to cooperate due to trade secrets and the complexities of sharing the liability of devices from different manufacturers that must work in unison. The patients are left to suffer.”

In one famous incident, this frustrating discontinuity was first overcome by a “citizen scientist,” a father who hacked his diabetic child’s separate devices and was able to link the two together. While this was never marketed, it signalled that the race for a commercially viable artificial pancreas was very much on. However, while no-one would resent such intrepid ingenuity on the part of the “citizen scientist,” Iyengar points out that it is also demonstrates the devices in question were very much hackable.

“If somebody hacks into an insulin pump you could kill someone,” he says. “They overdose, they go into a coma, they die. None of these insulin pump manufacturers are going to open source anything: they can’t, because of the deadly consequences of someone hacking it.”

Ultimately, it will prove an interesting challenge to future regulators to establish precisely where to draw the line on issue such as this. Still, the capacity for others to easily take control of (for instance) a connected pacemaker is bound to generate a degree of concern.

Many of these issues are complicated by existing regulations. The US Health Insurance Portability and Accountability Act (HIPAA) requirements state that medical data can only be shared after it has been completely anonymised, which presents something of a paradox to medical IoT, and frequently requires complex architectures and dual databases, with pointers enabling healthcare professionals to blend the two together and actually make sense of them.

Issues like this mean developers can’t rely on industry standard architectures.

“You can’t rely on this network immune system that exists in the consumer software space where many different parties are vigilant in monitoring breaches and bugs because multiple vendors’ code is used by a product,” says Sridhar, picking an apt metaphor. “If you want to develop security related features you kind of have to do it yourself.”  In turn this means that, if there are breaches, you have to address them yourself. “It raises this interesting dilemma,” he says. “On the one hand the way that software’s written in the medical field, it’s supposed to be more safe. But in some situations it may backfire and the entire industry suffers.”

Hybrid cloud issues are cultural first, technical second – Ovum

CIOs are still struggling with their hybrid cloud strategies

CIOs are still struggling with their hybrid cloud strategies

This week has seen a number of hybrid cloud deals which would suggest the industry is making significant progress delivering the platforms, services and tools necessary to make hybrid cloud practical. But if anything they also serve as a reminder that IT will forever be multimodal which creates challenges that begin with people, not technology, explains Ovum’s principle analyst of infrastructure solutions Roy Illsley.

There has been no shortage of hybrid cloud deals this week.

Rackspace and Microsoft announced a deal that would see the hosting and cloud provider expand its Fanatical Support to Microsoft Azure-based hybrid cloud platforms.

Google both announced it would support Windows technologies on its cloud platform, and that it would formally sponsor the OpenStack foundation – a move aimed at supporting container portability between multiple cloud platforms.

HP announced it would expand its cloud partner programme to include CenturyLink, which runs much of its cloud platform on HP technology, in a move aimed at bolstering HP’s hybrid cloud business and CenturyLink’s customer reach.

But one of the more interesting hybrid cloud stories this week came from the enterprise side of the industry. Copper and gold producer Freeport-McMoRan announced it is embarking on a massive overhaul of its IT systems. In a bid to become more agile the firm said it would deploy its entire application estate on a combination of private and public cloud platforms – though, and somewhat ironically, the company said the entire project would wrap up in five years (which, being pragmatic about IT overhauls, could mean far later).

“The biggest challenge with hybrid cloud isn’t the technology per se – okay, so you need to be able to have one version of the truth, one place where you can manage most the platforms and applications, one place where to the best of your abilities you can orchestrate resources, and so forth,” Illsley explains.

Of course you need all of those things, he says. There will be some systems that won’t fit into that technology model, that will likely be left out (i.e. mainframes). But there are tools out there to fit current hybrid use cases.

“When most organisations ‘do’ hybrid cloud, they tend to choose where their workloads will sit depending on their performance needs, scaling needs, cost and application architecture – and then the workloads sit there, with very little live migration of VMs or containers. Managing them while they sit there isn’t the major pain point. It’s about the business processes; it’s the organisational and cultural shifts in the IT department that are required in order to manage IT in a multimodal world.”

“What’s happening in hybrid cloud isn’t terribly different from what’s happening with DevOps. You have developers and you have operations, and sandwiching them together in one unit doesn’t change the fact that they look at the world – and the day-to-day issues they need to manage or solve – in their own developer or operations-centric ways. In effect they’re still siloed.”

The way IT is financed can also create headaches for CIOs intent on delivering a hybrid cloud strategy. Typically IT is funded in an ‘everyone pitches into the pot’ sort of way, but one of the things that led to the rise of cloud in the first place is line of businesses allocating their own budgets and going out to procure their own services.

“This can cause both a systems challenge – shadow IT and the security, visibility and management issues that come with that – and a cultural challenge, one where LOB heads see little need to fund a central organisation that is deemed too slow or inflexible to respond to customer needs. So as a result, the central pot doesn’t grow.”

While vendors continue to ease hybrid cloud headaches on the technology front with resource and financial (i.e. chargeback) management tools, app stores or catalogues, and standardised platforms that bridge the on-prem and public cloud divide, it’s less likely the cultural challenges associated with hybrid cloud will find any straightforward solutions in the short term.

“It will be like this for the next ten or fifteen years at least. And the way CIOs work with the rest of the business as well as the IT department will define how successful that hybrid strategy will be, and if you don’t do this well then whatever technologies you put in place will be totally redundant,” Illsley says.

Exclusive: How Virgin Active is getting fit with the Internet of Things

Virgin want to use IoT to make its service more holistic and improve customer retention

Virgin want to use IoT to make its service more holistic and improve customer retention

Virgin Active is embarking on an ambitious redesign of its facilities that uses the Internet of Things to improve the service it offers to customers and reduce subscriber attrition rates, explains Andy Caddy, chief information officer of Virgin Active.

“Five years ago you didn’t really need to be very sophisticated as a health club operator in terms of your IT and digital capability,” Caddy says. “But now I would argue that things have changed dramatically – and you have to be very smart about how you manage your relationship with customers.”

The health club sector is one of the most unique subscription-based businesses around, in part because the typical attrition rate is around 50 per cent – meaning by the end of the year the club has lost half of the members it started out with, and needs to gain new subscribers by at least as much in order to grow on aggregate. That’s quite a challenge to tackle.

Much of how Virgin Active intends to address this is through more clever use of data, and to use cloud-based software and IoT sensors to help better understand what its customers are doing inside and beyond the gym. The company’s vision involves creating once consolidated view of the customer, collating information stored on customers’ smartphones with health data generated from wearable sensors and gym machines being used by those customers.

The company is already in the process of trialling this vision with a new fitness club at Cannon Street, London, which opens later this month. Originally announced last year, the club, which Caddy says is to be Virgin Active’s flagship technology club, uses RFID chip-embedded membership wrist bands that can be used to do everything from entering the gym and logging cardiovascular data from the machines they use to buying drinks at the café, renting towels and accessing lockers.

“Now we start to see what people are doing in the clubs, which gives us a richer set of data to work with, and it starts to generate insights that are more relevant and engaging and perhaps also feeds our CRM and product marketing,” he says. “Over the next few months we’ll be able to compare this data with what we see at other clubs to find out a few important things – are we becoming more or less relevant to customers? Is customer retention improving?”

Combine that with IoT data from things like smartwatches that are worn outside the confines of the gym, and the company can get a better sense of how to improve what it suggests as a health or fitness activity from a holistic standpoint. It also means more effective marketing, which beckons a more sophisticated way of handling data and acting on it than it already does by Caddy’s admission.

“The kinds of questions I want to be able to answer for my customers are things like: What’s the kind of lunch I can eat tomorrow based on today’s activity? How should I change my calendar next week based on my current stress levels? These are the really interesting questions that would absolutely add value to [a customer’s] life and also create a reasonable extension of the role we’re already playing as a fitness provider.”

But Caddy says the vendors themselves, while pushing the boundaries in IoT from a technical standpoint, pose the biggest threats to the sector’s development.

“We want standards because it’s very hard to do anything when Nike want to talk about Fuel and Fitbit want to talk about Steps and Apple want to talk about Activity, and none of these things equal the same things,” he explains. “What we really want is some of these providers to start thinking about how you do something smart with that information, and what you need in order to do that, but I’m always surprised by how few vendors are asking those kinds of questions.”

“It’s an inevitable race to the bottom in sensor tech; the value is all in the data.”

Companies like Apple and Microsoft know this – and in health specifically are attempting to build out their own data services that developers can tap into for their own applications. But again, those are closed, proprietary systems, and it may be some time before the IoT sector opens up to effectively cater a multi-device, multi-cloud world.

“We’re lucky in a sense because health and fitness is one of the first places where IoT has taken off in a real sense. But to be honest, we’re still a good way from where we want to be,” he says.

Ericsson details strategic plans beyond telecoms sector

Swedish networking giant Ericsson has made no attempt to hide the fact that it needs to diversify in order to survive and the nature of that diversification just got a bit clearer, explains Telecoms.com.

In his exclusive interview with Telecoms.com late last year CEO Hans Vestberg detailed the five main areas of diversification his company has identified: IP networks, Cloud, OSS/BSS, TV & Media and Industry & Society.Ericsson has spoken freely about the first four but has chosen to keep quiet about its industry & society initiative until it was ready.

That moment has now arrived, so Telecoms.com spoke to Nadine Allen (pictured), who heads up Industry & Society for Ericsson in Western and Central Europe. She explained that Ericsson sees a massive opportunity in helping other industries to capitalize on the way the telecoms and IT industries are evolving and converging, with IoT being a prime example.

“The evolved use of ICT is becoming increasingly important to all industries as they address the opportunities and challenges that the networked society will bring,” said Allen. “There is a growing need for ICT connectivity and services in market segments outside the traditional customer base of Ericsson, such as: utilities, transport and public safety.”

Ericsson has identified five key industries to focus on: Automotive, Energy & Utilities, Road & Rail, Safety & Security and Shipping. As you can see these are mainly quite industrial sectors, and this is in keeping with how things like IoT are evolving, with the main commercial applications being of a B2B type.

Ericsson has been a transformation partner to our customers for many decades and supported them in shaping their strategies,” said Allen. “This is a key strength relevant to customers inside and outside the telco space as they develop their connected strategies.

“We are a leading software provider and developer across all areas of the network, including OSS and BSS – these capabilities we see as being key to what will be needed to flexibly support the plethora of future use cases, some of which we can only imagine right now.”

Allen brought our attention to some specific use-cases, illustrated in the slide below. In utilities, for example, things like smart grids and smart metering are already emerging as a way to increase efficiency, while intelligent transport systems are doing the same for that sector.

Ericsson industry & society slide

All of this makes a lot of sense on paper, and Ericsson unquestionably has a lot of tools at its disposal to help industries get smarter, but combining these capabilities into coherent solutions and competing against companies such as the big systems integration and consulting firms will be a challenge. The Ericsson brand is strong in telcos, but not necessarily in transport, and it still needs to establish its consulting credentials beyond its home territory.

To conclude we asked Allen how she sees these underlying trends evolving. We believe the Internet of Things will have a profound impact in the future, enabling anything to be connected and providing ’smartness’ to these connected things will bring value across many sectors,” she said.

“The vision of IoT is a key part of the networked society and in one line I would say it is well described by ‘where everything that can benefit from being connected will be connected’. For example in a world of connected things, value will shift from the physical properties of a product to the services that it provides.”

Living in a hybrid world: From public to private cloud and back again

Orlando Bayter, chief exec and founder of Ormuco

Orlando Bayter, chief exec and founder of Ormuco

The view often propagated by IT vendors is that public cloud is already capable of delivering a seamless extension between on-premise private cloud platforms and public, shared infrastructure. But Orlando Bayter, chief executive and founder of Ormuco, says the industry is only at the outset of delivering a deeply interwoven fabric of private and public cloud services.

Demand for that kind of seamlessness hasn’t been around for very long, admittedly. It’s no great secret that in the early days of cloud demand for public cloud services was spurred largely by the slow-moving pace traditional IT organisations are often set. As a result every time a developer wanted to build an application they would simply swipe the credit card and go, billing back to IT at some later point. So the first big use case for hybrid cloud emerged when developers then needed to bring their apps back in-house, where they would live and probably die.

But as the security practices of cloud service providers continue to improve, along with enterprise confidence in cloud more broadly, cloud bursting – the ability to use a mix of public and private cloud resources to fit the utilisation needs of an app – became more widely talked about. It’s usually cost prohibitive and far too time consuming to scale private cloud resources quick enough to meet the changing demands of today’s increasingly web-based apps, so cloud bursting has become the natural next step in the hybrid cloud world.

Orlando will be speaking at the Cloud World Forum in London June 24-25. Click here to register.

There are, however, still preciously few platforms that offer this kind of capability in a fast and dynamic way. Open source projects like OpenStack or more proprietary variants like VMware’s vCloud or Microsoft’s Azure Stack (and all the tooling around these platforms or architectures) are at the end of the day all being developed with a view towards supporting the deployment and management of workloads that can exist in as many places as possible, whether on-premise or in a cloud vendor’s datacentre.

“Let’s say as a developer you want to take an application you’ve developed in a private cloud in Germany and move it onto a public cloud platform in the US. Even for the more monolithic migration jobs you’re still going to have to do all sorts of re-coding, re-mapping and security upgrades, to make the move,” Bayter says.

“Then when you actually go live, and have apps running in both the private and public cloud, the harsh reality is most enterprises have multiple management and orchestration tools – usually one for the public cloud and one for the private; it’s redundant, and inefficient.”

Ormuco is one company trying to solve these challenges. It has built a platform based on HP Helion OpenStack and offers both private and public instances, which can both be managed in a single pane of glass; it has built its own layer in between to abstract resources underneath).

It has multiple datacentres in the US and Europe from which it offers both private and public instances, as well as the ability to burst into its cloud platform using on-premise OpenStack-based clouds. The company is also a member of the HP Helion Network, which Bayter says gives it a growing channel and the ability to offer more granular data protection tools to customers.

“The OpenStack community has been trying to bake some of these capabilities into the core open source code, but the reality is it only achieved a sliver of these capabilities by May this year,” he said, alluding to the recent OpenStack Summit in Vancouver where new capabilities around federated cloud identity were announced and demoed.

“The other issue is simplicity. A year and a half ago, everyone was talking about OpenStack but nobody was buying it. Now service providers are buying but enterprises are not. Specifically with enterprises, the belief is that OpenStack will be easier and easier as time goes on, but I don’t think that’s necessarily going to be the case,” he explains.

“The core features may become a bit easier but the whole solution may not, but there are so many things going into it that it’s likely going to get clunkier, more complex, and more difficult to manage. It could become prohibitively complex.”

That’s not to say federated identity or cloud federation is a lost cause – on the contrary, Bayter says it’s the next horizon for cloud. The company is currently working a set of technologies that would enable any organisation with infrastructure that lies significantly underutilised for long periods to rent out their infrastructure in a federated model.

Ormuco would verify and certify the infrastructure, and allocate a performance rating that would change dynamically along with the demands being placed on that infrastructure – like an AirBnB for OpenStack cloud users. Customers renting cloud resources in this market could also choose where their data is hosted.

“Imagine a university or a science lab that scales and uses its infrastructure at very particular times; the rest of the time that infrastructure is fairly underused. What if they could make money from that?”

There are still many unanswered questions – like whether the returns for renting organisations would justify the extra costs (i.e. energy) associate with running that infrastructure, or where the burden of support lies (enterprises need solid SLAs for production workloads) and how that influences what kinds of workloads ends up on rented kit, but the idea is interesting and definitely consistent with the line of thinking being promoted by the OpenStack community among others in open source cloud.

“Imagine the power, the size of that cloud,” says Bayter . “That’s the cloud that will win out.”

This interview was produced in partnership with Ormuco

Food retail, robotics, cloud and the Internet of Things

Ocado is developing a white-label grocery delivery service

Ocado is developing a white-label grocery delivery service

With a varied and fast moving supply chain, loads of stock moving quickly through warehouses, delivery trucks, stores, and an increasingly digital mandate, the food retail sector is unlike any other retail segment. Paul Clarke, director of technology at Ocado, a leading online food retailer, explains how the cloud, robotics, and the Internet of Things is increasingly at the heart of everything the company does.

Ocado started 13 years ago as an online supermarket where consumers could quickly and easily order food goods. It does not own or operate any brick-and-mortar stores, though it effectively competes with all other food retailers, in some ways now more than ever because of how supermarkets have evolved in the UK. Most of them offer online ordering and food delivery services.

But in 2013 the company struck a £216m deal with Morrisons that would see Ocado effectively operate as the Morrisons online food store, a shift from its previous strategy of offering a standalone end-to-end grocery service with its own brand on the front-end – and a move that would become central to its growth strategy moving forward. The day the Morrisons platform went live in early 2014 the company set to work on re-platforming the Ocado service and turning it into the Ocado Smart Platform (OSP), a white-label end-to-end grocery service that can be deployed by food retailers globally. Clarke was fairly tight-lipped about some of the details for commercial reasons, but suggested “there isn’t a continent where the company is not currently in discussions” with a major food retailers to deliver OSP.

The central idea behind this is that standing up a grocery delivery service – the technical infrastructure as well as support services – is hugely expensive for food retailers and involves lots of technical integration, so why not simply deploy a white label end-to-end service that will still retain the branding of said retailer but offer all the benefits?

Paul Clarke is speaking at the Cloud World Forum in London June 24-25. Click here to register!

“In new territories you don’t need the size of facilities that we have here in the Midlands. For instance, our site in the Midlands costs over £230m, and that is fine for the UK which has an established online grocery business and our customer base, but it wouldn’t fit well in a new territory where you’re starting from scratch, nor is there the willingness to spend such sums,” he explains.

The food delivery service operates in a hub-and-spoke model. The cloud service being developed by Ocado connects the ‘spokes’, smaller food depots (which could even be large food delivery trucks) to customer fulfilment centres, which are larger warehouses that house the majority of the stock (the ‘hub’).

The company is developing and hosting the service on a combination of AWS and Google’s cloud platforms – for the compute and data side, respectively.

“The breadth and depth of our estate is huge. You have robotics systems, vision systems, simulation systems, data science applications, and the number of different kinds of use cases we’re putting in the cloud is significant. It’s a microservices architecture that we’re building with hundreds of different microservices. A lot of emphasis is being put on security through design, and robust APIs so it can be integrated with third party products – it’s an end-to-end solution but many of those incumbents will have other supply chain or ERP solutions and will want to integrate it with those.”

AWS and Google complement eachother well, he says. “We’re using most things that both of those companies have in their toolbox; there’s probably not much that we’re not using there.”

The warehousing element including the data systems will run on a private cloud in the actual product warehouses, so low latency real-time control systems will run in the private cloud, but pretty much everything else will run in the public cloud.

The company is also looking at technologies like OpenStack, Apache Mesos and CoreOS because it wants to run as many workloads as possible in Linux containers; they’re more portable than VMs and because of the variation between the regions (legislation and performance) where it will operate the company may have to change whether it deploys certain workloads in a public cloud or private cloud quite quickly.

The Internet of Things and the Great Data Lakes

IoT is very important for the company in several areas. Its warehouses are like little IoT worlds all on their own, Clarke says, with lots of M2M, hundreds of kilometres of conveyor, and thousands of things on the move at any one time including automated cranes and robotics.

Then there’s all of the data the company collects from drivers for things like route optimisation and operational improvement – things like wheel speed, tire pressure, road speed, engine revs, fuel consumption, cornering performance, which are all fed back to the company in real-time and used to track driver performance.

There’s also a big role for wearables in those warehouses. Clarke says down the line wearables have the potential to help it improve safety and productivity (“we’re not there yet but there is so much potential.”)

But where IoT can have the biggest impact in food retail, and where it’s most underestimated, Clarke explains, is the customer element: “This is where many companies underestimate the scale of transformation IoT is going to bring, the intersection of IoT and smart machines. In our space we see that in terms of the smart home, smart appliances, smart packaging, it’s all very relevant. The customers living in this world are going to demand this kind of smartness from all the systems they use, so it’s going to raise the bar for all the mobile apps and service we build.”

“Predictive analytics are going to play a big part there, as will machine learning, to help them do their shop up in our case, or knowing what they want before they even have a clue themselves. IoT has a very important part to play in that in terms of delivering that kind of information to the customer to the extent that they wish to share it,” he says.

But challenges, ones that straddle the legal, technical and cultural, persist in this nascent space. One of them, largely technical, is data management, which isn’t insurmountable. The company has implemented a data lake built on Google BigQuery, where it publishes a log of pretty much every business event onto a backbone that it persists through that service as well as data exhaust from its warehouse logs, alerts, driver monitoring information, clickstream data and front-end supply chain information (at the point of order), and it uses technologies like Dataflow and Hadoop for number crunching.

Generally speaking, Clarke says, grocery is just fundamentally different to non-grocery or food in ways that have data-specific implications. “When you go buy stationary or a printer cartridge you usually buy one or two items. With grocery there can often be upwards of 50 items, there are multiple suppliers and multiple people involved, sometimes at different places, often on different devices and different checkouts. So that journey of stitching that order, that journey together, is a challenge from a data perspective in itself.”

Bigger challenges in the IoT arena, where more unanswered questions lie, include security and identity management, discoverability, data privacy and standards – or the lack of. These are the problems that aren’t so straightforward.

“A machine is going to have to have an identity. That whole identity management question for these devices is key and so far goes unanswered. It’s also linked to discoverability. How do you find out what the device functions are? Discovery is going to get far too complex for humans. You get into a train station these days and there are already 40 different Wi-Fi networks, and hundreds of Bluetooth devices visible. So the big question is: How do you curate this, on a much larger scale, for the IoT world?”

“The type of service that creates parameters around who you’re willing to talk to as a device, how much you’re willing to pay for communications, who you want to be masked from, and so forth – that’s going to be really key, as well as how you implement this so that you don’t make a mistake and share the wrong kinds of information with the wrong device. It’s core to the privacy issue.”

“The last piece is standardisation. How these devices talk to one another – or don’t – is going to be key. What is very exciting is the role that all the platforms like Intel Edison, Arduino, BeagleBone have played in lowering the barrier by providing amazing Lego with which to prototype, and in some cases build these systems; it has allowed so many people to get involved,” he concluded.

Food retail doesn’t have a large industry-specific app ecosystem, which in some ways has benefited a company like Ocado. And as it makes the transition away from being the sole vendor of its product towards being a platform business, Clarke said the company will inevitably have to develop some new capabilities, from sales to support and consultancy, which it didn’t previously depend so strongly upon. But its core development efforts will only accelerate as it ramps up to launch the platform. It has 610 developers and is looking to expand to 750 by January next year across its main development centre in Hatfield and two others in Poland, one of which is being set up at the moment.

“I see no reason why it has to stop there,” he concludes.

Real-time cloud monitoring too challenging for most providers, TFL tech lead says

Reed says TFL wants to encourage greater greater use of its data

Reed says TFL wants to encourage greater greater use of its data

Getting solid data on what’s happening in your application in real-time seems to be a fairly big challenge for most cloud services providers out there explains Simon Reed, head of bus systems & technology at Transport for London (TFL).

TFL, the executive agency responsible for transport planning and delivery for the city of London, manages a slew of technologies designed to support over 10 million passenger journeys each day. These include back office ERP, routing and planning systems, mammoth databases tapped in to line-of-business applications as well as customer-facing app (i.e. real-time travel planning apps, and the journey planner website), line-of-business apps, as well as all the vehicle telematics, monitoring and tracking technologies.

A few years ago TFL moved its customer facing platforms – the journey planner, the TFL website, and the travel journey databases – over to a scalable cloud-based platform in a bid to ensure it could deal with massive spikes in demand. The key was to get much of that work completed before the Olympics, including a massive data syndication project so that app developers could more easily tap into all of TFL’s journey data.

“Around the Olympics you have this massive spike in traffic hitting our databases and our website, which required highly scalable front and back-ends,” Reed said. “Typically when we have industrial action or a snowstorm we end up with 10 to 20 times the normal use, often triggered in less than half an hour.”

Simon Reed is speaking at the Cloud World Forum in London June 24-25. Register for the event here.

The organisation processes bus arrival predications for all 19,000 bus stops in London which is constantly dumped into the cloud in a leaky-tap model, and there’s a simple cloud application that allows subscribers to download the data in a number of formats, and APIs to build access to that data directly into applications. “As long as developers aren’t asking for predictions nanoseconds apart, the service doesn’t really break down – so it’s about designing that out and setting strict parameters on how the data can be accessed and at what frequency.”

But Reed said gaining visibility into the performance of a cloud service out of the box seems to be a surprisingly difficult thing to do.

“I’m always stunned about how little information there is out of the box though when it comes to monitoring in the cloud. You can always add something in, but really, should I have to? Surely everyone else is in the same position where monitoring actual usage in real-time is fairly important. The way you often have to do this is to specify what you want and then script it, which is a difficult approach to scale,” he said. “You can’t help but think surely this was a ‘must-have’ when people had UNIX systems.”

Monitoring (and analytics) will be important for Reed’s team as they expand their use of the cloud, particularly within the context of the journey data TFL publishes. Reed said its likely those systems, while in a strong position currently, will likely see much more action as TFL pursues a strategy of encouraging use of the data outside the traditional transport or journey planning app context.

“What else can we do to that data? How can we turn it around in other ways? How can other partners do the same? For us it’s a question of exploiting the data capability we have and moving it into new areas,” he said.

“I’m still not convinced of the need to come out of whatever app you’re in – if you’re looking at cinema times you should be able to get the transportation route that gets you to the cinema on time, and not have to come out of the cinema listings app. I shouldn’t have to match the result I get in both apps in order to plan that event – it should all happen in one place. It’s that kind of thinking we’re currently trying to promote, to think more broadly than single purpose apps, which is where the market is currently.”

BMJ CTO: ‘Consumerisation of IT brings massive risks’

Sharon Cooper, CTO of BMJ

Sharon Cooper, CTO of BMJ

As we approach Cloud World Forum in London this June BCN had the opportunity to catch up with one of the conference speakers, Sharon Cooper, chief technology officer of BMJ to discuss her views on the risks brought about by the consumerisation of IT.

What do you see as the most disruptive trend in enterprise IT today?

For me it is the consumerisation of IT, but not because I’m worried that IT department is being put out of business, or because business users don’t know what tools they need to run their business. My concern about the disruption is that there is a hidden risk and potential massive costs and unknown danger because many of today’s applications and tools are so deceptively simple to use that business users are not aware of things that might be critical to them, in part because the IT department always controlled everything, and hid much of the complexity from them.

Tools are so easy to use, someone just sign ups with their email address, uploads a large spreadsheet full of personal customer data, and then they leave, they forget to tell anyone that they have that account, it might even be under their personal email address. So the company has no idea where its corporate assets are being stored, you have no idea where they are being stored, and when that customer asks to be removed from the company’s databases, nobody has any idea that the customers details are hidden away in locally used Google Drives, Dropboxes, or other applications.

If nobody in the company has a view over what tools are used, by whom and what’s in them, is the company even aware of the risk, or its individual employees who are using these tools? Business users are reasonably savvy people but they probably won’t check the T&Cs or remember that extremely boring information governance mandatory training module they had to complete last year.

I really encourage people in my organisation to find good tools, SaaS, cloud based, apps, but I ask them to ensure that my team knows what they are, give them a quick review to see if they are genuine and not some sort of route for activists, has checked over the T&Cs, remind them about the fact that they are now totally responsible for any personal customer data or sensitive corporate information in those applications, and they will be the ones that will be impacted if the ICO comes calling.

What do you think the industry needs to work on in terms of cloud service evolution?

Trying to get legislation to catch up with the tech, or even be in the same century.

What does BMJ’s IT estate look like? What are the major services needing support?

We have a bit of everything, like most companies, although I believe we have made fairly significant moves into cloud and SaaS/managed services.

Our desktop IT, which is provided by our parent company is very much traditional/on-premise, although we have migrated our part of the business to Google Apps for business, which has dramatically transformed staff’s ability to work anywhere. We’re migrating legacy bespoke CRM systems to cloud-based solutions, and use a number of industry specific managed services to provide the back office systems that we use directly, rather than via our parent.

Our business is in digital publishing and the tools that we use to create the IP and the products that drive our revenue are predominantly open source, cloud-based, and moving increasingly that way. Our current datacentre estate includes private cloud, with some public cloud, and we believe we will move more towards public over the next 2-3 years.

Can you describe some of the unique IT constraints or features particular to your company or the publishing sector? How are you addressing these?

Our parent company is in effect a UK trade union, its needs are very, very different from ours; we were originally their publishing department and now an international publisher with the majority of our revenues coming from outside the UK. There is some overlap but it is diminishing over time.

Our market is relatively slow to change in some ways, so our products are not always driven as fast by changes in technology or in the consumer IT markets.

Traditionally academic publishing is not seen as a huge target for attack, but the nature of what we publish, which can be considered by some to be dangerous, has the potential to increase our risks above that of some of our peers – for example, controversies over the accuracy of medical treatments, we were the Journal that produced the evidence that Andrew Wakefields research into MMR was wrong, and he has pursued us through the courts for years. If that story had broken today, would we have been a target of trolling or even hacktivists. We sell products into the Middle East that contain information on alcohol related diseases, and we’ve been asked to remove them because there is not alcoholic disease in those countries (we have not bowed to this government pressure),

As the use of knowledge at the point of care becomes ever more available via the use of devices that can be used by anyone, anywhere, so does the additional burden of medical device regulation and other challenges, which coming from a print publishing background, were never relevant before.

Are there any big IT initiatives on the horizon at BMJ? What are the main drivers of those?

We have probably under invested in many applications over the last several years, a policy to really sweat an IT asset was in place for years – and we have a range of systems we will be replacing over time, consolidating – for example we have 5 different e-commerce systems, revenue is processed in more than 3 applications.

As with most companies a focus on data and analytics in all of its guises will be critical as we move forward.

Why do you think it’s important to attend Cloud World Forum?

It’s always good to see what vendors are offering and to hear what others have done to solve problems in their industries which might have relevance to yours, quite often it means you don’t feel quite so bad about your own situation when you hear other people’s tales.

Phil Carnelley, research director at IDC on cloud, big data, Internet of Things

Philip Carnelley shares his views on the big disrupters in IT

Philip Carnelley shares his views on the big disrupters in IT

As we approach Cloud World Forum in London this June BCN had the opportunity to catch up with one of the conference speakers, Philip Carnelley, software research director at IDC Europe to discuss his views on the most disruptive trends in IT today.

What do you see as the most disruptive trend in enterprise IT today?

This is a tricky one but I think it’s got to be the Internet of Things – extending the edge of the network, we’re expecting a dramatic rise in internet-connected cars, buildings, homes, sensors for health and industrial equipment, wearables and more.

IDC expects some 28 billion IoT devices to be operational by 2020. Amongst other things, this will change the way a lot of companies operate, changing from device providers to service providers, and allowing device manufacturers to directly sell to, and service, their end customers in the way they didn’t before.

What do you think is lacking in the cloud sector today?

There are 2 things. First, many organizations still have concerns about security, privacy and compliance in a cloud-centric world. The industry needs to make sure that organizations understand that these needs can be met by today’s solutions.

Second, while most people buy into the cloud vision, it’s often not easy to get to there from where they are today. The industry must make it easy as possible, with simple solutions that don’t require fleets of highly trained people to understand and implement.

Are you seeing more enterprises look to non-relational database tech for transactional uses?

Absolutely. We’re seeing a definite rise in the use of NoSQL databases, as IT and DB architects become much more ready to choose databases on a use-base basis rather than just going for the default choice. A good example is the use of Basho Riak at the National Health Service.

Is cloud changing the way mobile apps and services are developed in enterprises?

Yes, there is a change towards creating mobile apps and services that draw on ‘mobile back-end-as-a-service’ technologies for their creation and operation

Why do you think it’s important to attend Cloud World Forum?

Because cloud is the fundamental platform for what IDC calls the 3rd Platform of Computing. We are in the middle of a complete paradigm shift to cloud-centric computing – with the associated technologies of mobile, social and big data – which is driving profound changes in business processes and even business models (think Uber, AirBnB, Netflix). Any company that wants to remain competitive in this new era needs to embrace these technologies, to learn more about them, in the way it develops and runs its operations for B2E, B2B and B2C processes.