Category Archives: Interviews

Rackspace CTO: no-one is bigger than the software revolution

Rackspace - John Engates

Rackspace CTO John Engates speaking at Rackspace: Solve 2016

While the concept of cloud computing has been normalized to a degree, the industry is now leaning towards the perceived benefits which can be derived from the technology on the whole. For the majority of companies who are evaluating cloud technologies, reducing CAPEX and OPEX simply isn’t a strong enough business case anymore.

This is certainly the case for Rackspace CTO John Engates. In fact, we’re starting to see the beginning of a new trend which will define the future of a vast number of organizations, the ability and desire to experiment. Those who can experiment with new technology, and are prepared to fail, will succeed. And those who don’t, won’t.

Although there can be savings made through the transition to a cloud environment, early adopters are now looking beyond. Cloud will underpin the growth and success of the new wave of next generation technologies, whether it is virtual reality, artificial intelligence or autonomous vehicles. The early adopters are already defining how these technologies will take their business to the next level, though the risk for the rest is how far they will get left behind is they don’t get up to speed quickly.

“Cloud as a technology is just about hitting the mainstream now,” said Engates. “Everything pre-2015 has been early adopters, but for mass markets it was business as usual.

“The main problem is that the majority of these companies are two or three steps away from the cloud. The cloud is not about saving money, but freeing up your developers so they can experiment with new technologies, learn new language and take the company forward. If you’re not thinking about these technologies now, how far behind are you. And you’re probably going to be in a very difficult position in a couple of years.”

Blockbuster is a classic example. Blockbuster and Netflix were in a similar position pre-digitalization, as most people now forget Netflix initially rose to fame through the delivery of DVD’s to its customers through the post. Fast forward to the digital era, where Netflix evolved and created its current market position, one in which a number of major player are now trying to emulate, and Blockbuster no longer exists.

For Engates, this example highlights the importance of experimentation. Netflix was a company which allowed its developers to play with new technologies and methods of delivery, whereas Blockbuster attempted to hold onto the traditional model. This will be the same for other verticals in the coming years, those who embrace the new digital era, adapt their models and allow their developers’ freedom to innovate will continue to be competitive, those who don’t will take the same route as Blockbuster.

Sports woman overcoming challenges“The successful companies of the future will be software companies,” said Engates. “They may not sell software but they will use it as a means to define their business and be creative in the marketplace. The likes of Google, Facebook, Uber and Netflix are all software companies. They aren’t people companies or infrastructure companies, they are software. If you want to compete with these companies you need to get better at creating the software experience.”

Nike and Under Armour are two more companies highlighted by Engates. While both are lifestyle and sportswear brands, both have had to create a digital experience to meet the demands of customers. A few years ago industry giants such as Nike and Under Armour were too big to be bothered by such trends, but the cloud computing era has levelled the playing field. No-one is bigger than the software revolution.

“I think that companies have to enable some of their organization to be innovative and to be creative,” said Engates. “Most of IT has been behind the wall; they haven’t been innovators, they’ve been keeping the lights on. It wasn’t about transforming the company into something new and different that was product development’s job or marketing. But today, inventing the new it-thing means you have to have a digital component, to connect with you users through your mobile device.”

Mobile devices are now redefining business and consumer attitudes. For the most part this is how the consumer connects with new companies; it’s almost exclusively digital and if you’re company is not embracing this, Engates thinks it won’t be too long before you’re not relevant.

But will companies take those risks? “Not all of them will,” said Engates. “Not every company will make that leap. The ones that don’t will be left behind. Now even banks are starting to do this as you’re starting to see more automated investing and digital advisors. Why would you need to go to the branch if you can do it over the phone?”

For innovation to occur within an organization, the conditions have to be right. In the majority of large scale organizations, innovation is very difficult to achieve. There are too many risks, too much red tape and too much politics. The notion that a new idea might not succeed, or reap short term benefits, scares the board and stakeholders, which in turn will inhibit innovation. It’s a difficult loop to get out of, and for a number of larger, stodgy organizations, it will be immensely difficult.

“The reason cloud is so important is because to innovate you need to be using the most modern tools, for example data science, continuous integration, containers,” said Engates. “You need APIs to interact with, you don’t want to wait six weeks on a server. You want to experiment and do things quickly. If you want to do analytics, you need storage and compute power; you need to have the cloud.

“A lot of the people who want to work on these projects have a lot of options. There are a lot of smaller companies who have these conditions to be innovative, so they attract these developers. Companies have to adapt to them, not force them to adapt to the company. Decision makers need to change their organization to have the modern environment for these developers to work in, to be innovative and to make the company competitive in the digital era.”

Virtustream enters European cloud market place

competitive swimmingWhen looking at the European cloud market ecosystem, most people would be forgiven for not looking much past the three largest holders of market share; AWS, Microsoft Azure and Google. But there are alternatives, despite them being less well known. This is the challenge facing Virtustream MD Simon Walsh.

Although as a company Virtustream has been in operation since 2009, the team consider themselves in start-up mode, taking position to pounce on the European market over the company months. The company was acquired by EMC last year, and formed the tech giant’s managed cloud services business, a position which could be seen as enviable by other cloud companies aiming to topple the top three’s firm grasp on the cloud market.

“EMC is the largest storage company in the world,” said Walsh. “And we’re aiming to leverage that position. We’re taking a rifle shot approach to building business in the European markets, but in parallel we’ve partnered with EMC because they own us, and we’ve partnered with Dell because they own EMC. With these relationships, we have access to multiple routes to market, and we plan on leveraging the recognition of companies like EMC and Dell to scale Virtustream rapidly.”

Virtustream is currently one of six EMC Federation companies (EMC2, Pivotal, RSA, VCE, VMWare and Virtustream), and will continue to be an independent brand following the introduction of Dell Technologies, and the subsequent sub-brands, in the coming months. While the brand is relatively unknown in the EMEA and APJ markets, this is not the case in North America where it has a number of certifications for federal government contracts and a number of enterprise customers.

Growth in the European market will firstly utilize the certifications Virtustream has in the US market to provide credibility for public sector organizations in Europe, and secondly, leverage customers who are already bought into the EMC and Dell portfolios.

The new EMC/Dell execs are seemingly learning lessons from Microsoft’s rise to the top of the market segment, as opposed to AWS’, becoming an end-to-end enterprise IT vendor (similar to Microsoft) as opposed to a specialist public cloud company (AWS). While AWS is widely recognised as the cloud leader worldwide, a recent study from JP Morgan will give the new EMC/Dell execs confidence.

The research highlighted 46.9% of the CIOs surveyed highlighted Microsoft as the most critical and indispensable to the future of their IT environment, whereas AWS only accounted for 13%. The all-encompassing portfolio offered by Microsoft (cloud, desktop, server and database etc.) was more appealing than AWS’ offering.

Simon Walsh

Virtustream Managing Director Simon Walsh

Virtustream can offer cloud capabilities across the board, from cloud native to traditional systems of record, and now the team have connected the cloud storage options directly to three EMC platforms in the software. The team are leaning on the idea of trust in the EMC brand, straightforwardness of upgrade and the simple integration between the offerings of all federation businesses, will offer customers a portfolio which can’t be matched in the industry.

“EMC is globally 30% of the storage market, if we go to the installed base and connect these customers to our storage cloud we should be able to scale pretty quickly (on growth of the Virtustream business),” said Walsh. “We may not be number one in the cloud storage market, but I can see us being in the top three for market share within the near future.”

One area the team are targeting in particular is the core business applications. Most enterprise organizations cloud be perceived to have a reluctance and a paranoia to run core business applications in the cloud, though Virtustream believes it has an offering which can counter this trend.

“Yes, I see paranoia, however it is geographically different,” said Walsh. “Europe is behind. Europe is still clinging onto building it themselves or outsourcing. There’s nothing wrong with outsourcing, but in the America’s they are much bolder at adopting cloud.

“Most people have used cloud to date for dev and test or they’ve used it for web front end or scale out systems of engagement, hardly anybody actually has an application which they run their business on in the cloud. It’s either in their own data centre which they run themselves or they’ve outsourced, and they have someone doing application management services. We have a hybrid solution which can change all this.”

The team have combined public cloud attributes of agility and tiered payment, and the outsourcing attributes of a core app, with an SLA and a performance guarantee, to create a hybrid proposition which they claim is unique for the industry. Walsh and his team now face the task of convincing European decision makers there is a feasible option to run core business applications in the cloud.

“The entire world has not shifted to cloud native,” said Walsh. “There is still a substantial amount of money spent by corporations around the world on running core business applications. We have a proposition which can run cloud native but can also run core business applications in the cloud, on demand and on consumption. No-one else in the industry does that. We can run all the systems on the same billing engine, the same cloud management tools and the same application management tools, which gives us a differentiator in the market.”

Data, data, data. The importance of backing up data

Cloud datacentreMore often than not when browsing the internet each morning you’ll soon discover that in fact, this morning is “Talk Like a Pirate Day”, or “Hug a Vegetarian Day”, or something equally humorous. Today is an awareness day which, conversely, holds some use to the world on the whole.

World Backup Day encourages consumers to back up their family photos, home videos, documents and emails, on more than one device. The World Backup Day website lists numerous ways in which a consumer’s data or documents can be lost, however this day is also very applicable to the world of enterprise IT.

“The rapid increase in the amount of data that consumers and organisations store is one of the biggest challenges facing the backup industry,” says Giri Fox, Director of Technical Services at Rackspace. “Organisations aren’t always sure what data they should be keeping, so to make sure they don’t discard any important data they sometimes end up keeping everything which adds to this swell of data.

“For many companies, a simple backup tool is no longer enough to make sure all these company assets are safe and available, they need support in keeping up with the sheer scale of data and to fix problems when a valuable file or database goes missing.”

The volume of data being utilized (and in some cases not utilized) has grown astronomically, but to a certain degree, security and employee behaviour has not kept pace with this growth. Cyber criminals always seem to be one step ahead of ahead of enterprise when attempting to access data, but what is more worrying is the trend of employee indifference to IT security.

A recent survey highlighted employee negligence and indifference to IT policy is one of the most significant inhibitors to cloud security with only 35% of respondents highlighting that they use passwords in work.

Giri Fox, Director of Technical Services at Rackspace

Giri Fox, Director of Technical Services at Rackspace

“Over recent years, organisations have become far more aware of the importance of backing up their data and we’ve noticed the impact here at Rackspace, where currently we backup 120 PB per month globally,” adds Fox. “One of the main challenges for us is that businesses don’t just want to back-up more data than ever before, they want it to be done quicker than ever before.

“Also, the process of doing so has become more complex than it used to be because companies are more conscious than ever of the compliance regulations they have to adhere to. Fortunately, with the development of deduplication techniques, we are now able to back-up unique sections of data rather than duplicating large pools continuously, which has sped-up the backing-up process.”

Outside of employee indifference to cloud security, changes to EU-US data protection policy have highlighted the significance of data-backup and prevention of data loss. An instance of data loss could be crippling for an organization, whether it is financial penalties or the loss of information which could prove to be invaluable in the future.

“Initiatives like World Backup Day are a great way of highlighting the importance of backing up in an age where, as Ann Winbald put it, ‘data is the new oil’,” comments Fox.

In a world where data can be seen as one of the most important commodities to any business, the value of securing, backing up and encrypting data cannot be underplayed. That said, the majority of the working world (outside of the IT department), do not appreciate the value of security, mostly not out of malice, more because they don’t know any better.

“In the post-Edward Snowden era we’re also seeing just how seriously companies are thinking about encryption. Many companies now want to make sure their backed up data is no longer just encrypted when it goes outside the four walls of a data centre, but inside it as well,” says Fox.

G-Cloud – why being certified matters

Cloud computingIt might surprise you to know that more than £900m worth of sales have now taken place via the G-Cloud platform since its launch. The Government initiated the G-Cloud program in 2012 to deliver computing based capability (from fundamental resources such as storage and processing to full-fledged applications) using cloud and it has been hugely successful, providing benefits to both customers and suppliers alike.

The G-Cloud framework is offered via the Digital Marketplace and is provided by The Crown Commercial Service (CCS), an organisation working to save money for the public sector and the taxpayer. The CCS acts on behalf of the Crown to drive savings for the taxpayer and improve the quality of commercial and procurement activity. The CCS’ procurement services can be used by central government departments and organisations across the public sector, including local government, health, education, not-for-profit and devolved administrations.

G-Cloud approves framework agreements with a number of service providers and lists those services on a publicly accessible portal known as the Digital Marketplace. This way, public sector organisations can approach the services listed on the Digital Marketplace without needing to go through a full tender process.

G-Cloud has substantial benefits for both providers and customers looking to buy services. For vendors the benefit is clear – to be awarded as an official supplier for G-Cloud demonstrates that the company has met the standards laid out in the G-Cloud framework and it is compliant with these standards. Furthermore, it also opens up an exciting new opportunities to supply the public sector in the UK with the chance to reduce their costs. Likewise it brings recognition to the brand and further emphasises their position as a reputable provider of digital services.

Where public sector organisations are concerned, G-Cloud gives quick and easy access to a roster of approved and certified suppliers that have been rigorously assessed, cutting down on the time to research and find such vendors in the marketplace. This provides companies with a head start in finding the cloud services that will best address their business and technical needs.

I am proud to say that iland was awarded a place on the G-Cloud framework agreement for supplying Infrastructure-as-a-Service (IaaS) and Disaster-Recovery-as-a-Service (DRaaS) at the end of last year. We deliver flexible, cost-effective and secure Infrastructure-as-a-Service solutions from data centres in London and Manchester, including Enterprise Cloud Services with Advanced Security and Compliance, Disaster-Recovery-as-a-Service and Cloud Backup.

So if you are looking to source a cloud provider, I would recommend that you start your search with those that have been awarded a place on the G-Cloud framework agreement. It is important to then work with prospective providers to ensure their platform, service level agreements, native management tools and support teams can deliver the solutions that best address your business goals as well as your security and compliance requirements. Ask questions up front. Ensure the provider gives you full transparency into your cloud environment. Get a demonstration. You will then be well on your way to capitalizing on the promises of cloud.

Written by Monica Brink, EMEA Marketing Director, iland

IBM: “The level of innovation is being accelerated”

Angel DiazDr. Angel Diaz joined the research division of IBM in the late nineties, where he helped co-author many of the web standards we enjoy today. Nowadays, he’s responsible for all of IBM’s cloud and mobile technology, as well as architecture for its ambient cloud. Here, ahead of his appearance at Container World (February 16 – 18,  Santa Clara Convention Center, CA,) later this month, BCN caught up with him to find out more about the tech giant’s evolving cloud strategy.

BCN: How would you compare your early days at IBM, working with the likes of Tim Berners-Lee, with the present?

Dr. Angel Diaz: Back then, the industry was focused on developing web standards for a very academic purpose, in particular the sharing of technical information. IBM had a strategy around accelerating adoption and increasing skill. This resulted in a democratization of technology, by getting developers to work together in open source and standards.If you fast forward to where we are now with cloud, mobile, data, analytics and cognitive you see a clear evolution of open source.

The aperture of open source development and ecosystems has grown to include users and is now grounded on solid open governance and meritocracy models. What we have built is an open cloud architecture, starting with an open IaaS based on Open Stack, open PaaS with Cloud Foundry and an open container model with the Open Container Initiative and Cloud Native Computing Foundation. When you combine an open cloud architecture with open APIs defined by the Open API Initiative, applications break free. I have always said that no application is an island – these technologies make it so.

What’s the ongoing strategy at IBM, and where do containers come into it?

It’s very much hybrid cloud. We’ve been leveraging containers to help deliver hybrid applications and accelerate development through devOps, so that people can transform and improve their business processes. This is very similar to what we did in the early days of the web – better business processes means better business. At the end of the day – the individual benefits. Applications can be tailored to the way we like to work, and the way that we like to behave.

A lot of people in the container space, say, wow, containers have been around a long time, why are we all interested in this now? Well, it’s gotten easier to use, and open communities have rallied around it, and it provides a very nice way of marrying concepts of operations and service oriented architecture, which the industry missed in the 2000s.

What does all this innovation ultimately mean for the ‘real world’?

It’s not an exact analogy, but if we remember the impact of HTML, JavaScript – they allowed almost anyone to become a webmaster. That led to the Internet explosion. If you look at where we are now, what we’re doing with cloud: that stack of books you need to go buy has been reduced, the concept count of things you need to know to develop an application, the level of sophistication of what you need to know in order to build an application, scale an application, secure an application, is being reduced.

So what does that do? It increases participation in the business process, in what you end up delivering. Whether it’s human facing or whether it’s an internal business process, it reduces that friction and it allows you to move faster. What’s starting to happen is the level of innovation is being accelerated.

And how do containers fit into this process? 

Previously there was this strict line: you develop software and then operate it and make tweaks, but you never really fundamentally changed the architecture of the application. Because of the ability to quickly stand up containers, to quickly iterate, etc., people are changing their architectures because of operations and getting better operations because of it. That’s where the microservices notion comes in.

And you’ll be talking at Container World. What message are you bringing to the event?

My goal is to help people take a step back and understand the moment we’re in, because sometimes we all forget that. Whether you’re struggling with security in a Linux kernel or trying to define a micro service, you can forget what it is you’re trying to accomplish.

We are in a very special moment where it’s about the digital disruption that’s occurring, and the container technology we’re building here, allow much quicker iteration on the business process. That’s one dimension. The second is that, what IBM’s doing, in not just our own implementation of containers, but in the open source world, to help democratize the technology, so that the level of skill and the number of people who build on this grows.

AWS – we view open source as a companion

deepaIn one of the last installments of our series marking the upcoming Container World (February 16 – 18,  Santa Clara Convention Center, CA, USA), BCN talks to Deepak Singh, General Manager of Amazon EC2 Container Service, AWS

Business Cloud News: First of all – how much of the container hype is justified would you say?

Deepak Singh: Over the last 2-3 years, starting with the launch of Docker in March 2013, we have seen a number of AWS customers adopt containers for their applications. While many customers are still early in their journey, we have seen AWS customers such as Linden Labs, Remind, Yelp, Segment, and Gilt Group all adopt Docker for production applications. In particular, we are seeing enterprise customers actively investigating Docker as they start re-architecting their applications to be less monolithic.

How is the evolution of containers influencing the cloud ecosystem?

Containers are helping people move faster towards architectures that are ideal for the  AWS cloud. For example, one of the common patterns we have seen with customers using Docker is to adopt a microservices architecture. This is especially true for our enterprise customers who see Docker as a way to bring more applications onto AWS.

What opportunities does this open up to AWS?

For us, it all comes down to customer choice. When our customers ask us for a capability, then we listen. They come to us because they want something the Amazon way, easy to use, easy to scale, lower cost, and where they don’t have to worry about the infrastructure running behind it.

As mentioned, many of our customers are adopting containers and they expect AWS to support them. Over the past few years we have launched a number of services and features to make it easier for customers to run Docker-based applications. These include Docker support in AWS Elastic Beanstalk and the Amazon EC2 Container Service (ECS). We also have a variety of certified partners that support Docker and AWS and integrate with various AWS services, including ECS.

What does the phenomenon of open source mean to AWS? Is it a threat or a friend?

We view open source as a companion to AWS’s business model. We use open source and have built most AWS services on top of open source technology. AWS supports a number of open source applications, either directly or through partners. Examples of open source solutions available as AWS services include Amazon RDS (which supports MySQL, Postgres, and MariaDB), Amazon Elastic MapReduce (EMR), and Amazon EC2 Container Service (ECS). We are also an active member of the open source community. The Amazon ECS agent is available under an Apache 2.0 license, and we accept pull requests and allow our customers to fork our agent as well. AWS contributes code to Docker (e.g. CloudWatch logs driver), and was a founder member of the Open Container Initiative, which is a community effort to develop specifications for container runtimes.

As we see customers asking for services based on various open source technologies, we’ll keep adding those services.

You’ll be appearing at Container World this February. What do you think the biggest discussions will be about?

We expect customers will be interested in learning how they can run container-based applications in production, the most popular use cases, and hear about the latest innovations in this space.

Cloud academy: Rudy Rigot and his new Holberton School

rudy rigotBusiness Cloud News talks to Container World (February 16 – 18, 2016 Santa Clara Convention Center, USA) keynote Rudy Rigot about his new software college, which opens today.

Business Cloud News: Rudy, first of all – can you introduce yourself and tell us about your new Holberton School?

Rudy Rigot: Sure! I’ve been working in tech for the past 10 years, mostly in web-related stuff. Lately, I’ve worked at Apple as a full-stack software engineer for their localization department, which I left this year to found Holberton School.

Holberton School is a 2-year community-driven and project-oriented school, training software engineers for the real world. No classes, just real-world hands-on projects designed to optimize their learning, in close contact with volunteer mentors who all work for small companies or large ones like Google, Facebook, Apple, … One of the other two co-founders is Julien Barbier, formerly the Head of Community, Marketing and Growth at Docker.

Our first batch of students started last week!

What are some of the challenges you’ve had to anticipate?

Since we’re a project-oriented school, students are mostly being graded on the code they turn in, that they push to GitHub. Some of this code is graded automatically, so we needed to be able to run each student’s code (or each team’s code) automatically in a fair and equal way.

We needed to get information on the “what” (what is returned in the console), but also on the “how”: how long does the code take to run?  How much resource is being consumed? What is the return code? Also, since Holberton students are trained on a wide variety of languages; how do you ensure you can grade a Ruby project, and later a C project, and later a JavaScript project, etc. with the same host while minimizing issues?

Finally we had to make sure that the student can commit code that is as malicious as they want, we can’t need to have a human check it before running it, it should only break their program, not the whole host.

So how on earth do you negotiate all these?

Our project-oriented training concept is new in the United States, but it’s been successful for decades in Europe, and we knew the European schools, who built their programs before containers became mainstream, typically run the code directly on a host system that has all of the software they need directly installed on the host; and then they simply run a chroot before running the student’s code. This didn’t solve all of the problem, while containers did in a very elegant way; so we took the container road!

HolbertonCloud is the solution we built to that end. It fetches a student’s code on command, then runs it based on a Dockerfile and a series of tests, and finally returns information about how that went. The information is then used to compute a score.

What’s amazing about it is that by using Docker, building the infrastructure has been trivial; the hard part has been about writing the tests, the scoring algorithm … basically the things that we actively want to be focused on!

So you’ve made use of containers. How much disruption do you expect their development to engender over the coming years?

Since I’m personally more on the “dev” end use of devops, I see how striking it is that containers restore focus on actual development for my peers. So, I’m mostly excited by the innovation that software engineers will be focusing on instead of focusing on the issues that containers are taking care of for them.

Of course, it will be very hard to measure which of those innovations were able to exist because containers are involved; but it also makes them innovations about virtually every corner of the tech industry, so that’s really exciting!

What effect do you think containers are going to have on the delivery of enterprise IT?

I think one takeaway from the very specific HolbertonCloud use case is that cases where code can be run trivially in production are getting rare, and one needs guarantees that only containers can bring efficiently.

Also, a lot of modern architectures fulfil needs with systems that are made of more and more micro-services, since we now have enough hindsight to see the positive outcomes on their resiliences. Each micro-service may have different requirements and therefore be relevant to be done each with different technologies, so managing a growing set of different software configurations is getting increasingly relevant. Considering the positive outcomes, this trend will only keep growing, making the need for containers keep growing as well.

You’re delivering a keynote at Container World. What’s the main motivation for attending?

I’m tremendously excited by the stellar line-up! We’re all going to get amazing insight from many different and relevant perspectives, that’s going to be very enlightening!

The very existence of Container World is exciting too: it’s crazy the long way containers have gone over the span of just a few years.

Click here to learn more about Container World (February 16 – 18, 2016 Santa Clara Convention Center, USA)

IoT comes to the CES: opening up the home to the sharing economy

The Internet of Things vector illustration.One of the most intriguing corners of this year’s CES is the dedicated SuperSession on ‘IoT Business Strategies: Partnerships for the Sharing Economy’. After all, while almost anyone in Las Vegas this January will be able to tell you that IoT will (surely) have a huge part to play in the future of consumer tech, very few people will be able to tell you exactly how

The main current consumer thrust in IoT, for example, remains home automation, and specifically security. Yet there is often little really inspiring about this proposed application of IoT. In part, this is because it arguably fails to offer us anything really new. A secure home is a secure home is a secure home, however this is achieved, and if home automation currently offers greater control and security, it does so at significantly more expense.

Much more interesting is the USP of home automation innovator August.  Co-Founder & CEO Jason Johnson, who’ll be appearing at the SuperSession panel in Vegas next month, took the time to tell us precisely what distinguishes August’s approach to home automation from all the other contending companies.

“We definitely make products that fall under the security category,” he says. “But we have a kind of unique philosophy.  We’ve looked at the front door, and asked, if you can give control over that part of the home in a new way, what could that do for consumers?”

August concluded that the answer to this question lay was in combination of home automation with the booming sharing economy in particular and ecommerce in general – both of which an automated front door could make much more seamless and better integrated into users’ lives.

“Traditionally the lock on our doors has been designed to keep people out. We have a philosophy that, if we can make a really good access system, a different kind of lock and security for the front door, it could be used not just to keep people out but to let them in – a kind of different paradigm to what a lock is. Our vision is, if we do that really well, then when I get home from work tonight, my dog will have been walked, my groceries delivered, my UPS packages delivered, my house cleaned – maybe there’s fresh flowers on my dining room table, my dry cleaning has been delivered and it’s hanging in my closet, my dirty clothes have been taken away.”

The ideal behind August is that, for all of those service providers requiring access to the home to deliver (a requirement presently resulting in a chaos of keys, calls and clashing schedules), instant, temporary access could be delivered the second the arrangement is made. Johnson offers an example from personal experience.

“I have a vacation rental home up in Napa, this little tiny shack,” he says. “I made it available on Airbnb and right away I had to deal with the keys. So, first, we had to hide a key somewhere on the property. Later, of course, I started issuing keys from within the August app. And you can do that. You go to the app, you type in the person’s name, their phone number, the days, the hours they have access and I issue the keys from the app and they show up and can get access to the house.”

However, the experience became that much more seamless (and therefore satisfying) following a software integration between the two services. “Now, literally as I’m talking to you, someone could be doing an Airbnb booking for my place: and the August app will automatically provision a temporary key to that guest. I’ve done nothing.”

The opportunity for such a provision to facilitate e-commerce per se is striking.

“One of the things that cause us most to think twice about ordering something online is the challenge of, ‘how am I going to physically get that?’ At our office, we have a lot of employees that get packages delivered here, and they stack up and then they got to haul the packages home on the bus or they ride a bicycle and have to haul the packages home on their bikes. So people think twice about ordering things online! Nobody wants to come home and have that little sticker on the wall saying missed delivery.”

You could be forgiven for thinking that, indeed, home automation and the internet of services look made for one another. Indeed, technologies often seem to complement one another. It is presumably this manner of symbiosis that will allow IoT to flourish in the years to come, to offer consumers new experiences. Objects will not merely be connected and that’s it – rather, through that connectivity, new combinations and opportunities come to light.

There will be few more explicit examples of this approach on display at this year’s CES than at the ‘IoT Business Strategies’ SuperSession. Attendance is certainly a key part in August’s plans for 2016.

“The idea of a smart lock and a smart video doorbell is still a new concept. The challenge for us in 2016 – and starting at CES – is to move into the mainstream. How do you get, not just early tech adopters, but mainstream consumers to embrace these technologies and put them in our homes? That’s what we need to do over the course of 2016.”

Click here for more information about  the‘IoT Business Strategies: Partnerships for the Sharing Economy’ at CES, Las Vegas, January 7 2016

The IoT in Palo Alto: connecting America’s digital city

jonathan_reichental_headshot_banffPalo Alto is not your average city. Established by the founder of Stanford University, it was the soil from which Google, Facebook, Pinterest and PayPal (to name a few) have sprung forth. Indeed, Palo Alto has probably done more to transform human life in the last quarter century than any other. So, when we think of how the Internet of Things is going to affect life in the coming decades, we can be reasonably sure where much of expected disruption will originate.

All of which makes Palo Alto a great place to host the first IoT Data Analytics & Visualization event (February 9 – 11, 2016). Additionally fitting: the event is set to be kicked off by Dr. Jonathan Reichental, the city’s Chief Information Officer: Reichental is the man entrusted with the hefty task of ensuring the city is as digital, smart and technologically up-to-date as a place should be that has been called home by the likes of Steve Jobs, Mark Zuckberg, Larry Page and Sergey Brin.

Thus far, Reichental’s tenure has been a great success. In 2013, Palo Alto was credited with being the number one digital city in the US, and has made the top five year upon year – in fact, it so happens that, following our long and intriguing telephone interview, Reichental is looking forward to a small celebration to mark its latest nationwide ranking.

BCN: Jonathan, you’ve been Palo Alto’s CIO now for four years. What’s changed most during that time span?

Dr Jonathan Reichental: I think the first new area of substance would be open government. I recognise open government’s been a phenomenon for some time, but over the course of the last four years, it has become a mainstream topic that city and government data should be easily available to the people. That it should be machine readable, and that an API should be made available to anyone that wants the data. That we have a richer democracy by being open and available.

We’re still at the beginning however. I have heard that there are approximately 90,000 public agencies in the US alone. And every day and week I hear about a new federal agency or state or city of significance who are saying, ‘you can now go to our data portal and you can access freely the data of the city or the public agency. The shift is happening but it’s got some way to go.

Has this been a purely technical shift, or have attitudes had to evolve as well?

I think if you kind of look at something like cloud, cloud computing and cloud as a capability for government – back when I started ‘cloud’ was a dirty word. Many government leaders and government technology leaders just weren’t open to the option of putting major systems off-premise. That has begun to shift quite positively.

I was one of the first to say that cloud computing is a gift to government. Cloud eliminates the need to have all the maintenance that goes with keeping systems current and keeping them backed up and having disaster recovery. I’ve been a very strong proponent of that.

Then there’s social media  – government has fully embraced that now, having been reluctant early on. Mobile is beginning to emerge though it’s still very nascent. Here in Palo Alto we’re trying to make all services that make sense accessible via smart phone. I call it ‘city in a box.’ Basically, bringing up an app on the smart phone you should be able to interact with government – get a pet license, pay a parking fee, pay your electrical bill: everything should really be right there on the smartphone, you shouldn’t need to go to City Hall for many things any more.

The last thing I’d say is there has been an uptake in community participation in government. Part of it is it’s more accessible today, and part of it is there’s more ways to do so, but I think we’re beginning also to see the fruits of the millennial generation – the democratic shift in people wanting to have more of a voice and a say in their communities. We’re seeing much more in what is traditionally called civic engagement. But ‘much more’ is still not a lot. We need to have a revolution in this space for there to be significant change to the way cities operate and communities are effective.

Palo Alto is hosting the IoT Data Analytics & Visualization in February. How have you innovated in this area as a city?

One of the things we did with data is make it easily available. Now we’re seeing a community of people in the city and beyond, building solutions for communities. One example of that is a product called Civic Insight. This app consumes the permit data we make available and enables users to type in an address and find out what’s going on in their neighbourhood with regard to construction and related matters.

That’s a clear example of where we didn’t build the thing, we just made the data available and someone else built it. There’s an economic benefit to this. It creates jobs and innovation – we’ve seen that time and time again. We saw a company build a business around Palo Alto releasing our budget information. Today they are called OpenGov, and they sell the solution to over 500 cities in America, making it easy for communities to understand where their tax payer dollars are being spent. That was born and created in Palo Alto because of what we did making our data available.

Now we get to today, and the Internet of Things. We’re still – like a lot folks, especially in the government context – defining this. It can be as broad or as narrow as you want. There’s definitely a recognition that when infrastructure systems can begin to share data between each other, we can get better outcomes.

The Internet of Things is obviously quite an elastic concept, but are there areas you can point to where the IoT is already very much a reality in Palo Alto?

The clearest example I can give of that today is our traffic signal system here in the city. A year-and-a-half ago, we had a completely analogue system, not connected to anything other than a central computer, which would have created a schedule for the traffic signals. Today, we have a completely IP based traffic system, which means it’s basically a data network. So we have enormous new capability.

For example, we can have schedules that are very dynamic. When schools are being let out traffic systems are one way, at night they can be another way, you can have very granular information. Next you can start to have traffic signals communicate with each other. If there is a long strip of road and five traffic systems down there is some congestion, all the other traffic signals can dynamically change to try and make the flow better.

It goes even further than this. Now we can start to take that data – recording, for example, the frequency and volume of vehicles, as well as weather, and other ambient characteristics of the environment – and we can start to send this to the car companies. Here at Palo Alto, almost every car company has their innovation lab. Whether it’s Ford, General Motors, Volkswagen, BMW, Google (who are getting into the car business now) – they’re all here and they all want our data. They’re like: ‘this is interesting, give us an API, we’ll consume it into our data centres and then we’ll push into cars so maybe they can make better decisions.’

You have the Internet of Things, you’ve got traffic signals, cloud analytics solutions, APIs, and cars as computers and processors. We’re starting to connect all these related items in a way we’ve never done before. We’re going to follow the results.

What’s the overriding ambition would you say?

We’re on this journey to create a smart city vision. We don’t really have one today. It’s not a product or a service, it’s a framework. And within that framework we will have a series of initiatives that focus on things that are important to us. Transportation is really important to us here in Palo Alto. Energy and resources are really important: we’re going to start to put sensors on important flows of water so we can see the amount of consumption at certain times but also be really smart about leak detection, potentially using little sensors connected to pipes throughout the city. We’re also really focused on the environment. We have a chief sustainability officer who is putting together a multi-decade strategy around what PA needs to do to be part of the solution around climate change.

That’s also going to be a lot about sensors, about collecting data, about informing people and creating positive behaviours. Public safety is another key area. Being able to respond intelligently to crimes, terrorism or natural disasters. A series of sensors again sending information back to some sort of decision system that can help both people and machines make decisions around certain types of behaviours.

How do you expect this whole IoT ecosystem to develop over the next decade?

Bill Gates has a really good saying on this: “We always overestimate the change that will occur in the next two years and underestimate the change that will occur in the next ten.”  It’s something that’s informed me in my thinking. I think things are going to move faster and in more surprising ways in the next ten years for sure: to the extent that it’s very hard to anticipate where things are headed.

We’re disrupting the taxi business overnight, the hotel business, the food business. Things are happening at lightning speed. I don’t know if we have a good sense of where it’s all headed. Massive disruption across all domains, across work, play, healthcare, every sort of part of our lives.

It’s clear that – I can say this – ten years from now won’t be the same as today. I think we’ve yet to see the full potential of smart phones – I think they are probably the most central part of this ongoing transformation.

I think we’re going to connect many more things that we’re saying right now. I don’t know what the number will be: I hear five billion, twenty billion in the next five years. It’s going to be more than that. It’s going to become really easy to connect. We’ll stick a little communication device on anything. Whether it’s your key, your wallet, your shoes: everything’s going to be connected.

Palo Alto and the IoT Data Analytics & Visualization event look like a great matchup. What are you looking forward to about taking part?

It’s clearly a developing area and so this is the time when you want to be acquiring knowledge, networking with some of the big thinkers and innovators in the space. I’m pleased to be part of it from that perspective. Also from the perspective of my own personal learning and the ability to network with great people and add to the body of knowledge that’s developing. I’m going to be kicking it off as the CIO for the city.

Containers aren’t new, but ecosystem growth has driven development

kyle andersonContainers are getting a fair bit of hype at the moment, and February 2016 will see the first ever dedicated container-based conference take place in Silicon Valley in the US. Here, Business Cloud News talks to Kyle Anderson, who is the lead developer for Yelp, to learn about the company’s use of containers, and whether containers will ultimately live up to all the hype.

Business Cloud News: “What special demands does Yelp’s business put on its internal computing?”

Kyle Anderson: “I wouldn’t say they are very special. In some sense our computing demands are boring. We need standard things like capacity, scaling, and speed. But boring doesn’t quite cut it though, and if you can turn your boring compute needs into something that is a cut above the status quo, it can become a business advantage.”

BCN: “And what was the background to building your own container-based PaaS? What was the decision-making process there?”

KA: “Building our own container-based PaaS came from a vision that things could be better if they were in containers and could be scheduled on-demand.

“Ideas started bubbling internally until we decided to “just build it” with manager support. We knew that containers were going to be the future, not VMS. At the same time, we evaluated what was out there and wrote down what it was that we wanted in a PaaS, and saw the gap. The decision-making process there was just internal to the team, as most engineers at Yelp are trusted to make their own technical decisions.”

BCN: “How did you come to make the decision to open-source it?”

KA: “Many engineers have the desire to open-source things, often simply because they are proud of their work and want to share it with their peers.

“At the same time, management likes open-source because it increases brand awareness and serves as a recruiting tool. It was natural progression for us. I tried to emphasise that it needs to work for Yelp first, and after one and a half years in production, we were confident that it was a good time to announce it.”

BCN: “There’s a lot of hype around containers, with some even suggesting this could be the biggest change in computing since client-server architecture. Where do you stand on its wider significance?

KA: “Saying it’s the biggest change in computing since client-server architecture is very exaggerated. I am very anti-hype. Containers are not new, they just have enough ecosystem built up around them now, to the point where they become a viable option for the community at large.”

Container World is taking place on 16 – 18 February 2016 at the Santa Clara Convention Center, CA, USA.