All posts by Thomas Campbell

IBM: “The level of innovation is being accelerated”

Angel DiazDr. Angel Diaz joined the research division of IBM in the late nineties, where he helped co-author many of the web standards we enjoy today. Nowadays, he’s responsible for all of IBM’s cloud and mobile technology, as well as architecture for its ambient cloud. Here, ahead of his appearance at Container World (February 16 – 18,  Santa Clara Convention Center, CA,) later this month, BCN caught up with him to find out more about the tech giant’s evolving cloud strategy.

BCN: How would you compare your early days at IBM, working with the likes of Tim Berners-Lee, with the present?

Dr. Angel Diaz: Back then, the industry was focused on developing web standards for a very academic purpose, in particular the sharing of technical information. IBM had a strategy around accelerating adoption and increasing skill. This resulted in a democratization of technology, by getting developers to work together in open source and standards.If you fast forward to where we are now with cloud, mobile, data, analytics and cognitive you see a clear evolution of open source.

The aperture of open source development and ecosystems has grown to include users and is now grounded on solid open governance and meritocracy models. What we have built is an open cloud architecture, starting with an open IaaS based on Open Stack, open PaaS with Cloud Foundry and an open container model with the Open Container Initiative and Cloud Native Computing Foundation. When you combine an open cloud architecture with open APIs defined by the Open API Initiative, applications break free. I have always said that no application is an island – these technologies make it so.

What’s the ongoing strategy at IBM, and where do containers come into it?

It’s very much hybrid cloud. We’ve been leveraging containers to help deliver hybrid applications and accelerate development through devOps, so that people can transform and improve their business processes. This is very similar to what we did in the early days of the web – better business processes means better business. At the end of the day – the individual benefits. Applications can be tailored to the way we like to work, and the way that we like to behave.

A lot of people in the container space, say, wow, containers have been around a long time, why are we all interested in this now? Well, it’s gotten easier to use, and open communities have rallied around it, and it provides a very nice way of marrying concepts of operations and service oriented architecture, which the industry missed in the 2000s.

What does all this innovation ultimately mean for the ‘real world’?

It’s not an exact analogy, but if we remember the impact of HTML, JavaScript – they allowed almost anyone to become a webmaster. That led to the Internet explosion. If you look at where we are now, what we’re doing with cloud: that stack of books you need to go buy has been reduced, the concept count of things you need to know to develop an application, the level of sophistication of what you need to know in order to build an application, scale an application, secure an application, is being reduced.

So what does that do? It increases participation in the business process, in what you end up delivering. Whether it’s human facing or whether it’s an internal business process, it reduces that friction and it allows you to move faster. What’s starting to happen is the level of innovation is being accelerated.

And how do containers fit into this process? 

Previously there was this strict line: you develop software and then operate it and make tweaks, but you never really fundamentally changed the architecture of the application. Because of the ability to quickly stand up containers, to quickly iterate, etc., people are changing their architectures because of operations and getting better operations because of it. That’s where the microservices notion comes in.

And you’ll be talking at Container World. What message are you bringing to the event?

My goal is to help people take a step back and understand the moment we’re in, because sometimes we all forget that. Whether you’re struggling with security in a Linux kernel or trying to define a micro service, you can forget what it is you’re trying to accomplish.

We are in a very special moment where it’s about the digital disruption that’s occurring, and the container technology we’re building here, allow much quicker iteration on the business process. That’s one dimension. The second is that, what IBM’s doing, in not just our own implementation of containers, but in the open source world, to help democratize the technology, so that the level of skill and the number of people who build on this grows.

AWS – we view open source as a companion

deepaIn one of the last installments of our series marking the upcoming Container World (February 16 – 18,  Santa Clara Convention Center, CA, USA), BCN talks to Deepak Singh, General Manager of Amazon EC2 Container Service, AWS

Business Cloud News: First of all – how much of the container hype is justified would you say?

Deepak Singh: Over the last 2-3 years, starting with the launch of Docker in March 2013, we have seen a number of AWS customers adopt containers for their applications. While many customers are still early in their journey, we have seen AWS customers such as Linden Labs, Remind, Yelp, Segment, and Gilt Group all adopt Docker for production applications. In particular, we are seeing enterprise customers actively investigating Docker as they start re-architecting their applications to be less monolithic.

How is the evolution of containers influencing the cloud ecosystem?

Containers are helping people move faster towards architectures that are ideal for the  AWS cloud. For example, one of the common patterns we have seen with customers using Docker is to adopt a microservices architecture. This is especially true for our enterprise customers who see Docker as a way to bring more applications onto AWS.

What opportunities does this open up to AWS?

For us, it all comes down to customer choice. When our customers ask us for a capability, then we listen. They come to us because they want something the Amazon way, easy to use, easy to scale, lower cost, and where they don’t have to worry about the infrastructure running behind it.

As mentioned, many of our customers are adopting containers and they expect AWS to support them. Over the past few years we have launched a number of services and features to make it easier for customers to run Docker-based applications. These include Docker support in AWS Elastic Beanstalk and the Amazon EC2 Container Service (ECS). We also have a variety of certified partners that support Docker and AWS and integrate with various AWS services, including ECS.

What does the phenomenon of open source mean to AWS? Is it a threat or a friend?

We view open source as a companion to AWS’s business model. We use open source and have built most AWS services on top of open source technology. AWS supports a number of open source applications, either directly or through partners. Examples of open source solutions available as AWS services include Amazon RDS (which supports MySQL, Postgres, and MariaDB), Amazon Elastic MapReduce (EMR), and Amazon EC2 Container Service (ECS). We are also an active member of the open source community. The Amazon ECS agent is available under an Apache 2.0 license, and we accept pull requests and allow our customers to fork our agent as well. AWS contributes code to Docker (e.g. CloudWatch logs driver), and was a founder member of the Open Container Initiative, which is a community effort to develop specifications for container runtimes.

As we see customers asking for services based on various open source technologies, we’ll keep adding those services.

You’ll be appearing at Container World this February. What do you think the biggest discussions will be about?

We expect customers will be interested in learning how they can run container-based applications in production, the most popular use cases, and hear about the latest innovations in this space.

Betting on the cloud

Dan-Scholnick_v2A long-time expert on enterprise IT and cloud platforms, Dan Scholnick (General Partner, Trinity Ventures) has the distinction of having been Docker’s first venture investor. BCN spoke to him to find out the secrets to being a top level IT investor.

Know your stuff. Scholnick has a technical background, with a computer science degree from Dartmouth College. After this he worked at Wily Technology with the legendary Lew Cirne, who went on to be the founder and CEO of New Relic. At Wily, Scholnick built the first version of the company’s application performance management product.

All this gave Scholnick a natural appreciation for products and technologies that get used in the data centre as core infrastructure. It partly was this understanding that alerted him to the potential significance of Docker’s processor, dotCloud.

Know how to spot talent: The other factor was that he could recognise dotCloud founder Solomon Hykes as a technology visionary. “He had a better understanding and view of how infrastructure technology was changing than almost anyone we had met,” says Scholnick.

Of course, dotCloud didn’t turn out as expected. “It turns out we were wrong about PaaS, but we were right about the containers. Fortunately for all of us involved in the company, that container bet ended up working out.”

Know when the future is staring you in the face: When Scholnick invested in dotCloud, containers had been around for quite a long time. But they were very difficult to use. “What we learned through the dotCloud experience was how to make containers consumable. To make them easier to consume, easier to use, easier to manage, easier to operate. That’s really what Docker is all about, taking this technology that has actually been around, is great technology conceptually but has historically been very hard to use, and make it usable.”

The rest is IT history. Arguably no infrastructure technology in history has ever taken off and gained mass adoption as quickly as Docker.

“To me, the thing that’s really stunning is to see the breadth and depth of Docker usage throughout the ecosystem,” says Scholnick. “It’s truly remarkable.”

Know what’s next: When BCN asked Scholnick what he thought the next big thing would be in the cloud native movement, he points to an offshoot of Docker and Containers: microservices. “I think we’re going to see massive adoption of microservices in the next 3-5 years and we’re likely going to see some big companies built around the microservices ecosystem,” he says.” Docker certainly has a role to play in this new market: Docker is really what’s enabling it.” and

Keeping in touch with real world uses of Containers is one the reasons Scholnick will be attending and speaking at Container World (February 16 – 18, 2016 Santa Clara Convention Center).

“As a board member at Docker and as an investor in the ecosystem, it’s always good to hear the anecdotal information about how are people using Docker – as well as what pieces do they feel are missing that would help them use containers more effectively. That’s interesting to me because it point to problems that are opportunities for Docker to solve, or opportunities for new start-ups that we can fund.”

Click here to download the Container World programme

Cloud academy: Rudy Rigot and his new Holberton School

rudy rigotBusiness Cloud News talks to Container World (February 16 – 18, 2016 Santa Clara Convention Center, USA) keynote Rudy Rigot about his new software college, which opens today.

Business Cloud News: Rudy, first of all – can you introduce yourself and tell us about your new Holberton School?

Rudy Rigot: Sure! I’ve been working in tech for the past 10 years, mostly in web-related stuff. Lately, I’ve worked at Apple as a full-stack software engineer for their localization department, which I left this year to found Holberton School.

Holberton School is a 2-year community-driven and project-oriented school, training software engineers for the real world. No classes, just real-world hands-on projects designed to optimize their learning, in close contact with volunteer mentors who all work for small companies or large ones like Google, Facebook, Apple, … One of the other two co-founders is Julien Barbier, formerly the Head of Community, Marketing and Growth at Docker.

Our first batch of students started last week!

What are some of the challenges you’ve had to anticipate?

Since we’re a project-oriented school, students are mostly being graded on the code they turn in, that they push to GitHub. Some of this code is graded automatically, so we needed to be able to run each student’s code (or each team’s code) automatically in a fair and equal way.

We needed to get information on the “what” (what is returned in the console), but also on the “how”: how long does the code take to run?  How much resource is being consumed? What is the return code? Also, since Holberton students are trained on a wide variety of languages; how do you ensure you can grade a Ruby project, and later a C project, and later a JavaScript project, etc. with the same host while minimizing issues?

Finally we had to make sure that the student can commit code that is as malicious as they want, we can’t need to have a human check it before running it, it should only break their program, not the whole host.

So how on earth do you negotiate all these?

Our project-oriented training concept is new in the United States, but it’s been successful for decades in Europe, and we knew the European schools, who built their programs before containers became mainstream, typically run the code directly on a host system that has all of the software they need directly installed on the host; and then they simply run a chroot before running the student’s code. This didn’t solve all of the problem, while containers did in a very elegant way; so we took the container road!

HolbertonCloud is the solution we built to that end. It fetches a student’s code on command, then runs it based on a Dockerfile and a series of tests, and finally returns information about how that went. The information is then used to compute a score.

What’s amazing about it is that by using Docker, building the infrastructure has been trivial; the hard part has been about writing the tests, the scoring algorithm … basically the things that we actively want to be focused on!

So you’ve made use of containers. How much disruption do you expect their development to engender over the coming years?

Since I’m personally more on the “dev” end use of devops, I see how striking it is that containers restore focus on actual development for my peers. So, I’m mostly excited by the innovation that software engineers will be focusing on instead of focusing on the issues that containers are taking care of for them.

Of course, it will be very hard to measure which of those innovations were able to exist because containers are involved; but it also makes them innovations about virtually every corner of the tech industry, so that’s really exciting!

What effect do you think containers are going to have on the delivery of enterprise IT?

I think one takeaway from the very specific HolbertonCloud use case is that cases where code can be run trivially in production are getting rare, and one needs guarantees that only containers can bring efficiently.

Also, a lot of modern architectures fulfil needs with systems that are made of more and more micro-services, since we now have enough hindsight to see the positive outcomes on their resiliences. Each micro-service may have different requirements and therefore be relevant to be done each with different technologies, so managing a growing set of different software configurations is getting increasingly relevant. Considering the positive outcomes, this trend will only keep growing, making the need for containers keep growing as well.

You’re delivering a keynote at Container World. What’s the main motivation for attending?

I’m tremendously excited by the stellar line-up! We’re all going to get amazing insight from many different and relevant perspectives, that’s going to be very enlightening!

The very existence of Container World is exciting too: it’s crazy the long way containers have gone over the span of just a few years.

Click here to learn more about Container World (February 16 – 18, 2016 Santa Clara Convention Center, USA)

IoT comes to the CES: opening up the home to the sharing economy

The Internet of Things vector illustration.One of the most intriguing corners of this year’s CES is the dedicated SuperSession on ‘IoT Business Strategies: Partnerships for the Sharing Economy’. After all, while almost anyone in Las Vegas this January will be able to tell you that IoT will (surely) have a huge part to play in the future of consumer tech, very few people will be able to tell you exactly how

The main current consumer thrust in IoT, for example, remains home automation, and specifically security. Yet there is often little really inspiring about this proposed application of IoT. In part, this is because it arguably fails to offer us anything really new. A secure home is a secure home is a secure home, however this is achieved, and if home automation currently offers greater control and security, it does so at significantly more expense.

Much more interesting is the USP of home automation innovator August.  Co-Founder & CEO Jason Johnson, who’ll be appearing at the SuperSession panel in Vegas next month, took the time to tell us precisely what distinguishes August’s approach to home automation from all the other contending companies.

“We definitely make products that fall under the security category,” he says. “But we have a kind of unique philosophy.  We’ve looked at the front door, and asked, if you can give control over that part of the home in a new way, what could that do for consumers?”

August concluded that the answer to this question lay was in combination of home automation with the booming sharing economy in particular and ecommerce in general – both of which an automated front door could make much more seamless and better integrated into users’ lives.

“Traditionally the lock on our doors has been designed to keep people out. We have a philosophy that, if we can make a really good access system, a different kind of lock and security for the front door, it could be used not just to keep people out but to let them in – a kind of different paradigm to what a lock is. Our vision is, if we do that really well, then when I get home from work tonight, my dog will have been walked, my groceries delivered, my UPS packages delivered, my house cleaned – maybe there’s fresh flowers on my dining room table, my dry cleaning has been delivered and it’s hanging in my closet, my dirty clothes have been taken away.”

The ideal behind August is that, for all of those service providers requiring access to the home to deliver (a requirement presently resulting in a chaos of keys, calls and clashing schedules), instant, temporary access could be delivered the second the arrangement is made. Johnson offers an example from personal experience.

“I have a vacation rental home up in Napa, this little tiny shack,” he says. “I made it available on Airbnb and right away I had to deal with the keys. So, first, we had to hide a key somewhere on the property. Later, of course, I started issuing keys from within the August app. And you can do that. You go to the app, you type in the person’s name, their phone number, the days, the hours they have access and I issue the keys from the app and they show up and can get access to the house.”

However, the experience became that much more seamless (and therefore satisfying) following a software integration between the two services. “Now, literally as I’m talking to you, someone could be doing an Airbnb booking for my place: and the August app will automatically provision a temporary key to that guest. I’ve done nothing.”

The opportunity for such a provision to facilitate e-commerce per se is striking.

“One of the things that cause us most to think twice about ordering something online is the challenge of, ‘how am I going to physically get that?’ At our office, we have a lot of employees that get packages delivered here, and they stack up and then they got to haul the packages home on the bus or they ride a bicycle and have to haul the packages home on their bikes. So people think twice about ordering things online! Nobody wants to come home and have that little sticker on the wall saying missed delivery.”

You could be forgiven for thinking that, indeed, home automation and the internet of services look made for one another. Indeed, technologies often seem to complement one another. It is presumably this manner of symbiosis that will allow IoT to flourish in the years to come, to offer consumers new experiences. Objects will not merely be connected and that’s it – rather, through that connectivity, new combinations and opportunities come to light.

There will be few more explicit examples of this approach on display at this year’s CES than at the ‘IoT Business Strategies’ SuperSession. Attendance is certainly a key part in August’s plans for 2016.

“The idea of a smart lock and a smart video doorbell is still a new concept. The challenge for us in 2016 – and starting at CES – is to move into the mainstream. How do you get, not just early tech adopters, but mainstream consumers to embrace these technologies and put them in our homes? That’s what we need to do over the course of 2016.”

Click here for more information about  the‘IoT Business Strategies: Partnerships for the Sharing Economy’ at CES, Las Vegas, January 7 2016

How Silicon Valley is disrupting space

spaceship close upWe tend to think of the Space Industry as quintessentially cutting edge. As such it feels awfully strange to hear somebody compare it to the pre-Uber taxi industry – nowadays the definition of an ecosystem ripe for seismic technological disruption.

Yet comparing the two is exactly what Sean Casey (Founder and Managing Director of the Silicon Valley Space Centre) is doing, during a phone conversation ahead of his appearance at February’s IoT Data Analytics & Visualization event in Palo Alto.

“With all Silicon Valley things there’s kind of a standard formula that involves disruptive technologies and large markets. Uber’s that way. Airbnb is the same,” says Casey. “Space is dominated by a bunch of large companies, making big profits from the government and not really interested in disrupting their business. The way they’re launching their rockets today is the same way they’ve been doing it over the last forty years. The reliability has increased, but the price hasn’t come down.”

Nowadays, however, a satellite needn’t cost hundreds of millions of dollars. On the contrary, costs have even come down to as little as $150,000. Talk about economising! “Rather than spending hundreds of millions of dollars on individual satellites, we can fly hundreds of satellites at a greatly reduced cost and mitigate the risk of a single failure,” Casey says. In addition, he explains that these satellites have tremendous imaging and communications capabilities – technology leveraged from a very everyday source. “The amount of processing power that you can fly in a very small satellite comes from a tremendous processing power that we all have in our cell phones.”

Entrepreneur Elon Musk was one of the first to look at this scenario, founding SpaceX. “Maybe he was bringing some new technology to the table,” says Casey, “but he’s basically just restructured his business to make launch costs cheaper.”

However, due perhaps in part to the historical proximity of the US government and the Space Industry, regulatory opposition to newcomers has been particularly strident. It is a fact that clearly irritates Casey.

“Elon Musk has had to fight regulatory obstructions put up by people in Washington that said we want to keep you out of the business – I mean, how un-American is that? We’re supposed to be a capitalist country that embraces new opportunity and change. Get a grip! That stuff is temporary, it’s not long term. The satellite industry is often reluctant to fly new technologies because they don’t think they can sell that approach to their government customers.”

Whereas lower prices open the door to new customers, and new use cases – often moving hand-in-hand with developments in analytics. This brings us to perhaps the most interesting aspect of a very interesting discussion. There are, on the one hand, a number of immediate feasible use cases that come to Casey’s mind – analysing the flow of hospital visits to anticipate and epidemics, for example, not to mention a host of economic usages, such as recording and analysing shipping, resources, harvests and more…

On the other hand, while these satellites will certainly offer clients a privileged vantage point from which to view and analyse the world (we don’t refer to the ‘bird’s eye view’ for nothing), precisely what discoveries and uses will be discovered up there in the coming years remains vague – albeit in a tantalising sort of way.

“It’s one of those things that, if you’ve never looked at it. If you’ve never had that data before, you kind of don’t know what you’re going to find. After this is all played out, you’ll see that this was either a really big step forward or it was kind of a bust and really didn’t amount to anything.  It’s sort of like asking the guys at Twitter  to show that their company’s going to be as big as it became after they’d done their Series A Financing – because that’s where these satellite companies are. Most of them are Series A, some of them are Series B – SpaceX is a lot further on.”

One thing that looks certain is that Silicon Valley is eyeing up space as its Final Frontier. From OneWeb and O3b founder Greg Wyler’s aspiration to connect the planet, to Google’s  acquisition of Skybox and Monsanto’s acquisition of Climate Corp – plus a growing number of smaller investments in space-focussed start-ups, not to mention the aforementioned SpaceX and Amazon’s more overt investment in rocket science, capitalism is coming to the cosmos.

Sean Casey will be appearing at IoT Data Analytics & Visualization (February 9 – 11, 2016 Crowne Plaza Palo Alto). Click here to register.

The IoT in Palo Alto: connecting America’s digital city

jonathan_reichental_headshot_banffPalo Alto is not your average city. Established by the founder of Stanford University, it was the soil from which Google, Facebook, Pinterest and PayPal (to name a few) have sprung forth. Indeed, Palo Alto has probably done more to transform human life in the last quarter century than any other. So, when we think of how the Internet of Things is going to affect life in the coming decades, we can be reasonably sure where much of expected disruption will originate.

All of which makes Palo Alto a great place to host the first IoT Data Analytics & Visualization event (February 9 – 11, 2016). Additionally fitting: the event is set to be kicked off by Dr. Jonathan Reichental, the city’s Chief Information Officer: Reichental is the man entrusted with the hefty task of ensuring the city is as digital, smart and technologically up-to-date as a place should be that has been called home by the likes of Steve Jobs, Mark Zuckberg, Larry Page and Sergey Brin.

Thus far, Reichental’s tenure has been a great success. In 2013, Palo Alto was credited with being the number one digital city in the US, and has made the top five year upon year – in fact, it so happens that, following our long and intriguing telephone interview, Reichental is looking forward to a small celebration to mark its latest nationwide ranking.

BCN: Jonathan, you’ve been Palo Alto’s CIO now for four years. What’s changed most during that time span?

Dr Jonathan Reichental: I think the first new area of substance would be open government. I recognise open government’s been a phenomenon for some time, but over the course of the last four years, it has become a mainstream topic that city and government data should be easily available to the people. That it should be machine readable, and that an API should be made available to anyone that wants the data. That we have a richer democracy by being open and available.

We’re still at the beginning however. I have heard that there are approximately 90,000 public agencies in the US alone. And every day and week I hear about a new federal agency or state or city of significance who are saying, ‘you can now go to our data portal and you can access freely the data of the city or the public agency. The shift is happening but it’s got some way to go.

Has this been a purely technical shift, or have attitudes had to evolve as well?

I think if you kind of look at something like cloud, cloud computing and cloud as a capability for government – back when I started ‘cloud’ was a dirty word. Many government leaders and government technology leaders just weren’t open to the option of putting major systems off-premise. That has begun to shift quite positively.

I was one of the first to say that cloud computing is a gift to government. Cloud eliminates the need to have all the maintenance that goes with keeping systems current and keeping them backed up and having disaster recovery. I’ve been a very strong proponent of that.

Then there’s social media  – government has fully embraced that now, having been reluctant early on. Mobile is beginning to emerge though it’s still very nascent. Here in Palo Alto we’re trying to make all services that make sense accessible via smart phone. I call it ‘city in a box.’ Basically, bringing up an app on the smart phone you should be able to interact with government – get a pet license, pay a parking fee, pay your electrical bill: everything should really be right there on the smartphone, you shouldn’t need to go to City Hall for many things any more.

The last thing I’d say is there has been an uptake in community participation in government. Part of it is it’s more accessible today, and part of it is there’s more ways to do so, but I think we’re beginning also to see the fruits of the millennial generation – the democratic shift in people wanting to have more of a voice and a say in their communities. We’re seeing much more in what is traditionally called civic engagement. But ‘much more’ is still not a lot. We need to have a revolution in this space for there to be significant change to the way cities operate and communities are effective.

Palo Alto is hosting the IoT Data Analytics & Visualization in February. How have you innovated in this area as a city?

One of the things we did with data is make it easily available. Now we’re seeing a community of people in the city and beyond, building solutions for communities. One example of that is a product called Civic Insight. This app consumes the permit data we make available and enables users to type in an address and find out what’s going on in their neighbourhood with regard to construction and related matters.

That’s a clear example of where we didn’t build the thing, we just made the data available and someone else built it. There’s an economic benefit to this. It creates jobs and innovation – we’ve seen that time and time again. We saw a company build a business around Palo Alto releasing our budget information. Today they are called OpenGov, and they sell the solution to over 500 cities in America, making it easy for communities to understand where their tax payer dollars are being spent. That was born and created in Palo Alto because of what we did making our data available.

Now we get to today, and the Internet of Things. We’re still – like a lot folks, especially in the government context – defining this. It can be as broad or as narrow as you want. There’s definitely a recognition that when infrastructure systems can begin to share data between each other, we can get better outcomes.

The Internet of Things is obviously quite an elastic concept, but are there areas you can point to where the IoT is already very much a reality in Palo Alto?

The clearest example I can give of that today is our traffic signal system here in the city. A year-and-a-half ago, we had a completely analogue system, not connected to anything other than a central computer, which would have created a schedule for the traffic signals. Today, we have a completely IP based traffic system, which means it’s basically a data network. So we have enormous new capability.

For example, we can have schedules that are very dynamic. When schools are being let out traffic systems are one way, at night they can be another way, you can have very granular information. Next you can start to have traffic signals communicate with each other. If there is a long strip of road and five traffic systems down there is some congestion, all the other traffic signals can dynamically change to try and make the flow better.

It goes even further than this. Now we can start to take that data – recording, for example, the frequency and volume of vehicles, as well as weather, and other ambient characteristics of the environment – and we can start to send this to the car companies. Here at Palo Alto, almost every car company has their innovation lab. Whether it’s Ford, General Motors, Volkswagen, BMW, Google (who are getting into the car business now) – they’re all here and they all want our data. They’re like: ‘this is interesting, give us an API, we’ll consume it into our data centres and then we’ll push into cars so maybe they can make better decisions.’

You have the Internet of Things, you’ve got traffic signals, cloud analytics solutions, APIs, and cars as computers and processors. We’re starting to connect all these related items in a way we’ve never done before. We’re going to follow the results.

What’s the overriding ambition would you say?

We’re on this journey to create a smart city vision. We don’t really have one today. It’s not a product or a service, it’s a framework. And within that framework we will have a series of initiatives that focus on things that are important to us. Transportation is really important to us here in Palo Alto. Energy and resources are really important: we’re going to start to put sensors on important flows of water so we can see the amount of consumption at certain times but also be really smart about leak detection, potentially using little sensors connected to pipes throughout the city. We’re also really focused on the environment. We have a chief sustainability officer who is putting together a multi-decade strategy around what PA needs to do to be part of the solution around climate change.

That’s also going to be a lot about sensors, about collecting data, about informing people and creating positive behaviours. Public safety is another key area. Being able to respond intelligently to crimes, terrorism or natural disasters. A series of sensors again sending information back to some sort of decision system that can help both people and machines make decisions around certain types of behaviours.

How do you expect this whole IoT ecosystem to develop over the next decade?

Bill Gates has a really good saying on this: “We always overestimate the change that will occur in the next two years and underestimate the change that will occur in the next ten.”  It’s something that’s informed me in my thinking. I think things are going to move faster and in more surprising ways in the next ten years for sure: to the extent that it’s very hard to anticipate where things are headed.

We’re disrupting the taxi business overnight, the hotel business, the food business. Things are happening at lightning speed. I don’t know if we have a good sense of where it’s all headed. Massive disruption across all domains, across work, play, healthcare, every sort of part of our lives.

It’s clear that – I can say this – ten years from now won’t be the same as today. I think we’ve yet to see the full potential of smart phones – I think they are probably the most central part of this ongoing transformation.

I think we’re going to connect many more things that we’re saying right now. I don’t know what the number will be: I hear five billion, twenty billion in the next five years. It’s going to be more than that. It’s going to become really easy to connect. We’ll stick a little communication device on anything. Whether it’s your key, your wallet, your shoes: everything’s going to be connected.

Palo Alto and the IoT Data Analytics & Visualization event look like a great matchup. What are you looking forward to about taking part?

It’s clearly a developing area and so this is the time when you want to be acquiring knowledge, networking with some of the big thinkers and innovators in the space. I’m pleased to be part of it from that perspective. Also from the perspective of my own personal learning and the ability to network with great people and add to the body of knowledge that’s developing. I’m going to be kicking it off as the CIO for the city.

BT discusses its interests in South East Asia

Cloud SEANothing better reflects the way the Cloud is changing the traditional parameters of telecom operators than BT’s presence at this morning’s Cloud South East Asia keynotes in Kuala Lumpur.

Thanaraj Kanagalingam, BT’s regional solutions director, joined the likes of SingTel, Telekom Malaysia and MDeC in presenting at the well-attended conference, where he set out BT’s strategic presence in the region, itself a microcosm of the UK’s most recognisable telco’s increasingly global strategy and reach.

In South East Asia specifically, BT has been active in the networked IT business for a number of years, via acquisitions of local players such as Frontline (Singapore-based IT consulting and services company), which has driven them towards an ICT and telecommunications convergence play.

Subsequently BT has extended itself into the contact centre line business – with very strong links to the airline industries – and is  now moving into what it is called ‘the Cloud of Clouds’ – positioning itself through partnerships with existing Cloud service providers, and providing a new array of digital services to enterprise customers.

“It’s all about network connectivity,” explained Kanagalingam, answering questions after his keynote. “A lot of enterprises here want to go out to a global market, so when they establish themselves in China, in India, and so forth, they need to have that connectivity. BT provides this from a network perspective, from a telco perspective. We partner with local partners in each of these regions but at the same time leverage our traditional framework.”

A focal point of BT’s global appeal is security (a topic that has predictably dominated numerous discussions at Cloud South East Asia). Specifically, the telco looks to draw on its strengths to provide a more secure connectivity to enterprise customers. “For us,” explains Kanagalingam, “service is encompassing hybrid intelligent network, world class leading security coupled with PAYG cloud computing solution.”

AWS: examine fine print in data transfer legislation

In a week that has seen the European Court of Justice rule that the Safe Harbour agreement on data transfer as invalid, the significance of data transfer legislation in South East Asia has been under discussion at Cloud South East Asia.

Answering audience questions following his Cloud South East Asia keynote this morning, Blair Layton, Head of Database Services for Amazon Web Services, argued that some of the legislation against data transfer was not always as cast-iron as they appear.

Acknowledging that such legal concerns were indeed “very legitimate,” and that there were certainly countries with stringent legal provisions that formed an obvious barrier to the adoption of cloud services such as Amazon Web Services, Layton none the less stressed that it was always worth examining the relevant legislation “in more detail.”

“What we’ve found in some countries is that, even though the high level statement might be that data has to reside in one country, what you find in the fine print is that it actually says, ‘if you inform users then it is fine to move the data,”’ he told delegates. “Also, that for sensitive data you think you may not be able to move – because of company controls, board level concerns etc. – we can have many discussions about that. For instance, if you just want to move data for back-up and recovery, you can encrypt that on the premise, maintain the keys on premise, and shift that into the cloud for storage.”

In the same session, Layton, when not extolling the impressive scope and effectiveness of Amazon Web Services in the South East Asian region and beyond, discussed other reasons for the arguable disparity between the evident regional interest in cloud services, and the actual uptake of them.

“There are in different cultures in different countries, and they have different levels of interest in technology. For example, you’ll see that…. people in Singapore are very conservative compared to the Taiwanese In other countries their IT is not as mature and they’re not as willing to try new things and that’s simply cultural.”

Advancing the cloud in South East Asia

Mike_MuddToday is the first day of Cloud South East Asia in Kuala Lumpur, and the attendance alone testifies to the enthusiasm and curiosity around cloud development in the region in general and in Malaysia in particular.

One great authority on the topic is the chair of today’s event, Mike Mudd (pictured), MD at Asian Policy Partners LLC. Following keynotes from the likes of Amazon Web Services and the Asia Cloud Computing Association, Business Cloud News sat down with Mudd to discuss the significance of cloud computing standards in the region, something touched upon by a number of speakers.

BCN: Hi Mike. It was pointed out today that there is a slight disparity between the enthusiasm for the cloud in South East Asia, and the pace of actual adoption in the region. What would you say the big impediments are?

Michael Mudd: Well there’s the general one which is what I’ve described as the ‘trusted cloud’. This encompasses two things. One is security, and the other is privacy. The other issue however is that, only really half of the region here has adequate data protection rules. Some have them on the books but they’re either not enforced, they’re enforced laxly, or they are only applicable to the private sector, and not applicable to government. This is quite distinct to privacy laws in say Europe, where it goes across all sectors.

In addition, in certain countries, they’re trying to say that you cannot send any personally identifiable information across borders. This is important when it comes to financial information: banks, insurance, stock exchange, this type of thing, as well as healthcare.

And are regional governments taking up the cloud in general?

Forward looking governments are. Singapore, Hong Kong to a certain degree – but there’s not an idea of a ‘cloud first’ policy yet. It’s still very much ‘hug my server, build my data centre etc..’

From the point of view of the regulators, particularly the financial services, to do their job they’ve got to be able to audit. And one of the things they consider important to that is being able to physically enter premises if required. Certain jurisdictions want to see servers. If the data is in the cloud, then that too is an issue, and something that has to be addressed.

Do you think that the new Trans-Pacific Partnership could provide a way out of this impasse?

What has been drafted to my understanding (though we’ve still got to see the details) in the TPP, is wording which will enable or should enable cross border data flows to work far more easily. Again it was only signed two days ago so we don’t know exact words. (Like all trade negotiations they’re done in confidence – people complain they’re done in secrecy but all are done in the same way.)

Why is this so important?

From the point of view of cloud computing, this is new. Most trade agreements deal with traditional things. Agriculture being the first trading product, manufacturing the second, the third being services, but the fourth one is the new one: trading data, trading information, flowing across borders.

It actually goes right back to the very beginning. Information’s always been important for trade: being able to have a free flow of information. I’m not talking about security or government: that kind of thing is always sensitive and will always be treated separately as it should be, but commercial information is very important. It’s the reason your ATM card works here as well as in London. That’s a cross border data flow of information!

Standards are only just emerging. We obviously have technical standards – their objective is to enable interoperability between disparate machines. Those kinds of standards have been around a long time – they’re based on industry protocols etc. What have starting to come up now are management standards, standards coming out now very specifically for cloud.