Category Archives: containers

What did BCN readers say last week?

What do you think written on whiteboardOver the past week, we took the opportunity to gauge the opinion of the BCN readership on industry trends and issues, through a number of polls. Here’s what we found out:

Microsoft is unlikely to be successful? 58% say no

For the most part, Microsoft’s lawsuit has been keep out of the headlines. This is unlikely to indicate the whole episode is unimportant to the industry, but maybe more due to the fact the story has been overshadowed by the ongoing saga between Apple and the FBI.

In any case, Microsoft filed a lawsuit against the US government, citing the first and fourth amendment with regard to government agencies using secrecy orders to access its customer’s emails or records. From Microsoft’s perspective, the company should have the right to tell customers the government is accessing their data, aside from in exceptional circumstances. The government disagrees.

While the tech giant has taken it upon itself to fight the good fight alone, BCN readers are a bit more sceptical on the success of the venture. Only 42% believe Microsoft’s lawsuit will be successful, though this is a question which is unlikely to be answered for a substantial period of time. Any decision will be appealed by the despondent party, dragging out any decisions or changes in government practise.

When will containers hit mainstream? 21% say right now

Containers are one of the hottest trends in 2016. We recently ran a buzzword-buster article not only discussing what containers actually are, but more importantly what the value to enterprise actually is. Since then there have been numerous announcement focused around the technology, from Microsoft to Red Hat to Juniper, indicating containers are starting to get some traction.

But how much of the press is a smoke-screen and how much is reality? In short, it’s looking quite positive.

Cloud took a healthy amount of time to be trusted and understood by the mainstream market, and maybe it is this longer adoption time which has accelerated containers as a technology. 21% of BCN readers highlighted that they are currently using the technology in a meaningful way in their business currently, 50% believe it will be in the next 1-2 years, and only 29% said longer than three years.

Who is the best innovator in the cloud industry? 75% still say AWS

Last week AWS launched a host of new features at the AWS Chicago Summit, ranging from new security features, tools which simplify the movement of data around an organizations cloud, platforms for automatically deploying and running apps on Amazon’s cloud infrastructure, testing features, as well as authentication services.

Although this is the first major update from AWS in some time, Google and Microsoft have been feverishly bolstering their offerings over the last six months ranging from new hires, to new features and new acquisitions. Industry insiders have even told us at BCN that AWS could be seen to be sitting back to much, offering Google and Microsoft the opportunity to improve their own standing, and make up ground on the number one player in the cloud space.

BCN readers do not agree however. 75% believe AWS is still by far and away the industry leader, 10% believe AWS, Google and Microsoft are all on par, while 15% believe innovation has faltered at AWS, and the rest of the industry is catching up fast.

Is DevOps mainstream? 48% say no

DevOps is another of the buzzwords which has floated over from 2015 into 2016. However, as buzzwords go, few have captured the attention of the industry in the same manner. Such is the prominence of DevOps, it seems although every company is now a DevOps specialist, DevOps expert or DevOps orientated organization.

In fact, this isn’t only vendors who have adopted DevOps, but pretty much every enterprise decision maker has DevOps on the lips also. The main concern here is the definition of DevOps can be seen as lost on certain organizations. Yes, there are practitioners of the methodology, though there are also a host of people who have taken the phrase without fully understanding the implications and the means to implement such an idea.

And it would appear BCN readers also agree with that assumption. Despite DevOps being one of the most used words in the cloud industry, only 52% of our readers believe DevOps has hit the mainstream market.

Juniper boosts security capabilities with two new product offerings

Secure cloudJuniper Networks has launched a number of new cloud and virtualised service offerings as part of its software-defined secure networks framework.

The new offerings include a new containerised virtual firewall called cSRX and a multi-core version of the Juniper Networks vSRX. The company claims the new vSRX version is ten times faster than the nearest competitor and creates new possibilities for using agile and flexible virtual firewalls, while cSRX is the first containerized offering for the industry.

“As the security landscape continues to evolve, it is more important than ever to work together to combat cyber threats,” Kevin Walker, Security CTO at Juniper Networks. “These key additions to our security portfolio will further our Software-Defined Secure Networks vision and greatly benefit our customers. Our products provide the best opportunity to create secure networks through policy, detection and enforcement. We are excited to be releasing the most flexible firewall solutions in the market and continue to showcase our commitment to bringing SDSN to organisations across the globe.”

Juniper believes the faster vSRX offering and the scalability of the containerized cSRX, combined with the higher density of services on the Intel Xeon processor family, will increase an organizations capability to detect threats.

“Juniper Networks is delivering significant scale and total cost of ownership advantages to its customers with the new cSRX, which fundamentally changes how security is deployed and illustrates the power of Software-Defined Secure Networks to provide a holistic network protection paradigm,” Mihir Maniar, VP of Security Product Management at Juniper Networks. “Moreover, with the addition of our 100 Gbps vSRX, our security portfolio is further advancing the industry’s highest performing virtual firewall.”

Red Hat bets on OpenStack, DevOps and Containers in new product offerings

redhat office logoRed Hat has launched general availability of Red Hat Cloud Suite and Red Hat OpenStack Platform 8, leaning on wider DevOps trends within the industry.

The company claims the new offerings will assist enterprise organizations in bridging the gap between development and operations teams at the scale of cloud computing, and successfully implement a DevOps business model.

“Everyone is now aware that Uber doesn’t own a car or Facebook doesn’t generate its own content, this is nothing new, but it does highlight the digital disruption which is taking place in the industry,” said Radhesh Balakrishnan, General Manager, OpenStack at Red Hat. “These disruptions are impacting decisions on infrastructure within the organization, but also what kind of development methodology gets adopted. Customers are demanding an agile infrastructure and a Devops model to ensure they can reduce time to market and accelerate innovation within their own organization.

“When you generally look at the CIO agenda, the need to be more responsive to business needs is a priority within almost every organization by default. Given that they are viewing DevOps as a means to facilitate the change in thinking and culture, DevOps is here now and it’s not a fad which the industry has grabbed onto.

“Even internally, we have been aggressive in embracing DevOps. Our oldest business is Enterprise Linux and security updates is an area which of key value to our customers. Heartbleed was a huge issue for our customers 12 months ago, and since we are following the DevOps methodology, we were not only able to provide a patch, but we also pushed out a tool which customers can use to see if they are now compliant. None of this would have been possible without DevOps, so we are seeing the benefits internally as well.”

Red Hat is currently pinning its ambitions on the growth of OpenStack and the belief it will become the choice operating system for cloud infrastructure and the data centres of the future. The company backed the growth of Linux in a similar fashion, effectively riding the wave to its $2 billion annual sales, and is now placing the same bet on OpenStack, and its adoption throughout the industry.

The launch is based on OpenStack Kilo, the release which came out last year, combining the Red Hat cloud, DevOps and container offerings on a single cloud suite, within a private cloud environment. Keeping on the theme of ‘openness’, the tools will also be available as individual products should customers want to work with other offerings also.

Building on another industry trend, Red Hat has also prioritized containers as a technology for its service offering.

“Containers are probably the most attractive technology we at Red Hat have seen in years. Every large customer we have wants to have a conversation around containers,” said Balakrishnan. “We’re including OpenShift in the cloud suite, which is a service offering which was designed from the ground up on Docker (for container image) and Kubernetes (for orchestration layer). We are excited about the fact that we are one of the first in the industry to be bringing container technology to mainstream.

“Containers are one of the biggest priority areas for us as a company, so much so that we include container technology in our Enterprise Linux offering. It’s pervasive both in our technology as well as in our customer minds.”

Microsoft enters the containers race

male and female during the run of the marathon raceMicrosoft has cashed in on one of the industry’s trending technologies, with the announcement of the general availability of the Azure Container Service.

The Microsoft container service was initially announced in September 2015 and released for public preview in February, is built on Opensource and offers a choice between DC/OS or Docker Swarm orchestration engines.

“I’m excited to announce the general availability of the Azure Container Service; the simplest, most open and flexible way to run your container applications in the cloud,” said Ross Gardler, Senior Program Manager at Microsoft, on the company’s blog. “Organizations are already experimenting with container technology in an effort to understand what they mean for applications in the cloud and on-premises, and how to best use them for their specific development and IT operations scenarios.”

While the growth of containers technology has been documented in recent months, a number of industry commentators have been concerned about the understanding of the technology within enterprise organizations themselves. A recent survey from the Cloud & DevOps World event, highlighted 74% of respondents agreed with the statement “everyone has heard of containers, but no-one really understands what containers are.”

Aside from confusion surrounding the definition and use case of containers, the Microsoft team believe the growth of the technology is being stunted by the management and orchestration. While the technology does offer organizations numerous benefits, traditional means of managing such technologies has proven to be in-effective.

“Azure Container Service addresses these challenges by providing simplified configurations of proven open source container orchestration technology, optimized to run in the cloud,” said Gardler. “With just a few clicks you can deploy your container-based applications on a framework designed to help manage the complexity of containers deployed at scale, in production.”

Along the availability announcement, Microsoft has also joined a new open source DC/OS project enabling customers to use Mesosphere’s Data Center Operating System to orchestrate their containers projects. The project brings together the expertise of more than 50 partners to drive usability within the software-defined economy.

The Docker Swarm version ensures any Docker compliant tooling can be used in the service. Azure Container Service provides a ‘Docker native’ solution using the same open source technologies as Dockers Universal Control Plane, allowing customers to upgrade as and when required.

Head in the clouds: Four key trends affecting enterprises

New trends, concept imageCloud is changing the way businesses are functioning and has provided a new and improved level of flexibility and collaboration. Companies worldwide are realising the cloud’s capabilities to generate new business models and promote sustainable competitive advantage; the impact of this is becoming very apparent: a Verizon report recently revealed that 69 per cent of businesses who have used cloud have put it to use to significantly reengineer one or more of their business processes. It’s easy to see why there’s still so much hype around cloud. We’ve heard so much about cloud computing over the last few years that you could be forgiven for thinking that it is now universally adopted, but the reality is that we are still only just scratching the surface, as cloud is still very much in a period of growth and expansion.

Looking beyond the horizon

At present, the majority of corporate cloud adoption is around Infrastructure-as-a-Service (IaaS) and Software-as-a-Service (SaaS) offerings such as AWS, Azure, Office 365 and These services offer cheap buy-in and a relatively painless implementation process, which remains separate from the rest of corporate IT. Industry analyst Gartner says IaaS spending is set to grow 38.4 per cent over the course of 2016, while worldwide SaaS spending is set to grow 20.3 per cent over the year, reaching $37.7 billion. However, the real promise of cloud is much more than IaaS, PaaS or SaaS: it’s a transformative technology moving compute power and infrastructure between on-premise resources, private cloud and public cloud.

As enterprises come to realise the true potential of cloud, we’ll enter a period of great opportunity for enterprise IT, but there will be plenty of adoption-related matters to navigate. Here are four big areas enterprises will have to deal with as cloud continues to take the world by storm:

  1. Hybrid cloud will continue to dominate

Hybrid cloud will rocket up the agenda, as businesses and providers alike continue to realise that there is no one-size-fits-all approach to cloud adoption. Being able to mix and match public and private cloud services from a range of different providers enables businesses to build an environment that meets their unique needs more effectively. To date, this has been held back by interoperability challenges between cloud services, but a strong backing for open application programming interfaces (APIs) and multi-cloud orchestration platforms is making it far easier to integrate cloud services and on-premise workloads alike. As a result, we will continue to see hybrid cloud dominate the conversation.

  1. Emergence of iPaaS

NASA is teaming up with IBM to host a code-a-thon for developers interested in supporting space exploration through apps

The drive towards integration of on premise applications and cloud is giving rise to Integration Platform as a Service (iPaaS). Cloud integration still remains a daunting task for many organizations, but iPaaS is a cloud-based integrations solution that is slowly and steadily gaining traction within enterprises. With iPaaS, users can develop integration flows that connect applications residing in the cloud or on premise, and deploy them without installing any hardware or software. Although iPaaS is relatively new to the market, categories of iPaaS vendors in the market are beginning to emerge, including ecommerce/B2B integration and cloud integration. With integration challenges still a huge issue for enterprises using cloud, demand for iPaaS is only set to grow over the coming months.

  1. Containers will become reality

To date, a lot of noise has been made about the possibilities of container technology, but in reality its use has yet to fully kick-off. That’s set to change as household name public clouds such as Amazon, Microsoft and Google are now embracing containers; IBM’s Blue Mix offering in particular is set to make waves with its triple-pronged Public, Dedicated and Local delivery model. Building a wave of momentum for many application and OS technology manufacturers to ride, it will now become increasingly realistic for them to construct support services around container technology. This does present a threat to traditional virtualization approach, but over time a shift in hypervisors is on the cards and container technology can only improve from this point.

  1. Cloud will be used for Data Resiliency/Recovery services

With cloud storage prices coming down drastically and continuous improvements being made to cloud gateway platforms, the focus is set to shift to cloud-powered backup and disaster recovery services. We are in an age where everything is being offered ‘as a service’; the idea of cloud-powered on-demand usability suits backup and disaster recovery services very well because they do not affect the immediate production data. As such, this should be an area where cloud use will dramatically increase over the next year.

With all emerging technologies, it takes time to fully figure out what they actually mean for enterprises, and these four cloud trends reflect that. In reality we’re only just getting started with cloud, now they understand how it works, the time has come for enterprises to turn the screw and begin driving even more benefits from it.

Written by Kalyan Kumar, Chief Technologist at HCL Technologies.

Will containers change the world of cloud?

Global Container TradeThe rise of containers as a technology has been glorious and confusing in equal measure. While touted by some as the saviour of developers, and by others as the end of VM’s, the majority simply don’t understand containers as a concept or a technology.

In the simplest of terms, containers let you pack more computing workloads onto a single server and in theory, that means you can buy less hardware, build or rent less data centre space, and hire fewer people to manage that equipment.

“In the earlier years of computing, we had dedicated servers which later evolved with virtualisation,” say Giri Fox, Director of Technical Services at Rackspace. “Containers are part of the next evolution of servers, and have gained large media and technologist attention. In essence, containers are the lightest way to define an application and to transport it between servers. They enable an application to be sliced into small elements and distributed on one or more servers, which in turn improves resource usage and can even reduce costs.”

There are some clear differences between containers and virtual machines though. Linux containers give each application, its own isolated environment in which to run, but multiple containers share the host servers’ operating system. Since you don’t have to boot up an operating system, you can create containers in seconds not minutes like virtual machines. They are faster, require less memory space, offer higher-level isolation and are highly portable.

“Containers are more responsive and can run the same task faster,” adds Fox. “They increase the velocity of application development, and can make continuous integration and deployment easier. They often offer reduced costs for IT; testing and production environments can be smaller than without containers. Plus, the density of applications on a server can be increased which leads to better utilisation.

“As a direct result of these two benefits, the scope for innovation is greater than its previous technologies. This can facilitate application modernisation and allow more room to experiment.”

So the benefits are pretty open-ended. Speed of deployment, flexibility to run anywhere, no more expensive licenses, more reliable and more opportunity for innovation.

Which all sounds great, doesn’t it?

CaptureThat said, a recent survey from the Cloud & DevOps World team brought out some very interesting statistics, first and foremost the understanding of the technology. 76% of respondents agreed with the statement “Everyone has heard of containers, but no-one really understands what containers are”.

While containers have the potential to be
the next big thing in the cloud industry, unless those in the ecosystem understand the concept and perceived benefits, it is unlikely to take off.

“Containers are evolving rapidly and present an interesting runtime option for application development,” says Joe Pynadath, ‎GM of EMEA for Chef. “We know that with today’s distributed and lightweight apps, businesses, whether they are a new start-up’s to traditional enterprise, must accelerate their capabilities for building, testing, and delivering modern applications that drive revenue.

“One result of the ever-greater focus on software development is the use of new tools to build applications more rapidly and it is here that containers have emerged as an interesting route for developers. This is because they allow you to quickly build applications in a portable and lightweight manner. This provides a huge benefit for developers in speeding up the application building process. However, despite this, containers are not able to solve the complexities of taking an application from build through test to production, which presents a range of management challenges for developers and operations engineers looking to use them.”

There is certainly potential for containers within the enterprise environment, but as with all emerging technologies there is a certain level of confusion as to how they will integrate within the current business model, and how the introduction will impact the IT department on a day-to-day basis.

“Some of the questions we’re regularly asked by businesses looking to use containers are “How do you configure and tune the OS that will host them? How do you adapt your containers at run time to the needs of the dev, test and production environments they’re in?” comments Pynadath.

While containers allow you to use discovery services or roll your own solutions, the need to monitor and manage them in an automated way remains a challenge for IT teams. At Chef, we understand the benefits containers can bring to developers and are excited to help them automate many of the complex elements that are necessary to support containerized workflows in production”

Vendors are confident that the introduction of containers will drive further efficiencies and speed within the industry, though we’re yet to see a firm commitment from the mass market to demonstrate the technology will take off. The early adopter uptake is promising, and there are case studies to demonstrate the much lauded potential, but it’s still early days.

In short, containers are good, but most people just need to learn what they are.

Containers and microservices starting to enter mainstream market – survey

MainstreamA recent survey from NGINX highlighted containers and microservices are two buzzwords which are starting to enter the mainstream market as companies target daily software releases.

While daily software releases are the ultimate goal within the industry, 70% of respondents highlighted that they are currently releasing new code only once a week, with only 28% reaching target. Barriers cited were a lack of automation tools, a constant trade-off between quality of code and expected speed of delivery, as well as a lack of central management, accountability, and collaboration tools.

Containers are now seemingly beginning to enter the mainstream as 69% of respondents said that they were either actively using or investigating the technology currently. Of the 20% using containers in production, more than a third are running more than 80% of their workloads on containers and more than half for mission-critical applications. The technology is also creating a new buyer audience for vendors, as 74% of respondents said developers were responsible for choosing development and delivery tools as opposed to heads of departments or managers.

Microservices tell a slightly different story, as while adoption levels are similar at approximately 70% currently using or investigating, the trend is weighted more towards small and medium organizations rather than the blue chips. Of the larger organizations, 26% are researching, 36% are currently using in development or production however 38% aren’t using microservices at all.

AWSThe survey also demonstrated AWS are continuing to dominate market share, accounting for 49%. Despite Google and Microsoft Azure grabbing headlines recently with a number of new client wins, acquisitions and product releases, the market seemingly still favours AWS with respondents highlighting an accessible price point as one of the most important factors when selecting a cloud provider.

Continuous integration and continuous delivery are becoming development best practices, as 27% of the respondents would now consider their organization to have a mature practise for continuous integration and delivery. On the other end of the scale, roughly a third said that they were keen to move forward with continuous integration and delivery but the necessary level of collaboration or understanding is not yet widespread in their organizations as of yet.

While the survey does demonstrate the integration of cloud-first technologies such as containers are moving beyond the early-adopter stage, it will be some time before such technologies become common place in large scale organizations were the wheels are slower to turn. Like the cloud business model, containers and microservices seem to be offering small and medium size operations the opportunity to compete with larger organizations budgets through technology innovation, agility and speed of deployment.

Docker launches DDC to support ‘container as a service’ offering

Container company Docker has announced a Docker Data Center along with the new concept of ‘containers as a service’ in a bid to extend its cloud based technology to customer sites.

The Docker Datacenter (DDC) resides on the customer’s premises and gives them a self service system for building and running applications across multiple production systems while under operations controls.

It has also announced the general availability of Docker Universal Control Plane, a service that has been undergoing beta-testing since November 2015, which underpins the running of the container as a service (CaaS).

The advantage of the DDC is that it creates a native environment for the lifecycle management for Dockerized applications. Docker claims that 12 Fortune 500 companies have been beta testing the DDC along with smaller and companies in a range of industries.

Since every company has different systems, tools and processes the DDC was designed to work with whatever the clients have got and adjust to their infrastructure without making them recode their applications, explained Docker spokesman Banjot Chanana on the Docker website. Networking plugins, for example, can be massively simplified if clients use Docker to define how app containers network together. They can do this by choosing from any number of providers to provide the underlying network infrastructure, rather than have to tackle the problem themselves. Similarly, connecting to an internal storage infrastructure is a lot easier. Application programming interfaces provided by the on site ‘CaaS’ allow developers to move stats and logs in and out of logging and monitoring systems more easily.

“This model enables a vibrant ecosystem to grow with hundreds of partners,” said Chanana, who promised that Docker users will have much better options for their networking, storage, monitoring and workflow automation challenges

Docker says its DDC is integrated with Docker’s commercial Universal Control Plane and Trusted Registry software. It achieved this with open source Docker projects Swarm (orchestration), Engine (container runtime), Content Trust (security) and Networking. Docker and its partner IBM provide dedicated support, product engineering teams and service level agreements.

Exponential Docker usage shows container popularity

Global Container TradeAdoption of Docker’s containerisation technology has entered a period of explosive growth with its usage numbers nearly doubling in the last three months, according to its latest figures.

A declaration on the company blog reports that Docker has now issued 2 billion ‘pulls’ of images. In November 2015 the usage figure stood at 1.2 bullion pulls and the Docker Hub from which these images are pulled was only launched in March 2013.

Docker’s invention of software defined autonomous complete file system that encapsulates all the elements of a server in microcosm – such as code, runtime, system tools and system libraries – has whetted the appetite of developers in the age of the cloud.

In January 2016, Docker users pulled images nearly 7,000 times per minute, which was four times the run rate a year ago. In that one month Docker enjoyed the equivalent of 15% of its total transaction from the past three years.

The number of ‘pulls’ is significant because each of these transactions indicates that a Docker engine is downloading an image to create containers from it. Development teams use Docker Hub to publish and use containerised software, and automate their delivery. The fact that two billion pulls have now taken place indicates the popularity of the technology and the exponential growth rate in the last three months is an indicator of the growing popularity of this variation of virtualisation.

There are currently over 400,000 registered users on Docker Hub. “Our users span from the largest corporations, to newly-launched startups, to the individual Docker enthusiast and their number is increasing every day,” wrote Docker spokesman and blog author Mario Ponticello.

Around a fifth of Docker’s two billion pulls come from its 93 ‘Official Repos’ – a curated set of images from Docker’s partners, including NGINX, Oracle, Node.js and Cloudbees. Docker’s security-monitoring service Nautilus maintains integrity of the Official Repos over time.

“As our ecosystem grows, we’ll be adding single-click deployment and security scanning to the Docker platform,” said Monticello.

A Rightscale study in January 2016 found that 17% of enterprises now have more than 1,000 virtual machines in the public cloud (up 4% in a year) while private clouds are showing even stronger appetite for virtualisation techniques with 31% of enterprises running more than 1,000 VMs, up from 22% in 2015.

IBM: “The level of innovation is being accelerated”

Angel DiazDr. Angel Diaz joined the research division of IBM in the late nineties, where he helped co-author many of the web standards we enjoy today. Nowadays, he’s responsible for all of IBM’s cloud and mobile technology, as well as architecture for its ambient cloud. Here, ahead of his appearance at Container World (February 16 – 18,  Santa Clara Convention Center, CA,) later this month, BCN caught up with him to find out more about the tech giant’s evolving cloud strategy.

BCN: How would you compare your early days at IBM, working with the likes of Tim Berners-Lee, with the present?

Dr. Angel Diaz: Back then, the industry was focused on developing web standards for a very academic purpose, in particular the sharing of technical information. IBM had a strategy around accelerating adoption and increasing skill. This resulted in a democratization of technology, by getting developers to work together in open source and standards.If you fast forward to where we are now with cloud, mobile, data, analytics and cognitive you see a clear evolution of open source.

The aperture of open source development and ecosystems has grown to include users and is now grounded on solid open governance and meritocracy models. What we have built is an open cloud architecture, starting with an open IaaS based on Open Stack, open PaaS with Cloud Foundry and an open container model with the Open Container Initiative and Cloud Native Computing Foundation. When you combine an open cloud architecture with open APIs defined by the Open API Initiative, applications break free. I have always said that no application is an island – these technologies make it so.

What’s the ongoing strategy at IBM, and where do containers come into it?

It’s very much hybrid cloud. We’ve been leveraging containers to help deliver hybrid applications and accelerate development through devOps, so that people can transform and improve their business processes. This is very similar to what we did in the early days of the web – better business processes means better business. At the end of the day – the individual benefits. Applications can be tailored to the way we like to work, and the way that we like to behave.

A lot of people in the container space, say, wow, containers have been around a long time, why are we all interested in this now? Well, it’s gotten easier to use, and open communities have rallied around it, and it provides a very nice way of marrying concepts of operations and service oriented architecture, which the industry missed in the 2000s.

What does all this innovation ultimately mean for the ‘real world’?

It’s not an exact analogy, but if we remember the impact of HTML, JavaScript – they allowed almost anyone to become a webmaster. That led to the Internet explosion. If you look at where we are now, what we’re doing with cloud: that stack of books you need to go buy has been reduced, the concept count of things you need to know to develop an application, the level of sophistication of what you need to know in order to build an application, scale an application, secure an application, is being reduced.

So what does that do? It increases participation in the business process, in what you end up delivering. Whether it’s human facing or whether it’s an internal business process, it reduces that friction and it allows you to move faster. What’s starting to happen is the level of innovation is being accelerated.

And how do containers fit into this process? 

Previously there was this strict line: you develop software and then operate it and make tweaks, but you never really fundamentally changed the architecture of the application. Because of the ability to quickly stand up containers, to quickly iterate, etc., people are changing their architectures because of operations and getting better operations because of it. That’s where the microservices notion comes in.

And you’ll be talking at Container World. What message are you bringing to the event?

My goal is to help people take a step back and understand the moment we’re in, because sometimes we all forget that. Whether you’re struggling with security in a Linux kernel or trying to define a micro service, you can forget what it is you’re trying to accomplish.

We are in a very special moment where it’s about the digital disruption that’s occurring, and the container technology we’re building here, allow much quicker iteration on the business process. That’s one dimension. The second is that, what IBM’s doing, in not just our own implementation of containers, but in the open source world, to help democratize the technology, so that the level of skill and the number of people who build on this grows.