Category Archives: containers

Docker bolsters security capabilities with Security Scanning launch

DockerDocker has announced the general availability of its Security Scanning product, an offering formerly known as Project Nautilus.

The service, which is available as add-on service to Docker Cloud private repositories and for Official Repositories located on Docker Hub, streamlines software compliance procedures by providing customers with a security profile of all their Docker images. The offering sits alongside Docker Cloud to automatically trigger a series of events as soon as an image is pushed to a repository, providing a complete security profile of the image itself.

“Docker Security Scanning conducts binary level scanning of your images before they are deployed, provides a detailed bill of materials (BOM) that lists out all the layers and components, continuously monitors for new vulnerabilities, and provides notifications when new vulnerabilities are found,” said Docker’s Toli Kuznets on the company’s blog.

“The primary concerns of app dev teams are to build the best software and get it to their customer as fast as possible. However, the software supply chain does not stop with developers, it is a continuous loop of iterations, sharing code with teams and moving across environments. Docker Security Scanning delivers secure content by providing deep insights into Docker images along with a security profile of its components. This information is then available at every stage of the app lifecycle.”

The offering itself splits each Docker image its respective layers and components, and evaluates the risk associated with each one. Risks are reported back to the CVE databases, linked to the specific layer and/or component, but are also monitored on an on-going basis.

New vulnerabilities found during the on-going monitoring process are reported to the CVE database, which will then assess all other software associated with that component/package to improve software compliance across the board. Docker believes software compliance and general risk management can be enhanced through the offering, but also throughout the lifecycle of the software itself.

“With this information, IT teams can proactively manage software compliance requirements by knowing what vulnerabilities impact what pieces of software, reviewing the severity of the vulnerability and making informed decisions on a course of action,” said Kuznets.

The offering is now available to all customers, with Docker currently offering a three month free trial.

Docker Security

Container adoption hindered by skills gap – survey

Empty road and containers in harbor at sunsetNew research from Shippable has highlighted the use of containers is increasing within the North American market, though the current skills gap is proving to be a glass ceiling for the moment.

Just over half of the respondents to the survey, said they were currently using containers in production and 14% confirmed they were using the technology in the development and testing stages. A healthy 89% believe the use of containers in their organization will increase over the next 12 months.

“Our research and personal experience shows that companies can experience exponential gains in software development productivity through the use of container technology and related tools,” said Avi Cavale, CEO at Shippable. “Companies are realizing the productivity and flexibility gains they were expecting, and use of container technology is clearly on the rise. That said, there are still hurdles to overcome. Companies can help themselves by training internal software teams and partnering with vendors and service providers that have worked with container technology extensively.”

Of those who are not using technology currently, a lack of in-house skills was listed as the main reason, however the survey highlighted security is still a concern, the ROI of the technology is still unproven, and also the company’s infrastructure is not designed to work with containers.

While the rise in awareness of containers has been relatively steady, there have been a number of reports which highlighted an unhealthy proportion of IT professionals do not understand how to use the technology, or what the business case is. The results here indicate there has at least been progress made in understanding the use case, as 74% of those who said they were using the technology are now shipping new software at least 10% faster using container technology, and eight% are shipping more than 50% faster than before.

“In the earlier years of computing, we had dedicated servers which later evolved with virtualisation,” say Giri Fox, Director of Technical Services at Rackspace. “Containers are part of the next evolution of servers, and have gained large media and technologist attention. In essence, containers are the lightest way to define an application and to transport it between servers. They enable an application to be sliced into small elements and distributed on one or more servers, which in turn improves resource usage and can even reduce costs.

“Containers are more responsive and can run the same task faster. They increase the velocity of application development, and can make continuous integration and deployment easier. They often offer reduced costs for IT; testing and production environments can be smaller than without containers. Plus, the density of applications on a server can be increased which leads to better utilisation.

“As a direct result of these two benefits, the scope for innovation is greater than its previous technologies. This can facilitate application modernisation and allow more room to experiment.”

The survey also showed us that while Docker maybe one of the foremost names in the containers world, this has not translated through to all aspects of usage. The most popular registry is Google Container Registry at 54%, followed by Amazon EC2 Container Registry on 45% and Docker Hub in third place with 34%. Public cloud was also the most popular platform, accounting for 31% of respondents. 52% of developers said they’re running containerized applications on Google Compute Engine, while 49% are running on Microsoft Azure and 43% on Amazon Web Services.

While containers are continuing to grow in popularity throughout the industry, the survey highlights the technology is not quite there yet. North America could be seen as more of a trend setting than Europe and the rest of the world, and the fact usage has only just tipped through 50%, there might still be some work before the technology could be considered mainstream. The results are positive, but there is still work to do.

What did BCN readers say last week?

What do you think written on whiteboardOver the past week, we took the opportunity to gauge the opinion of the BCN readership on industry trends and issues, through a number of polls. Here’s what we found out:

Microsoft is unlikely to be successful? 58% say no

For the most part, Microsoft’s lawsuit has been keep out of the headlines. This is unlikely to indicate the whole episode is unimportant to the industry, but maybe more due to the fact the story has been overshadowed by the ongoing saga between Apple and the FBI.

In any case, Microsoft filed a lawsuit against the US government, citing the first and fourth amendment with regard to government agencies using secrecy orders to access its customer’s emails or records. From Microsoft’s perspective, the company should have the right to tell customers the government is accessing their data, aside from in exceptional circumstances. The government disagrees.

While the tech giant has taken it upon itself to fight the good fight alone, BCN readers are a bit more sceptical on the success of the venture. Only 42% believe Microsoft’s lawsuit will be successful, though this is a question which is unlikely to be answered for a substantial period of time. Any decision will be appealed by the despondent party, dragging out any decisions or changes in government practise.

When will containers hit mainstream? 21% say right now

Containers are one of the hottest trends in 2016. We recently ran a buzzword-buster article not only discussing what containers actually are, but more importantly what the value to enterprise actually is. Since then there have been numerous announcement focused around the technology, from Microsoft to Red Hat to Juniper, indicating containers are starting to get some traction.

But how much of the press is a smoke-screen and how much is reality? In short, it’s looking quite positive.

Cloud took a healthy amount of time to be trusted and understood by the mainstream market, and maybe it is this longer adoption time which has accelerated containers as a technology. 21% of BCN readers highlighted that they are currently using the technology in a meaningful way in their business currently, 50% believe it will be in the next 1-2 years, and only 29% said longer than three years.

Who is the best innovator in the cloud industry? 75% still say AWS

Last week AWS launched a host of new features at the AWS Chicago Summit, ranging from new security features, tools which simplify the movement of data around an organizations cloud, platforms for automatically deploying and running apps on Amazon’s cloud infrastructure, testing features, as well as authentication services.

Although this is the first major update from AWS in some time, Google and Microsoft have been feverishly bolstering their offerings over the last six months ranging from new hires, to new features and new acquisitions. Industry insiders have even told us at BCN that AWS could be seen to be sitting back to much, offering Google and Microsoft the opportunity to improve their own standing, and make up ground on the number one player in the cloud space.

BCN readers do not agree however. 75% believe AWS is still by far and away the industry leader, 10% believe AWS, Google and Microsoft are all on par, while 15% believe innovation has faltered at AWS, and the rest of the industry is catching up fast.

Is DevOps mainstream? 48% say no

DevOps is another of the buzzwords which has floated over from 2015 into 2016. However, as buzzwords go, few have captured the attention of the industry in the same manner. Such is the prominence of DevOps, it seems although every company is now a DevOps specialist, DevOps expert or DevOps orientated organization.

In fact, this isn’t only vendors who have adopted DevOps, but pretty much every enterprise decision maker has DevOps on the lips also. The main concern here is the definition of DevOps can be seen as lost on certain organizations. Yes, there are practitioners of the methodology, though there are also a host of people who have taken the phrase without fully understanding the implications and the means to implement such an idea.

And it would appear BCN readers also agree with that assumption. Despite DevOps being one of the most used words in the cloud industry, only 52% of our readers believe DevOps has hit the mainstream market.

Juniper boosts security capabilities with two new product offerings

Secure cloudJuniper Networks has launched a number of new cloud and virtualised service offerings as part of its software-defined secure networks framework.

The new offerings include a new containerised virtual firewall called cSRX and a multi-core version of the Juniper Networks vSRX. The company claims the new vSRX version is ten times faster than the nearest competitor and creates new possibilities for using agile and flexible virtual firewalls, while cSRX is the first containerized offering for the industry.

“As the security landscape continues to evolve, it is more important than ever to work together to combat cyber threats,” Kevin Walker, Security CTO at Juniper Networks. “These key additions to our security portfolio will further our Software-Defined Secure Networks vision and greatly benefit our customers. Our products provide the best opportunity to create secure networks through policy, detection and enforcement. We are excited to be releasing the most flexible firewall solutions in the market and continue to showcase our commitment to bringing SDSN to organisations across the globe.”

Juniper believes the faster vSRX offering and the scalability of the containerized cSRX, combined with the higher density of services on the Intel Xeon processor family, will increase an organizations capability to detect threats.

“Juniper Networks is delivering significant scale and total cost of ownership advantages to its customers with the new cSRX, which fundamentally changes how security is deployed and illustrates the power of Software-Defined Secure Networks to provide a holistic network protection paradigm,” Mihir Maniar, VP of Security Product Management at Juniper Networks. “Moreover, with the addition of our 100 Gbps vSRX, our security portfolio is further advancing the industry’s highest performing virtual firewall.”

Red Hat bets on OpenStack, DevOps and Containers in new product offerings

redhat office logoRed Hat has launched general availability of Red Hat Cloud Suite and Red Hat OpenStack Platform 8, leaning on wider DevOps trends within the industry.

The company claims the new offerings will assist enterprise organizations in bridging the gap between development and operations teams at the scale of cloud computing, and successfully implement a DevOps business model.

“Everyone is now aware that Uber doesn’t own a car or Facebook doesn’t generate its own content, this is nothing new, but it does highlight the digital disruption which is taking place in the industry,” said Radhesh Balakrishnan, General Manager, OpenStack at Red Hat. “These disruptions are impacting decisions on infrastructure within the organization, but also what kind of development methodology gets adopted. Customers are demanding an agile infrastructure and a Devops model to ensure they can reduce time to market and accelerate innovation within their own organization.

“When you generally look at the CIO agenda, the need to be more responsive to business needs is a priority within almost every organization by default. Given that they are viewing DevOps as a means to facilitate the change in thinking and culture, DevOps is here now and it’s not a fad which the industry has grabbed onto.

“Even internally, we have been aggressive in embracing DevOps. Our oldest business is Enterprise Linux and security updates is an area which of key value to our customers. Heartbleed was a huge issue for our customers 12 months ago, and since we are following the DevOps methodology, we were not only able to provide a patch, but we also pushed out a tool which customers can use to see if they are now compliant. None of this would have been possible without DevOps, so we are seeing the benefits internally as well.”

Red Hat is currently pinning its ambitions on the growth of OpenStack and the belief it will become the choice operating system for cloud infrastructure and the data centres of the future. The company backed the growth of Linux in a similar fashion, effectively riding the wave to its $2 billion annual sales, and is now placing the same bet on OpenStack, and its adoption throughout the industry.

The launch is based on OpenStack Kilo, the release which came out last year, combining the Red Hat cloud, DevOps and container offerings on a single cloud suite, within a private cloud environment. Keeping on the theme of ‘openness’, the tools will also be available as individual products should customers want to work with other offerings also.

Building on another industry trend, Red Hat has also prioritized containers as a technology for its service offering.

“Containers are probably the most attractive technology we at Red Hat have seen in years. Every large customer we have wants to have a conversation around containers,” said Balakrishnan. “We’re including OpenShift in the cloud suite, which is a service offering which was designed from the ground up on Docker (for container image) and Kubernetes (for orchestration layer). We are excited about the fact that we are one of the first in the industry to be bringing container technology to mainstream.

“Containers are one of the biggest priority areas for us as a company, so much so that we include container technology in our Enterprise Linux offering. It’s pervasive both in our technology as well as in our customer minds.”

Microsoft enters the containers race

male and female during the run of the marathon raceMicrosoft has cashed in on one of the industry’s trending technologies, with the announcement of the general availability of the Azure Container Service.

The Microsoft container service was initially announced in September 2015 and released for public preview in February, is built on Opensource and offers a choice between DC/OS or Docker Swarm orchestration engines.

“I’m excited to announce the general availability of the Azure Container Service; the simplest, most open and flexible way to run your container applications in the cloud,” said Ross Gardler, Senior Program Manager at Microsoft, on the company’s blog. “Organizations are already experimenting with container technology in an effort to understand what they mean for applications in the cloud and on-premises, and how to best use them for their specific development and IT operations scenarios.”

While the growth of containers technology has been documented in recent months, a number of industry commentators have been concerned about the understanding of the technology within enterprise organizations themselves. A recent survey from the Cloud & DevOps World event, highlighted 74% of respondents agreed with the statement “everyone has heard of containers, but no-one really understands what containers are.”

Aside from confusion surrounding the definition and use case of containers, the Microsoft team believe the growth of the technology is being stunted by the management and orchestration. While the technology does offer organizations numerous benefits, traditional means of managing such technologies has proven to be in-effective.

“Azure Container Service addresses these challenges by providing simplified configurations of proven open source container orchestration technology, optimized to run in the cloud,” said Gardler. “With just a few clicks you can deploy your container-based applications on a framework designed to help manage the complexity of containers deployed at scale, in production.”

Along the availability announcement, Microsoft has also joined a new open source DC/OS project enabling customers to use Mesosphere’s Data Center Operating System to orchestrate their containers projects. The project brings together the expertise of more than 50 partners to drive usability within the software-defined economy.

The Docker Swarm version ensures any Docker compliant tooling can be used in the service. Azure Container Service provides a ‘Docker native’ solution using the same open source technologies as Dockers Universal Control Plane, allowing customers to upgrade as and when required.

Head in the clouds: Four key trends affecting enterprises

New trends, concept imageCloud is changing the way businesses are functioning and has provided a new and improved level of flexibility and collaboration. Companies worldwide are realising the cloud’s capabilities to generate new business models and promote sustainable competitive advantage; the impact of this is becoming very apparent: a Verizon report recently revealed that 69 per cent of businesses who have used cloud have put it to use to significantly reengineer one or more of their business processes. It’s easy to see why there’s still so much hype around cloud. We’ve heard so much about cloud computing over the last few years that you could be forgiven for thinking that it is now universally adopted, but the reality is that we are still only just scratching the surface, as cloud is still very much in a period of growth and expansion.

Looking beyond the horizon

At present, the majority of corporate cloud adoption is around Infrastructure-as-a-Service (IaaS) and Software-as-a-Service (SaaS) offerings such as AWS, Azure, Office 365 and Salesforce.com. These services offer cheap buy-in and a relatively painless implementation process, which remains separate from the rest of corporate IT. Industry analyst Gartner says IaaS spending is set to grow 38.4 per cent over the course of 2016, while worldwide SaaS spending is set to grow 20.3 per cent over the year, reaching $37.7 billion. However, the real promise of cloud is much more than IaaS, PaaS or SaaS: it’s a transformative technology moving compute power and infrastructure between on-premise resources, private cloud and public cloud.

As enterprises come to realise the true potential of cloud, we’ll enter a period of great opportunity for enterprise IT, but there will be plenty of adoption-related matters to navigate. Here are four big areas enterprises will have to deal with as cloud continues to take the world by storm:

  1. Hybrid cloud will continue to dominate

Hybrid cloud will rocket up the agenda, as businesses and providers alike continue to realise that there is no one-size-fits-all approach to cloud adoption. Being able to mix and match public and private cloud services from a range of different providers enables businesses to build an environment that meets their unique needs more effectively. To date, this has been held back by interoperability challenges between cloud services, but a strong backing for open application programming interfaces (APIs) and multi-cloud orchestration platforms is making it far easier to integrate cloud services and on-premise workloads alike. As a result, we will continue to see hybrid cloud dominate the conversation.

  1. Emergence of iPaaS

NASA is teaming up with IBM to host a code-a-thon for developers interested in supporting space exploration through apps

The drive towards integration of on premise applications and cloud is giving rise to Integration Platform as a Service (iPaaS). Cloud integration still remains a daunting task for many organizations, but iPaaS is a cloud-based integrations solution that is slowly and steadily gaining traction within enterprises. With iPaaS, users can develop integration flows that connect applications residing in the cloud or on premise, and deploy them without installing any hardware or software. Although iPaaS is relatively new to the market, categories of iPaaS vendors in the market are beginning to emerge, including ecommerce/B2B integration and cloud integration. With integration challenges still a huge issue for enterprises using cloud, demand for iPaaS is only set to grow over the coming months.

  1. Containers will become reality

To date, a lot of noise has been made about the possibilities of container technology, but in reality its use has yet to fully kick-off. That’s set to change as household name public clouds such as Amazon, Microsoft and Google are now embracing containers; IBM’s Blue Mix offering in particular is set to make waves with its triple-pronged Public, Dedicated and Local delivery model. Building a wave of momentum for many application and OS technology manufacturers to ride, it will now become increasingly realistic for them to construct support services around container technology. This does present a threat to traditional virtualization approach, but over time a shift in hypervisors is on the cards and container technology can only improve from this point.

  1. Cloud will be used for Data Resiliency/Recovery services

With cloud storage prices coming down drastically and continuous improvements being made to cloud gateway platforms, the focus is set to shift to cloud-powered backup and disaster recovery services. We are in an age where everything is being offered ‘as a service’; the idea of cloud-powered on-demand usability suits backup and disaster recovery services very well because they do not affect the immediate production data. As such, this should be an area where cloud use will dramatically increase over the next year.

With all emerging technologies, it takes time to fully figure out what they actually mean for enterprises, and these four cloud trends reflect that. In reality we’re only just getting started with cloud, now they understand how it works, the time has come for enterprises to turn the screw and begin driving even more benefits from it.

Written by Kalyan Kumar, Chief Technologist at HCL Technologies.

Will containers change the world of cloud?

Global Container TradeThe rise of containers as a technology has been glorious and confusing in equal measure. While touted by some as the saviour of developers, and by others as the end of VM’s, the majority simply don’t understand containers as a concept or a technology.

In the simplest of terms, containers let you pack more computing workloads onto a single server and in theory, that means you can buy less hardware, build or rent less data centre space, and hire fewer people to manage that equipment.

“In the earlier years of computing, we had dedicated servers which later evolved with virtualisation,” say Giri Fox, Director of Technical Services at Rackspace. “Containers are part of the next evolution of servers, and have gained large media and technologist attention. In essence, containers are the lightest way to define an application and to transport it between servers. They enable an application to be sliced into small elements and distributed on one or more servers, which in turn improves resource usage and can even reduce costs.”

There are some clear differences between containers and virtual machines though. Linux containers give each application, its own isolated environment in which to run, but multiple containers share the host servers’ operating system. Since you don’t have to boot up an operating system, you can create containers in seconds not minutes like virtual machines. They are faster, require less memory space, offer higher-level isolation and are highly portable.

“Containers are more responsive and can run the same task faster,” adds Fox. “They increase the velocity of application development, and can make continuous integration and deployment easier. They often offer reduced costs for IT; testing and production environments can be smaller than without containers. Plus, the density of applications on a server can be increased which leads to better utilisation.

“As a direct result of these two benefits, the scope for innovation is greater than its previous technologies. This can facilitate application modernisation and allow more room to experiment.”

So the benefits are pretty open-ended. Speed of deployment, flexibility to run anywhere, no more expensive licenses, more reliable and more opportunity for innovation.

Which all sounds great, doesn’t it?

CaptureThat said, a recent survey from the Cloud & DevOps World team brought out some very interesting statistics, first and foremost the understanding of the technology. 76% of respondents agreed with the statement “Everyone has heard of containers, but no-one really understands what containers are”.

While containers have the potential to be
the next big thing in the cloud industry, unless those in the ecosystem understand the concept and perceived benefits, it is unlikely to take off.

“Containers are evolving rapidly and present an interesting runtime option for application development,” says Joe Pynadath, ‎GM of EMEA for Chef. “We know that with today’s distributed and lightweight apps, businesses, whether they are a new start-up’s to traditional enterprise, must accelerate their capabilities for building, testing, and delivering modern applications that drive revenue.

“One result of the ever-greater focus on software development is the use of new tools to build applications more rapidly and it is here that containers have emerged as an interesting route for developers. This is because they allow you to quickly build applications in a portable and lightweight manner. This provides a huge benefit for developers in speeding up the application building process. However, despite this, containers are not able to solve the complexities of taking an application from build through test to production, which presents a range of management challenges for developers and operations engineers looking to use them.”

There is certainly potential for containers within the enterprise environment, but as with all emerging technologies there is a certain level of confusion as to how they will integrate within the current business model, and how the introduction will impact the IT department on a day-to-day basis.

“Some of the questions we’re regularly asked by businesses looking to use containers are “How do you configure and tune the OS that will host them? How do you adapt your containers at run time to the needs of the dev, test and production environments they’re in?” comments Pynadath.

While containers allow you to use discovery services or roll your own solutions, the need to monitor and manage them in an automated way remains a challenge for IT teams. At Chef, we understand the benefits containers can bring to developers and are excited to help them automate many of the complex elements that are necessary to support containerized workflows in production”

Vendors are confident that the introduction of containers will drive further efficiencies and speed within the industry, though we’re yet to see a firm commitment from the mass market to demonstrate the technology will take off. The early adopter uptake is promising, and there are case studies to demonstrate the much lauded potential, but it’s still early days.

In short, containers are good, but most people just need to learn what they are.

Containers and microservices starting to enter mainstream market – survey

MainstreamA recent survey from NGINX highlighted containers and microservices are two buzzwords which are starting to enter the mainstream market as companies target daily software releases.

While daily software releases are the ultimate goal within the industry, 70% of respondents highlighted that they are currently releasing new code only once a week, with only 28% reaching target. Barriers cited were a lack of automation tools, a constant trade-off between quality of code and expected speed of delivery, as well as a lack of central management, accountability, and collaboration tools.

Containers are now seemingly beginning to enter the mainstream as 69% of respondents said that they were either actively using or investigating the technology currently. Of the 20% using containers in production, more than a third are running more than 80% of their workloads on containers and more than half for mission-critical applications. The technology is also creating a new buyer audience for vendors, as 74% of respondents said developers were responsible for choosing development and delivery tools as opposed to heads of departments or managers.

Microservices tell a slightly different story, as while adoption levels are similar at approximately 70% currently using or investigating, the trend is weighted more towards small and medium organizations rather than the blue chips. Of the larger organizations, 26% are researching, 36% are currently using in development or production however 38% aren’t using microservices at all.

AWSThe survey also demonstrated AWS are continuing to dominate market share, accounting for 49%. Despite Google and Microsoft Azure grabbing headlines recently with a number of new client wins, acquisitions and product releases, the market seemingly still favours AWS with respondents highlighting an accessible price point as one of the most important factors when selecting a cloud provider.

Continuous integration and continuous delivery are becoming development best practices, as 27% of the respondents would now consider their organization to have a mature practise for continuous integration and delivery. On the other end of the scale, roughly a third said that they were keen to move forward with continuous integration and delivery but the necessary level of collaboration or understanding is not yet widespread in their organizations as of yet.

While the survey does demonstrate the integration of cloud-first technologies such as containers are moving beyond the early-adopter stage, it will be some time before such technologies become common place in large scale organizations were the wheels are slower to turn. Like the cloud business model, containers and microservices seem to be offering small and medium size operations the opportunity to compete with larger organizations budgets through technology innovation, agility and speed of deployment.

Docker launches DDC to support ‘container as a service’ offering

Container company Docker has announced a Docker Data Center along with the new concept of ‘containers as a service’ in a bid to extend its cloud based technology to customer sites.

The Docker Datacenter (DDC) resides on the customer’s premises and gives them a self service system for building and running applications across multiple production systems while under operations controls.

It has also announced the general availability of Docker Universal Control Plane, a service that has been undergoing beta-testing since November 2015, which underpins the running of the container as a service (CaaS).

The advantage of the DDC is that it creates a native environment for the lifecycle management for Dockerized applications. Docker claims that 12 Fortune 500 companies have been beta testing the DDC along with smaller and companies in a range of industries.

Since every company has different systems, tools and processes the DDC was designed to work with whatever the clients have got and adjust to their infrastructure without making them recode their applications, explained Docker spokesman Banjot Chanana on the Docker website. Networking plugins, for example, can be massively simplified if clients use Docker to define how app containers network together. They can do this by choosing from any number of providers to provide the underlying network infrastructure, rather than have to tackle the problem themselves. Similarly, connecting to an internal storage infrastructure is a lot easier. Application programming interfaces provided by the on site ‘CaaS’ allow developers to move stats and logs in and out of logging and monitoring systems more easily.

“This model enables a vibrant ecosystem to grow with hundreds of partners,” said Chanana, who promised that Docker users will have much better options for their networking, storage, monitoring and workflow automation challenges

Docker says its DDC is integrated with Docker’s commercial Universal Control Plane and Trusted Registry software. It achieved this with open source Docker projects Swarm (orchestration), Engine (container runtime), Content Trust (security) and Networking. Docker and its partner IBM provide dedicated support, product engineering teams and service level agreements.