10 Things to Know About Docker

DockerIt’s possible that containers and container management tools like Docker will be the single most important thing to happen to the data center since the mainstream adoption of hardware virtualization in the 90s. In the past 12 months, the technology has matured beyond powering large-scale startups like Twitter and Yelp and found its way into the data centers of major banks, retailers and even NASA. When I first heard about Docker a couple years ago, I started off as a skeptic. I blew it off as skillful marketing hype around an old concept of Linux containers. But after incorporating it successfully into several projects at Spantree I am now a convert. It’s saved my team an enormous amount of time, money and headaches and has become the underpinning of our technical stack.

If you’re anything like me, you’re often time crunched and may not have a chance to check out every shiny new toy that blows up on Github overnight. So this article is an attempt to quickly impart 10 nuggets of wisdom that will help you understand what Docker is and why it’s useful.

Docker is a container management tool.

Docker is an engine designed to help you build, ship and execute applications stacks and services as lightweight, portable and isolated containers. The Docker engine sits directly on top of the host operating system. Its containers share the kernel and hardware of the host machine with roughly the same overhead as processes launched directly on the host machine.

But Docker itself isn’t a container system, it merely piggybacks off the existing container facilities baked into the OS, such as LXC on Linux. These container facilities have been baked into operating systems for many years, but Docker provides a much friendlier image management and deployment system for working with these features.

 

Docker is not a hardware virtualization engine.

When Docker was first released, many people compared it to virtual machine hypervisors like VMWare, KVM and Virtualbox. While Docker solves a lot of the same problems and shares many of the same advantages as hypervisors, Docker takes a very different approach. Virtual machines emulate hardware. In other words, when you launch a VM and run a program that hits disk, its generally talking to a “virtual” disk. When you run a CPU-intensive task, those CPU commands need to be translated to something the host CPU understands. All these abstractions come at a cost: two disk layers, two network layers, two processor schedulers, even two whole operating systems that need to be loaded into memory. These limitations typically mean you can only run a few virtual machines on a given piece of hardware before you start to see an unpleasant amount of overhead and churn. On the other hand, you can theoretically run hundreds of Docker containers on the same host machine without issue.

All that being said, containers aren’t a wholesale replacement for virtual machines. Virtual machines provide a tremendous amount of flexibility in areas where containers generally can’t. For example, if you want to run a Linux guest operating system on top of a Windows host, that’s where virtual machines shine.

 

Download the whitepaper to read the rest of the list of 10 Things You Need to Know About Docker

 

 

 

 

Whitepaper by Cedric Hurst, Principal at Spantree

[slides] Accelerating DevOps with @Actifio | @DevOpsSummit #DevOps #Docker #Containers #Microservices

Providing the needed data for application development and testing is a huge headache for most organizations. The problems are often the same across companies – speed, quality, cost, and control. Provisioning data can take days or weeks, every time a refresh is required. Using dummy data leads to quality problems. Creating physical copies of large data sets and sending them to distributed teams of developers eats up expensive storage and bandwidth resources. And, all of these copies proliferating the organization can lead to inconsistent masking and exposure of sensitive data.

But some organizations are adopting a new method of data management for DevOps that is delivering transformational business outcomes in faster time to market, lower costs, and great control.
In his session at DevOps Summit, Brian Reagan, Managing Director of Blackthorne Consulting Group, an Actifio company, reviewed the core concepts of using data virtualization to power DevOps, including Central Administration by Operations, Self-Service for developers and testers, and Automating Data Masking for enhanced control of sensitive information.
He shared real life case studies and provided a practical perspective on the benefits and considerations for these projects.

read more

Microsoft buys FieldOne in field service management software play

Microsoft has acquired FieldOne to strengthen its Dynamics CRM offering

Microsoft has acquired FieldOne to strengthen its Dynamics CRM offering

Microsoft has acquired field service management FieldOne Systems in a move aimed at complementing its Dynamics CRM customer service capabilities.

The cloud-based field service management software is already built on Microsoft technology on the back and front-end (Dynamics CRM), making integration with Office 365 somewhat more straightforward than it would be otherwise.

“Their industry-leading solution specializes in delivering a full set of capabilities that include work order management, automated scheduling, asset contract, inventory and procurement management, workflow capabilities and mobile collaboration – providing enterprises with a comprehensive modern field service solution,” explained Bob Stutz, corporate vice president of Microsoft Dynamics CRM.

“FieldOne is a great fit for Dynamics CRM adding to our extensive customer service capabilities – which includes chat, knowledge management and self-service functionality from Parature which we acquired in January of 2014.  Like Parature, FieldOne is offered to customers as a cloud service. It’s built on Microsoft technology for fast integration, it already works great with other Microsoft productivity offerings like Office 365 and SharePoint, and has cross-platform capabilities meaning it can work on different devices enhancing the mobile experience which is so critically important in field service management.”

Microsoft said the FieldOne acquisition is a “major step” towards helping it round off its customer services software portfolio. The move is reminiscent of a similar acquisition made last year by Oracle when the database and ERP giant bought TOA technologies, which it rolled into its Service Cloud offering.

The Question Has Been Answered. SDN Is Secure | @CloudExpo #Cloud #SDN

Software-Defined Networking (SDN) is one of the most interesting developments in networking to emerge in the last decade. The potential to establish a simplified infrastructure and leverage software to dynamically modify existing flow characteristics has the potential to address many concerns around hardware costs, faster service provisioning, and greater configuration control across diverse networks. However, concern for or lack of information about security is a key inhibitor to SDN adoption in today’s rapidly-evolving data centers and connected wide area network environments.

read more

Google joins OpenStack to build bridges between public and private clouds

Google has joined the OpenStack Foundation, a big sign of support for the open source software organisation

Google has joined the OpenStack Foundation, a big sign of support for the open source software organisation

Google has officially signed up to sponsor the OpenStack Foundation, the first of the big three – Google, Microsoft and AWS – to formally throw its weight behind the open source cloud orchestration software. Analysts believe the move will improve support for Linux containers across public and private cloud environments.

Google has already set to work integrating Kubernetes with OpenStack with pure-play OpenStack software vendor Mirantis, a move the company said would help bolster its hybrid cloud capabilities.

While the company has had some engineers partnering with the Foundation on Magnum and Murano, container-focused toolsets baked into the open source platform, Google said it plans to significantly bolster the engineering resource it devotes to getting Linux containers – and particularly its open source scheduling and deployment platform Kubernetes – integrated with OpenStack.

The formal sign of support from such a big incumbent in the cloud space is a big win for OpenStack.

“We are excited about becoming active participants in the OpenStack community,” said Craig McLuckie, product manager at Google. “We look forward to sharing what we’ve learned and hearing how OpenStack users are thinking about containers and other technologies to support cloud-native apps.”

Mark Collier, chief operating officer of the OpenStack Foundation said: “OpenStack is a platform that frees users to run proven technologies like VMs as well as new technologies like containers. With Google committing unequaled container and container management engineering expertise to our community, the deployment of containers via proven orchestration engines like Kubernetes will accelerate rapidly.”

Although Google has a long history of open sourcing some of the tools it uses to stand up its own cloud and digital services like search it hasn’t always participated with many open source forums per se.

In a sense Kubernetes marked a departure from its previous trajectory, and as Ovum’s lead software analyst Laurent Lachal explained to BCN, it seems to be focusing on containers as a means of building a bridge between private and public clouds.

“Google knows that it needs to play nice with cloud platforms like OpenStack and VMware, two platforms that are primarily private cloud-centric, if it wants to get workloads onto its public cloud,” he explained.

“Joining OpenStack is exactly that – a means to building a bridge between private and public clouds, and supporting containers within the context of OpenStack may be both a means of doing that and generating consensus around how best to support containers in OpenStack, something that could also work in its favour.”

“There’s also a big need for that kind of consensus. Currently, everyone wants to join the containers initiatives in the open source project but there isn’t much backing for one particular way of delivering the container-related features users need,” he added.

[slides] Storage for Docker Containers By @OnModulus | @DevOpsSummit #DevOps #Docker #Containers #Microservices

Learn how to solve the problem of keeping files in sync between multiple Docker containers.
In his session at 16th Cloud Expo, Aaron Brongersma, Senior Infrastructure Engineer at Modulus, discussed using rsync, GlusterFS, EBS and Bit Torrent Sync. He broke down the tools that are needed to help create a seamless user experience.
In the end, can we have an environment where we can easily move Docker containers, servers, and volumes without impacting our applications? He shared his results so you can decide for yourself.

read more

Log-level Usage Reports By @TrevParsons | @DevOpsSummit #DevOps #Docker #Microservices

Log data provides the most granular view into what is happening across your systems, applications, and end users. Logs can show you where the issues are in real-time, and provide a historical trending view over time. Logs give you the whole picture.
Separating the signal from the noise is one of the biggest challenges when dealing with machine-generated log data today and has generally required deep technical expertise. However once you find that signal it can be massively useful and can help you make business decisions with a big impact. Today Logentries is announcing log level usage reports, which is one more way we are striving to do the hard work so you don’t have to!

read more

After support expires, many organisations still cling to Windows Server 2003

(c)iStock.com/kieferpix

Figures from the Cloud Industry Forum (CIF) reveal more than half (51%) of organisations polled with between 21 and 200 employees and almost three quarters (72%) of firms with more than 200 employees are still reliant on Windows Server 2003, despite the support from Microsoft having expired on July 14.

Naturally, this may not be the most surprising news you will read today. Back in April, this publication spoke with hybrid IT provider Zynstra and cloud storage provider CTERA Networks and concluded many organisations were leaving it until the last minute to finalise data migration plans. As Nick East, Zynstra CEO, argued: “The most expensive upgrade strategy is the one that you do in crisis – you’ve done it after the event, so you really don’t have any options.”

Small businesses have been steadily migrating away from WS2003, the CIF argues, and the key factor in larger organisations’ reluctance to move is complexity. Jon Seddon, head of product at Outsourcery, a founder member of the CIF, said: “When we consider how integral Windows Server 2003 has been to businesses’ IT for the past decade, and the layers that have built up on the operating system during that time, the task of moving away from it can be a daunting one.”

Seddon echoed East’s ‘don’t bury your head in the sand’ ethos when he added: “It’s somewhat understandable that the proportion of larger organisations still using Windows Server 2003 hasn’t shown much movement in the run up to the end of support deadline. But doing nothing is clearly not an option, and those still using the operating system past [the] deadline face significant risks to the security of their data, their productivity and the ability to remain competitive.”

The research, of 250 UK IT decision makers, found a quarter of organisations with fewer than 20 employees have upgraded from WS2003 in the past year. 44% of those organisations claim to still use the outdated server in some aspect, down from 58% in 2014.

In May, the CIF prognosticated over how adoption of cloud services would increase as more firms flew away from WS2003, arguing by early 2016 86% of UK-based firms will formally use at least one cloud service. Back then, 58% of companies polled overall by the advisory board were using the server, down only 2% from the year before.

For the cloud service providers, the CIF notes a word of caution, adding they need to show businesses they can be trusted with the most intimate of corporate data. East argued similarly, saying the software industry should “do a better job” of making the software lifecycle more transparent for companies. Organisations do not need to migrate data away from WS2003 in one hit, he added; one of his customers utilised a “divide and conquer” approach, where one particularly tricky application to move over was using Server 2003 on a virtualised machine.