Docker in DevOps Ecosystem
August 2018
Short introduction to DevOps and Continuous Delivery
Before I begin on this topic, allow me to brief you about DevOps and Continuous Delivery which has been widely adapted by many Organizations to overcome some of the challenges being faced while delivering the Product.
“DevOps is a methodology widely used in Software Development to improve the communication, cooperation, collaboration, integration and automation between Developers and Operations team thereby maximizing the efficiency and predictability”
Gone are the days where in Developers and Operations (also IT teams) worked in an environment insulating from each other. It is time for Organizations to embrace new processes and methodology that brings individual teams together to adapt to the changing IT world which is moving towards automation in almost everything.
DevOps was conceptualized and developed at Flickr® when they had to happen to do almost 10 deployments per day to meet the highly demanding business requirements. In fact DevOps involves various stakeholders that follows defined processes as part of the Continuous Delivery (CD), a Software Engineering practice that depends heavily on Continuous Integration (CI) and Continuous Testing (CT) helping teams to produce and deliver valuable software in short cycles ensuring the reliability of the same.
Now, you must be wondering why I briefed about DevOps and then moved on to Continuous Delivery before getting to Docker. Well, please note that to achieve any processes and engineering practices in software development one needs a strong set of tools. Yes, you are absolutely right - Docker is a tool that compliments DevOps in automating the configuration and deployment aspects.
Organizations should embrace CD in their delivery approach to stay competitive in the market
“However, it is easier said than done”. Organizations go through lot of pain points (not limited to):
Seamless communication between Development and Operations team due to increase in the velocity of deployments
Shipping releases at the pace expected in today’s demanding world
Developers do not understand the complexity of Production environment and Operations team sometimes do not understand the internals of software that is shipped to them
Tools evaluation and fitment of those according to adapted processes in the deployment pipeline
Many more…
So what is Docker?
Docker is an open source framework (licensed under Apache 2.0 license) that allows Developers to package an application with all of its dependencies (including libraries and binaries) into a standardized unit for software development.
Docker let developers ship their code in a self-contained runtime environment so that their apps can be easily shipped to different environments
Yes, it is natural to compare Docker Container with Virtual Machines. Here is the real comparison
Docker offers virtualization at the Operating System level wherein the containers run user space on top of OS kernel thus making it lightweight and faster
Docker with its lightweight container virtualization platform combined with workflows and tooling can help you manage and deploy applications. This same lightweight nature of containerization can also mean a lot in several ways:
Packaging applications and its dependencies into Docker containers
Distributing and Shipping of those containers to teams for further development and testing
Deploying those applications into production environment, whether it is on premise or on the cloud
Docker Architecture
Source: https://docs.docker.com/get-started/overview/
Docker is client-server based architecture. It has the following components:
Docker Daemon - It runs on host machine. User does not directly interact with the daemon but instead through Docker Client
Docker Client - It is the primary user interface to Docker in the form of docker binary. It accepts commands from user and communicates back and forth with Docker daemon.
Docker Engine - This comprises of further more components
Docker images - A Docker image is a read-only template. Images are used to create Docker containers. Docker provides a simple way to build new images or update existing images, or you can download Docker images that other people have already created. Docker images are the build component of Docker.
Docker registries - Docker registries hold images. These are public or private stores from which you upload or download images. The public Docker registry is called Docker Hub. It provides a huge collection of existing images for your use. These can be images you create yourself or you can use images that others have previously created. Docker registries are the distribution component of Docker.
Docker containers - Docker containers are similar to a directory. A Docker container holds everything that is needed for an application to run. Each container is created from a Docker image. Docker containers can be run, started, stopped, moved, and deleted. Each container is an isolated and secure application platform. Docker containers are the run component of Docker.
Docker has two major components,
Docker - Open source container virtualization platform
Docker Hub - Software-as-a-Service platform for sharing and managing Docker containers
There is a dedicated community and wide range of adaptions by the IT industry which strongly supports the standardization of some of the elements surrounding Development and Deployment using Docker. It is also supported by cloud platforms including Amazon Web Services®, Google Cloud®, Microsoft Azure®, and Red Hat® who is now working to create Open Container standards.
“This standardization should ideally eliminate the Vendor Lock-in factor that has been seen as a major burden while embracing any cloud platform”
Why Docker matters for DevOps?
Docker can be used to solve some real-world organizational problems thus enabling them to ship better software. Let’s take couple of use cases that is quite common in any Software development
Evolving Application Architecture
Gone are the days where applications use to be simple and lightweight, having one application per server, and database instance either running locally or remotely. The application architecture has evolved from being standalone to distributed architecture where applications are horizontally scaled to multiple instances, distributed database instances, polyglot databases, 3rd party web services, identity management, etc. With growing complexity, it has become best practice to build complex distributed applications using independent microservices thereby slowly and gradually moving away from being monolithic. This invariably increased the complexity of dependencies and unique deployment scenarios which in turn made Operations management even more difficult
“Docker makes it easy to build, deploy and manage applications and its dependencies with ease, and also make them cross-platform compatible thus enabling them to run on prem as well as on cloud”
Provisioning of Development Environment
From providing dedicated environments to the Dev and QA team, it has been nightmare for operations team to manage infrastructure overheads including production and non-production environments. There were several interim approaches taken such as shared environment to virtualization, however it brought additional complexities. However, the cloud computing made it easy to hosting a development environment for you. With cheap on-demand resources available on cloud, it became the de-facto standard for Organizations to have their DEV and QA environments running on cloud.
“Docker containers share the same kernel and core operating system files which make them lightweight and extremely fast. Using Docker to manage containers makes it easier to build distributed systems by allowing applications to run on a single machine or across many virtual machines with ease”
Conclusion
Docker is a tool that plays an important role in the DevOps ecosystem by simplifying and automating the infrastructure dependencies that the applications and services would need. This enables Organizations to achieve Continuous Delivery and Continuous Deployment, leading to short-release cycles and eventually resulting in faster-time-to-market and stay competitive in the fast evolving world. And, it eliminates the operational complexities involved, thus resulting in overall cost savings for an Organization