<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=278116885016877&amp;ev=PageView&amp;noscript=1">

, , , ,

Dec 22, 2020 | 3 Minute Read

Managing Docker Containers Using Ansible

Table of Contents

Introduction

Containers have taken over DevOps processes for good. This optimized and highly efficient standard unit of software empowers developers to pack, ship and run their applications quickly and reliably. 

Containers work by isolating the software from its environment and ensure standardization by separating application dependencies from infrastructure. However, too many containers can be confusing and need management. Let’s dig into the core of these two technologies/tools and learn how to utilize them.

Docker and Ansible

Docker is an executable package built on a highly optimized platform for running software on containers. Separated as bundles, these containers have their own libraries and configuration files and they communicate with each other through well-defined channels. 

An automation tool by Red Hat, Ansible, aims to simplify tasks like configuration management, application deployment, and intra-service orchestration.

With its simple architecture, Ansible provides a robust set of features and built-in modules that facilitate writing automation scripts. This allows you to write tasks and execute them inside environments of your choice. Ansible uses YAML language in the form of playbooks. Ansible has a number of built-in modules and plugins, which makes it easy to perform complex tasks using simple YAML code.

Why automate Docker containers with Ansible?

Automating Docker containers in your environment with Ansible leads to:

 

Flexibility

Building a container with an Ansible Playbook gives you the flexibility to reproduce your Docker environment and other formats like Vagrant, on a cloud instance of your choice or bare metal servers. 

 

Auditability

Using Ansible provides a simple, repeatable, defined state of your containers that can be easily tracked for any potential vulnerabilities, along with tracking actions like who deployed what container, with what code, where, and when.

 

Ubiquity

Ansible deploys not just containers but the non-containerized environment components around these, such as storage, database, networking, etc. Thus, Ansible covers the full environment, from local machines to full cloud infrastructures. 

What you need

To make this combination work on your system, you'll need the following before you begin:

  • Docker installed on the local system
  • Ansible 2.4 or higher installed and running on your local machine
  • Passwordless SSH connection between your local machine and remote server
  • A user with sudo privileges

Note: Using Docker modules requires having the Docker SDK for Python installed on the host running Ansible. You will need to have >= 1.7.0 installed for Python 2.7 or Python 3.

Process

Once you have all prerequisites set up on your system, you can begin running playbooks. We are following a specific directory structure to manage our Dockerfiles, playbooks, inventory, etc. You can use any directory structure to manage your code as per your convenience, but it is recommended to follow Ansible best practices.

When it comes to managing different environments like dev, staging and production, it's good to differentiate environment-specific properties within your inventory file; for example, you might have different ports for development, staging, and for production.

So rather than writing a Docker Compose for each environment, you can easily use the same set of code, whether that is Playbook code or role for each and every environment.

Run the Ansible Playbook

We have a sample playbook and Dockerfile for creating an NGINX image and pushing the same to the Docker Hub. Further, we are creating containers to deploy a sample Drupal application on a remote server using Ansible from our local system.

./nginx/Dockerfile

playbook.yml

view raw playbook.yml hosted with ❤ by GitHub

./inventory/dev/host.yml

 

This is a very simple Dockerfile to build an image for an NGINX web server. Under ./inventory, we have inventory files for all environments like dev, stage, prod, etc. In ./inventory/dev/host.yml inventory file, we have our host defined and some variables specific to the dev environment since you can have different values for these variables for other environments. Also, note that the mysql_root_pass variable value is encrypted using Ansible vault. 

Now, to run our playbook on our dev server, we can use this command:

ansible-playbook -i inventory/dev playbook.yml --ask-vault-pass

You will need a vault password that you used to encrypt your secret. This playbook will install the Docker package and its dependencies on the dev server and will create containers to set up your Drupal application along with the MySQL database.

This was a simple playbook to build an image and create some Docker containers. This code can be used for each environment if you want to duplicate a similar setup to any other environment; you just need to make sure you have updated the required values in your environment-specific inventory file. You can orchestrate and manage more complex containerized environments using Ansibmale, like Docker Swarm based environments as well.

Docker has emerged as a great business tool for the DevOps industry; here's the beginner's guide to Docker and Kubernetes to grasp more on the technology. 

 
About the Author
Jayati Kataria, Axelerant Alumni
About the Author

Jayati Kataria, Axelerant Alumni


Back to Top