4 Jan 2025
10 min read

What is Docker: Docker basics

Docker: Simplifying tech integration by bridging development, deployment, and scaling across all platforms.
SS
Sanjay Senthilkumar
Founder & CEO
Header image

Introduction to Docker

Docker is a platform that enables the encapsulation of software applications within units known as containers. These containers can be visualized as packages that encompass all the necessary components an application requires to execute seamlessly in any environment.

This concept differs from virtual machines, which emulate an entire computer system. While virtual machines replicate hardware, containers mirror the operating system, resulting in a more efficient and faster execution of applications.

Docker is composed of three primary elements:

  • The Dockerfile: This is a script that outlines the steps to create an image.
  • The Image: This is a static snapshot of the application along with its dependencies.
  • The Container: This is the active instance of the application.

Docker’s versatility allows developers to build, deploy, and manage applications across various platforms, including personal computers, web-based platforms, or distributed systems. This flexibility enhances productivity and ensures consistent performance across different environments.

image

Docker Basics

Docker terminology

Dockerfile: A text file that contains instructions for building a Docker image.

Image: A read-only template with instructions for creating a Docker container. It’s a snapshot of a container.

Container: A runnable instance of an image. You can create, start, stop, move, or delete a container using Docker API or CLI.

Docker Hub: A cloud-based registry service where you can link to code repositories, build your images and test them, store manually pushed images, and link to Docker Cloud.

Docker Compose: A tool for defining and running multi-container Docker applications. With Compose, you use a YAML file to configure your application’s services.

Docker Swarm: A native clustering and scheduling tool for Docker. Swarm allows IT administrators and developers to create and manage a virtual system of Docker nodes and schedule containers.

Volume: A specially-designated directory within one or more containers that bypasses the Union File System to provide several useful features for persistent or shared data.

Service: In a distributed application, different pieces of the app are called “services.” For example, if you imagine a video sharing site, it probably includes a service for storing application data in a database, a service for video transcoding in the background after a user uploads something, a service for the front-end, and so on.

Node: In the context of Docker, a node is a physical or virtual machine that is running Docker Engine.

Stack: A stack is a group of interrelated services that share dependencies, and can be orchestrated and scaled together. A single stack is capable of defining and coordinating the functionality of an entire application (though very complex applications may want to use multiple stacks).

Understanding Dockerfile

A Dockerfile is a text-based document filled with a sequence of instructions to construct a Docker image. It acts as the architectural plan for creating Docker images and is vital to Docker’s functionality.

The Dockerfile employs a specialized language and outlines the base image to employ, any supplementary software to install, and any configuration modifications to implement. When the

docker build

Understanding Docker Image

A Docker image serves as a read-only blueprint, encompassing a series of directives to instantiate a container capable of executing on the Docker platform, offering an efficient method to package up applications and pre-configured server environments, which can be utilized privately or disseminated publicly to other Docker users, constituting the initial step for Docker novices.

image

Using Docker Hub

Docker Hub is a cloud-based repository service provided by Docker for finding and sharing container images with your team and the Docker community. It’s where you can push or pull Docker images, making it a centralized resource for Docker images.

Here’s how you can use Docker Hub:

  1. Create a Docker Hub Account: To start using Docker Hub, you need to create an account on the Docker Hub website.
  2. Search for Docker Images: Once you have an account, you can search for Docker images that have been uploaded by other users. You can find images for different operating systems, databases, or applications.
  3. Pull Docker Images: After finding an image you want to use, you can pull it to your local machine using the docker pull command followed by the name of the image.
  4. Push Docker Images: If you’ve created your own Docker image, you can push it to Docker Hub. This allows other users to pull and use your image. To push an image, use the docker push command followed by the name of the image.
  5. Manage Docker Images: Docker Hub also allows you to manage your Docker images. You can create repositories to organize your images and control who has access to them.

Docker Hub is not just a repository for images but also a place where you can build, store, and distribute your images. It’s a fundamental tool for anyone working with Docker and containerization technology.

image

Writing a basic Dockerfile

A Dockerfile is a script composed of text-based instructions, which serves as a blueprint for creating a Docker image. Here’s an example of a Dockerfile for a Node.js application:

1  # Use the official Node.js 14 image from Docker Hub as the base image
2    FROM node:14
3    
4    # Set the working directory in the Docker image
5    WORKDIR /usr/src/app
6    
7    # Copy package.json and package-lock.json to the working directory
8    COPY package*.json ./
9    
10    # Install the application dependencies using npm
11    RUN npm install
12    
13    # Copy the rest of the application source code to the working directory
14    COPY . .
15    
16    # Expose port 8080 to have it mapped by the Docker daemon
17    EXPOSE 8080
18    
19    # Define the command to run the application
20    CMD ["npm", "start"]
21
22

Here’s a breakdown of the Dockerfile:

  • FROM node:14: This instruction sets the base image for the Docker image. In this case, it’s using the official Node.js 14 image from Docker Hub.
  • WORKDIR /usr/src/app: This instruction sets the working directory inside the Docker image. All subsequent instructions operate within this directory.
  • COPY package*.json ./: This instruction copies the package.json and package-lock.json files from your project to the Docker image.
  • RUN npm install: This instruction installs the dependencies defined in the package.json file.
  • COPY . .: This instruction copies the remaining project files into the Docker image.
  • EXPOSE 8080: This instruction informs Docker that the container listens on the specified network port at runtime.
  • CMD ["npm", "start"]: This instruction provides defaults for executing the container. In this case, it’s starting the application.

This Dockerfile is a basic example.

Building a Docker Image from a Dockerfile

Create a Dockerfile

A Dockerfile is a text-based document that houses a sequence of instructions Docker utilizes to construct an image. It essentially serves as the blueprint for your application.

Define the Base Image

FROM node:14

Within the Dockerfile, you’ll need to specify the base image you’ll be working from using the FROM command. This could be a language-specific image like node:14 or python:3.7, or an OS-specific image like ubuntu:18.04.

Specify Dependencies

RUN npm install

Next, you’ll employ the RUN command to install any dependencies your application requires. For a Node.js application, this might be executed with npm install.

Copy Application Files

COPY . /app

The COPY command is utilized to transfer files from your local filesystem into the image. For instance, COPY . /app would copy all the files in your current directory to the /app directory in the image.

Define the Command

CMD ["node", "app.js"]

Finally, the CMD command is employed to specify what should be executed when a container is initiated from the image. For a Node.js application, this might be CMD ["node", "app.js"].

Build the Image

docker build -t my-app .

With the Dockerfile in place, you can now construct the image. Navigate to the directory containing the Dockerfile and execute docker build -t my-app . This instructs Docker to build an image using the Dockerfile in the current directory, and to tag the image as my-app.

This process is fundamental in Docker.

Running a Docker container

Obtain the Docker Image: The initial step in executing a Docker container is to confirm you possess the appropriate Docker image. If the image isn’t locally available, you can retrieve it from Docker Hub or another Docker image registry using the command:

docker pull <image_name>

For instance, to obtain an Ubuntu image, you would use:

docker pull ubuntu

Initiate the Docker Container: Once the image is accessible, you can initiate the container. This is achieved using the command:

docker run -it -d --name <container_name> <image_name>

This command generates and starts a new container in the background. For example, to run the Ubuntu image in a container named ‘my_container’, you would use:

docker run -it -d --name my_container ubuntu

Engage with the Active Container: If you need to engage with the active container, you can utilize the docker exec command. The command:

docker exec -it <container_ID_or_name> /bin/bash

Will launch a bash shell within the container that you can interact with. For instance, to engage with ‘my_container’, you would use:

docker exec -it my_container bash

Remember to substitute <image_name>, <container_name>, and <container_ID_or_name> with your specific values.

This procedure enables you to run any application encapsulated in a Docker container, ensuring uniformity across various environments.

Docker Compose

Understanding Docker Compose

Docker Compose is a tool engineered to simplify the orchestration of multi-container Docker applications. It employs a YAML file to configure the application’s services, enabling developers to initiate all services with a single command.

The primary role of Docker Compose is to streamline the management of multiple Docker containers. This proves particularly advantageous for applications composed of multiple microservices, as it facilitates each service to operate in isolation, within its own container.

By leveraging Docker Compose, developers can define an entire multi-container application within a single file, and then launch the application with a single command. This not only simplifies the development process but also enhances the portability of the application and eases its deployment.

image

Writing a docker-compose.yml file

A docker-compose.yml file is a YAML file that defines the services, networks, and volumes for a Docker application. Here’s an example of a docker-compose.yml file for a Node.js application:

1version: '3'
2    services:
3      app:
4        build: .
5        volumes:
6          - .:/usr/src/app
7        ports:
8          - '3000:3000'
9        command: npm start
10        depends_on:
11          - db
12      db:
13        image: mongo
14        volumes:
15          - mongodb_data_container:/data/db
16    volumes:
17      mongodb_data_container:

Here’s a breakdown of the docker-compose.yml file:

  • version: '3': This line specifies the version of the Docker Compose file format.
  • services:: This section describes the services to run.
  • app: and db:: These are the names of the services.
  • build: .: This line tells Docker to build the Docker image for the app service using the Dockerfile in the current directory.
  • volumes:: This line specifies which directories to mount for persistent data storage.
  • .:/usr/src/app: This line mounts the current directory (where your Node.js application resides) to /usr/src/app in the container.
  • ports:: This line specifies which ports to expose. In this case, it’s mapping port 3000 inside the Docker container to port 3000 on the host machine.
  • command: npm start: This line is the command that starts your Node.js application.
  • depends_on:: This line specifies that the app service depends on the db service.
  • image: mongo: This line specifies that the db service should use the mongo image from Docker Hub.
  • mongodb_data_container:/data/db: This line creates a named volume for MongoDB data persistence.
  • volumes: at the root level is used to declare named volumes.

This docker-compose.yml file is a basic example.

Running Multi-Container Applications

With Docker compose

Define the Application

Start by outlining your application’s services in a docker-compose.yml file. Each service symbolizes a container. You’ll need to specify the image to use, the ports to expose, any volumes to mount, and other configuration specifics for each service.

Build the Services

Once your docker-compose.yml file is set up, navigate to the directory containing the file and use the docker-compose build command to construct all the services. This command interprets the docker-compose.yml file and builds Docker images for each service.

Run the Application

After building the services, use the docker-compose up command to initiate the application. This command starts all the services as defined in the docker-compose.yml file.

Interact with the Application

With the application running, you can interact with it just as you would if it was running outside of Docker. If you’ve exposed any ports in your docker-compose.yml file, you can access the services via these ports.

Stop the Application

When you’re done, use the docker-compose down command to stop and remove all the services.

Running multi-container applications is a common scenario in the development and deployment of modern web services. Docker Compose simplifies this process by allowing the definition of multi-container applications using a YAML file.

Advanced Topics in Docker

Docker is a powerful platform with a wide range of applications. Here are some advanced topics that can enhance your understanding and usage of Docker:

  1. Docker Networking: Docker allows for custom networks to be created, providing better control over how containers communicate with each other. This feature is crucial for managing the communication paths between different services.
  2. Data Management in Docker: Understanding volumes and bind mounts in Docker is essential for persistent data storage and sharing data among containers. This knowledge is key to managing data within and across Docker containers.
  3. Docker Compose: Docker Compose is a tool for defining and running multi-container Docker applications. It uses a YAML file to configure the application’s services, simplifying the process of managing multi-container applications.
  4. Docker Swarm: Docker Swarm is a native clustering and scheduling tool for Docker. It allows IT administrators and developers to create and manage a virtual system of Docker nodes and schedule containers, making it easier to manage large-scale Docker deployments.
  5. Docker Secrets: Docker Secrets is a secrets management tool specifically designed for Docker Swarm. It allows you to securely store and manage sensitive information, enhancing the security of your Docker applications.
  6. Optimizing Docker Images: Techniques for reducing Docker image sizes, such as multi-stage builds, can lead to more efficient deployment and faster startup times. This is important for optimizing the performance of your Docker applications.
  7. Continuous Integration/Continuous Deployment (CI/CD) with Docker: Docker can be integrated with popular CI/CD tools, which helps automate the testing and deployment of your applications. This integration is key to implementing modern DevOps practices.
  8. Security in Docker: Understanding the security features of Docker, such as user namespaces, seccomp profiles, and capabilities, can help you run containers securely. This knowledge is crucial for maintaining the security of your Docker applications.

These topics delve into the more complex aspects of Docker and can help you leverage Docker’s full potential in your projects. As you continue to explore Docker, these advanced topics will provide deeper insights and open up new possibilities for using Docker.

image

Further Resources for Learning

Here are some concise recommendations for Docker learning resources:

  • “Docker: Up & Running” by Karl Matthias and Sean P. Kane
  • “The Docker Book” by James Turnbull
  • “Docker in Action” by Jeff Nickoloff and Stephen Kuenzli
  • “Docker in Practice” by Ian Miell and Aidan Hobson Sayers
  • “Using Docker” by Adrian Mouat
  • “Mastering Docker” by Russ McKendrick and Scott Gallagher

These books cover a range of topics from Docker basics to advanced use cases. Happy reading!😊.

Share this post
Devops

Related Learn

Interviews, tips, guides, industry best practices, and news.
Office setting
Design
8 min read

UX review presentations

How do you create compelling presentations that wow your colleagues and impress your managers?
Read post
Man working at desk
Product
8 min read

Migrating to Linear 101

Linear helps streamline software projects, sprints, tasks, and bug tracking. Here’s how to get started.
Read post
Man pinning images on wall

Building your API Stack

The rise of RESTful APIs has been met by a rise in tools for creating, testing, and managing them.
Read post
Mountains
Product
8 min read

PM mental models

Mental models are simple expressions of complex processes or relationships.
Read post
Desk with computer
Product
8 min read

Our top 10 Javascript frameworks to use

JavaScript frameworks make development easy with extensive features and functionalities.
Read post