Docker is the most popular containerization software, but not everyone uses it efficiently. If you don’t follow Docker best practices, you may leave your apps vulnerable to security issues or performance problems.

Here are some best practices you can adopt, to use Docker features resourcefully. These measures improve security and ensure you create maintainable Docker files.

Docker official images page

1. Use Official Docker Images

When containerizing your application, you must use a Docker image. You can build an image with custom configuration or use Docker’s official images.

Building your own images requires that you handle all the configuration yourself. For example, to build an image for a node.js application, you must download node.js and its dependencies. The process is time-consuming and may not result in all the correct configuration.

Docker images various tags

Docker recommends you use an official node.js image which comes with all the correct dependencies. Docker images have better security measures, are lightweight, and are tested for various environments. You can find the official images onDocker’s official imagespage.

2. Use Specific Versions of Docker Image

Usually, when you pull an official image, it’s the one with the latest tag which represents the latest updated version of that image. Every time you build a container from that image, it’s a different version of the last container.

Building with different Docker image versions can cause unpredictable behavior in your application. The versions may clash with other dependencies and eventually cause your app to fail.

results of scanning a Docker image

Docker recommends that you pull and build using images of a specific version. Official images also have documentation and cover the most common use cases.

For example, instead ofdocker pull alpine, usedocker pull alpine:3.18.3.Docker will pull that specific version. You can then use it in successive builds, reducing errors in your application. You can find the particular versions of images on the official Docker image page, underSupported tags and respective Dockerfile links:

Apline image versions for PostgreSQL

3. Scan Images for Security Vulnerabilities

How can you determine that an image you want to build with has no security vulnerabilities? By scanning it. you may scan Docker images using the docker scan command. The syntax is as follows:

You must first log in to docker to scan an image.

Alpine image layers

Then, scan the specific image you want to check:

A tool calledSynkscans the image, listing any vulnerabilities according to their severity. You can see the type of vulnerability and links to information about it, including how to fix it. You can tell from the scan whether the image is secure enough for your application.

4. Use Small-Sized Docker Images

When you pull a Docker image, it comes with all system utilities. This increases the image size with tools you don’t need.

Large Docker images take up storage space and can slow down the runtime of containers. They also have a greater possibility of security vulnerabilities.

You canreduce the size of Docker images using Alpine images. Alpine images are lightweight and come with only the necessary tools. They reduce storage space, making your application run faster and more efficiently.

You will find an Alpine version for most of the official images on Docker. Here’s an example of Alpine versions for PostgreSQL:

5. Optimize Caching Image Layers

Each command in a Dockerfile represents a layer on the image. The layers have different utilities and perform various functions. If you look at the official images on Docker Hub, you will see the instructions used to create them.

The Dockerfile includes everything you need to create the image. It’s one of the reasons why manydevelopers prefer Docker to virtual machines.

Here’s the structure of an example Alpine image:

When you build your application based on an image, you’re adding more layers to the image. Docker runs instructions on a Dockerfile from top to bottom, and if a layer changes, Docker has to rebuild subsequent layers.

The best practice is to arrange your Dockerfile from the least changing files to those that change most often. The instructions that don’t change, like installation, can be at the top of the file.

When you change a file, Docker builds from the changed files and caches the unchanged files above it. Therefore, the process runs faster.

Look at the example illustrated in the picture above. If there’s a change in the application files, Docker builds from there; it doesn’t have to install npm packages again.

If you build from the image, the process will run faster than re-building all the other layers afresh. Caching also speeds up pulling and pushing images from Docker Hub.

7. Use a .dockerignore File

When building an image using a Dockerfile, you may wish to keep certain information private. Some files and folders may be part of the project, but you don’t want to include them in the build process.

Using a .dockerignore file reduces the image size considerably. This is because the building process only includes the necessary files. It also helps to keep files private and to avoid exposing secret keys or passwords.

The .dockerignore file is a file you create in the same folder as your Dockerfile. It’s a text file, much likea .gitignore file, that contains the names of any files you don’t want to include in the build process.

Here’s an example:

8. Use the Principle of the Least Privileged User

By default, Docker uses the root user as admin for permission to run commands, but this is bad practice. If there’s a vulnerability in one of the containers, hackers can access the Docker host.

To avoid this scenario, create a dedicated user and group. you may set the required permissions for the group to safeguard sensitive information. If a user gets compromised, you can delete them without exposing the whole project.

Here’s an example showing how to create a user and set their permissions:

Some base images have pseudo-users created in them. You can use the installed users instead of the root user permissions.

Why You Should Adopt Docker’s Best Practices

Best practices are a great way to reduce vulnerabilities and write cleaner code. There are many best practices that you can apply to each Docker feature you use.

A well-organized project makes synchronizing easier with other orchestrating tools like Kubernetes. You can start with the ones outlined in the article and adopt more as you learn Docker.