Manage your Node App using Docker Like a Pro

Lessons learned working as a DevOps engineer

Ankit Jain
Bits and Pieces

--

Docker

Docker, the most popular tech among developers. It is one of the widely adopted technology at an enterprise level because of its great containerizing feature along with many other features that highly increases the productivity of projects from development to shipping.

I am currently working as a part of DevOps team as a DevOps Engineer so I got an opportunity to make my hands dirty but yeah. It is awesome if we make use of it in a most optimized way else we will end up with our minds blow off. In this post, we’ll learn about Docker and how to make the best use of Docker.

What is Docker?

Docker is a computer program that performs operating-system-level virtualization, also known as “containerization” and a layered filesystem for shipping images.

Ahh, not this, that’s hard to understand with all the jargons in a definition. So, In simpler words, Docker is a tool to easily create, deploy, and run applications by using containers that are independent of the OS. A container packages the application service or function with all of the libraries, configuration files, dependencies and other necessary parts to operate. Each container shares the services of one underlying operating system.

Now, we are thinking “How Docker is different from Virtual Machines”. Read the complete article on Docker vs Virtual Machines. I am sharing the link to my presentation of the Talk on Docker, to get acquaintance with the Docker Containers, Images, and its Features.

Tip: Share reusable components with Bit. Then you can easily find and use them in every app you’re building, and share with your team. Try it.

Create a Node.js project

Create a small Node.js project and dockerize it.

Let’s see the package.json for the project —

package.json

Run npm install to install the project dependencies. It will create a package-lock.json file. Now let’s create a index.js file.

const express = require('express');
const app = express();
app.get('/', (req, res) => {
res.send('The best way to manage your Node app using Docker\n');
});
app.listen(3000);
console.log('Running on http://localhost:3000');

Create a Dockerfile

In the previous step, we have created our project which we want to dockerize. Now comes the important part of any Docker project.

Dockerfile is the text file that contains all the necessary commands required to create a corresponding Docker Image for an app. Each command in the Dockerfile acts as a filesystem layer which can be cached. In Object Oriented Programming, We have “Classes” which are used to create the instances i.e Objects. Similarly, these Docker Images are used to create the instances i.e Docker Containers. Let’s see this through code —

Create an empty Dockerfile.

touch Dockerfile

As it is a Node.js project so we need Node to be installed in our container. So the first thing that we need to do is add the Node Image from the DockerHub within our Dockerfile. We will be using LTS version of Node.js

FROM node:8

Now, we will create a directory which contains our code. If we want to create app directory other than /app we can pass arguments to the docker during build time, hence we are using ARG instruction. Read more about ARG.

# Directory
ARG APP_DIR=app
RUN mkdir -p ${APP_DIR}
WORKDIR ${APP_DIR}

We are using Node image so it already comes with Node.js and npm already installed so we need to install the project dependencies. With the --production flag (or when the NODE_ENV environment variable is set to production), npm will not install modules listed in devDependencies.

# Install dependencies
COPY package*.json ./
RUN npm install
# For production
# RUN npm install --production

We are copying only the package*.json files rather copying the complete project files. This is because only RUN, COPY, and ADD instructions create layers so that we can take advantage of cached Docker layers. This way, when we build the image next time, Docker will look for an existing image in its cache that it can reuse, rather than creating a new (duplicate) image. This saves a lot of time during image build when we are working on big projects including a lot of npm modules.

Now, we will copy the project files in the current working directory. We will use COPY instead of ADD and it is recommended to use COPY in most cases. ADD has some extra features (like local-only tar extraction and remote URL support). Read more about ADD and COPY.

# Copy project files
COPY . .

Docker containers are isolated containers that mean we can’t directly interact with our app without exposing the port on which our project is running. EXPOSE instruction informs Docker that the container listens on the specified network ports at runtime. Read more about EXPOSE.

# Expose running port
EXPOSE 3000

Now, we have added our app with its required dependencies in the directory. Let’s add the instruction to run our app. CMD instruction sets default command and/or parameters, which can be overwritten from the command line when docker container runs. Read more about CMD.

# Run the project
CMD ["npm", "run"]

Our Dockerfile will now look like this-

Dockerfile

Build the image

We are done with the Dockerfile and defined all the instructions that are required to build the image and so our container. Let’s build the image by running this command.

# docker build --build-arg <build arguments> -t <user-name>/<image-name>:<tag-name> /path/to/Dockerfile$ docker build --build-arg APP_DIR=var/app -t ankitjain28may/node-app:V1 .

As we have added the arguments in our Dockerfile so we are passing the arguments here. If we don’t pass the arguments here, it will take the default value i.e app .

# Let's check the image
$ docker images

The output will be similar to this —

Docker images

Run the image

We have built the image from our Dockerfile. Now, we will run the image to create an instance i.e container. Let’s run the image by running this command.

# docker run -p <External-port:exposed-port> -d --name <name of the container> <user-name>/<image-name>:<tag-name>$ docker run -p 8000:3000 -d --name node-app ankitjain28may/node-app:V1# Let's check the container
$ docker ps

The output will be similar to this —

Docker Containers

Everything is done except testing, Our container named node-app is running at a port 8000, open the browser and browse localhost:8000 to test whether it is running or not. We can also make a curl request for the same.

# Testing
$ curl -i localhost:8000

The output will be similar to this —

localhost:8000

We can create as many containers as we want from the above image. We can also push this image to DockerHub registry so that other developers/users can make use of this project. In this way, it is easy to ship and deploy projects.

Best practices and Guidelines to follow

Some of the best practices and guidelines that should be followed to make the best use of Docker and build lightweight images.

1- Always create .dockerignore file.

We should always create .dockeignore file so that we can ignore the files and directories that are not required during image building. In this way, we can reduce the build context and can reduce the build time and final image size. This file supports exclusion patterns similar to .gitignore files. It is also recommended to add /.git dir in dockerignore, as .git folder sometimes has a lot of size due to multiple branches (especially during development) which ultimately results in the large size of docker image and adding .git dir under image/container is also useless.

2- Use docker Multi-stage build.

Let’s take an example, we are building a project for an enterprise and we are using a lot of packages of npm and each npm module also install self-dependencies. This also takes time in building the image (That, we can cache as I have mentioned above) but the final image is of larger size which isn’t good though. In enterprise products, we generally use webpack and similar modules to make our production build static so we don’t need the npm modules (In case of Front-end).

One way of doing this is —

# Install dependencies
COPY package*.json ./
RUN npm install --production
# Production build
COPY . .
RUN npm run build:production
# Remove npm modules
RUN rm -rf node_modules

But, it won’t work as Docker is a layered filesystem so every RUN, ADD and COPY instruction will itself is a layer and cached so what we can do is —

# Add complete project
COPY . .
# Install dep, Make build and remove dependencies
RUN npm install --production && npm run build:production && rm -rf node_module

Here, we have just one RUN instruction which is installing the dependencies, making the build and removing the node_modules so the final size will definitely be reduced as there are no modules but every time we build an image, It will take a lot of time in installing dependencies. It can be used but it is not efficient so we can use Docker Multi-Stage build, let’s see this in detail.

Suppose, I have a front-end project, which contains a lot of dependencies and we are using webpack to build the static files for our project. This way, we can leverage the docker multi-stage build to reduce the final image size.

FROM node:8 As build# Directory
RUN mkdir /app && mkdir /src
WORKDIR /src
# Install dependencies
COPY package*.json ./
RUN npm install
# For production
# RUN npm install --production
# Copy project files and build the files
COPY . .
RUN npm run build:production
# This result in a single layer image
FROM node:alpine
# Move the build files from build folder to app folder
COPY --from=build ./build/* /app
ENTRYPOINT ["/app"]
CMD ["--help"]

This way, the final image produced is very light as compared to the previous image and we are using node:alpine docker image which itself is very light. Here we can see both the docker images —

Node Images

We can see the size of both the images, node:alpine is way lighter than the node:8

3- Use Docker Cache

Always try to use the docker cache layer feature to reduce the time in building the image as we have used above in case of package*.json but don’t use it blindly.

Suppose we are installing packages using Ubuntu:16.04 docker image in Dockerfile.

FROM ubuntu:16.04
RUN apt-get update && apt-get install -y \
curl \
package-1 \
.
.

Now, when we run this, it will take a lot of time updating and then installing if we have a large number of packages. So, we thought to use the docker caching feature and put the instructions like this —

FROM ubuntu:16.04
RUN apt-get update
RUN apt-get install -y \
curl \
package-1 \
.
.

Now, when we build the image first time, it will run perfectly as there is no caching initially but suppose now we need to install another package let’s say package-2. So, we added the package-2 like —

FROM ubuntu:16.04
RUN apt-get update
RUN apt-get install -y \
curl \
package-1 \
package-2 \
.
.

Package-2 won’t get installed or won’t get the latest updated version, but why? This is because when it is trying to RUN apt-get update command, Docker didn’t find any change in the instruction and took the data from cache (outdated version) and found the change in the apt-get install command so it started executing this command and end up either with old-version of the packages or some might fail to install so apt-get update && apt-get install should be run in a single layer. Caching is really great but it also ends up with problems like this if we didn’t take care of such things.

4- Minimize the number of layers

It is always recommended to minimize the number of layers as each layer itself is a filesystem in Docker so less number of layers means the small size of docker image.

It is great if we use Docker multi-stage build feature to minimize the layers and the size of an image.

Conclusion

With this, I would like to conclude this article “The best way to manage your Node app using Docker”. It is not particularly only for Node.js projects, the best practices and guidelines must be followed in any docker projects to maximize the efficiency and minimize the associated problem with docker.

In this article, we first understood what is Docker and how to dockerize the Node.js project. We also learned about some best practices and guidelines of using docker effectively. I hope this article helped you understand the docker and its standards. Read more about the Docker and Best Practices of writing Dockerfiles.

Feel free to comment and ask me anything. You can follow me on Twitter and Medium. Thanks for reading! 👍

--

--