Docker containers have become a popular open source standard for developing, packaging, and operating applications at scale. There are a few key benefits to using Docker:
Docker containers provide you with a reliable way to gather your application components, and package them together into one build artifact. This is important because modern applications are often composed of a variety of pieces, not only code, but also dependencies, binaries, or system libraries.
With Docker you can write one
Dockerfile that describes the application. For example:
FROM node:9 AS build WORKDIR /srv ADD package.json . RUN npm install FROM node:9-slim COPY --from=build /srv . ADD . . EXPOSE 3000 CMD ["node", "index.js"]
This file is a script which describes how to setup a development environment for fetching Node.js dependencies from NPM, install them (also compiling any binary dependencies), then package up a slim container image with the final build product for delivery to any machine which needs to run the container.
Because Docker containers allow an application to carry all its dependencies along with it this allows the container to be taken anywhere you want and have the application run reliably. Whether it is a local development laptop, an on-premise data center, or a cloud provider, the container will run there.
Docker containers improve efficiency by providing a lightweight, efficient isolation model. Unlike a heavier virtual machine, you can run many small docker containers on a single machine. It isn’t uncommon to fill an EC2 instance with 10-20 Docker containers.
This helps you get more efficienct usage of the cloud resources you are paying for. Rather than paying for a large EC2 instance and only getting 10-20% utilization out of the instance, you can aim to pack many application containers onto the instance and get 70-80% utilization.