What is Docker?
- Definition of Docker
Docker is an open-source tool designed using containers to make it easier to create deploy and run applications. The function of the containers in docker is to provide developers all parts needed, such as dependencies or libraries, and contain it into one package. By using a docker that implements a container system, a developer doesn’t have to worry about software compatibility for every machine or system. When using a docker container, all the attributes and settings are already set up.
To visualize docker is a bit like a virtual machine, but it’s not. Docker allows the application to use the same Linux kernel as the local system running on and provide all the system package that be needed without forcing the local system to have all the dependencies needed. This reduces the size of the application memory and gives a better performance for the computer to run.
Important Component of Docker
- Docker Client and Daemon
Docker clients provide communication transport service from docker users to docker daemon. Docker users use CLI to input commands and send API responses through the server. The server in this case is a docker daemon that interacts with the operating system and performs services. Docker daemon constantly listens to all of the requests. To start using docker you must trigger by input docker command within your docker daemon, and the performance will be started.
- Docker Image
Docker image is a read-only template that contains instructions and dependencies for the docker container. The template is written in YAML language, which stands for Yet Another Markup Language. The docker image with the YAML file is built and then hosted as a file in the docker registry or hub. Docker image has several key layers, and each layer depends on the layer below it. Image layers are created by executing each command in the Dockerfile and are in the read-only format. The step of running the docker image start in the base layer, which contains the base image and your base operating system. After that layer per layer will be run. After running all layers, all the YAML files will be compressed and become docker files. In the docker image, you can run four layers of instruction as from, pull, run, and CMD. Pull command adds files from your docker repository. Run command builds your container. And lastly, the cmd command specifies which command to run within the container.
- Docker Registry
Docker registry is where all the images will be hosted and distributed the image from. The repository contains a collection of docker images that being built in the YAML file. The commands you would use to connect the registry are Push and Pull. Use the Push command to push a new container environment you’ve created from your local manager node to the Docker registry, and use a pull command to retrieve new clients (Docker image) created from the Docker registry. Again, a Pull command pulls and retrieves a Docker image from the Docker registry, and a Push command allows you to take a new command that you’ve created and pushed it to the registry, whether it’s Docker hub or your own private registry.
- Docker Container
Docker container is a package of applications and all of its dependencies that being compress together. Docker container can be visualized like an environment of a work of an application that works like a virtual machine but it’s not. The docker container is built using docker images, and the command to run those images is the Run command.
Here is the example of a command :
$ Docker run <container_name>
How to implement Docker?
In this implementation, I want to explain docker implementation for the Heroku web server. Note that, the web library that being used is Django so we need to set up the environment for Django by taking dependencies from the pip command from the requirement file. And the hosting web server from Heroku already has docker-compose built-in, so only Dockerfile will be made in the repository.
First set up your Gitlab configuration to include a docker image. The function of this step so the GitLab will read your docker image file and can be run when the process is running. You can configure your docker setting in GitLab ci-yml so you don't have to do it manually. Here is an example of docker configuration in the Gitlab ci-yml file to deploy automatically.
Secondly, you must make dockerfile to make a list of dependencies that are being needed in the docker image of your application. Firstly the dependencies that you must specify contain the base image that is used with FROM command. Because the project is Django Web-based application so base images are using python. Secondly, you must specify the environment variable that is used for the application with the ENV command. Third, you must install all the dependencies that are being needed in the docker container with the RUN command. Fourth, you must open a certain post so docker users can use it. Last, after all, set up is being stated run the server with CMD command.
What’s the benefit of using docker?
- Increase performance
- Flexible with any operating system
- An easy update for new dependencies
- Support continuous integration
- Isolate from irrelevant dependencies
This is my article for my Individual Review for PPL course at the Faculty of Computer Science, University of Indonesia.