Containerization is a powerful process that simplifies application deployment and management across various environments. One of the popular containerization platforms is Docker, which allows you to package applications along with their dependencies and configurations for seamless execution.
For those new to this concept, let me break it down. Imagine you've created a fantastic application or website and want your friends to try it out. However, when they run it, they encounter issues, while it works perfectly on your machine. Frustrating, right?
Well, think of it like sharing a delicious recipe with your friend instead of sending them the prepared dish. Similarly, Docker lets you create a compact image of your application using a docker-compose file, making it easy to share and test on any machine.
With Docker, you can confidently develop, deploy, and share your applications, ensuring consistency and efficiency across different systems. It's a game-changer for developers and a key tool in modern software development. So, let Docker revolutionize how you manage and share your applications, making collaboration a breeze!
What is Docker?
Docker is a tool that helps to containerize applications. In the process, we create an image of the application and run this image in containers. Using this container, we can access the application that is running on it.
Docker contains its own commands to create, run, stop, list, and perform other operations on containers using images, not just performing operations, but also performing other actions as mapping the ports, etc.
It is one of the most used tools by DevOps engineers and it also ensures to deliver the application as a complete bundle with its required dependencies, versions of it, and code files.
How does Docker ensure to delivery of the application?
As we've discussed before, sending just the code files may not work. Instead creating an image of the application and using it can mostly help to run the application without any error from runtime.
What is an image?
A Docker image serves as the blueprint for your application, and it's created using the YAML Markup Language. Within this file, you define all the necessary dependencies, versions, file paths, and commands required to start your application. We save this file as a "Dockerfile."
To bring your application to life, you use the "docker build" command, which reads the Dockerfile and creates an image. If there are no errors in the Dockerfile, we can create image and application will run successfully. However, it's important to note that when the image runs, it no longer remains in its image form. Instead, it becomes a "container" that executes the image.
Containers act as platforms or environments in which these images can run flawlessly. This distinction is crucial to grasp, but once you understand it, Docker becomes much simpler to work with.
In my upcoming blog series about Docker, I'll walk you through everything you need to know, from the basics to more advanced topics. So, don't worry if it seems overwhelming now – you'll soon become a Docker expert!
Virtual Machines?
It would be easy if you have pre-knowledge about the Virtual Machines. Well let me give you an introduction about virtual machines if you don't know what it is.
You own your device, you store all the images, code files, etc files in your machine locally. You use the device's keyboard or other peripheral devices to enter data and also to do other stuff. But, what if you can use other's computer form your computer, virtually?
It's like using your friend's computer from your desktop, then you won't use your friend's peripherals to give input or do other operations.
Virtual machine is kind of similar to this instance, it is like having a second computer on your main or desktop. It splits the hardware resources into halves and let both machines use them seamlessly.
Usually, operating system consists of it's hardware resources, operating system and hypervisor. But, when it comes to Virtual machines, it also contains these properties in it.
VMs are like having a machine virtually. We can use as it our desktop.
Containers:
These Virtual Machines can be created on AWS using EC2 Instances using an operating system like Amazon Linux, RedHat, Windows etc OS. The server is made of hardware, host OS, and other terminologies.
Moreover, these VMs also contains terminologies like OS, hardware usage, and others which services used by the server aswell. So, it is using more memory in the server and also VMs generally takes more time to turn on and off. If any interruption occurs, it would take more time to on and off virtual machines.
To solve this problem, containers comes into action. The major difference between containers and VM is that it doesn't need separate allocation of the hardware resources, OS, etc.
It just use the Host OS, and other host services to execute the services which provided exact by the virtual machines. The advantage is that Containers are lightweight means it can be on or off instantly and consumes less memory.
By this isssulstration it may give you an idea of Virtual Machines and Containers.
Docker Hub?
When we work in a organization, the images are used multiple times. Like to install the image on multiple servers, updating them etc. And also many people work using the docker image of an application.
It would be difficult to install the image on everyone's machine, servers and other platforms using dockerfile. So, whaat we can do is that we can push this docker image to the dockerhub where our image get stored and using commands they can easily pull these images and work on them on their platforms.
Dockerhub is like a Github, where GitHub store the code files in a repository same way here dockerhub is a platform where we store all our images of application in a registry.
Conclusion
We've learnt what docker is, how it is useful in the industry, what are the images, containers, and dockerhub.
As in the series of Introduction to the DevOps, I believe this much information about the Docker helps you to understand the other tools which will be used in the devops culture and also I hope this blog helped you to understand the basic concepts of the Docker.
If you have any queries please don't hesitate to drop them down in the comment section and share this with your network if you feel it helps them to add knowledge about the docker.
In thee upcoming series of devops, we cover many concepts regarding docker and other tools up to advanced level.