What is Docker?
Containers, isolation, and why they matter
The core idea
Docker is a platform that packages applications into containers — lightweight, isolated environments that include everything required to run a piece of software: code, runtime, libraries, and configuration. Because containers bundle their own dependencies, they run identically everywhere Docker is installed. A container that works on your laptop runs the same way on a staging server or production cloud. This consistency eliminates the classic "it works on my machine" failure mode.
The shipping container analogy
Before standardized shipping containers, transporting goods between ships, trains, and trucks was chaotic. Every cargo type required individual handling. Shipping containers solved this by standardizing the unit of transport. Once goods were sealed inside, the container itself moved — not the goods individually. Docker works the same way for software. Instead of moving loose files and dependencies between environments, the entire application is packaged into one portable unit that runs anywhere.
Containers vs. virtual machines
Virtual machines emulate entire computer systems — each VM runs its own OS, kernel, and hardware layer. Flexible, but resource-heavy and slow to start. Containers share the host operating system's kernel and isolate applications at the process level. They don't include a full OS, so they start in milliseconds and use far fewer resources. This lightweight design is why containers became the default for modern cloud-native infrastructure.
# Pull and run the hello-world image docker run hello-world # What Docker does: # 1. Looks for the image locally # 2. Downloads it from Docker Hub if not found # 3. Creates a container from the image # 4. Runs the container, prints a message, exits # List all containers (including stopped ones) docker ps -a
Key terms
Ready to test your knowledge?
Pass the quiz to unlock the next module