5 things You Need to Know About Containerized Applications
What do you think of when you hear the word “containers”? Do you think of big cargo ships full of containers crossing the ocean, moving goods to and from ports around the world? Each one of those containers are carrying goods of some sort. They may be carrying the newest 4K TV or other new technology, or they could just be carrying boxes full of clothes. No matter what they're carrying they are pulled off the ship one by one and placed on a trailer chassis, latched in, and moved to a holding spot or to their final destination.
This process is nothing new but has evolved over time. Think back just a couple of hundred years, containers back then were most likely wood and weren’t too big because a couple of men would have had to move them around. However, as the world became more industrialized, we created larger ships that could carry more and large machines that do much of the heavy lifting for us.
But enough about shipping containers. What does that have to do with the technology?
Well, technology containers really aren’t much different from shipping containers. Technology containers have also been around for a while, but recently there has been a lot of development and innovation. Have you ever heard of BSD jails, or Unix/Linux chroot? These are examples of OS (operating system) virtualization which would function as a type of container. Wikipedia sites 1979 as the year chroot was added to Unix and 1982 when it was added to BSD. [1] So our friend the “container” has been around for a while. But if they’ve been around so long, why all the press recently? Let’s walk through 5 things you should know about containers and why they are gaining momentum.
1) Containers speed up development.
How so? Directly from Microsoft Azure’s blog by Mark Russinovich:
“From a developer’s desktop to a testing machine to a set of production machines, a Docker [a well-known Technology containerization company] image can be created that will deploy identically across any environment in seconds.”[2]
This is a big advantage as I don’t know how many times I’ve had a developer tell me, “it worked on my machine but for some reason it’s not working in test or production.” Of course, their local machine was not configured like the test environment running a server OS with certain security requirements, nor was it identical to the production environment which was set up, configured, and hardened to run the application at scale. These differences are what cause new applications to fail and significantly slow the application development process.
Containers now give us the ability to provide developers with a local standardized container environment that eliminates the need to match the test or production environments. This allows them to share with colleagues for continued development, and move into test and on to production environments without issue.
2) Containers provide better resource utilization
Why is this? Compared to hardware virtualization (think any of your most popular hypervisors), containers require less resources to provision and operate. This is because with normal hardware virtualization you have server hardware, a hypervisor (host OS) which uses CPU, memory, storage, and services that run and require resources. On top of this, you then deploy virtual machines that use CPU, memory, and storage, plus their own services which also require resources. For each VM we deploy, rinse and repeat. So as you can see, the more VMs we deploy the more resource overhead we create.
With containers we have the server hardware, a host OS running some type of virtualization (containerization) service and its own services that require resources. On top of this we then deploy containers. These containers, however, use only the resources that are required for the application we deploy in the container. We lose the need for the VM OS installation along with the CPU, memory, and storage that it consumes.
3) Docker
Docker created a toolset, packaging model, and deployment process that makes using containers across any Linux host easy, with support for Windows Server 2016 not too far away as well. Through containers, Docker has given developers the ability to have a local standardized environment for development. The developer can then share their containers (thus the application) with others, move it into test, and ultimately into production with ease and reliability.
4) Containers are not synonymous with Docker
Though Docker may be the most well-known and has helped containers go mainstream, they’re not the only game in town. You can run Linux containers (LXC/LXD) on your favorite Linux distribution or look at another container company like CoreOS. And only time will tell how Cisco will use its acquisition of Container X to get into the container game.
5) Containers aren’t all grins and giggles
Why do I say this? Well, just like all other technology there are advantages and disadvantages. On a positive note, containers can truly help speed up development which enables faster time to market. And isn’t that what all businesses want? To get their product out the door faster, more securely, and with less cost? But this ability may come with some issues (or as I used to tell my team, “opportunities to excel”). I’m not saying these “opportunities to excel” will halt the adoption of containers, nor am I saying they should. On the contrary, containers offer IT professionals the ability to provide value to the business while protecting its interests. Because just like with anything else, we need to be able to help the business make informed decisions.
To learn more about containers join us at our 4th annual OktoberTekfest! Or if you aren’t able to make it to TekFest, stay tuned and we’ll follow up this post with more on containers and those “opportunities to excel.”
[1] https://en.wikipedia.org/wiki/Chroot#History
[2] https://azure.microsoft.com/en-us/blog/containers-docker-windows-and-trends/