Odds are, in the past two or three years, your IT department has heard a great deal about containers. The most agile small to midsize businesses (SMBs) or enterprises may already be deploying containers. As far as application development and IT infrastructure go, Linux containers are about as buzzy as this kind of technology gets.
Continue Reading Below
We've already explained how the modular application architecture of microservices is helping development and IT teams work more efficiently, while reducing the cost and complexity of adding new features and functionality. From a technology standpoint, containers are the catalyzing agent of that DevOps equation. They're the convenient package through which DevOps and IT teams can quickly and consistently pass an application's code, configurations, and dependencies back and forth.
But what does that actually mean for your business? I spoke with enterprise IT solutions and open-source software company Red Hat to find out. This explainer will lay out not only what containers are and how they work but the different ways in which—once you understand the technology—your organization can use containerized deployments atop your data center or cloud infrastructure to deliver quality software faster.
At their most basic level, Linux containers are aptly named for the metal shipping containers to which they're so often equated. Whether it's on a freight ship, a cargo train, or on the back of a big rig truck, the container itself is the same uniform vessel of transporting goods. Lars Herrmann, General Manager of the Integrated Solutions Business Unit at Red Hat, oversees the company's Linux container technology. Herrmann said that businesses should simply look at containers as a new unit of work.
"Containers are all about agility," Herrmann said. "In a complex organization, it's about assigning responsibilities along with the freedom to deliver features. And containers give you this technology to keep it all together while still managing your responsibility for security, availability, regulatory compliance—all the stuff that matters."
Click image for full infographic. Image credit: Twistlock
Continue Reading Below
In this way, the homogeneity of containers makes them easy-to-use building blocks. They're small, pluggable units upon which you can build a microservices architecture that accounts for operational efficiency and version control. At the same time, they give DevOps and IT teams granular control over how they deploy infrastructure resources. Herrmann also pointed out that containers are fundamentally an operating system (OS) technology.
"Containers take the operating system and slice it into two pieces," explained Herrmann. "On one hand, you get the work unit for the application, which contains application code and dependencies in a way that can be optimized by the DevOps teams, and [gives] them autonomy and control to make decisions when they want to. They no longer have to wait for other teams.
"The other piece is the operating system kernel. The OS kernel and container payload provide support for the research and primitives you want available like storage, networking, and security. Because containers are an OS technology, you can run them anywhere, be it virtual hosts or a public cloud. That hybrid quality lets you manage any application in any environment using the same technology while still empowering DevOps teams."
Containers are also not the same thing as virtualization. Herrmann explained that containers and virtualization are reciprocal forces. Virtualization emulates a virtual hardware environment to run various software stacks; it provides what's called an abstraction layer to give that cloud-computing environment flexibility over how applications and data are structured and deployed. So, upon a single virtualized OS kernel, you can then run multiple servers or instances. Containers are the instances.
"There is still a lot of confusion around conflating containers with virtualization," said Herrmann. "Virtualization solves a different problem, and we think containers and virtualization complement each other very nicely. Virtualization provides abstraction and emulation and, with containers, you get a similar kind of abstraction but without the emulation. Together, they give you no measurable overhead and a ton of operational efficiency but it can be tough to separate the two."
A Quick Breakdown of the Container Landscape
The DevOps and agile principles we're talking about in regards to containers aren't new as they go back to the concept of service-oriented architecture (SOA) (which is detailed in our microservices explainer above). But the modern Linux container was invented when Docker changed the game. Docker is a few different things but, first and foremost, it's a wildly popular open-source technology that was developed by the Docker Project in 2013. It's designed for packing, shipping, and running any application as a lightweight container.
Docker is among several open-source projects that are helping to shape the technology and the space. Kubernetes, originally developed by Google and now managed by the Cloud Native Computing Foundation, is an open-source system for automating container deployment, scaling, and management. Docker and Kubernetes are the two powerhouse open-source projects that hold the most sway over the development of technology. Still, there are dozens of others, along with other organizations such as Red Hat's Project Atomic (for combined Docker/Kubernetes stacks) and the Linux Foundation's Open Container Initiative that aims to create open industry standards around containers. For Docker, it was Docker images that set the development world on fire. "Containers had all been putting services on the same node until Docker introduced the notion of image-based deployment," said Herrmann.
Click image for full infographic. Image: The Docker Survey, 2016
Docker is also a startup (founded in 2010 as dotCloud) that has raised more than $180 million in venture capital (VC) funding. The company offers a suite of enterprise Container-as-a-Service (CaaS) tools for Docker deployments in data centers and private clouds. Of course, when it comes to enterprise container management, Docker isn't alone in the space. Red Hat offers its own enterprise CaaS suite of developer tools across its Red Hat Enterprise Linux (RHEL), OpenShift, and JBoss products. Lately, more and more big-name tech companies have also been getting in on the action. Over the summer, Samsung bought Joyent and Triton, its CaaS platform. Oracle got in on the action late last year by acquiring StackEngine, and Cisco recently acquired enterprise Docker startup ContainerX.
Finally, there are the cloud giants. Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP) have all integrated built-in container orchestration and management tools into their respective cloud infrastructure-as-a-service (IaaS) platforms. In a matter of a few short years, the container space has gotten quite crowded.
What Business Problems Can Containers Solve?
When it comes to implementing modern application architectures and DevOps principles within an enterprise, containers are the answer to a number of problems. Particularly when the organization is entrenched in legacy technology and traditional development policies, containers are the easily integrated platform underneath that can smooth the transition and make it easy on the IT department.
"Right now, we see containerization as the most practical way to introduce cloud, DevOps, and microservices into your environment. Containers integrate naturally with the technologies you already have," said Herrmann.
Rich Sharples, Senior Director of Product Management for Middleware at Red Hat, said it's all about delivering quality software at a faster cadence. All companies are looking to put out software faster to compete in their own markets, and that pressure often falls on an overworked IT department. Sharples said containers are a way to produce applications and services that can be changed faster—be it adding a new feature or a critical security fix—while maintaining quality. He also talked about containerized infrastructure as the bridge to get enterprises ready for microservices.
"We have this design principle: We can't afford to leave any applications behind," said Sharples. "We're in this wonderful new world of DevOps and agile software development. But enterprises can't rewrite all their applications in order to join the party. How do we move them toward these new ideas?
"Investing in platforms like containers is a way to ensure the organization is ready to start building out something like microservices. Microservices and containers together is where the real power is. There's nothing interesting about a single microservice; it's only in plurality that you see this cooperating network made up of discrete chunks of functionality."
Image credit: Docs.Docker.com
When thinking about investing in and adopting container technology, Sharples said it's not just a technology decision. He explained that, for a successful transition to DevOps which incorporates containers and microservices, you'll need the architecture, the underlying platforms, and agile processes in place.
"This is not just a technology decision," said Sharples. "You need to think about whether your organization is ready, whether you have particular software delivery problems you need to solve, and understand what the business drivers look like around automation and DevOps. Understand your key requirements, look at different projects' needs, and then decide what combination of cloud, application architecture, and container technology can make it happen."
Herrmann gave enterprise IT departments three pieces of advice when looking at how containers fit into their organization:
1. Get Started
According to Herrmann, the combination of DevOps, agile, containers, microservices is not just a technology change in isolation. "It's a journey that leads to pretty significant transformation in how your business will operate," he said. "My first piece of advice is to get started because your competitors are. Letting early adopters put the pieces together is the wrong strategy because you may not catch up."
2. Comprehensive Vision
Herrmann advised that you approach containers from a more holistic viewpoint. "Pick your most important goal," he said. "The ability to deliver software faster is a great starting point. Based on that one objective, think about how you bring your organization into the process, and structure this work without risk and churn for your existing operations."
Many enterprises are reaching the point where they're constrained with legacy architecture, processes, and platforms, Herrmann pointed out. "You can't change the architecture without thinking about the platforms you want to rely on," he said. "Then the question is: who do I work with? Who do I talk to? Our recommendation is to look for companies who can help not just with the technology problems but manage the transformation on all these dimensions: tech, process, all the way to organization. When dealing with cloud, DevOps, containers, and microservices all together, you want to rely on an ecosystem that can help you deliver success in a short period of time and spare you from dead-ends."