<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=522217871302542&amp;ev=PageView&amp;noscript=1">

Mastering the Portability of Workload Containerization

Posted by Dwight Hawkins on Nov 17, 2020 1:36:16 PM

When I hear the word container, I think of a place to put my roasted turkey and provolone on wheat sandwich. The term also reminds me of family road trips when we drive alongside railroad tracks and see trains carrying shipping containers barreling along the tracks.

Rick Stewart_Photo2In this age of technology, a different type of container term has emerged. Rick Stewart, chief software technologist at DLT Solutions, a Tech Data company, explains how the portability of workload containerization is making waves in the technology industry. 

The Situation

Imagine you are a technologist, and your boss gives you a task to create a service that is available to anyone, whenever they want it. This service must be accessible through many devices, anywhere on the globe, and provide the ability to support an unknown number of users at any one time. After you stop laughing and make a snarky comment about updating your resume, you begin refining this request into more exact requirements.      

Utilizing containers is a reasonable approach to these types of aggressive asks, which are becoming the standard of operational requirements.

So, what are containers? Containers are a form of operating system virtualization, through which applications are run in isolated user spaces that all use the same shared operating system (OS). They are the hottest trend in service development and delivery, whether it is part of your data center innovation or cloud migration strategies.

Users running Google Search launch about 7,000 containers every second, which amounts to about two billion every week. Google supports searches in different locations around the world, and levels of activity rise and fall in each data center according to the time of day and events affecting that part of the world. Containers are one of the secrets to the speed and smooth operation of the Google Search engine. That kind of example only spurs the growing interest in containers.

How Containers Work

If you're wondering how containers might fit your innovation strategy, a brief background on the history of innovation evolution is warranted. Information technologists are continually innovating about how to make more efficient use of computer resources. A significant technological development was virtualization, or in other words, making multiple virtual machines (VMs) from one physical computer. VMs make more efficient use of hardware resources and are the basis for cloud computing, whether that’s on-premise (deployed and running from within the confines of the organization), or via a cloud service provider. If you get more instances of an application to service requests by users, dynamically, you can ensure service availability and user satisfaction. Containers are like VMs in that they provide application isolation and allow you to run more applications/services on a single machine. However, VMs consume more computer resources than containers, as each VM is its own operating system. Containers are "lightweight," meaning that while many dozens of VMs can be put on a single host server, each running an application with its operating system, hundreds or even thousands of containers can run on a host server.

Additionally, using containers allows you to increase the capacity for new computing jobs in a split second. Containers can be created much faster than VMs, as VMs must retrieve 10-20 gigabytes (GBs) of an operating system from storage. The workload in the container uses the host server's operating system kernel, avoiding that step. On average, containers can boot up in one-twentieth of a second.

Another significant difference between VMs and containers is the inability to port (run them somewhere else) VMs seamlessly across data centers or cloud providers. Containers are more efficient at creating workload bundles that are portable from host to host, cloud to cloud. They provide a stable foundation for moving workloads around multi-cloud and hybrid-cloud infrastructures without having to do significant rewrites of an application's codebase (a collection of source code used to build a particular software system, component or application).

This foundation gives organizations the ability to determine the best environment to deploy workloads that meet their stakeholder's needs. In theory, the use of containers means an organization can procure less hardware, build or rent less data center space, use fewer cloud resources, and hire fewer people to manage those resources. Despite these perks, the use of containers is not a solution; a container management system is necessary to organize and manage containers.

Container Management

To be considered an enterprise-grade system, containers need to be managed and orchestrated across the organization's resources. Container orchestration is all about managing the life cycles of containers, especially in large, dynamic environments.

Service delivery teams use container orchestration to control and automate many tasks, including:

  • Provisioning and deployment of containers
  • Redundancy and availability of containers
  • Scaling up or removing containers to spread application load evenly across host infrastructure
  • Movement of containers between hosts for any reason
  • Allocation of resources between containers
  • External exposure of services running in a container with the outside world
  • Load balancing of service discovery between containers
  • Health monitoring of containers and hosts
  • Configuration of an application in relation to the containers running it

Container Platforms

Modern container management and orchestration platform software allows IT professionals to deploy multiple containers across several hosts, whether on-premise, bare metal (a system/network in which a virtual machine is installed directly on hardware rather than within the host operating system), via private or public clouds, or all of the above. Organizations use these platforms to increase the scalability and functionality of applications by adding containers and connecting information about repositories and networks. They can also improve container security by setting requirements for accessing containers and keeping components separated.

Container management platforms may include orchestration features, but many container orchestration solutions function as a complement to the management platform. Container orchestration or management software is used for managing workload containers and services, facilitating both declarative configuration (the desired end state – in other words, describing what the program should do instead of how it will do it) and automation.

Pursuing a container strategy without using a management platform and an orchestration platform is a fool's errand. When embarking on the use of containers, a comprehensive approach must have both, as attempting to manage containers at scale cannot be successfully performed manually.

As Rick explained, when your boss asks you to complete what seems an impossible task, you can respond without a laugh or a snicker (at least not out loud!). You merely need the information about what the service needs "to do." Knowing that technical solutions are available and feasible to address operational needs will make your day easier — and ultimately can help make your task possible.

Visit DLT Solutions for more information on the ‘Portability of Workload Containerization,’ and other solutions for your enterprise.  

 

About the author

Dwight Hawkins is a Marketing Communications Specialist for Tech Data in Tempe, Arizona. In his role, Dwight works on several external communications initiatives, including the Authority blog, web content, an external newsletter for Tech Data channel partners, and executive and presentation content – all with a focus on thought leadership and brand equity. He also hosts a video interview series called “Tech Data Flyby.” Before joining Tech Data, Dwight served for over 20 years in the United States Air Force, garnering experience in public affairs, video and radio broadcasting, and strategic communication. 

Tags: kubernetes, workload containerization, Portability of workload containerization, DLT Solutions, Chief Software Technologist