A Persistent Approach to Containers Pays Off
How to Increase the Reliability of Your IT Infrastructure Using Predictive Analytics REGISTER >
Containers enable TGEN to package application code with underlying system primitives in a way that allows them to run on any type of physical or virtual machine.
IT organizations that specialize in the management of high-performance computing (HPC) environments have always had little patience for virtualization. Anything that gets between their applications and the server is generally viewed as unneeded overhead. But the new exception to that rule appears to be lightweight containers such as Docker.
A case in point is the Translational Genomics Research Institute (TGEN), which is making use of containers to facilitate collaboration across the research community in which it participates.
TGEN CIO James Lowey says the research organization has embraced containers to make it a lot simpler for the worldwide clinical labs it works with to share models. Containers enable the institute to package application code with underlying system primitives in a way that allows them to run on any type of physical or virtual machine running on-premises or in a cloud.
Lowey says that when TGEN shares a model or other application code via a container, it immediately becomes reusable. "Researchers can substitute their own data and go," says Lowey. "That also makes it easier to reproduce results."
The challenge, says Lowey, is that most of the applications running across those HPC environments are stateful. Because genomics involves working with a patient's DNA, each patient participating in a trial can generate over 6 terabytes of data. Given the number of patients participating in the TGEN trials, the amount of data that needs to be stored already exceeds five petabytes--and all that data needs to be stored anywhere from five to 25 years, depending on the nature of the trial.
Lowey says he chose to work with Portworx, a provider of container data services, to make use of the container data services layer of storage software. That makes it feasible for a container—regardless of where and how it was created—to access data stored either locally on a network-attached storage (NAS) server or storage area network (SAN) or in a public cloud.
In effect, Portworx provides the layer of persistent storage that make it possible for application code packaged in a container to access data wherever it's stored. As part of that service layer, Portworx provides a Global File Namespace to keep track of where data is stored, and it also enables IT organizations to encrypt data.
Containers Enable Collaboration
As TGEN collaborates with more organizations, containers have been an effective means of enabling that collaboration because no two clinical labs are standardized on the same IT infrastructure, Lowey explains. Obviously, not every research facility is going to be standardized on Portworx either. But within TGEN, the combination of containers and Portworx significantly reduces the need to duplicate massive amounts of clinical data. That, in turn, results in a much more efficient usage of IT infrastructure in a HPC environment that continually seeks to maximize utilization.
Compared to most enterprise IT organizations, TGEN is much further down the road in terms of making the transition to containers.
Most of the containers in the enterprise today are deployed on virtual machines, rather than on the bare-metal servers employed by TGEN. The reason is that most enterprise IT organizations don't have the tooling in place to manage containers on bare-metal servers. Deploying containers on virtual machines allows those organizations to take advantage of the portability enabled by containers without having to rip and replace their existing IT management platforms.
However, Charles King, principal analyst for Pund-IT, reports that enterprise IT organizations are coming up to speed rapidly on containers. "Containers are very hot across the enterprise," he says.
King says that it's only a matter of time before more enterprise IT organizations start to make use of containers to replace virtual machines altogether. An IT organization might be able to deploy 20 or more containers on top of a virtual machine, but a bare-metal server that doesn't have resources being consumed by virtual machines should be able to host hundreds of containers.
In a lot of use cases, the cost efficiency of containers running on bare-metal servers quickly becomes too compelling to ignore—especially when organizations no longer need to license commercial virtual machine software from vendors such as VMware.
Naturally, that level of transition might take another year or two to play out across the enterprise. But at this rate, a fundamental shift in how applications are deployed across the enterprise is all but inevitable.