Enterprises want Docker. It’s on many 2016 roadmaps and has become the tech darling of startups and financial services conglomerates alike, notwithstanding its extreme youth.
Despite common perceptions, enterprises don’t need to reach the promised land of a full “DevOps transformation” to start using Docker. They don’t need to have a microservices model or a fleet of full-stack engineers. In fact, Docker is a good fit for enterprises that are in the thick of a multi-year IT transformation and can actually help big teams implement DevOps best practices more quickly. Hybrid cloud is the goal of nearly half of enterprises, most of which are in the process of some kind of DevOps tool chain adoption. Both are messy processes. Enterprises are hiring cloud consultants, consolidating data centers, breaking down barriers between engineering teams, and migrating new applications to Amazon Web Services (AWS) or other public clouds.
Mastering the hybrid cloud
Despite the supposed flexibility benefits of hybrid clouds, it is quite an engineering feat to manage security and scalability across multiple complex systems. Internal dependencies, network complications, and huge on-premises database clusters burden the vast majority of an enterprise’s applications. The idea of moving an application from one cloud to another “seamlessly” is laughable. For most enterprises, cloud bursting is a pipe dream. This is where Docker fills a critical gap. The top reason enterprises are using Docker is to help them deploy across multiple systems, migrate applications, and remove manual reconfiguration work. Because application dependencies are built into containers, Docker containers significantly reduce interoperability concerns. Docker works equally well on bare metal servers, virtual machines, AWS instances, and so on. As a result, applications that run well in test environments built on a public cloud instance will run exactly the same in production environments in on-premises private clouds. Applications that run on bare metal servers can also run in production on any public cloud platform.
Accelerating a DevOps culture
This is good news for enterprises looking to push a DevOps culture transition forward. The DevOps movement is really about moving faster and consuming fewer resources. Enabling developers to provision Docker containers, run tests against them, and deploy to production in minutes is cost-efficient and eliminates a developer’s worst enemy: manual system configuration work. Docker is also a good fit for evolving enterprises because they are usually the most skittish about vendor lock-in. Container standardization makes it that much easier to move across clouds operated by multiple vendors.
Testing as security features mature
However, there are still some major hurdles to jump. Enterprises are rightly concerned about Docker security in hybrid environments. Containers may resemble virtualization, but they have vastly different implications for system segregation, log aggregation, and monitoring. Enterprise applications often have strict governance procedures that require extensive logging and monitoring. Quite simply, there is no mature orchestration tool that monitors security across multiple Docker clusters. Most monitoring tools on the market don’t have a view of transient instances in public clouds, let alone the sub-virtual machine entities. In the case of a security threat, Docker containers currently would require a lot of manual security patching. Docker allows you to make an update to your base image, but developers would have to manually ensure that base image is running in each container. Some form of image inheritance is necessary for Docker to be ready for a mission-critical enterprise application. For enterprises that require multi-tenancy for isolating multiple clients’ environments, Docker is truly not an option. You’re running on the same kernel in the same kernel space, which is not equivalent to separate VMs under a hypervisor. Enterprises with sophisticated backup tools may find that Docker containers present an extra layer of challenge to getting data shipped on time and to the right places. Docker is quite possibly the answer to enterprises’ challenges in hybrid cloud. But it is also a brand new technology without many of the orchestration or security monitoring tools that enterprises need to use Docker in production. Now is the time for enterprises to investigate Docker, try to get their app running in hybrid test environments, and get to know their pain points, but probably not the time to use Docker clusters in production. Docker Container is quite new in Vietnam, however it is a “hot trend” and be one of the top famous and interesting open-source technology. In July 2015, FPT Software helped holding the first Docker Conference in Vietnam, called DockerDay. More than 160 people registered to join, sharing good stories about how Docker was applied in their enterprise environment. Catching up with the Docker-Container-trend, the Cloud team at FPT Software – FSB is building a product called Citus™ Containerization. Citus™ Containerization is a powerful tool to help simplify and fasten the deployment and migration process in cloud application development. Using container technology, Citus™ Containerization easily creates lightweight, portable, self-sufficient containers from any application, as well as provides the environment for Dockerfile editor and auto deployment. With this tool, FPT Software not only shows off the capabilities working with Cloud technology, but also provides a powerful tool to developers and enterprise to easier using container and applying in their development process. Source: VentureBeat, LogicWork, Docker, FPT.
– KienNH1 –