Making mistakes is part of life. But that doesn’t mean blindly making choices, in life, or in technology, without considering whether others have trodden the same path before. Every generation of technology — disruptive or not — brings with it a similar set of challenges as the previous generation, only in a different form. One of my colleagues in sales says, half-jokingly, that if you want to know where the next set of problems will come from with a new technology, pull out an architectural diagram of a 1960’s mainframe. It provides a map for what part of the system needs to be built next. It’s an apt comparison; after all, new technology is no panacea. It merely shifts the need for innovation to another area.
I remember when VMware and virtualization burst onto the scene in the late 90’s. Back then, I was a systems administrator trying to modernize our data centers. My bosses fell in love with virtualization before anyone really understood best practices for this new technology, or who the standard-bearers in this new space would be. They insisted that we virtualize our servers at a ratio of 8-to-1, without understanding that following some figure on a vendor’s data sheet did not equate to actual business value. Also, any company in business for more than a decade has a high level of heterogeneity in its IT infrastructure: mainframes, ancient UNIX servers, and desktops in the data center are embarrassing, but a reality. Imposing new technology as an all-in general practice ignores this fact.
Sweeping technology initiatives often fail to address the real question: what value will it deliver for our customers?
A very similar form of technology myopia has descended on the marketplace today in containerization. Containers have startups, incumbent vendors, and end users in a frenzy, all espousing the technology as a one-size-fits-all panacea. But we don’t yet know how containers’ value proposition will ultimately be defined, or what solutions will win the day.
A recent Gartner report reinforces this skepticism, saying “Industry hype is excessive and is focused too narrowly on a handful of products in the container… market.”
Companies need to consider the real value to customers when embarking on technology initiatives. Otherwise they are just cool, but often time- and money-wasting engineering projects. Gartner states, “Enterprises need to understand the capabilities of container orchestration tools, as well as how they fit into an overall data center tooling strategy, because container orchestration tooling alone will not support the production operationalization of containers at scale.”
While container technology has been around for some time — and Docker certainly has made revolutionary inroads by making containers easy-to-use for the average developer — there are still a number of unknowns about this latest generation of technology. So let’s learn from our predecessors in previous innovation cycles. Doing so will ensure we avoid making the same mistakes as past generations when we adopt products from this one. Here are some tips to learn from the past:
There is a strong business case for containers. But companies should ensure they have a high level of operational maturity, a culture of innovation, and an agile, DevOps-friendly environment to ensure success with container adoption.
Gartner suggests enterprises “capture and prioritize your container orchestration requirements, embed container deployment use case requirements in your overall data center architecture and scrutinize your organization’s ability to deploy, integrate and support the requisite container-related tooling.”
If your organization is considering containers, following these best practices will drive success and ensure you avoid the technology adoption mistakes that have plagued previous generations. Make sure you adopt containers in a way that delivers positive business outcomes, instead of creating yet another mess for your IT teams to clean up later.