The Doppler Quarterly Fall 2019 | Page 57

Every decade or so a new IT infrastructure model barges onto the scene and changes the way organizations use technology. Client/server architectures put computing resources in the back room and doled them out back in the 1990s. Virtual machines (VMs) created the ability to emulate one computer’s resources on another in the 2000s. Cloud hit big in the 2010s, helping companies become more agile and cost focused. Now that we’re entering a new decade, what model will dominate the conversation? Based on current trends and expert forecasts, it’s clear that the 2020s will be defined by containers and microservices. Containers, or course, aren’t brand new. Some describe the technology as another name for VM partitioning, which dates back to the 1960s. Google introduced a container clus- ter management system in 2003. Docker popularized the concept with the introduction of its orchestration platform in 2013. Gartner forecasts that half of all companies will use some kind of container technology by 2020. But the emphasis organizations are putting on the technology today is increasing to the point where they’re making it a critical part of their overall transformation process. Companies we talk to are embarking on different journeys – some to the public cloud, some to a hybrid cloud environment, others embracing a blend of hybrid IT. They all have different goals and different timelines. What they have in common is a respect for the value containers and microservices can provide in helping them stream- line their IT processes. Getting containers right is critical to success in any transformational journey. There are many key facets to creating a sound container strategy and questions to answer along the way. How do containers actually work? How does data fit into an overall container plan? How do you secure containers against outside threats? And what’s the end goal of a container strategy? What does “good” look like? FALL 2019 | THE DOPPLER | 55