Some toolchain leaders are emerging, but mind the cultural and other details to successfully implement a DevOps strategy.

Guest Commentary, Guest Commentary

January 18, 2018

5 Min Read
<b>Status Report: DevOps Tools</b>

CIOs realize that to speed the launch of new business applications they must abandon the slow, rigid, old-school waterfall style of application development and adopt DevOps. This philosophy promises shorter development cycles and faster delivery through the use of agile software-creation tools and the melding of development and operations teams.

That's a tall order, and it becomes especially daunting when IT executives first encounter the head-spinning world of DevOps.

Unfamiliar terms include toolchains and pipelines; new concepts can include containers, microservices and serverless computing. This is not to mention the dizzying variety of open source tools with names like Puppet and Chef.

"A big chunk of the Fortune 500 has acknowledged that change needs to happen," said Chris Riley, DevOps analyst at FixateIO a provider of testing and documentation services, "but it's been implemented by very few."

The good news is that the somewhat chaotic world of DevOps tools gained a measure of order in 2017, with DevOps automation and orchestration tools like Docker, Jenkins, Kubernetes, Chef and Puppet emerging as those preferred by Corporate America for their DevOps toolchains.

Toolchains help automate and manage the various steps in the complex process of building, testing, deploying, and monitoring new applications.

 "When I first started, people were struggling to get toolchains right," said Peter Varhol, a longtime technology evangelist and currently director of proactive strategy at Kanda Software, a custom app-development firm. "The DevOps workflow today is fairly well settled."

devops-shutterstock_581136349.jpg

The idea that enterprises no longer need to cobble together a DevOps toolchain from scratch is a huge leap forward.

There's more to DevOps than tools

Unfortunately, picking the right tools for each part of the process is only one piece of the puzzle. There is also the organizational aspect of merging developers and operational folks, which Varhol said is critical to the success of an organization's DevOps efforts.

"Distributed teams are with us whether we like it or not," he said. The issue is how to make them work. For Varhol, it comes down to "soft" skills. Companies need to start small and pay close attention to the composition and temperament of DevOps teams, he recommended. It's all about breaking down silos and finding people who are receptive to a culture change.

Peter-Varhol---Kanda.jpg

Getting a small DevOps team up and running is a great step in the right direction. But there's another key issue, according to experts who argue that DevOps is more than just a set of tools or an organizational structure -- it's a mindset. "It’s not the thing you do, it is a principle that guides the thing you do," said Riley.

Riley's recommendation is that companies not get too hung up on individual tools, many of which might not even be around two or three years from now. "Companies make the mistake of not figuring out how to orchestrate the pipeline before choosing any sort of tools."

Much more important, he said, is that companies must first create a software delivery pipeline. This is an overarching workflow system that enables applications to be built, tested, deployed, monitored and improved in a continuous, automated process, regardless of the specific tools used. Riley said companies need to think about the scenario in which they are able to drop new tools into the pipeline two years from now without breaking anything.

The good news here is that start-ups like Electric Cloud and XebiaLabs, as well as established enterprise software vendors like CA (Automic) and IBM (UrbanCode) have introduced pipeline automation tools.

Looking ahead: More cohesion possible, but not a sure thing

DevOps analysts agree that 2018 could be a watershed year in which much of the confusion is swept away. This will likely happen through a combination of industry consolidation -- reducing the number of players -- and by large vendors swooping in with all-encompassing solutions.

chris-riley-fixatelo.jpg

Varhol said the traditional IT software vendors have sensed an opportunity and might soon deliver a comprehensive DevOps solution that could offer an alternative to today's point products. He predicted that big names like IBM and Oracle could be jumping into the mix.

He said that an increasing number of enterprises want to do DevOps, but also believe that picking the right tools is "beyond our competency." A larger solution put forth by a major IT vendor could be an exciting development in the DevOps world.

Two other mega-players could also reshape the DevOps landscape in 2018 – Amazon and Microsoft. Since the cloud is the preferred environment for DevOps activities, it only makes sense that the major cloud vendors would be thinking about ways to support DevOps.

Both Amazon and Microsoft have introduced serverless offerings – AWS Lambda and Microsoft Azure Functions. The way it works is that developers deploy code directly to Lambda, for example, and Amazon takes full responsibility for where the code runs. For customers, there is no hardware consideration at all, Riley said. He predicts an uptick in serverless computing adoption in 2018.

Just a Bit of Advice

Status Report: DevOps Tools

There is no one-size-fits-all blueprint for DevOps because every company has a different application stack, different databases, different architectures, and so on. Thus every company needs to figure out a path to DevOps that works for them. Here are some suggestions.

The first step is finding a pain point for the business. This could be an application delivery bottleneck or a specific problem; you then want to solve that one problem using DevOps thinking, DevOps methodologies and agile development tools.

  1. Once that's successfully accomplished, move on to the next problem. But always shoot for ways to automate processes, to link tools with each other and to make sure the solution fits into your overarching delivery pipeline.

  2. Over time, continue to incrementally look for better ways to do things in a never-ending iterative process. “This is a living thing, this is ongoing, there is no one right way,” said Chris Riley, DevOps analyst at FixateIO.

Neal Weinberg is an independent technology journalist based in Franklin, Ma. He previously was executive features editor for NetworkWorld.

About the Author(s)

Guest Commentary

Guest Commentary

The InformationWeek community brings together IT practitioners and industry experts with IT advice, education, and opinions. We strive to highlight technology executives and subject matter experts and use their knowledge and experiences to help our audience of IT professionals in a meaningful way. We publish Guest Commentaries from IT practitioners, industry analysts, technology evangelists, and researchers in the field. We are focusing on four main topics: cloud computing; DevOps; data and analytics; and IT leadership and career development. We aim to offer objective, practical advice to our audience on those topics from people who have deep experience in these topics and know the ropes. Guest Commentaries must be vendor neutral. We don't publish articles that promote the writer's company or product.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights