Understanding Fact from Fiction When Moving Legacy to Cloud - InformationWeek

InformationWeek is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

IoT
IoT
Cloud
Commentary
1/6/2020
07:00 AM
Cameron Jenkins, EVP, Modern Systems, an Advanced company
Cameron Jenkins, EVP, Modern Systems, an Advanced company
Commentary
50%
50%

Understanding Fact from Fiction When Moving Legacy to Cloud

Here are three truths and one lie leaders should know before moving to the cloud.

While they might power mission critical aspects of a business, traditional data centers and legacy-based monolithic applications are often brittle, old, complex and tightly integrated. Few can argue with the business and technical benefits associated with shifting these workloads from traditional data centers to cloud-based infrastructures to remain agile and competitive in today’s landscape. 

Image: Gorodenkoff - stockadobe.com
Image: Gorodenkoff - stockadobe.com

But even though the benefits of the end state destination might be obvious, the act of moving to a cloud environment can be a very tricky process. For CIOs, CTOs, and IT leaders, one of the most important steps in the process is having a firm understanding of what is true and what is fiction when it comes to this kind of transition to ensure that it is done correctly.

Here are several truths and one lie leaders should know before moving to the cloud.   

Truth: Not all workloads should become cloud-native microservices

Cloud-native might be a logical design goal of newly developed cloud workloads, but in some cases, there is no need to distill complete legacy functionality down to an independent set of loosely coupled microservices. Sometimes, the complexities involved in architecting, managing, scaling and monitoring highly transactional atomic-based workloads make them better suited for monolithic application. Since they are also updated less frequently, they don’t need continuous delivery models supported by their own team.

Each strategy for monolithic will depend on the key business needs and may require methods like re-hosting “as-is” with little-to-no change, or refactoring to Java or C# to further optimize for certain cloud capabilities such as increased elasticity and availability. The key here is not to rush down one path. Rather, companies should leverage a tailored approach coupled with roadmaps that outline specific goals for different applications based on individual requirements. Deciding which capabilities to decouple, and then moving toward a cloud-native microservices architecture from there will prove much more productive and effective.

Truth: You can minimize risk through a combined top down and bottom up assessment

When embarking on a transformational journey, all lines of business and various stakeholders must be aligned throughout the entire process, and key parties should not be siloed from each other. This requires ongoing conversations about the shift at the onset, and proactively addressing the cultural and operational changes associated that will impact different teams. A top down and bottom up analysis should then be executed in tandem, a combination that has proven to reduce scope by 40-70% if done correctly.

The top down analysis can be through workshops like event storming and domain driven design (DDD) that allow the future to be shaped by describing the business and how events flow through the system. Legacy functionality usage has most likely evolved over the years, so incorporating UX to build specific service use cases is also critical.

A bottom up analysis offers a comprehensive picture of the contents and interrelationships between application components. This kind of insight can significantly reduce the cost and complexity of any future effort by isolating unused components, highlighting potential roadblocks, and focusing on areas in need of concentration. This also exposes the legacy application design and anatomy of the source code, which is key to eliminating any design weaknesses so that the future state architecture doesn’t inherit them.

Truth: The move to cloud is best accomplished as an incremental journey

Through the top down and bottom up assessments, organizations can also easily create strategic roadmaps that outline ways to drive ROI at each incremental step. It’s important here to consider the different levels of maturity (i.e. cloud ready, cloud optimized and cloud native) to determine the best approach. Although cloud-native is often the end goal for monolithic applications, a logical first step is to convert code into cloud native languages like Java or C#. By doing so, an organization can then eliminate the dependency of the mainframe and target a cloud ready containerized environment.

For cloud-optimized environments, workloads are optimized further to provide scalability at the container level, while replacing a few high-value capabilities with new microservices functionality. This can be further optimized at an organization’s own pace to move incrementally toward that cloud native environment, though some organizations might never actually transform the entire monolith.

Lie: There’s no reason to worry about operations and infrastructure

Around 40% of the move from legacy to cloud is typically focused on application source code and data conversion, with 40-50% spent on testing, and 10-20% geared towards design, implementation and management of target operations and hardware infrastructure. Since the legacy environment has well-established operational and infrastructure standards and processes already rooted in place, a target platform should never be an afterthought. As part of this, companies must ensure that they are first operationally ready and have the skilled resources needed to support the new, continuous delivery processes. By building out the target environment as part of the incremental journey, teams will have more time to adjust before any microservices-driven projects start.

An astounding 80% of global corporate data today lives in or comes from mainframes that leverage technology that is often 50 years old. While the move to the cloud has given organizations a way to shed the growing burden of managing these systems and a new approach to remaining innovative, agile, flexible and competitive, many have only just scratched the surface in embarking on and effectively completing their journeys. Since once of the biggest hurdles is developing and implementing the right approach, it’s up to IT and technology leaders to take the time to tailor their strategies and ensure that they will actually drive the right business results now and for years to come.

As Executive Vice President of Modern Systems -- an Advanced company, Cameron Jenkins oversees sales, marketing, technology products and solutions on a global level. Prior to joining Modern Systems, he served as executive director & global practice lead for the Application Modernization division of Dell Services.  

The InformationWeek community brings together IT practitioners and industry experts with IT advice, education, and opinions. We strive to highlight technology executives and subject matter experts and use their knowledge and experiences to help our audience of IT ... View Full Bio
We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.
Comment  | 
Print  | 
More Insights
News
How COVID is Changing Technology Futures
Jessica Davis, Senior Editor, Enterprise Apps,  7/23/2020
Slideshows
10 Ways AI Is Transforming Enterprise Software
Cynthia Harvey, Freelance Journalist, InformationWeek,  7/13/2020
Commentary
IT Career Paths You May Not Have Considered
Lisa Morgan, Freelance Writer,  6/30/2020
White Papers
Register for InformationWeek Newsletters
2020 State of DevOps Report
2020 State of DevOps Report
Download this report today to learn more about the key tools and technologies being utilized, and how organizations deal with the cultural and process changes that DevOps brings. The report also examines the barriers organizations face, as well as the rewards from DevOps including faster application delivery, higher quality products, and quicker recovery from errors in production.
Video
Current Issue
Special Report: Why Performance Testing is Crucial Today
This special report will help enterprises determine what they should expect from performance testing solutions and how to put them to work most efficiently. Get it today!
Slideshows
Flash Poll