Trends Transforming a CIO’s Approach to Legacy Modernization

It has been almost 17 years since Y2K yet there are still many core applications being run on mainframe and AS/400 systems written in COBOL, Natural, PL/1, REXX, RPG and even Assembler. By now, many of those applications have had some attempts to modernize these applications either through the use of technologies that convert and consolidate the text-only green-screen transactions to a slightly more visually appealing grey-screen that support basic web interface constructs, and even mobile access in some cases. This is also the case for applications that were written in the 1990s to early 2000s in languages such as FoxBase/FoxPro, SmallTalk and even early versions of Java.

In other cases, legacy applications have been replaced with package software, Software-as-a-Service solutions or migrated to less expensive hardware while using the same code with a COBOL runtime provided by Micro Focus (in potential merger with HPE). Additionally, legacy applications have been rewritten to a language and database that the company running the application had more recent experience with or felt would be easier to maintain for the next 20-25 years (this was often J2EE or .Net with some form of SQL for storage). Decoupled interfaces and Service Oriented Architectures (SOA) became popular during the first wave of transitions and often involved the first time a company invested in an Enterprise Integration platform (EAI or ESB).

However, for companies that are still running applications using those legacy languages and databases are likely exposing themselves to the following risk factors:

·         A shrinking pool of resources that can maintain or upgrade those applications

·         Long development cycles that prevent making rapid changes to support business demands

·         High infrastructure and licensing costs

·         Using unsupported or obsolete hardware or software technologies that have become single-points of failure

In the past 15 years as an IT consultant, I have assisted numerous companies in migrating legacy applications to J2EE, .Net and software packages, mostly with technologies provided by well-known platform providers on internally hosted infrastructure. These efforts were largely successful and predictable because we knew what the end-users wanted, which was essentially the same thing they had before but with higher quality, lower costs, improved user experiences and more rapid changes.

During this time, it was relatively easy to architect the new solution because it looked a lot like the previous solution. Some form of structured data became a new form of structured data, and application functions became services which largely performed the same functionality. The end users were generally satisfied that they knew what to expect and were satisfied to get better reports and changes on a monthly basis instead of quarterly or even annually. That expectation is now gone.

Users of systems today no longer know what they want, nor should they. The days of architecting for a known future-state are over. As an architect, I used to first ask for the business requirements before I began my architecture activities. “Tell me what you want the system to do and I’ll architect it.” “Tell me what your business model is and I’ll incorporate it.” Now my clients are saying, “How can I predict the future? How can I get an architecture that can evolve and accommodate things I can’t even imagine being possible one year or ten years from now?” In July of this year, Capgemini CTO Ron Tolido wrote an article entitled ‘Enterprise Architects need to become Platform Masters: the art of Jon Snow Architecture” in which he first describes the need for architects to follow the “I don’t know” architecture approach.  Here are nine trends that are fueling these insecurities:

1.       Disruptive business models (Amazon, Uber, Netflix)

2.       Open source technologies becoming mainstream and acceptable (Apache, Mongo, JBoss)

3.       Cloud computing becoming mainstream and accepted as secure (Amazon, Azure)

4.       The Internet Of Things creating Big Data (data from inexpensive sensors, wearables, medical, cell phones, connected devices/automobiles)

5.       Changes in how identities are managed (blockchain)

6.       Changes in how payments are made (Apple Pay, BitCoin)

7.       Increased use of Robotic Process Automation (UiPath, Kofax, Pega) to automate workflow

8.       The ability to perform real-time advanced analytics (Artificial Intelligence, Cognitive Computing, Machine Learning)

9.       The use of tools to deploy application changes at will (DevOps, Continuous Integration, Continuous Development)

And this is just a partial list of the changes and challenges facing an architect in charge of making decisions for to modernize an application today. So what is the solution? Stop trying to solve it. You can’t. It’s like trying to predict the stock market. Instead, I suggest letting someone else think about it for you. That’s where GE Predix comes in.

Check back next week for a post on how we’re leveraging Predix for legacy modernizations.

This post is part of a series on Predix, an Industrial Internet platform created by GE to turn real-time operational data into actionable insights. As a member of the GE Digital Alliance Program – an ecosystem of global systems integrators, independent software vendors, telecommunications service providers, and technology providers – Capgemini provides a range of services for Predix from strategy and application development through implementation. To help business leaders understand how Predix can help with their Industrial Internet journey, Capgemini has developed a series of blogs on the technology behind Predix, and its applicability in various sectors. To learn more, email me at steven.rogers@capgemini.com.