Legacy Replacement with Low-Code


Posted on: February 7, 2019 by Kees Kranenburg

We all know that there is a lot of legacy in our applications today. Most of today’s legacy applications are business-critical. They are voluminous, complex and poorly documented. Their technologies originated from the period they were popular: Cobol, PL/1, Oracle*Forms, PowerBuilder, Progress, C, C++, Visual Basic, and so on. Core business applications are based on these ancient technologies, hindering innovation and increasing the technology debt. The resources to maintain them are scarce. Those who have built the application have left the company or are retired.

A recent survey shows that 67% of the organizations sees the legacy applications as Achilles heel of their multi-cloud strategy.

Issues in migrating legacy applications

In this context many attempts are taken to replace the legacy applications or to cut their costs. One option is migrating the application by lifting and shifting it to a cheaper (hosting) platform. But in this option the application itself does not change. Its code base is still the same: a Cobol application stays a Cobol application. Only the underlying platform is changing. So, the issues remain of a poorly documented application and scarce resources to maintain it.

Another option is migrating the code base, partly automated. In this option the programming language is replaced by another, more current programming language, e.g. Cobol to Java. In this way, the maintenance of the application can be done by developers with modern technology skills. But the application architecture is still as it was designed 10 years ago. Knowledge of the application and the way it works is still lacking.

According to a recent publication in a Dutch IT magazine, organizations are facing more and more problems in migrating their legacy applications. This despite the experience in the market that one may expect.

The most popular option is replacement by a standard package. For a long time packaged-based solution were preferred above bespoke development. A "SAP, unless" policy was the doctrine for many CIO's. But recently, some large-scale implementations of ERP failed. One CIO explained: “An ERP implementation cannot last seven years. The pace of change has accelerated in many industries, retail and distribution is not immune. ERP Systems have to cope with the pace of change.”

Go Against The Tide

Times are changing, and rapidly. Today, several organizations are following a “Go Against the Tide” policy to create competitiveness by preferring bespoke development using low code platforms.

Their business case is based on the high productivity of software development. A study of KPMG indicates 30% to 50% productivity gains in coding, 40% during testing, 75% in deployment and 50% to 75% productivity gains during maintenance. Project experiences show a productivity increase of 5 to 10 times compared to Java or .NET software development. This extreme high productivity makes the business case positive.

Despite my plea for rebuilding legacy applications with low-code, it is not a piece of cake. Knowledge capture is by far the largest hurdle to tackle as many legacy applications are poorly documented and their developers have gone. To overcome this issue, application mining brings a helpful hand.

The basis for further development

Application mining tools analyzes the application code. A typical application mining tool embraces a code analyzer for almost every programming language. They produce ‘technical documentation’, e.g., the entity-relationship-diagram, a high-level architecture view, the database access view and the transaction/workflow view. After a model-to-model transformation, these diagrams and views can be imported into the low-code platform. Some low-code platforms are equipped with a ‘model API’ to import models from other sources. Others have an in-built application mining feature to directly incorporate the DNA of the legacy application into their platform.

These models are the basis for further development. Following a Scrum approach with sprints of 2 to 3 weeks, the data model, the business logic and the user interface are further developed and refined. Alignment with today’s business processes and requirements result in a revised workflow and a renewed user experience.

Successful cases of rebuilding legacy applications with low-code are not hard to find. Some of them are public and published as success story on the low-code supplier’s web site. Atos has rebuilt a Java-based asset management & planning system with low-code in 14 months. A C#, .NET/mainframe-based financial transaction system was rebuilt in just 10 months. Both are mission- and business critical systems.

These examples, and others, show that the use of low-code platforms goes beyond developing a simple mobile app or a web page. But above all, with a positive business case in hand, rebuilding legacy applications with a low code platform should be considered to enable the digital transformation and to create competitiveness. ​

Share this blog article


About Kees Kranenburg

Solution Lead Low-code Platforms
Kees Kranenburg is a Distinguished Expert, domain Applications at Atos. His field of play is software development and application management and the organization, processes, methods and tools necessary to professionalize them. By consultative selling he has brought AMS strategy and innovation into Application Management engagements. His focus areas are in Low-code platforms, Application development and management, and Outsourcing. He is a member of the Advisory Council of the University of Arnhem and Nijmegen and a lecturer at the University of Amsterdam. Kees is the author of the books “Model-based Application Development” and “Managing a Software Factory”.

Follow or contact Kees