8

How Tried-and-Tested COBOL Underpins Banking & Insurance Digital Transformat...

 2 years ago
source link: https://www.gigaspaces.com/blog/how-tried-and-tested-cobol-underpins-banking-insurance-digital-transformations
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

How Tried-and-Tested COBOL Underpins Banking & Insurance Digital Transformations

7min. read
iStock-1047930492.jpg

Legacy platforms that run COBOL remain a mainstay across a large swath of private and governmental organizations, especially in the banking and insurance sectors. They should also continue to play a key role for many banks’ and insurance firms’ digital transformations.

All told, over 80% of all in-person transactions rely on COBOL, while the programming language constitutes the IT backbone at 43% of all banks, according to a Reuters study. Almost 90% of Fortune 500 companies integrate COBOL into their IT systems, accounting for why 85% of all business transactions use COBOL, according to the U.S.-based consulting firm COBOL Cowboys.

Indeed, there has long been talk of the death of the mainframe and of midrange computers such as IBM’s AS/400 (now System i) that run COBOL. However, these workhorses of the financial world carry on doing what they are meant to do — and it is very unlikely they will be replaced any time soon. In fact, sales of IBM mainframe MIPS continue to grow (although this does include the sale of Linux-based Z Servers as well as incremental extra MIPS for COBOL work).

Considering that COBOL accounts for the majority of all code in use today for business applications — according to COBOL Cowboys — it is in the interest of companies to seek ways to capitalize on their investments in COBOL environments in order to accelerate their digital transformations. 

pasted-image-0-4-300x263.png

Figure 1: Grace Brewster Murray Hopper working with UNIVAC I, the first commercial electronic computer that also ran COBOL.
Photo: Unknown (Smithsonian Institution), CC BY 2.0 via Wikimedia Commons.

The Right COBOL Stuff

The insurance and banking industries are good examples of traditional sectors with firms that often have mainframes running COBOL code dating back decades. Instead of reinventing the wheel and replacing these workhorse mainframes, many banks and insurance firms have been forced to integrate these servers for their data-intensive transactional uses distributed across multiclouds and on-premises environments.

However, there are a few challenges involved that DevOps teams face when seeking to integrate their COBOL applications into their operations in such a way that agility — and in many cases — digital transformations are completed. Today, IT infrastructure is being upgraded to ensure real-time access for users to data housed on legacy mainframes running COBOL on an add-needed basis from remote locations. 

With data being the lifeblood of an organization, it is not good practice to have such mainframes operating in discrete, siloed manners. There must be good levels of integration at the data and process level.

For the financial industry, the siren-call of migration away from COBOL has not been a major success. They might have been sitting on top of systems that have been running since the 1960s that have served well as system-of-record solutions with only minor changes over that time. Moving such code directly from a legacy to a new environment is not viable, of course. One alternative has been to reconfigure the applications, such as by relying on a more modern language, like Python, in order that they run on a modernized infrastructure, for which low-latency and fast data throughput are essential. 

Over the past decade or so, there has been a lot of focus on enabling existing COBOL-based systems to be more capable of operating within a highly complex, disparate environment. IBM itself has a dedicated mainframe modernization team that has been tasked with working with customers to use different approaches to make mainframes a peer part of an overall IT platform.  The System i team has been similarly tasked with ensuring that these servers are likewise capable of being peer systems. 

An ecosystem has emerged around “Z” and “i” modernization — particularly when it comes to COBOL-based code. IBM offers zAPI — a means of opening up mainframe-based applications with open application programming interfaces (APIs) such that other systems can make standardized calls to the mainframe environment to gain access to process flows and data.

Companies such as Micro Focus have moved to include COBOL as a part of modern development approach. COBOL can now be included in Agile and DevOps processes using Micro Focus systems.  

However, the main focus must be around ensuring that all data systems can be seen as being part of the same overall data fabric. This does, however, raise some issues. The first decision that must be made is whether to clone the data as a separate RDBMS, making inquiries easier using standard SQL syntax. If this is done, should the COBOL data source then be tied into the RDBMS at a constant synchronization level, maintaining data veracity between the two environments?  If not, what level of non-synchronicity should be allowed, and what impact will this have on the business? That constant data synchronization will also have resource overheads — and the need for an RDBMS in the first place will require additional servers, software licenses, and so on.

Agglomerated Data

unnamed-4.png

Figure 2: Proposed Digital Integration Hub (DIH) Architecture

As a possible solution, emerging Digital Integration Hub (DIH) architectures are being used as the diversity and volume of data an organization is having to deal with grow faster than their existing technology can handle. The need for rapid analysis of data (particularly systems-of-record data) in order to make informed decisions means that organizations must have the capability to have a single place to go to find all the data that is pertinent to their requirements. If relevant data can be placed in one data store, it can be accessed through a standardized manner.

pasted-image-0-5.png

Figure 3: Proposed Data Replication within a DIH.

By holding the data in a low-latency data store (such as an in-memory data grid), then results can be gleaned very rapidly, even in real-time. DIHs are fast becoming a means of bringing together diverse data across systems not just including standardized RDBMs and modern NoSQL databases, but also “legacy” systems of record on mainframes and AS/400 (System i) infrastructures that have been difficult to integrate in the past.

When it comes to COBOL-based environments, enabling suitable APIs to access DIH frees up COBOL DevOps team members. These team members are then able to focus on ways to add business value, since DIH manages integrating and replicating data from COBOL applications. 

A DIH typically uses Change Data Capture (CDC) for near real-time synchronization. CDC utilizes the capability to identify only what data has changed within a dataset, allowing for high-throughput and ultra-low latency synchronizations to then be triggered to update the DIH.

Old Guard

COBOL has and should continue to form a large slice of the infrastructure backbone of many organizations’ digital transformations. It also is obviously highly unlikely that COBOL code will be the lingua franca with which tomorrow’s applications will be created. Stil, COBOL applications should continue to represent a major component in microservices-connected on-premises and multicloud environments with shared and agglomerated data in DIHs. The idea is to thus make due with this legacy infrastructure running COBOL by integrating it with a modern one. 

It is also important to keep in mind that the COBOL community is shrinking as the core COBOL developer individuals retire. So, while infrastructure running COBOL remains in use, there are few COBOL support staff coming through. Indeed, the name of the firm COBOL Cowboys was inspired by “Space Cowboys” starring and directed by Clint Eastwood about a group of elderly space engineers called back up for duty because only they have the skills to fix a Soviet-error satellite. 

At the end of the day, DevOps teams orchestrating a digital transformation for their organization are obviously not going to earmark investments to develop new COBOL applications, as they struggle to maintain and manage their existing COBOL code. While it is tempting to imagine eventually phasing out all COBOL applications and replacing them with modern applications that run on today’s infrastructure — whether on-premises or in multicloud environments — this cannot happen overnight. It will take many years (maybe even decades) before COBOL will die. 

A DIH architecture allows organizations to gradually modernize their infrastructure and applications while reducing risk, all the while enabling new digital applications and services that consume data from multiple systems of record, whether COBOL-based or from other sources. This is what will provide the critical infrastructure for an organization’s digital transformation.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK