Transforming Mainframes with AI: Optimising Migration and Expanding AI Capabilities
Mainframe systems have played a crucial role in the insurance industry, serving as the foundation for critical operations. These systems excel at managing enormous volumes of data and supporting es applications. Mainframes are highly regarded for their reliability, scalability, and impressive computing capabilities, making them invaluable for handling large-scale data processing tasks. Their ability to handle substantial workloads while maintaining data integrity has made mainframes vital for organisations dealing with complex and mission-critical applications. IBM reports that over two-thirds of the Fortune 100 companies still rely on the IBM systems family1.
With the rapid advancement of technology, the limitations of traditional mainframe systems have become increasingly evident. These systems often need help to keep up with modern business requirements, impeding agility, scalability, and cost-effectiveness. Unlike the cloud, which can leverage distributed computing power from multiple locations and servers, mainframes are constrained by the computing power within their hardware. They also need help integrating new applications and are not the preferred choice for developing new ones. While there is a growing trend towards modernisation and cloud adoption, retiring the mainframe is not a viable option for many insurers.
Artificial Intelligence (AI) has recently experienced remarkable growth and adoption. AI algorithms and machine learning models can analyse massive amounts of data, detect patterns, and provide predictions or recommendations. This transformative technology has profoundly impacted the insurance industry as well. By automating processes, enhancing decision-making, and improving overall efficiency, businesses can leverage AI to drive significant advancements in these sectors.
By leveraging both technologies' strengths, organisations can unlock new opportunities and achieve greater efficiency in their operations. The robustness and reliability of Mainframe systems, coupled with AI's analytical capabilities and predictive power, can lead to significant advancements in data processing, decision-making, and overall business performance. This synergy between Mainframe and AI presents a promising avenue for organisations looking to optimise their computing infrastructure and leverage the benefits of both technologies.
Legacy Application Modernisation
The migration of legacy applications from Mainframe to modern platforms can be a complex and resource-intensive undertaking. However, organisations can simplify this process by harnessing the power of AI. AI plays a significant role in facilitating mainframe modernisation by automating intricate tasks and improving efficiency. Generative AI (Gen AI) is valuable for code conversion, retro documentation, and testing processes, making it easier to modernise legacy applications developed using technologies like COBOL. This significantly reduces the time and effort required for migration, ensuring a seamless transition while preserving the integrity of the data.
The market offers many tools to facilitate migrating mainframe applications to the cloud. Hyper-scale companies like AWS and Google provide advanced tools that harness the power of AI and GenAI capabilities to evaluate and assist in the refactoring and migration process. IBM, a leader in the field, offers its tools, including the Watsonx™ Code Assistant for Z. This product is specifically designed to accelerate the mainframe application lifecycle and streamline modernisation efforts.
These tools provide comprehensive support throughout the entire application developer lifecycle. They begin with application discovery and analysis capabilities, enabling developers to gain insights into their existing mainframe applications. The tools also offer code explanations, helping developers understand their code's intricacies and identify areas for improvement. Automated code refactoring is another key feature, allowing developers to selectively refactor elements of their applications and automate the process, saving valuable time and effort. Additionally, the tools provide code optimisation advice, assisting developers in enhancing the performance and efficiency of their code. Alternatively, developers can leverage the power of generative AI to transform their code into a modern programming language.
The Computing Power of the Mainframe :
Mainframes have long been recognised for their reliability, scalability, and processing power, making them the backbone of critical business operations. Despite being considered outdated, mainframes possess unique characteristics that make them well-suited for AI applications.
Here are some reasons why mainframes are ideal for AI:
- Vast Accumulation of Valuable Data: Mainframes have accumulated extensive amounts of valuable data, often stored in structured formats. This data can serve as a valuable resource for training AI models. By leveraging this data, organisations can enhance the accuracy and performance of their AI algorithms, enabling more informed decision-making and predictive capabilities.
- High-Volume Transaction Handling and Complex Computations: Mainframes are designed to handle high-volume transactions and complex computations. This processing power is crucial for AI workloads that require intensive calculations, such as training deep learning models or running complex algorithms. Organisations can accelerate AI training and inference by utilising the mainframe's processing capabilities, reducing time-to-insights.
- Robust Security and Compliance Features: Mainframes have a long-standing reputation for robust security and compliance features. This is particularly important when dealing with sensitive data in AI applications. By reusing mainframes for AI, organisations can leverage their built-in security measures, ensuring data privacy, regulatory compliance, and protection against cyber threats. The latest, the IBM z16 system, will feature quantum-safe cryptography designed to safeguard current systems and client data against future technologies.
Bringing AI to where the data is a more efficient approach. Reusing existing mainframes for AI initiatives can be a cost-effective strategy. Organisations can avoid significant investments in new infrastructure by repurposing their mainframes and leveraging their existing hardware and software resources.
The latest IBM Telum Processor is designed for real-time and large-scale fraud detection and other emerging use cases using deep learning inference. It is the first IBM processor to feature on-chip acceleration for AI network training during transactions. According to an article published in the Wall Street Journal titled "Mainframes Find New Life in AI Era2", IBM plans to incorporate traditional AI capabilities and large language models (LLM) in the next version of their mainframe system, the IBM Z series. This development aims to revitalise mainframes by enabling them to run AI applications.
The concept of 'AI in a box' is gaining traction, particularly in countries like China. This approach is favoured by governments and companies that value data privacy and have reservations about storing data on public clouds or relying on publicly available data. By incorporating AI capabilities into a self-contained system, 'AI in a box' provides an attractive solution that effectively addresses these concerns.
As organisations grapple with updating their IT infrastructure, mainframe systems continue to play a crucial role. These options can be considered when formulating the overall modernisation strategy.
Reference
1. https://www.ibm.com/history/eserver-zseries
2. https://www.wsj.com/articles/mainframes-find-new-life-in-ai-era-1e32b951?mod=hp_minor_pos5
Tile Image courtesy: GenAI