BUSINESS IMPACT

May 26, 2020

How the Cloud Rains on the Mainframe: A Story of Modernization

Steve Cardella Posted by Steve Cardella

The financial services community has a long and storied history of computer-controlled record keeping. For many years, IBM mainframes and their applications were the solution of choice for large-scale, online transactional processing systems. While these mainframe systems served users’ transactional needs well, they did not keep pace with their users’ analytical needs. Organizations found mainframe-sourced data expensive, inaccessible and difficult to use.

Mainframe vendors charge a premium for their systems and their operation. The storage and processing power required for large-scale analytics is cost prohibitive compared to modern, cloud-based tools. Mainframes have a fixed capacity, so customers must pay for a system based on peak demand. In contrast, Azure has a pay-as-you-go model, allowing customers to scale systems up or down according to their needs.

cloud-rains-on-mainframe2

Mainframe systems pose many problems to IT professionals:

  • Some mainframes require specialized knowledge to access and extract data for analysis in modern tools.
  • Users might need to use antiquated console development tools and query environments to analyze data.
  • Mainframe databases use obscure, coded names for tables and columns.

Modernizing to Azure gives organizations easy data access, user-friendly interfaces, and more convenient development tools. Moreover, it closes the potential skill gap as users and developers of all levels of experience are empowered to extract and analyze data.

Azure also allows organizations to leverage powerful self-service business intelligence tools and cutting-edge AI platforms that provide a competitive advantage. Self-service tools like Power BI allow users to easily organize and present data without developer support. In fact, it allows out-of-the-box analytics and visualizations that are virtually impossible to recreate with legacy tools.

Who Wants EBCDIC in a Unicode world?

At BlueGranite, one of our clients was struggling with this very issue. They only had indirect access to their mainframe data, meaning they could only access the data encapsulated in thousands of preformatted reports. Unfortunately, those reports often didn’t give them exactly what they wanted. To compensate, some users had created in-house applications that would scrape the data they needed from these report outputs.

Obviously, the existing reports were not ideal for meeting the demands of today’s data-driven world. To solve this problem, BlueGranite first designed a solution to continuously import these thousands of report files into an SQL database. But we soon realized that with such varied data, a traditional ETL approach would require constantly maintaining ETL code.  Something else was needed — the situation called for a metadata-driven approach.

With a few bits of metadata maintained by the client, we designed a solution that allowed them to dynamically modify their data export process without developers or custom code. Then, based on that metadata, the solution would automatically create everything necessary to import new or updated reports, which dramatically reduced the need for maintenance.

For the users, the data could now easily be reached by querying a much cheaper, more modern database. The technically savvy, like those who created those data scraping applications, were happy to import the data from there and flush, filter, slice and dice the data to their hearts’ content.

That’s great for the technically savvy, but what about those who live and die in Excel?

For the legions of Excel crunchers, an Excel front-end seemed most appropriate. This solution delivered the data where the analysts felt most comfortable.  With easy access to the data, they too can filter, slice, and dice the data to their hearts’ content

Scaling and Looking Forward

Our first iteration used on-prem tools to prove out the technology stack and user tools. While we were building that first iteration, the client’s network and security team vetted Azure’s modern data platform services for compliance with their standards and procedures. We demonstrated that the technology stack and user tools met their complex reporting needs, and that it eliminated time-consuming developer turnaround. When it was time to scale the solution, we wanted to use Azure’s elastic scalability and pay-as-you-go model, as it required fewer upfront costs than scaling out an on-premises solution.

Throughout this period, the security team and networking team had grown more comfortable with Azure. Azure had also introduced a few new security features that complied with the client’s need for private networking. With these factors in mind, it made sense to try a migration to the cloud. Despite having to navigate the intricacies of tight security measures and a new ETL architecture, the company’s new Azure solution offers nearly unlimited scaling and the beginnings of a data lake to boot. It also laid the foundation for a Power BI implementation that will allow even richer self-service BI solutions in the future.

How Can We Help?

With a solution that helps our client reduce maintenance, improve access to information and lower costs, their business is in a better, more sustainable place.  In these times where the future becomes less certain, two things become increasingly critical:

  1. Clear visibility and attribution of resources for data & analytics workloads
  2. Elasticity to scale up and down according to need while minimizing overspend

At BlueGranite, we can help you gain that insight with our highly trained team of data professionals and implement cost-conscious, scalable solutions. To see how we can help you optimize your data infrastructure, get in touch today!

New call-to-action
Steve Cardella

About The Author

Steve Cardella

Steve is a Senior Consultant at BlueGranite, with over 10 years in Business Intelligence and Database Development experience. He specializes in data integration and distributed computing platforms. In particular, he enjoys working with modern data platform tools, including Azure Databricks, Data Factory and Synapse Analytics. Steve has worked with a variety of industries, including financial services, retail and manufacturing.

Latest Posts

New Call-to-action