iTech-Ed Ltd

Mainframe, Cloud, and the Post-Pandemic IT Landscape

Follow us on Twitter

Pinterest


Monday, 1 March 2021


With so much value being locked up in data, which is often held on tape, Model9 wrote in this year’s Arcati Mainframe Yearbook how to leverage and monetize that data using the cloud.

The year ahead, one more step into what promises to be a tumultuous decade for mainframes, is full of opportunities to do groundwork and lay foundations. Our customers tell us two things, loud and clear: They know they need to do more to leverage their vast store of mainframe data and they also continue to feel squeezed by the costs and limitations of traditional storage options.

Put another way, they know the mainframe is a solid, reliable core for their business, but they recognize competing priorities and opportunities, most of which involve leveraging the cloud for agility, faster and better analytics, and potentially more cost-effective storage.

All of these trends were becoming visible in 2019 but the double hammer blows of disruptions resulting from Covid and Covid lockdowns and the secondary economic impacts are powering them to prominence.

While some organizations have found ways to actually eliminate the mainframe, we believe there is still a strong case for most enterprises to have access to a dedicated processing powerhouse to ensure mission-critical activities and align fully with corporate priorities. But when it comes to the data stewardship occurring within the mainframe environment there are two closely related problems.

First and foremost, business units, divisions, and even individual contributors are “voting with their feet” – putting more and more data and more and more analytics in the cloud. It’s easy, simple, and can often produce immediate and valuable business results. But this independence limits the potential for cross-fertilization between the new and the old, the central and the peripheral, and between what’s core and what’s potentially just short-term. It is past time to relink those worlds – enhancing data quality for both and making mainframe even more relevant.

Second, historic dependence on tape for long-term and archive storage as well as backup and recovery results in heavy, ongoing investments in hardware, software, and people to manage them. And, it ensures that those petabytes of data, not to mention active data that is similarly walled-off by its proprietary forms and formats, is also difficult to reach by new tools that are in the cloud.

A recent report from leading analyst firm, Gartner, predicts, “by 2025, 35% of data center mainframe storage capacity for backup and archive will be deployed on the cloud to reduce costs and improve agility, which is an increase from less than 5% in 2020.” That’s momentous!

Cloud is no longer an experiment or a theory. It is a substantial part of the world’s information infrastructure. Even more than the changes wrought in the mainframe world by personal computers and client-server computing a generation ago, cloud promises tectonic change – but change that can benefit mainframe operations. Indeed, unlike mid-range computers and PCs in the past, the cloud shouldn’t be perceived as a threat for the mainframe but rather as a platform for integration/collaboration.

Mainframe organizations should consider where they stand with regard to existing technology, what their business needs and business opportunities are, and the available paths forward. The decision making is a step on what must become a journey.

Organizations face a continuing challenge in the stranglehold of tape and proprietary data formats, which are built around sequential read-write cycles that date back to 9-track tape. With this technology, both routine data management tasks as well as more ambitious data movements put an enormous processing drain on the mainframe itself and lead to increased MIPS costs.

The solution is to eliminate the direct costs of acquisition, maintenance, and management and instead to deal directly with data. This involves mastering movement. This can allow data to be used where it is needed and stored most cost effectively. This step can provide immediate dividends, even if you plan to go no farther.

By using modern storage technologies for mainframe that provide flexibility, you are no longer confined to a narrow list of choices. The technology can also provide the connectivity that enables movement between the mainframe and the new storage platform. With this enabler, you can be master of your own data.

The next challenge is achieving real integration of mainframe data with other sources of data in the cloud.

The solution is to take advantage of Extract-Load-Transform (ELT) technology instead of complex, old, slow, compute-intensive ETL approaches. ELT can leverage processing capabilities within the mainframe outside of the CPU (eg zIIP engines) and TCP/IP to rapidly extract mainframe data and load into a target. There, transformation to any desired format can occur economically and flexibly. The net result is more cost effective and generally faster than ETL.

Building on your movement and transformation capability can help you better engage with cloud applications when and where it makes sense. It is an ideal way to move secondary data storage, archive, and even backup data to the cloud, and then transform it to universal formats.

Liberated from the complexities of the mainframe, this data can be the crucial difference between seeing the full business picture and getting the full business insights, or missing out entirely. This data can also provide a powerful addition to a data lake or can be exposed to the latest agile analytical tools – potentially delivering real competitive advantage to the enterprise, at modest cost. And, all this without adversely impacting traditional mainframe operations, since ELT can move data in either direction as needed. Transformation can apply to any type of mainframe data, including VSAM, sequential and partitioned data sets. That data can be converted to standard formats such as JSON and CSV.

Once mainframe data is no longer locked in a silo it can be leveraged and monetized for new business purposes, right in the cloud and in ways not possible in a traditional tape or VTL environment.

The final challenge for some organizations, for example those that evolved to a highly decentralized operational model, is that a central, on-premises mainframe may no longer deliver benefits and may, in fact, add to latency, cost, and inflexibility.

The solution for these organizations is to recognize that data is the only non-negotiable element. If the data is in the cloud already or if you can get it there, you can grow more and more native capability in the cloud, ranging from operational applications to analytics. Replicating or matching traditional mainframe functions with cloud-based functionality is challenging but achievable. As the most substantial step, in a cloud journey, this step is necessarily the most complex, especially when organizations determine to actually rewrite their existing applications for the cloud. But ELT and the liberation of data from mainframe silos can lay the groundwork and provide a solid basis for finding a workable path to move beyond mainframe. In particular, moving historical data from voluminous and expensive tape infrastructure to the cloud permits consideration of a post-mainframe future, if so desired.

Complete migration to the cloud, is not for all organizations. But for some organizations it offers an opportunity to transform and grow in new ways.

By enabling the mainframe to play more effectively in the data business value game, cloud connectivity and cloud capabilities can make the mainframe an even more valuable and sustainable platform for the organization. Indeed, these options have the potential to make the mainframe even more integral to an organization’s future.

You can read the full article from Model9 here.

If you need anything written, contact Trevor Eddolls at iTech-Ed.
Telephone number and street address are shown here.