We are witnessing an era of exponential data growth! According to Statista, global data creation is projected to grow from 64.2 zettabytes in 2020 to over 180 zettabytes up to 2025, and this is just the beginning! Equally, this is also an era where data is considered to be the ‘new gold’.
However, for many organisations data ceases to be an asset and becomes a liability if it is not managed properly.
Organisations are often challenged with an uncontrolled proliferation of large data silos, often locked in legacy applications, mushrooming across the operational estate. Very often, these data silos are not integrated or managed properly leading to:
- Poor data insights (this is how data ceases to become an asset)
- Increased risk of data non-compliance (this is how data becomes a liability)
- Increased cost and risk of maintaining these data silos often locked in legacy applications
When an organisation grappling with the aforementioned issues undergoes a major change like merger & acquisition (M&A) or a strategic shift to the cloud, it is often faced with the need to undergo very expensive application rationalisation and data consolidation exercises often hindered by delays and costs.
So, how can an organisation overcome these challenges, consolidate its data and make it available for better e-discoverability?
In this blog I will explore a few common scenarios that have challenged our clients and how data archiving can overcome these issues to become a key pillar in our clients’ enterprise architecture and vision. The image below is an overview of a few common data archiving scenarios enabling specific business outcomes
Organisations often have to reluctantly bear the cost and risk of maintaining legacy applications. This is because they hold data that is valuable to the business and may be subject to strict data regulations that requires them to be preserved and accessible to support business, legal and regulatory compliance.
However, the OPEX of maintaining such legacy applications is a stubborn and immovable entity that refuses to reduce an organisation’s annual IT budget. The OPEX includes:
- Application licence and infrastructure costs
- Cost of managing the data backups and disaster recovery
- Expensive skilled resources required to support these legacy applications (often out of vendor support)
Just like the stubborn OPEX, the risks involved in maintaining a legacy application are:
- Forced to maintain a software that doesn’t offer an upgrade path or even worse where the software vendor no longer exists
- Unable to search and/or access data in a timely manner
- Unable to secure the application adequately due to shortcomings in legacy products
- Diminishing skills in the market to support the legacy applications and the hosted infrastructure
Data archiving enables an organisation to decommission it’s legacy applications by securely archiving legacy data to a compliant and secure platform with real-time GUI access to it. Thus, an organisation can eliminate the risk of maintaining legacy applications and thereby releasing the associated stubborn OPEX cost that can be used for other progressive IT initiatives.
Application data load shedding
Applications maintaining large volumes of data often tend to suffer from performance issues and strain the wider live operational estate that it operates in. This not only affects end user productivity in general, but also burdens IT departments with the rising cost and pain in having to manage such applications and its data.
It is not uncommon to find that a large proportion of an application’s data start its life as a very active piece of information accessed frequently to support live operations. Over time the demand for some data will gradually fade and they become infrequently accessed immutable records. So, in effect, these applications are managing a mountain of infrequently accessed data (think of an iceberg!) straining a live operation estate, resulting in poor user experience and productivity.
In an engagement with a public service organisation, we helped relieve our client’s live operational estate by archiving the ‘mountain’ of infrequently accessed application data whilst retaining access and applying appropriate retention and security controls. The solution enabled our client to shed ‘archive ready’ data thereby improving application performance, reducing IT costs and improving user experience and productivity.
Accelerated cloud migration
Many organisations journey to the cloud has been expedited by the recent COVID pandemic. However, the data migration part of the journey is proving to be time consuming, difficult and costly due to the complexities involved in moving large volumes of data into expensive operational cloud platforms.
An organisation’s need to secure and maintain accessibility to data doesn’t necessarily need to translate to migrating all of it into expensive storage platforms. I have observed, in past engagements, that ‘archive ready’ data is usually much larger than the ‘migration ready’. So, archiving large volumes of data not only simplifies migrations but also ensures the data endures at a significantly lower cost but is still easily accessible.
Hence, there is tremendous merit in identifying early on what really needs to be migrated to support live operations and what can be archived as it will help simplify and accelerate an organisation’s data migration journey to the cloud.
Application rationalisation and data consolidation
When an organisation undergoes significant events such as M&A, it often triggers a need for widescale application rationalisation and data consolidation with a view to streamline and meet the organisation’s overall M&A strategy and objectives.
Such a large-scale application rationalisation and data consolidation programme also presents a great opportunity for an organisation to become more environmentally sustainable by decommissioning redundant applications and consolidating disparate legacy data silos into a low-cost unified archival platform.
However, such rationalisation and consolidation programmes need to be carefully managed to ensure legal, business and regulatory compliance are understood and met when applications are decommissioned, and the data is consolidated.
By introducing data archiving early in the process, it is possible to accelerate an organisation’s journey towards meeting its M&A objectives.
Introducing an archiving capability into an organisation’s enterprise architecture enables the following outcomes:
- Reduces the risk and cost involved in maintaining legacy applications
- Simplifies and accelerates cloud migrations
- Improves application performance and reduces operational overheads
- Enables better e-discoverability of legacy data by consolidating various disparate data silos
- Assures legal, business and regulatory compliance
Managing Solution Architect in the Insights and Data team
Ajay Padmanabhan is a Managing Solution Architect in the Insights and Data team at Capgemini in the UK. He has over 18 years of experience in the field of information & content management and has worked across a variety of clients to help deliver their business and information objectives.