The new Basel IV requirements are putting a strain on financial institutions. For the previous rounds of BCBS-driven requirements, only three banks out of 30 had passed muster with only six months left to go. Thus far, none of those significant institutions – some of which are classified as global systemically important banks – have fully implemented the BCBS 239 principles, though the implementation deadline passed at the start of 2016. This blog will address the three main issues around data management for the banking industry.
What are the reasons to look after your data?
There are three reasons why Basel IV is the perfect opportunity to become a data-driven enterprise. Not only is BCBS asking for more granular data, they also limit the freedom of banks for applying internal regulatory calculations. At the same time, the ECB is getting closer to banks with their thorough inspections and reviews. Headroom to manoeuvre on calculations is becoming less in favor of data quality. Better data is paying off. With better data, can be saved and the regulatory capital claim lowered.
A third reason is that because of the need to improve CET 1 capital, the effort on cost reduction for efficiency is still a primary requirement. Banks must not only invest more in data, they also need to improve the efficiency of data use. The banking industry still relies on legacy systems and integration and thus requires constant ad-hoc solutions in order to provide the right data to the regulator. With insufficient data management, banks will have to maintain bigger margins on their capital floors and RWAs. That’s why banks are no longer just about money but also about data. This has to do with three generic data-management problems.
What are the main data issues which banks are facing today?
Data inconsistency: The transition toward an agile institution is not compete within the financial services industry. Hence, the IT landscape is still built and set up in a silo, which means that banks have not matured in their data consolidation. Financial institutions need to solve inconsistencies between the branches, entities, and group-level (head) offices in order to improve the efficiency with which the data can be compiled when required by the regulator. Inconsistent data will require additional analysis and modifications to ensure consistency in reporting to the regulator. Not only does this increase costs for any project involving business analysts; the inconsistency of the data also influences the reporting accuracy forcing banks to keep bigger reserves. Some banks have started data integration and data quality projects, but it will take years to see returns. If, as a bank, you have not started yet, you’d better hurry.
Data understanding: Understanding one’s own data might sound as a given. However, with ever-more granular data elements populating existing and newly created data sets, this becomes increasingly arduous. Data elements in one system can have completely different roles in another. But, if in some cases these data elements do mean the same, they need to have the same format, structure, and quality. To ensure that the data within an organization is not only known but also understood, a data governance is required across all departments, branches, and business roles. If the data is not understood, or worse not known, there is an increased chance of incorrect reporting. To prevent this, additional resources are constantly required to ensure that the right data is flowing in the right direction. This can be achieved by setting up data governance tools such as data dictionaries, common data models, and data quality frameworks.
Data availability: New regulatory requirements are focused on receiving the required data on an ad-hoc basis. Both internal and external data requirements are growing at high pace, and demand that all data be available across the entire organization. On top of this, banks are required to show the lineage of this data. Where did it come from? And what modifications were made to the raw data? The classic approach with IT functioning in silos and supporting business demand as and when required does not facilitate such regulatory demands. When data flows across an entire organization its origins are essential to demonstrate the data’s lineage. Many financial institutions will find themselves struggling to provide full data lineage, made even more difficult due to legacy systems. Hence, a fully integrated data-management approach across the organization is necessary. On top of this, it is crucial to maintain the lineage reports in a way that ensures that history is being kept. The regulator will require additional insights on the data lineage – Where did this data come from? What happened to the data? What was the logic driving these modifications?
Data management, along with internal and regulatory reporting, is key to the successful management of banks. The ready availability of data will increase the efficiency of internal data gathering and can thus increase the speed at which banks implement new initiatives. If banks want to minimize their capital floors and free up the maximum amount of available funds, data management must become a priority. Because banking is no longer about money, it is also about data.
 ECB, Report on the Thematic Review on effective risk data aggregation and risk reporting, May 2018