Data Management in ALM is Crucial

As financial institutions embrace asset liability management to ensure customer and regulatory confidence, data management practices and technology step up in importance.

Kaz Kazmi, Director, Baringa

August 6, 2023

4 Min Read
business finance technology and investment concept
Panther Media GmbH via Alamy Stock

Uncertainty in the banking sector has raised concerns on how financial institutions operate and demonstrate their resilience in such times. Numerous analyses and post-mortems have been conducted to examine the factors that led to the downfall of two prominent banking institutions. These include insufficient governance and controls, flawed management of their balance sheets, and ultimately, the impact of market forces on their operations.

Over the past decade, larger institutions underwent the most stringent regulatory changes, and, therefore, they have not seen as much turmoil as their mid-tier counterparts. One key tenet of those regulatory expectations is the high bar for data management in meeting requirements, especially for those related to capital and liquidity management.

Asset liability management (ALM) and overall financial risk management disciplines -- such as liquidity, interest rate risk and capital management -- have required extensive investments in data infrastructure to ensure high quality, granular data to measure, monitor and make timely decisions in day-to-day balance sheet management.

There are a number of data challenges in ALM:

  • Data quality

  • Data integration

  • Data analysis

  • Reporting and decision support

 The good news is there have been significant advancements in data management principles that have been informed through the years since the last financial crisis of 2008.

Enabling Data Management in ALM

The key elements include:

Data governance: Implementing a data governance framework can help ensure that data is accurate, complete, and consistent. Data governance is not just about fixing data quality. Instead, data governance is how an organization defines its data culture, including:

  • The policies around data management

  • The roles and responsibilities that ensure accountability (e.g. for fixing and ensuring data quality) at the most appropriate level of seniority and skillset

  • The standards and processes through which data is discovered, assessed, onboarded, integrated, curated and consumed

  • The types of controls required to maintain the highest level of confidence in the data throughout its lifecycle (e.g. data quality and data lineage)

Data governance is not just a means to fix data quality, but the way to ensure a critical function like ALM has access to, and trust in, the data it uses to make mission critical decisions.

Data integration tools: Using data integration tools can help financial institutions integrate data from different sources. These tools can automate data extraction, transformation, and loading (ETL) processes, reducing the risk of errors and inconsistencies. The core data for liquidity or ALM management can come from many different sources. Doing so in the absence of proper data acquisition and integration tools can be a highly challenging task with many opportunities for breakdown or error. There is a plethora of modern tools in the marketplace that can seamlessly manage the flow of data throughout its lifecycle (source ingestion to staging to integration and curation to provisioning and reporting). These tools come in many forms. Some are highly modular (“choose what you need”) or highly consolidated (“full-service functions''). They can be plugged into an organization's architecture with ease and wired up to source systems or existing analytical/calculation tools. Investing in data integration tools is essential to maintaining the flow of data to time sensitive and high visibility processes such as those inherent within ALM functions.

Business intelligence tools: BI tools can help financial institutions analyze large volumes of data quickly and efficiently. These tools are capable of joining different streams of data from multiple sources and have the functionality to let users create their own datasets on the fly with little to no technical expertise. Desktop applications (such as Excel) are ubiquitous in analytics-hungry functions due to their ease of use. However, there is an over reliance on these tools that are not only prone to human error (with little to no controls) but also have a knack for becoming repositories for data that are then used elsewhere in the organization, creating a fit-for-purpose challenge. BI tools are a better alternative for such desktop applications, enabling the consolidation of a large amounts of data in a desktop-like environment while facilitating custom data analysis capabilities for ALM professionals. That enables analysis on liquidity or cash flow positions in a short period of time, thus reducing the reliance on technology to build complex systems through traditional software development lifecycles.

Reporting, analytics, and visualization tools: These are similar to BI tools (and often as part of them). Many organizations have invested heavily in building or acquiring sophisticated and rich reporting tools. These tools vary in functionality from operational reporting to executive-level reporting for senior management and regulatory reporting. These tools rely on availability of high-quality data from the appropriate sources but offer highly plug-n-play characteristics. Such tools enable ALM functions to create many views of the data in an intuitive and increasingly visual manner, adding to clarity in understanding a financial institution's balance sheet position. Some of the market leading solutions enable highly interactive reporting functionality that can help senior leaders and decision makers perform dynamic analysis of the data that informs their critical decision-making process.

Data is, and will remain, the lifeblood of financial organizations and in critical functions such as ALM. It not only enables prudent management of balance sheet risk but can also create opportunities for efficiency and optimization of capital resources. It is important to note that each of these practices and tools must be adopted with careful context and consideration of an organization’s size, complexity, and operating environment. 

About the Author(s)

Kaz Kazmi

Director, Baringa , Baringa

Kaz Kazmi is a director focused on advisory services in the banking and capital markets industries at Baringa. Kaz leads Baringa’s Data Management offering for the Financial Services business. He specializes in helping clients manage their data, align to business strategy, and reduce risk through people, process, controls, and tools. Kaz has worked in the financial services industry for 18+ years serving global banking and capital markets institutions and helping them in a myriad of areas including capital and liquidity management, regulatory strategy, compliance, and regulatory reporting, with a focus on harnessing emerging technologies such as machine learning/AI to solve complex business problem domains.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights