The financial industry has seen an escalation of liquidity-related shocks recently, including continued interest rate hikes, the March 2020 Market Turmoil and the collapse of Silicon Valley Bank and First Republic Bank in the US. Alongside continued deficiencies related to the reliability of regulatory reporting in the banking industry, banks are in a position where they may be either unprepared or incapable of making effective strategic liquidity and capital management decisions.
With regulators set to enhance their liquidity supervision, and risk managers in need of more reliable risk and return
analytics, dynamic and robust asset-liability management (ALM) will become a priority to navigate the volatility of
the financial industry in the future.
ALM spans the entirety of an enterprise and, as such, is acutely impacted by technical debt from enterprise data development and management. A critical challenge for treasury is having sight of data lineage and reconciling datasets it consumes from across the business. These data processes were often not originally designed nor provisioned for treasury’s specific modelling and reporting processes. The downstream impacts can be dire: unreliable regulatory reporting, limited analytical capabiliti¬es and ultimately impaired strategic insights.
Ultimately, banks need to unlock the potential of their data through refining and reengineering key capabilities. The implementation of an Integrated Treasury Data Product aims to roll back this technical debt by overhauling data consumption into a centralised Treasury environment, anchored in finance and maintained through an effective end-to-end operating model. This includes refining data provisioning from across its business units and subsidiaries to lay the foundation for a group-wide, reliable analytical output which can be leveraged for future regulatory requirements including climate risk management and reporting.
Monocle’s approach focuses on five core principles: data granularity, finance integration, data lineage, data enrichment and data governance.
The design of a robust data model and logical product taxonomy is at the core of a Treasury Data Product implementation. This enforces data standardisation across the enterprise at a level that provides for agile pricing, forecasting and valuation. A Treasury data model will cater for a significant number of products which further requires a comprehensive approach in determining the various data attributes that enable risk modelling. Once this is complete, attributes should be grouped to form common elements that enforce standardisation.
The data model and data provisioning platforms provide the foundation for future-proofing the solution against future regulatory requirements and evolving business uses, such as advanced analytics.
Many analytics platforms are constrained by aggregated datasets that limits accuracy as well as the application of data for other BI purposes.
The various subsidiaries’ balance sheets/general ledgers form the base of the solution to ensure finance and risk data reconciles. Each transaction/trade is tagged with a general ledger account to anchor risk to finance, enabling drill down to daily and monthly balance sheets. Reconciliations must be designed and implemented between the transactional source systems, the ALM model and the general ledger/financial accounts at cost centre/portfolio/general ledger account/instrument code level for daily reporting. Additionally, pre-existing processes for daily accounting and product control purposes across subsidiaries are leveraged as part of the AML requirements.
In addition to the core datasets, supplementary and enrichment datasets are included to allow for the construction of dynamic cash-flow profiles used in the calculation of net cash outflows as well as the valuation and revaluation of instruments/products at both mark-to-market and accrual basis. Common enrichment data
With a modular approach, a Treasury Data Product can be expanded to include additional datasets in order to scale the treasury solution to solve for future purposes including credit risk, deposit insurance and climate risk that requires extensive product and finance data.
Through the identification of authoritative sources to meet the data requirements of a product taxonomy, a Treasury Data Product overhauls data sourcing by validating data availability and highlighting data gaps. Due to differing data requirements between front-end lending and trading systems, the ETL process design and development stream will often identify data issues and will then need to enforce and drive standardisation to the agreed upon data model.
Technical debt will be a particular challenge for banks that have been forced to use transaction booking “workarounds” on legacy systems that do not provide functionality for more complex trades and loans. systems that do not provide functionality for more complex trades and loans.
For many banks, the use of an enterprise data warehouse (EDW) has become a critical part of their data strategy in order to break down data silos and provide a centralised gateway for downstream functions. In this case, the Treasury Data Product can integrate with the organisation’s EDW, provided it stores and maintains the required datasets.
Treasury data can only be as robust as the operating model that underpins the entire process. Therefore, the above datasets and processes must be supported by a well-defined target operating mode that incorporates the following core elements:
Treasury data platform implementation is broken into two phases – a data and a treasury component.
Data Solution Scope:
This covers the entire centralised Treasury Data Product implementation from initial assessment to design, development and testing across the decentralised subsidiaries.
Treasury Solution Scope:
This covers the integration of Treasury’s risk engines, reporting processes and other analytical tools with the finalised strategic data store. Additionally, this phase includes enhancements to quantitative modelling, solution prototyping, regulatory reporting, reconciliations and controls, and analytics.
Explore trending topics in the banking and insurance industriesVoew all