South Africa

Choose another country to see content specific to your location.

Integrated Treasury Data Product Implementation

The financial industry has seen an escalation of liquidity-related shocks recently, including continued interest rate hikes, the March 2020 Market Turmoil and the collapse of Silicon Valley Bank and First Republic Bank in the US. Alongside continued deficiencies related to the reliability of regulatory reporting in the banking industry, banks are in a position where they may be either unprepared or incapable of making effective strategic liquidity and capital management decisions.

With regulators set to enhance their liquidity supervision, and risk managers in need of more reliable risk and return
analytics, dynamic and robust asset-liability management (ALM) will become a priority to navigate the volatility of
the financial industry in the future.

With Treasury management overseeing the fundamentals of a bank’s profitability, banking institutions cannot afford to compromise on analytical agility due to unreliable data

Closing the Spread on Treasury Management


ALM spans the entirety of an enterprise and, as such, is acutely impacted by technical debt from enterprise data development and management. A critical challenge for treasury is having sight of data lineage and reconciling datasets it consumes from across the business. These data processes were often not originally designed nor provisioned for treasury’s specific modelling and reporting processes. The downstream impacts can be dire: unreliable regulatory reporting, limited analytical capabiliti¬es and ultimately impaired strategic insights.


Ultimately, banks need to unlock the potential of their data through refining and reengineering key capabilities. The implementation of an Integrated Treasury Data Product aims to roll back this technical debt by overhauling data consumption into a centralised Treasury environment, anchored in finance and maintained through an effective end-to-end operating model. This includes refining data provisioning from across its business units and subsidiaries to lay the foundation for a group-wide, reliable analytical output which can be leveraged for future regulatory requirements including climate risk management and reporting.

Critical Treasury Capabilities in Modern Banking

 

 

  • Daily liquidity and funding monitoring that meets regulatory compliance reporting requirements (i.e., LCR and NSFR).
  • Balance sheet optimisation to adapt to rapidly changing regulatory requirements and market conditions, and to ensure capital is effectively allocated.
  • Agile stress testing to analyse and forecast risk, as per regulatory and business scenarios.
  • Funds transfer pricing (FTP) that provides profitability insights across the enterprise.
  • Granular, dynamic and actionable insights to front-end users (i.e., trade-level regulatory capital impact and trade-level FTP rates) to guide operational decision-making.

Treasury Data Product Implementation


Monocle’s approach focuses on five core principles: data granularity, finance integration, data lineage, data enrichment and data governance.

1. Data Granularity


The design of a robust data model and logical product taxonomy is at the core of a Treasury Data Product implementation. This enforces data standardisation across the enterprise at a level that provides for agile pricing, forecasting and valuation. A Treasury data model will cater for a significant number of products which further requires a comprehensive approach in determining the various data attributes that enable risk modelling. Once this is complete, attributes should be grouped to form common elements that enforce standardisation.

 

The data model and data provisioning platforms provide the foundation for future-proofing the solution against future regulatory requirements and evolving business uses, such as advanced analytics.

 

Many analytics platforms are constrained by aggregated datasets that limits accuracy as well as the application of data for other BI purposes.

2. Finance Integration


The various subsidiaries’ balance sheets/general ledgers form the base of the solution to ensure finance and risk data reconciles. Each transaction/trade is tagged with a general ledger account to anchor risk to finance, enabling drill down to daily and monthly balance sheets. Reconciliations must be designed and implemented between the transactional source systems, the ALM model and the general ledger/financial accounts at cost centre/portfolio/general ledger account/instrument code level for daily reporting. Additionally, pre-existing processes for daily accounting and product control purposes across subsidiaries are leveraged as part of the AML requirements.

3. Data Enrichment


In addition to the core datasets, supplementary and enrichment datasets are included to allow for the construction of dynamic cash-flow profiles used in the calculation of net cash outflows as well as the valuation and revaluation of instruments/products at both mark-to-market and accrual basis. Common enrichment data
sets include:

 

 

  • Additional product information
  • Valuation configurations
  • Static and reference information

With a modular approach, a Treasury Data Product can be expanded to include additional datasets in order to scale the treasury solution to solve for future purposes including credit risk, deposit insurance and climate risk that requires extensive product and finance data.

4. Data Lineage


Through the identification of authoritative sources to meet the data requirements of a product taxonomy, a Treasury Data Product overhauls data sourcing by validating data availability and highlighting data gaps. Due to differing data requirements between front-end lending and trading systems, the ETL process design and development stream will often identify data issues and will then need to enforce and drive standardisation to the agreed upon data model.


Technical debt will be a particular challenge for banks that have been forced to use transaction booking “workarounds” on legacy systems that do not provide functionality for more complex trades and loans. systems that do not provide functionality for more complex trades and loans.


For many banks, the use of an enterprise data warehouse (EDW) has become a critical part of their data strategy in order to break down data silos and provide a centralised gateway for downstream functions. In this case, the Treasury Data Product can integrate with the organisation’s EDW, provided it stores and maintains the required datasets.

5. Data Governance


Treasury data can only be as robust as the operating model that underpins the entire process. Therefore, the above datasets and processes must be supported by a well-defined target operating mode that incorporates the following core elements:

 

  • Service level agreements (SLAs) and data provisioning contracts, defined with the providers of data, to ensure transferred data is complete, accurate, valid and received timeously.
  • Data quality rules to monitor the reliability and integrity of data as well as to identify discrepancies. These data quality rules should be measured and reported through dynamic dashboards daily.
  • Reconciliations designed and implemented between transactional source systems, the asset and liability management model and the general ledger at cost centre/portfolio/general ledger account/Instrument code level for daily reporting.
  • Governance boards and methodology forums to ensure alignment between various business units and the central Treasury around the valuation of specific products and instruments.

How Monocle Can Assist

 

Treasury data platform implementation is broken into two phases – a data and a treasury component.

 

Data Solution Scope:


This covers the entire centralised Treasury Data Product implementation from initial assessment to design, development and testing across the decentralised subsidiaries.

Treasury Solution Scope:


This covers the integration of Treasury’s risk engines, reporting processes and other analytical tools with the finalised strategic data store. Additionally, this phase includes enhancements to quantitative modelling, solution prototyping, regulatory reporting, reconciliations and controls, and analytics.

What's the latest with Monocle