Implementing the Expected Credit Loss model for receivables A case study for IFRS 9

Similar documents
Analysis of the FX risk position

ICPAK. IFRS 9 Practical approach to impairment. March kpmg.com/eastafrica

IFRS 9 Implementation Workshop. A Practical approach. to impairment. March 2018 ICPAK

Technology revs up regulatory complexity and drives deeper data demands

IFRS 9 Impairment Requirements

IFRS 9 Implementation

Applying IFRS. ITG discusses IFRS 9 impairment issues at December 2015 ITG meeting. December 2015

IFRS 9 Implementation Workshop 31 st January 1 st February 2018

Amath 546/Econ 589 Introduction to Credit Risk Models

FE501 Stochastic Calculus for Finance 1.5:0:1.5

Expected credit loss assessment by banks

The Accreditation and Verification Regulation - Verifier s risk analysis

Wider Fields: IFRS 9 credit impairment modelling

Expected Loss Models: Methodological Approach to IFRS9 Impairment & Validation Framework

Solvency Assessment and Management: Stress Testing Task Group Discussion Document 96 (v 3) General Stress Testing Guidance for Insurance Companies

IFRS 9 Implementation Guideline. Simplified with illustrative examples

FASB s CECL Model: Navigating the Changes

Report on Transition to IFRS 9: Financial Instruments of UniCredit Group

FIGHTING AGAINST CRIME IN A DIGITAL WORLD DAVID HARTLEY DIRECTOR, SAS FRAUD & FINANCIAL CRIME BUSINESS UNIT

Moody s Analytics IFRS 9 Impairment: Current State of the Market. Burcu Guner EMEA Specialist Team - Director 9 th March 2016

Basel II Pillar 3 disclosures

IFRS 9 Financial Instruments for broker-dealers

IFRS 9 Readiness for Credit Unions

In depth IFRS 9 impairment: significant increase in credit risk December 2017

Global Credit Data SUMMARY TABLE OF CONTENTS ABOUT GCD CONTACT GCD. 15 November 2017

BAC BAHAMAS BANK LIMITED Financial Statements

Building statistical models and scorecards. Data - What exactly is required? Exclusive HML data: The potential impact of IFRS9

Welcome to the participants of ICAI- Dubai Chapter on IFRS 9 Presentation

IFRS 9 for Insurers. Syysseminaari. Aktuaaritoiminnan kehittämissäätiö. 30 November 2017

PRE CONFERENCE WORKSHOP 3

APPLYING IFRS 9 TO RELATED COMPANY LOANS

Quantitative Modeling Beyond CCAR and other Regulatory Compliance

International Financial Reporting Standards (IFRS) (Circular No. 2, CIR2)

ENTERPRISE RISK AND STRATEGIC DECISION MAKING: COMPLEX INTER-RELATIONSHIPS

Special Considerations in Auditing Complex Financial Instruments Draft International Auditing Practice Statement 1000

Definition of Non-Performing Exposures, impaired (IAS 39), past-due and forbearance measures

IFRS 9 loan impairment

Santander response to the European Commission s Public Consultation on Credit Rating Agencies

Guidelines on credit institutions credit risk management practices and accounting for expected credit losses

Investec plc and Investec Limited IFRS 9 Financial Instruments Combined Transition Report

In various tables, use of - indicates not meaningful or not applicable.

Basel II Pillar 3 disclosures 6M 09

IFRS 9 Financial Instruments. IICPAK: The Financial Reporting Workshop 4 th and 5 th December 2014 Hilton Hotel, Nairobi

IFRS 9: How Credit Data Can Help

CECL Modeling FAQs. CECL FAQs

Leveraging Basel and Stress Testing Models for CECL and IFRS 9. Nihil Patel, Senior Director

Definition of Non-Performing Exposures, impaired (IAS 39), past-due and forbearance measures

Disclosure framework for financial market infrastructures

Guidelines on PD estimation, LGD estimation and the treatment of defaulted exposures

What are CECL gaps in the current ALLL process?

BANCO DE BOGOTA (NASSAU) LIMITED Financial Statements

Risk and treasury management

Complying with CECL. We assess five ways to implement the new regulations. September 2017

Investec plc silo IFRS 9 Financial Instruments Transition Report

Analysing the FX risk position

Final draft RTS on the assessment methodology to authorize the use of AMA

IFRS 9. Challenges and solutions. May 2016

Bond Pricing AI. Liquidity Risk Management Analytics.

Ahli Bank Q.S.C. CONSOLIDATED FINANCIAL STATEMENTS 31 DECEMBER 2017

FREQUENTLY-ASKED QUESTIONS (FAQs) FOR MFRS 9 FINANCIAL INSTRUMENTS

Managing a Transition to a New ALLL Process

Are you prepared? FASB s CECL Model for Impairment Demystifying the Proposed Standard

IFRS 9 Forward-looking information and multiple scenarios

Investec Limited group IFRS 9 Financial Instruments Transition Report

mbank Hipoteczny S.A. IFRS Condensed Financial Statements for the first half of 2018

Risk & Capital Management Under Basel III and IFRS 9 This course is presented in London on: May 2018

Financial statements

Asset items 31/12/ /12/2016

White Paper. Not Just Knowledge, Know How! Artificial Intelligence for Finance!

Nationwide Building Society Report on Transition to IFRS 9

IFRS 9: A new model for expected loss provisions for credit risk

CREDIT LOSS ESTIMATES USED IN IFRS 9 VARY WIDELY, SAYS BENCHMARKING STUDY CREDITRISK

Effective Corporate Budgeting

Quantifiable Risk Management Data Driven Approaches to Building a Predictive Risk Framework. Andrew Auslander, CFA, FRM

The Goldman Sachs Group, Inc. PILLAR 3 DISCLOSURES

The Goldman Sachs Group, Inc. PILLAR 3 DISCLOSURES

Risk & Capital Management Under Basel III and IFRS 9 This course can also be presented in-house for your company or via live on-line webinar

The analysis of credit scoring models Case Study Transilvania Bank

TaxOne A breath of fresh air for your tax department

PBR in the Audit: What to Expect Michael Fruchter, FSA, MAAA Emily Cassidy, ASA, MAAA

What IFRS 9 means to insurers. Developing insurance specific business capabilities

CECL for Commercial Entities

EBF Comment Letter on the IASB Exposure Draft - Financial Instruments: Expected Credit Losses

STRESS TESTING Transition to DFAST compliance

Quantitative Management vs. Traditional Management

The Importance of Operational Transfer Pricing

Financial Instruments IND AS 109

Proposed Accounting Standards Update, Financial Instruments Credit Losses (Subtopic )

In depth IFRS 9: Expected credit losses August 2014

10 Steps to realising real cash value from Innovation and IC Assets. Ludo Pyis Areopa

ALVAREZ & MARSAL READINGS IN QUANTITATIVE RISK MANAGEMENT. CECL and the Present Value of Troubled Debt

EBA REPORT ON RESULTS FROM THE SECOND EBA IMPACT ASSESSMENT OF IFRS July 2017

IFRS 9: Closing the Gap between Risk and Finance. Written by Jeroen Van Doorsselaere

Challenges For Measuring Lifetime PDs On Retail Portfolios

INDEPENDENT AUDITOR S REPORT FINANCIAL STATEMENTS NOTES TO THE FINANCIAL STATEMENTS

TRΛNSPΛRΣNCY ΛNΛLYTICS

IFRS 9 Financial Instruments : Transition. Lloyds Banking Group plc

FRAMEWORK FOR SUPERVISORY INFORMATION

Deutsche Bank. IFRS 9 Transition Report

BAC INTERNATIONAL BANK (GRAND CAYMAN)

Transcription:

Implementing the Expected Credit Loss model for receivables A case study for IFRS 9 Corporates Treasury Many companies are struggling with the implementation of the Expected Credit Loss model according to IFRS 9. Below we present some examples for the Simplified Approach in receivables from goods and services, what an implementation could look like and which aspects could be automated. The new impairment model under IFRS 9 foresees risk provisioning for expected credit losses, which is a change from the method used so far which only looked at actual credit losses. Accounting thus becomes more of a forward-looking credit-risk management; this requires a model for value credit loss risks for all financial assets that are not valued according to market value. Today s article focuses on the implementation of the simplified approach, which is also used for receivables from goods and services as well as contractual assets (revenue from contracts with customers) under IFRS 15. IFRS 9 does not stipulate any specific requirements regarding the design of the model. In practice, however, mostly two approaches are used to determine the ECL (expected credit loss): 1. Provision matrices based on company-internal, historical default data and past-due dates 2. Valuation method using the likelihood of default What is common for both approaches is that they depend very much on probability-weighted occurrences and that they have to be adjusted by forward-looking macroeconomic information. Often, historical data held by a company is not very representative because defaults due to economic cycles or business models are relatively rare. However, IFRS 9 does not allow a simple projection of past business, i.e. the standard makes a case that a certain percentage of default is likely even for clients with good credit standing. In other cases, the records available in the enterprise resource planning (ERP) system do not allow a sufficiently granular analysis of historical defaults. It is for this reason that the use of a valuation method where the ECL is determined based on the probability of default is a good idea, which is then applied to the receivables. How this model is applied in three steps will be discussed below using some examples. Example: impairment model 1. Defining the model s parameters To begin with, the company has to define the required input parameters and the availability of the necessary data. Besides using the ERP, data could also be drawn from risk-management systems, e.g. in receivables management, where there is often already quite a bit of relevant data. The following information is necessary for the model: Book values of receivables from goods and services as well as revenue from contracts with customers; Time to maturity on contracts; Collateral; Clients names and addresses; Ratings or scorings; Probability of default (PD). In many cases, neither rating/scoring data nor probability of default applicable to the company s client base is available, especially in the case of heterogeneous and internationally active corporations. Scoring services, rating agencies and credit insurance could provide relief in this instance as these usually use exactly this data to determine credit risk. Oftentimes, a quick look at Treasury s already available market data system, which offers ratings and default probabilities for many exchange-listed companies as well as at industry level, can at least be used for an initial quantification or for an individualized look at a large client. Proceeding in a structured way is also a good idea because the corporation has to explain the individually applied input data, assumptions and methods used to determine the provisioning against risks in the Notes. Implementing the Expected Credit Loss model for receivables June 2018

2. Specification and data collection In a second step, the corporation has to actually collect the data and it has to be integrated into the model by IT. The basis for the valuation is the risk exposure, in this case the book values of the receivables (exposure at default, EAD). Risk-mitigating collateral, such as credit insurance or Hermes sureties can either be deducted directly from the exposure or integrated at the end by using a weighting factor. Deductibles and other clauses that could leave the corporation sitting on residual risks should not be ignored. For the valuation of the risk exposure, the contractual term plays a central role. There is an empirical rule: the longer the time to maturity, the higher the default risk. Reversing this rule makes for lessened risk for short maturities. The planned repayments of each client as well as the remaining term of the recognized receivables as well as the payment schedule are often not available ad-hoc and should therefore be an integral part of the data collected early on for the model. This may be collected by default in the ERP or by querying local entities reporting package. For companies with a heterogeneous client base or extremely small clients, IFRS 9 offers the possibility of grouping these together into separate risk portfolios, so-called clusters. A cluster will have similar risk characteristics, such as region, industry or historical payment behavior thus allowing the creation of homogeneous risk clusters, which in the model will only be looked at the level of the cluster when doing the valuation. Clustering also allows a reduction of the amount of data. A partially globalized look may be acceptable if materiality thresholds are respected. All of the data named so far may be gathered internally, so now, let s turn to the data that has to be gathered externally in a last step: Client rating/scoring and especially the probability of default expressed as a percentage. The PD has to be properly allocated to each client or risk cluster and the terms have to be calibrated accordingly. In addition, the corporation should obtain a confirmation from the data service stating the factors (especially regarding forward-looking information) that are being included in the determination as this information will have to be disclosed in the Notes. The grouping into risk clusters is also of relevance in view of IFRS 7. 3. Implementation and recognition Once the company-internal data on clients, details of the receivables and collateral have been gathered and possibly grouped in ideal risk clusters and this data is then enhanced with the external data, such as ratings and probability of default, you will have the information necessary to evaluate the credit default. In practice, usually the following formula is used: ECL = EAD * PD * LGD [Expected Credit Losses = Exposure at Default * Probability of Default * Loss Given Default] In this equation, LGD (Loss Given Default), i.e. the actual losses in receivables in case of default is the expected insolvency assets that are no longer recoverable. Calculation examples: The corporation holds an uncovered client exposure of more than EUR 100m with a residual maturity of 1 year, where the probability of default for 1 year is 1% and where the loss given default is assumed to be 50%. This makes for expected credit losses of EUR 0.5m (ECL = 100 * 1% * 0.5). For reasons of materiality, no discounting is used in this example. The first time it is calculated, the expected credit loss is expensed in the income statement in an adjustment account for the relevant balance sheet item. This item is then updated at every balance sheet date. Just as is required in IAS 39, specific valuation allowances are still recorded every time a loss occurs, despite the ECL. Integration into processes and systems So that the impairment calculation does not remain purely theoretical, the implementation should also think of an optimal integration of this model into the processes and IT systems relevant to the accounting. Depending on the ERP environment and the group structure, the accounting process could include both adjustments at the top in the consolidated balance sheet but also a push-down into the local ERP systems. When thinking about this, the process efficiency and risk of error should be considered. In practice, group accounting often collects and models data initially, and then integrates the provisioning data against risks centrally using standardized ERP reports or reporting systems. Local entities can then access this data, thus ensuring a consistent logic when applying it and recording their local assets using. Practice shows that automation may be challenging when using data that is not also used for the annual reports, such as client details and contractual terms. In order to avoid manual queries and elaborate data mapping, the objective should be to integrate data which is regularly collected in a standardized manner in the ERP or the reporting package for financial statements and Notes. During the implementation, it is recommended to discuss not only the method but also the integration into the accounting-relevant processes, IT systems and the internal control system with your external auditor. Depending on the availability of data and interfaces with external service providers, there may be fully automated impairment solutions already available and implementable which ensure a high data quality and efficient processes. The information contained herein is of a general nature and is not intended to address the circumstances of any particular individual or entity. Although we endeavor to provide accurate and timely information, there can be no guarantee that such information is accurate as of the date it is received, or that it will continue to be accurate in the future. No one should act on such information without appropriate professional advice after a thorough examination of the particular situation. 2018 KPMG AG is a subsidiary of KPMG Holding AG, which is a member of the KPMG network of independent firms affiliated with KPMG International Cooperative ( KPMG International ), a Swiss legal entity. All rights reserved. Implementing the Expected Credit Loss model for receivables June 2018

Analytics in Treasury Necessity or Mumbo-Jumbo? Corporates Treasury Treasury reporting as it currently takes place is probably one of the company s most painful processes with relatively limited gains. This is largely due to the fact that so much of it is based on manual activity but also on the limited usefulness of analyses. The simple data aggregation of yore is in stark contrast to today s analytics, which ranges from linear regressions extracted from simple data cubes to multi-variable algorithms, allowing the forecasting of complex risk performance indices. The high concentration in quantitative competences in Treasury makes for a high affinity regarding analytical methods and applications possible, which is why this department is predestined to step into analytics. So what is meant with (business) analytics? A possible definition could be: Business analytics (BA) is the practice of iterative, methodical exploration of an organization's data, with an emphasis on statistical analysis. BA is used by companies committed to data-driven decision-making. BA is thus used for decisions and therefore has to happen before the decision is made. However, it also comes into play after the decision has been made because it also helps to answer the question whether a decision/measure has brought about the expected results and whether the process was effective and efficient. Therefore, it is unlikely that a daily financing status or a liquidity planning derived from recorded receivables, commitments and corporate planning will ever be part of business analytics. Deriving measures from data becomes possible because data is turned into information which is then interpreted and validated for its truthfulness (always subject to a caveat). Moreover, current and past data can be reviewed for patterns in order to make the future more predictable. In view of all of the imponderabilities of the future, it is the only method for a number of cases. So what are the fields of application of business analytics? Business analytics in Treasury All of the functions in Treasury could benefit from business analytics, even the compliance function. Let s look at three examples: 1. Liquidity planning One of the possibilities of using BA in Treasury is liquidity planning. For certain business models, it is quite possible to extrapolate future scenarios by analyzing past streams of payments (taking into consideration specific parameters). The fundamental for this is a highly granular current cash flow, broken down according to meaningful budget positions,

companies and value date. Statistical extrapolations are thus trained and tested for this data. A step further on, the model can be enhanced with further source data, which is then validated. This data could then predict future business developments, such as economic indicators. If a predictive cash forecasting model also separates the different currencies used by the company, it may also serve as data basis for currency management and to improve hedging strategies (e.g. by reducing the hedging amount to be rolled over). At the same time, the effectiveness of past hedging measures may be analyzed and systemic errors corrected. 2. Payment operations Another possibility for BA is payment operations. For instance, statistical anomalies in payment processes may be identified in time and corrected accordingly. An anomaly would be identified if unusual payments were made to a specific supplier in addition to normal payments, and maybe this payment exceeds a certain threshold and is sent to a new account and the payment was made by another employee than the one who normally performs such payments. 3. Working capital management Working capital management is a rather large topic so it may make sense to include other functions here, such as Accounting and Controlling. For instance, working capital management often interlinks data relevant to Accounting and Controlling with Treasuryspecific data. The historic development of receivables and commitments, as well as the objective of payments bring about understanding of the past. Current order volume and sales plans provide a base for future needs and requirements. A common database can therefore be used for different purposes and functions. Rules-based machines or even systems including artificial intelligence are better suited to process all of this data simultaneously than a human being. Over time, this allows for more consistent and stringent decision-making and the impact of the decisions may be better understood. If you go a step further, we are already in the world of big data. Big data analytics is the analysis of large masses of data of different kinds (big data) in order to detect hidden patterns, unknown correlations and other useful information. The example for liquidity planning above already points in that direction. If one were to use big data analytics here, you would also enhance this data with partially unstructured data coming from social media feeds, internet articles, forecast data on macro-economic developments or weather data. Other questions that could also be answered with the help of big data, could be: - Can information on the most important suppliers and clients be used to measure impact on own cash flows thus managing risks this way? - Should global corporations use big data analyses to re-route cash and short-term money investments in certain countries at certain banks in consideration of risk criteria? - Can market signals also be based on unstructured data in order to look at different scenarios in medium-term liquidity planning? In all of these cases it becomes obvious there is less a case made for reducing manual interventions but rather for gathering insights, which would not have been possible before due to a lack of accessibility and data evaluation options. However, the current technologies may be too cumbersome for using it with big data to actually get a positive use out of it. How are analytics solutions implemented? First of all, the problem has to be rephrased to describe it better. A precise definition including all of the relevant (and known) dependencies will help create a complete mapping to the underlying business process.

In a next step it becomes necessary to create an ideal database for the analytics solution. This could be data from a single or several data sources which are included in a data model and which can then be used to perform the necessary analyses. The complexity of the statistical forecasting models of simple regression models up to modeling of multivariate neuronal networks is freely scalable and is based on the required exactitude of the forecast and the data available. For instance, bank holidays, payment plans or trends can easily be presented for a cash-flow forecast. However, if external data should also be integrated in the forecasting model, complex modelling and validations would be necessary. For instance, commodity price indicators can have a delayed impact on certain plan items of certain entities or even entire countries. In addition, increasing complexity will also make the interdependencies less transparent and thus harder to grasp. Understanding the model, however, is cardinal because ultimately, it is the user as well as Management that should trust the resulting forecasts. And finally, past and forecasting data should be processed and visualized in an appropriate report so that statements and results are immediately recognizable. The business intelligence solution market offers various user-friendly applications, even the kind that will not mean a lifelong dependency on advisory services. The economic use should be a given Even if the sky is the limit as far as the use for Treasury analytics is concerned and its use is generally sensible, it should also be looked at in view of cost/benefit considerations. What does an implementation cost and what is the quantifiable use generated with this application? These answers will be very different for a business model in the retail industry that is undergoing dramatic changes and the thus resulting necessity for a much more precise liquidity planning than for complex exposure models involving commodities of a medium-sized food manufacturer. The information contained herein is of a general nature and is not intended to address the circumstances of any particular individual or entity. Although we endeavor to provide accurate and timely information, there can be no guarantee that such information is accurate as of the date it is received, or that it will continue to be accurate in the future. No one should act on such information without appropriate professional advice after a thorough examination of the particular situation. 2018 KPMG AG is a subsidiary of KPMG Holding AG, which is a member of the KPMG network of independent firms affiliated with KPMG International Cooperative ( KPMG International ), a Swiss legal entity. All rights reserved.