Credit Scoring. from Concept to Reality. Credit & Collections Conference Boston: June 11 th, 2007

Similar documents
PayNet Advanced Score of Scores ( PASS )

The Tragedy of the Commons

Executing Effective Validations

December 2015 Prepared by:

Keeping Score: Best Practices for Risk Management Reporting

Making the most of GRESB Real Estate

The CreditRiskMonitor FRISK Score

Strategic Planning, Forecasting & Budgeting

Top US Bankcard Issuer Validates the Power of FICO 8 Score Key metrics exceed client expectations in originations testing

Developing WOE Binned Scorecards for Predicting LGD

Empowering the customer journey in retail banking

Nasdaq Chaikin Power US Small Cap Index

CRIF Lending Solutions WHITE PAPER

Performance Metrics in a High Growth Environment

How to Solve Hiring Problems with Data Analytics

Proposed Change to Unsecured Credit Scoring Model

Making sense of Schedule Risk Analysis

Selling an Insurance Agency

How to Prevent Debt from Becoming Uncollectable. Todd Wahl, President - Hunter Warfield, Inc.

An Interview with Renaud Laplanche. Renaud Laplanche, CEO, Lending Club, speaks with Growthink University s Dave Lavinsky

Lenders using CreditXpert as an integral part

Building statistical models and scorecards. Data - What exactly is required? Exclusive HML data: The potential impact of IFRS9

CECL Modeling FAQs. CECL FAQs

Actuarial Transformation The Future Actuary

Personal Banking WHITE PAPER SERIES. Using your home equity wisely.

yourmoney a guide to managing your credit and debt Volume 6 Life After Debt

Invest now or temporarily hold your cash?

Credit Card Default Predictive Modeling

FINC 664 Business Analysis Using Financial Statements. What will we cover this week? Forecasting. FINC 664 week 3 1. Week 3 Forecasting

2015 STAR Best Practices

Annual Budget Process Survey

Effective Corporate Budgeting

Credit Risk Scoring - Basics

POPULAR IBC TOPICS Notes on Lecture 4: Paying Cash vs. IBC. Robert P. Murphy July, 2015

Te c h n o l o gy : According to a recent ELAOnline

A new highly predictive FICO Score for an uncertain world

Multi-Bureau Data: Maximising Predictive Accuracy and Customer Understanding

UNDERSTANDING BUSINESS CREDIT

10 Errors to Avoid When Refinancing

SAMPLE REPORT. Contact Center Benchmark DATA IS NOT ACCURATE! In-house/Insourced Contact Centers

4 BIG REASONS YOU CAN T AFFORD TO IGNORE BUSINESS CREDIT!

How to Find and Qualify for the Best Loan for Your Business

Making More Informed Decisions

Boosting Financial Based Risk Measures with Nonfinancial Information. Douglas Dwyer

Quantifiable Risk Management Data Driven Approaches to Building a Predictive Risk Framework. Andrew Auslander, CFA, FRM

Analytic measures of credit capacity can help bankcard lenders build strategies that go beyond compliance to deliver business advantage

Is corporate New Zealand ready for the Big 3?

CHAPTER VI SUMMARY OF FINDINGS, SUGGESTIONS AND CONCLUSION INTRODUCTION

Risk Rating and Credit Scoring for SMEs

Best Execution Trends: What s Changing

Drive Away Happy: Car Buying Decisions

Reinsurance Symposium 2016

Technical Tools Partnered With Other Methodologies. May 15, 2014 Adam Grimes, CIO, Waverly Advisors

FEATURING A NEW METHOD FOR MEASURING LENDER PERFORMANCE Strategic Mortgage Finance Group, LLC. All Rights Reserved.

Beware the Enterprise Valuation Measure

Work management. Work managers Training Guide

Telematics Usage- Based Insurance

Lean Leadership & Systems Thinking. Al Shalloway CEO, Enterprise Consultant

SAMPLE REPORT. Contact Center Benchmark DATA IS NOT ACCURATE! Outsourced Contact Centers

Unit-of-Risk Ratios A New Way to Assess Alpha

Quantitative Trading System For The E-mini S&P

WHITE PAPER FOUR PRACTICAL WAYS TO CAPTURE AND MONITOR RISK APPETITE

How Advanced Pricing Analysis Can Support Underwriting by Claudine Modlin, FCAS, MAAA

Comprehensive plan services with an eye toward tomorrow

Game Theory I. Author: Neil Bendle Marketing Metrics Reference: Chapter Neil Bendle and Management by the Numbers, Inc.

THE PITFALLS OF EXPOSURE RATING A PRACTITIONERS GUIDE

First Time Home Buying Steps

Making Analytics Pay Making Analytics Mainstream

Questions to Consider Before Extending Credit

Leveraged Buyouts: The Debt / Equity Ratio

HOW TO BUY A CAR WITH BAD CREDIT

Providing Outsourced CFO Services (OCS) Presenter: Christian Wielage

Choosing modelling options and transfer criteria for IFRS 9: from theory to practice

Revenue Forecasting in Local Government. Hitting the Bulls Eye. Slide 1. Slide 2. Slide 3. Slide 4. School of Government 1

Event Performance Indices (EPI) Report

Is Growing Student Loan Debt Impacting Credit Risk?

NSP-41. The Wealth Building Strategy. The ONLY Trading System with a One-Year Money Back Guarantee! Limited to 300 Copies!

What brings IFRS November 2017

HARNESSING THE POWER OF FACTOR MODELS

How Investment Managers Use Active Share to Win New Business, Retain Clients and Justify Fees

Applying fundamental & technical analysis in stock investing

A. Risk Management Framework

RIGHTSOURCING FINDING THE BEST BUSINESS MODEL FOR YOUR ASSET MANAGEMENT AND RELATED OPERATIONS

Q&A with Antonio Derossi & Mahendra Nambiar

THEBUYER S GUIDE Everything you need to know about insuring your van or pickup truck! Why choose us? / 5 Insurance products / 6-7 FAQ / 10-11

Forecasting More Profits For You and Your Clients

Smarter Water: Customer Service Improvements

Fall 2013 Volume 19 Number 3 The Voices of Influence iijournals.com

Key Performance Indicators

Are today s market pressures reshaping credit risk?

CITY OF VILLA PARK The Hidden Jewel

Applying fundamental & technical analysis in stock investing

UNDERSTAND & PREDICT CONSUMER BEHAVIOUR WITH TRENDED DATA SOLUTIONS

Trading Game-Plan Template

Unravelling the Guidelines in Preparation for CECL (ASU ) 11/29/2016

ADVANCING YOUR ORGANIZATION S MISSION. Services for Foundations and Endowments

Financial. Management FOR A SMALL BUSINESS

Article from The Modeling Platform. November 2017 Issue 6

Private Equity Analytics & Governance Platform and Services

Active Portfolio Management. A Quantitative Approach for Providing Superior Returns and Controlling Risk. Richard C. Grinold Ronald N.

Transcription:

Credit Scoring from Concept to Reality Credit & Collections Conference Boston: June 11 th, 2007

2 Agenda 1) Developing & Launching the Credit Scoring Plan Tom Kritzer Navistar Financial Corporation 2) Crunching the Numbers Tom Ware PayNet Analytical Services 3) Making It a Reality Bill Gillin GE Capital Solutions

3 Developing & Launching the Credit Scoring Plan Tom Kritzer Navistar Financial Corporation

The Vision 4 - Understanding the Portfolio! - Segmentation, Performance, SG&A, etc. - Automation: the end game - Leverage Staff / Better Control of Buying Practices - Preparation for the Future - Transparency, Regulation, Compliance

Pooled vs. Custom vs. Hybrid Models 5 Pooled-Data Scorecard Custom Scorecard Hybrid "Custom-Pooled" Scorecard Size of Data Sample for Model Building Applicability to Your Unique Situation Ability to Benefit from Types of Data Only You Have Very Large Varies from Good to Fair No Direct Method (Can Build Rules and/or Matrix) Varies from Medium to Small Excellent Excellent Very Large (in most respects) Excellent Excellent Cost Low Medium - High Medium - Low PREDICITIVENESS COST Off-the-rack suit, unaltered Custom-tailored suit (but lacking benefits of large scale production) Off-the-rack suit, altered by a tailor for optimum fit

Pooled vs. Custom vs. Hybrid Models (blue/green) (yellow/green) (all colors) 6 Breadth of Data Types Data Unique to Your Institution (e.g."program") Your Institution Breadth of Lending Institutions

Choosing Your Partners 3 Rules 7 Model development firm should: 1) Know your industry, having done prior models for your segments and equipment types Credit score development is both Art & Science, the modeler who knows more about the industry will be better able to find meaningful and predictive segmentations and variables: - And know to look for things like seasonal patterns in construction lending but only in Northern states - Or that in truck lending, more trucks is better, except that 2 or 3 is often worse than 1 (unless it s s medium duty)

Choosing Your Partners 3 Rules 8 Model development firm should: 2) Have an Interactive Style taking extensive time to collaborate with your Subject Matter Experts not just take your data file and then present you with a finished model Your staff knows your business, and can help guide the modeler to find the most meaningful and predictive nuances if the modeler cares to listen Fundamentally credit scoring is all about systematically capturing and structuring knowledge, and while the expected relationships need to be quantitatively verified, they need to first be identified to be tested

Choosing Your Partners 3 Rules 9 Model development firm should: 3) Be willing and able to provide a full range of score-related related services and consulting to help you implement and manage your scoring, both upfront and on an on-going basis The credit score is great, but where should the score cut-offs be? Up to what dollar amount? What review rules should be in place as safeguards? What default and loss rates will we have given our applicant population and score cut-offs? And how should scoring be managed over time? What monitoring is necessary? Did the score perform as expected? When is it time to rebuild?

Choosing Your Partners Firm Type 10 Pros: Independent Score Developer - No question of conflict of interest - Broader understanding of data sources Data-Affiliated Score Developer - Better understanding of own data - Better understanding of own scores - Focused on long-term relationship Cons: - Relationship is usually just one project - Could face a conflict of interest - Less understanding of best data source If unsure of best data source for your institution, lean toward Independent but if fairly confident one source is best for you, lean toward Data-Affiliated (assuming you have confidence in their integrity) Confirm developer can work with any bureau s data

Gathering & Managing the Data For Now 11 Data capture in the past often incomplete Special IT processing often required to gather data for model build May require research to fully understand nuances of coding schemes in the past May have to go to bureaus to get past data that might have been retained internally Any model developer will tell you: gathering ones own data takes longer than you think

Gathering & Managing the Data For the Future 12 Model development time is ideal time to put in place systems to capture data for the next model build Be careful not to fall into circular trap of: we didn t t get this field in the past, so it s s not in our new model, so why capture it if it s s not in the model? or you ll never get it

13 Crunching the Numbers Tom Ware PayNet Analytical Services

Comparing Different Scores 14 When choosing a Pooled-Data Score to use, or when making a preliminary decision as to what score(s) ) a Hybrid Score should be developed on, the most fundamental question is: - Which is better, Score A or Score B? The Lorenz/ROC Curve is the Ultimate Measure

Lorenz Curve Example 15 Score A Score B Random Score A All Deals Bad Deals Cumulative % Cumulative % Cumulative % All Bad All Bad All Bad All Bad All Bad Score A Random Deals Deals Deals Deals Deals Deals Deals Deals Deals Deals 600 10 4 10% 40% 200 10 2 10% 20% 1/10th 10 1 10% 10% 610 10 3 20% 70% 210 10 2 20% 40% 1/10th 10 1 20% 20% 620 10 1 30% 80% 220 10 1 30% 50% 1/10th 10 1 30% 30% = 70% approval rate 630 10 1 40% 90% 230 10 1 40% 60% 1/10th 10 1 40% 40% 640 10 0 50% 90% 240 10 1 50% 70% 1/10th 10 1 50% 50% 650 10 0 60% 90% 250 10 1 60% 80% 1/10th 10 1 60% 60% 660 10 1 70% 100% 260 10 0 70% 80% 1/10th 10 1 70% 70% 670 10 0 80% 100% 270 10 1 80% 90% 1/10th 10 1 80% 80% 680 10 0 90% 100% 280 10 0 90% 90% 1/10th 10 1 90% 90% 690 10 0 100% 100% 290 10 1 100% 100% 1/10th 10 1 100% 100% TOTAL 100 10 TOTAL 100 10 TOTAL 100 10 BOOKED 70 2 Bad% = 2.9% BOOKED 70 5 Bad% = 7.1% BOOKED 70 7 Bad% = 10.0%

Lorenz Curve Example Cumulative Percentages 16 100% 90% 80% 70% Defaults go from 5 to 2, a 60% reduction Score A Score B Bad Deals 60% Approvals go from 50% 70 to 87, a 24% increase 40% 30% Random 20% 10% 0% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% lowest scoring =70% Applications highest scoring Approval Rate

Lorenz Curve Example The Perfect Score 17 100% 90% 80% 70% The Perfect Score The Bad deals (10% of the population in this example) all have lower scores than the Good deals so a 90% approval rate will approve all the Goods, and decline all the Bads Bad Deals 60% 50% 40% Random 30% 20% 10% 0% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% lowest scoring Applications highest scoring

Comparing Different Scores 18 Two common metrics are used to summarize these curves ROC Area and K-S* K While these are convenient statistics, be aware that the curves themselves are more important - Because the stats are just ways to summarize - And where your institution operates on the curve (high, medium or low approval rate) is where the lift really matters for your institution * (Kolmogorov( Kolmogorov-Smirnoff)

Developing & Launching the Credit Scoring Plan 19 To explain these stats we must first look at the slight difference between the Lorenz Curve and the ROC Curve

Lorenz Curve vs. ROC Curve 20 Lorenz Curve ROC Curve 100% 100% 90% 90% 80% Score A Score B 80% Score A Score B 70% 70% Bad Deals 60% 50% 60% 50% 40% Random 40% Random 30% 30% 20% 20% 10% 10% 0% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Applications 0% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Good Deals Only

Lorenz Curve vs. ROC Curve The Perfect Score 21 Lorenz Curve ROC Curve 100% 100% The Perfect Score The Perfect Score 90% 90% 80% 80% 70% 70% Bad Deals 60% 50% 60% 50% 40% Random 40% Random 30% 30% 20% 20% 10% 10% 0% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Applications 0% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Good Deals Only

The ROC Area 22 100% 90% Score A 80% 70% 60% Bad Deals 50% 40% Random ROC AREA 30% 20% 10% 0% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Good Deals Only

The K-S Statistic 23 100% 90% 80% 70% 60% Score A K-S Bad Deals 50% 40% Random 30% 20% 10% 0% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Good Deals Only

Combining Different Scores 24 Knowing which one score is the best by itself is useful and sometimes all you need to know but combining different scores is almost always more powerful But how much more powerful? And which combinations? The more different the data, the more powerful the combination so adding a consumer score to a commercial score is likely to add more lift than adding a second commercial score

Combining Different Scores 25 But, it still may be worthwhile to use multiple commercial bureaus for larger and/or riskier transactions And it certainly makes sense to build a model that tries a second commercial bureau when the first bureau is a no-hit

Combining Different Scores 26 In building any credit score, it is very important that the build population mirror the implementation population so: If the blue score (below) is first-pull, and the red score will only be pulled when blue is a no-hit, then the blue scorecard should be built on population A and B, but the red should be built just on C A B C

Combining Different Scores 27 For those institutions that don t t want to build a Custom or Hybrid model, but that do want to intelligently and scientifically combine different existing Pooled-data data scores, there is a fairly easy way to do it: the Joint Odds Table

Combining Different Scores Joint Odds Table 28 The Other Score to combine with the first The PayNet Rating 100 90 80 70 60 50 40 30 20 10 100 1% 100 1% 2% 3% 5% 9% 90 2% 90 1% 2% 3% 4% 5% 10% The 80 3% 80 1% 2% 3% 4% 5% 6% 11% Score 70 4% 70 1% 2% 3% 4% 5% 6% 7% 11% You 60 5% 60 1% 2% 3% 4% 5% 6% 7% 8% 11% Currently 50 6% 50 1% 2% 3% 4% 5% 6% 7% 8% 10% 12% Use 40 7% 40 2% 3% 4% 5% 6% 7% 8% 10% 12% 30 8% 30 3% 4% 5% 6% 7% 8% 10% 12% 20 10% 20 5% 5% 6% 7% 8% 10% 12% 10 12% 10 9% 10% 11% 11% 11% 12% Italics % = Probability of Default % in italics = Default Rate

29 Making it a Reality Bill Gillin GE Capital Solutions

Making it a Reality 30 (Implementation 101) When does implementation begin? What does implementation entail? What are some common pitfalls to avoid

When does implementation begin? 31 Partnering with the modelers Dual path with concurrent timelines Accelerating user buy-in early on

What does implementation entail? 32 Managing Through IT Constraints The Human Factor: Getting Buy-in Roll-Out & Phasing In On-going Score Management Issues

33 Managing Through IT Constraints Ensuring data accuracy and data derivation Understanding nuances between modeling and transactional data Right Sizing the testing effort

The Human Factor : Getting Buy-in 34 De-mystify modeling approach Don t underestimate the value of analyst acceptance Target Buy-in from all levels of organization Be open to trade-off between statistical accuracy and user acceptance

Roll Out & Phasing In 35 Avoid big bang approach Start from the ends and work towards the middle Establish milestones and benchmarks Establish clear channels for analyst feedback/questions

Score Management 36 Near Term monitoring through the door activity Longer Term monitoring performance Establish feedback loops for constant improvement

37 Avoiding Pitfalls Data Availability Understanding targeted population and segmentation assumptions Painting yourself into a corner Underestimating scope of testing Not a one-time effort

38 Questions?