Session 35 PD, Predictive Modeling for Actuaries: Integrating Predictive Analytics in Assumption Setting Moderator: David Wang, FSA, FIA, MAAA Presenters: Guillaume Briere-Giroux, FSA, MAAA Eileen Sheila Burns, FSA, MAAA Elizabeth L. Olson, FSA, MAAA
Integrating Predictive Modeling in Assumption Setting Presented by Eileen S. Burns, FSA, MAAA Milliman October 27, 2014
What makes predictive modeling different? Results More granularity Quantifiable goodness-of-fit Foundations Different tools More data prep 2 October 27, 2014
BUILDING A SOLID FOUNDATION 3 October 27, 2014
Human and physical capital Decision #1: Project staff Actuary Statistician Supporting staff Decision #2: Program Comfort Flexibility Processing efficiency Cost Computing resources q x OR? x 2 OR? http://commons.wikimedia.org/wiki/file:sas_logo_horiz.svg http://commons.wikimedia.org/wiki/file:r_logo.svg 4 October 27, 2014
Exploratory data analysis Programmer, statistician, and actuary Data visualization tools http://commons.wikimedia.org/wiki/file:sas_logo_horiz.svg http://commons.wikimedia.org/wiki/file:r_logo.svg http://commons.wikimedia.org/wiki/file:tableau_software_logo_small.png 5 October 27, 2014
Data visualization on raw and prepped data How is each field distributed? Is it consistent over time? Policy value Category Which variables are related to the target variable, and how? Time Policy value Category Time 6 October 27, 2014
Example 1: Gender MALE FEMALE NA Problem: Missing extreme number of genders, male and female counts unexpected Solution: Code problem referencing secondary insured by mistake (gender2 vs. gender field) 7 October 27, 2014
Example 2: Distribution channel A B C NA Problem: Large number of distribution channels missing Solution: No change. Discovered issue by looking across time. Fix is only available for policies that don t lapse, and would distort experience. 8 October 27, 2014
Example 2: Distribution channel A B C NA Problem: Large number of distribution channels missing 08 Q2 08 Q3 08 Q4 09 Q1 09 Q2 09 Q3 09 Q4 10 Q1 10 Q2 Solution: No change. Discovered issue by looking across time. Fix is only available for policies that don t lapse, and would distort experience. 9 October 27, 2014
Example 3: Policy status records Active Surrender Death Problem: No deaths in any Q4 08 Q2 08 Q3 08 Q4 09 Q1 09 Q2 09 Q3 09 Q4 10 Q1 10 Q2 Solution: No change. A data issue with no feasible solution. Only affects scaling and only for a small percentage of policies. 10 October 27, 2014
Example 4: Rider prevalence No Rider Rider Problem: no other rider features seen in one quarter. 08 Q2 08 Q3 08 Q4 09 Q1 09 Q2 09 Q3 09 Q4 10 Q1 10 Q2 Solution: Analyzed prior and subsequent quarters to identify policies with riders. Necessary to avoid modeling policies with riders as no-rider policies. 11 October 27, 2014
Example 5: Assessing policy values Log distribution Relative distribution DB/AV % of dist <= 1.00 25% 1.01-1.25 50% 1.26-2.00 10% 2.01 5.00 5% 5.01 9.00 5% Log death benefit 9.01+ 5% Death benefit Problem: What is a reasonable distribution? Solution: Check log distribution, and check relative to similarly-scaled field. Led to recognizing some $4$ benefits, and some DB coded as increase over AV. 12 October 27, 2014
Example Recap Response variable Policy status Factor variables Rider prevalence Gender Distribution channel Numeric variables Policy value distributions Correlations Other issues Missing entire files Files with misleading names (e.g. wrong valuation date) Inconsistent feature descriptions (e.g. reset/ratchet) unreliable fields Takeaway: every variable that is part of a typical experience analysis has required specific attention to ensure consistency and model validity, as have each of the additional 50+ fields. 13 October 27, 2014
Thank you! Eileen S. Burns, FSA, MAAA Milliman eileen.burns at milliman.com 206-504-5955 14 October 27, 2014
Predictive Modeling Case Studies Liz Olson, FSA, MAAA October 27, 2014
Background I am not: A statistician A model builder Responsible for experience studies I am: An actuary Responsible for assumption governance Someone who knew there was power in statistics, but spent years before finding the right statistician to pull it off 1
Experience Studies Where We ve Used Predictive Modeling Variable Deferred Annuity Lapses Deferred Annuity Mortality Fixed Deferred Annuity Lapses VUL Life Insurance Mortality Life Insurance Renewal Premiums 2
Fixed Deferred Annuity Lapses Examined Fixed Deferred Annuity lapses using predictive modeling for the first time in 2013 Some factors that lapse behavior varied by were: Attained Age Policy Size Effective Surrender Charge Difference between credited rate and market rate Qual vs. Non-Qual Guarantee Minimum Floor Rate These drivers also had different sensitivities depending on whether the policy was in the surrender charge period, in the shock year, or in the post-shock years 3
Fixed Deferred Annuity Lapses Typical graphs of dynamic parameters Policy Size Attained Age Dynamic Factor Smaller Larger In CDSC Shock Post Shock Dynamic Factor Younger Older In CDSC Shock Post Shock Predictive modeling allows for better delineation between base and dynamic lapses 4
Fixed Deferred Annuity Lapses Progression of Models Base Lapses Only Actual Lapse Rates vs. Model Over Time Lapse Rate Actual Model: Base Lapses Only 5
Fixed Deferred Annuity Lapses Progression of Models Base and Some Dynamic Parameters Actual Lapse Rates vs. Model Over Time Lapse Rate Actual Model: Base + Some Parameters 6
Fixed Deferred Annuity Lapses Progression of Models Final Model Actual Lapse Rates vs. Model Over Time Lapse Rate Actual Model: Final 7
Fixed Deferred Annuity Lapses Lessons Learned Just because the model fits the past doesn t mean it s good for predicting the future The model told us that for one product the difference between our credited rate and the market rate didn t matter for policy durations post shock The study had a few years of experience, but all in low rate environment Don t rely on the model but rather use it to inform decision making Involve actuarial modelers early on to ensure the dynamic parameters are implementable Language barrier between statisticians and actuaries 8
VUL Life Insurance Mortality We worked with a consulting firm to analyze our VUL mortality experience using predictive modeling In addition to analyzing mortality across the obvious dimensions (age, gender, underwriting category, etc.) we were also able to gather data from our P&C business or from other sources Home size, number of people in household, automobiles Interests of individuals in the household (i.e. boating, skiing, scuba, golf, organic foods, interior design, video games ) Consultant was able to tease out two profiles with very different mortality experience Profile A: More attributes lead to lower mortality Profile B: More attributes lead to higher mortality 9
VUL Life Insurance Mortality Customer Profile A Customer Profile B Relative Mortality by Number of Profile A Attributes Relative Mortality by Number of Profile B Attributes 1.5 2.5 Mortality Relativity 1 0.5 Mortality Relativity 2 1.5 1 0.5 0 0 1 2 3 4+ Number of Attributes of Profile A 0 0 1 2 3 4+ Number of Attributes of Profile B 10
Life Insurance Renewal Premiums New project to use predictive modeling to establish renewal premium assumptions for ULSG and VUL products Could ultimately be used as a retention strategy to proactively notify customers who the model suggests may have a higher likelihood of stopping or reducing their premiums 11
Life Insurance Renewal Premiums Some companies may be setting this assumption in a very simplistic manner (i.e. all policies make renewal premiums such that the aggregate matches expectations) This can lead to a sizeable error on many policies Histogram of Actual vs. Predicted Renewal Premium Error Policy Count All PHs Pay Average Large Error No Error Large Error 12
Life Insurance Renewal Premiums Our new approach using predictive modeling has further improved our renewal premium assumption Histogram of Actual vs. Predicted Renewal Premium Error Policy Count Predictive Modeling Bucketing Algorithm All PHs Pay Average Large Error No Error Large Error 13
Other Possible Areas for Predictive Model Use Living Benefit Utilization Transfers between Fixed and Variable investments Life Insurance Surrenders Life Underwriting Decisions 401k, 457, 403b Case Lapses and Participant Deferrals Sales Distribution Tendencies Could possibly identify producer abuse such as STOLI or STOVA 14
Data Quality ASOP 23 provides insights into what an actuary should do when using data Review data for reasonableness and consistency Not required to audit the data Some steps that we ve taken Compare to ledger sources (policy count, deaths, lapses, FY premium, renewal premium, claims) Cursory review of distributions (issue age, account value, face amount, gender, underwriting category) 15
Staffing Model Pure statisticians often don t have the product background or business intuition needed Actuaries often don t have the statistical horsepower needed Finding someone with both of these is rare Leaders could staff by: Using pure statisticians to build the predictive models and using actuaries to manage the projects and add insights based on their intuition and experiences, or Grow the statistical talent by sending actuaries through extensive statistical training so they can run the predictive modeling project from start to finish 16
Summary Seemingly endless possible uses of predictive modeling in experience studies Think beyond just using traditional internal data Garbage In = Garbage Out Build the right team 17
Integrating Predictive Analytics in Assumption Setting Implementation and Integration in Financial Models 2014 SOA Annual Meeting & Exhibit Orlando October 27, 2014 Guillaume Briere-Giroux, FSA, MAAA, CFA 2014 Oliver Wyman
Agenda I. How and where predictive analytics impact assumption setting? II. Implications for assumption setting process III. Challenges and solutions for financial modeling integration IV. Key takeaways 2014 Oliver Wyman 1
How predictive analytics impact assumption setting? Business value Descriptive analytics (what happened and why?) Predictive analytics (what will happen?) Score / predict Prescriptive analytics (what should we do?) Decide / optimize / manage Enhanced assumption setting Enhanced model-based decisions Analyze / understand Enhanced experience studies Describe / monitor Scope of predictive modeling techniques Data analytics literacy 2014 Oliver Wyman 2
Where predictive analytics impact assumption setting? Use of predictive modeling is increasingly widespread for experience studies Product Surrenders / Lapses Utilization / funding pattern Mortality Morbidity VA Living Benefits FIA Living Benefits Fixed Annuities Universal Life Term Long Term Care Source: Oliver Wyman research We are also seeing greater use of predictive analytics in M&A 2014 Oliver Wyman 3
Implications for assumption setting process 1 2 3 4 5 More attention is paid to more secondary internal variables Increased opportunities to test external variables Additional relationships to study and understand More comprehensive data-driven discussions Better ability to put experience in context In summary, using predictive analytics requires more resources dedicated to assumption setting but enables richer thinking around key assumptions 2014 Oliver Wyman 4
Example: Integrated GLWB policyholder behavior cohorts Cohort of GLWB Observed behavior Efficient users Utilize 100% of GLWB maximum income Strong utilization feature skew Low lapse rate More efficient dynamic lapses Partial users Utilize less than 100% of GLWB maximum income Weaker utilization skew Higher lapse rate than efficient users Less efficient dynamic lapses Excess users Utilize more than 100% of GLWB maximum income Very high lapse rates Least efficient dynamic lapses Waiting users Have not yet utilized Low lapse rates Efficient dynamic lapses Waiting for rollup? This creates four cohorts to model, i.e., a policyholder behavior scenarios dimension 2014 Oliver Wyman 5
Computational implications for modeling Run time optimization becomes a three dimensional problem Can specify accuracy functions to determine optimal accuracy for a given run time and generate an efficient frontier 2014 Oliver Wyman 6
Other considerations for modeling How granular should the model become? 1 2 3 4 5 Materiality and certainty of dynamic Materiality of business Model purpose Degree of buy in Ability to implement and validate 2014 Oliver Wyman 7
Model implementation approach There is a compromise between transparency, flexibility, controls and system performance Desirable property Parameterized formula Factor tables Transparency Flexibility (model form) High Low Flexibility (adjustments) Ease of control Auditability Computational performance 2014 Oliver Wyman 8
Best practices for model implementation Area of focus Considerations Parallel testing New assumptions are more complex to code Do single cell-testing with replicator (e.g., Excel) Excel replicator can also be used for extreme value testing / sensitivity testing Internal data Data definition between experience study and financial models must be consistent External variables Modeler must understand the sensitivity of projected behavior to external variables (how reliable are my scenarios?) Sensitivity testing Understand the potential impact from stress testing the assumption parameters Documentation Documentation of rationale for key modeling decisions and any limitations or simplifications 2014 Oliver Wyman 9
Key takeaways 1 2 3 Think about the business and the environments Think about the models and their end goal Prioritize and make incremental improvements 2014 Oliver Wyman 10