View Presentation

 Session 39, Reviewing PBR Results Moderator: Hye‐Jin Nicole Kim, FSA, MAAA Presenter: Hye‐Jin Nicole Kim, FSA, MAAA Sam M. Steinmann, ASA, MAAA, CERA Rostislav Kongoun Zilber, FSA, MAAA ValAct PBR
Nicole Kim, FSA, MAAA
August 29, 2016
AGENDA
— Background of PBR
— Audit risks
— Governance
What is the PBR calculation?
Three components to evaluate
Exclusion tests
— Net premium reserve (NPR)
— SR – products may be exempt from the SR calculation
if there is little interest or equity risk
— Deterministic reserve (DR)
— Stochastic reserve (SR)
The booked reserve will be the maximum of these
components
— DR – UL without SG products may be exempt from the
DR calculation if future net premiums are less than
future gross premiums
© 2016 KPMG LLP, a Delaware limited liability partnership and the U.S. member firm of the KPMG network of independent member firms affiliated with KPMG International
Cooperative (“KPMG International”), a Swiss entity. All rights reserved. NDPPS 594581
3
NPR, deterministic, and stochastic reserve
Net premium reserve – seriatim
— Mortality
— Interest
— Lapse
— Cash Value Floor
Deterministic reserve – can be
grouped
Stochastic reserve – can be
grouped
— PV (benefits, expenses, and
related amounts), Minus
— Project cash flows using
stochastically generated
scenarios and calculate GPVAD
for each scenario
— PV (gross premiums and related
amounts)
— Calculate CTE 70
— Determine any additional amount
needed to capture any material
risk not reflected in the cash flow
model
— Reserve = CTE 70 + additional
amount
© 2016 KPMG LLP, a Delaware limited liability partnership and the U.S. member firm of the KPMG network of independent member firms affiliated with KPMG International
Cooperative (“KPMG International”), a Swiss entity. All rights reserved. NDPPS 594581
4
Differences from current stat
Current statutory reporting is highly prescriptive. PBR emphasizes process over prescription.
— Identifying risks
— Producing economic assumptions
— Setting assumptions
— Determining margins for assumptions
— Modeling and measuring risks
— Sensitivity testing
— Documentation
© 2016 KPMG LLP, a Delaware limited liability partnership and the U.S. member firm of the KPMG network of independent member firms affiliated with KPMG International
Cooperative (“KPMG International”), a Swiss entity. All rights reserved. NDPPS 594581
5
Model components
Three components to any model
Inputs
Model calculations
Outputs
— Assumptions/margins
— NPR, DR, and SR
— Outputs to analyses
— Policyholder data
— Exclusion tests
— Manual adjustments
— Product features/guarantees
— Valuation software
— Economic scenarios
— Manual calculations outside the
valuation model
— Reserve determination – NPR
vs. DR vs. SR
— Model point grouping
— Hedging and reinsurances
— Upload to general ledger
— Implementation of hedging and
reinsurance
— Generation of
economic scenarios
© 2016 KPMG LLP, a Delaware limited liability partnership and the U.S. member firm of the KPMG network of independent member firms affiliated with KPMG International
Cooperative (“KPMG International”), a Swiss entity. All rights reserved. NDPPS 594581
6
Model risks – overall
Risks
— Not in compliance with PBR
— Inaccuracies or errors
— Lack of completeness
— Subjectivity (stochastic assumptions/methodology)
— Improperly or insufficiently documented
— Lack of internal change controls
— Available resources are not appropriately skilled or
knowledgeable
© 2016 KPMG LLP, a Delaware limited liability partnership and the U.S. member firm of the KPMG network of independent member firms affiliated with KPMG International
Cooperative (“KPMG International”), a Swiss entity. All rights reserved. NDPPS 594581
7
Audit risks: Inputs
Assumptions
— Incorrect coding of assumptions
— Coded assumptions are inconsistent with recent or
anticipated experience
— Assumptions not in compliance with PBR; not
adequately justified or documented
— Scenarios not in compliance with VM-20
recommendations; not adequately justified
Product coding
— Product features and guarantees not properly coded in
the system
— Product parameters do not properly reflect the
provisions of the underlying benefit forms
— Improper conclusions drawn from analyses due to
poorly defined product coding
— Improper coding or mapping of funds in the
system
— Poor choice of economic scenarios
Hedging
Data
— Hedging not implemented properly
— Documentation does not meet the guideline
requirements
— Policyholder data is inaccurate, incomplete, or
improperly coded
© 2016 KPMG LLP, a Delaware limited liability partnership and the U.S. member firm of the KPMG network of independent member firms affiliated with KPMG International
Cooperative (“KPMG International”), a Swiss entity. All rights reserved. NDPPS 594581
8
Audit risks: Model calculations
Home-grown model
— Lack of documentation (e.g., audit and key person risks)
— Lack of internal controls or change controls
Whether home-grown or vendor-supplied
— Calculation engine programming errors
– Formula errors
– Mapping errors (e.g., pointing to incorrect tables)
— Routines in the calculation engine are using logic that
deviates from intended or documented
— Selected economic scenarios fail to appropriately
capture tail risk or otherwise do not comply with PBR
— Improper reflection of hedging or reinsurance
— Significant non-modeled business that is not handled by
calculation engine and inadequately handled manually
External vendor model
— Can’t handle complex or unique product features
— “Black box” logic (e.g., audit and transparency of
understanding)
— Output reports are not complete, summarized
incorrectly, or otherwise misleading
— Lack of controls over manual processes
— Undiscovered errors
© 2016 KPMG LLP, a Delaware limited liability partnership and the U.S. member firm of the KPMG network of independent member firms affiliated with KPMG International
Cooperative (“KPMG International”), a Swiss entity. All rights reserved. NDPPS 594581
9
Audit risks: Outputs
Outputs
— Risk that calculation engine output reports are not
complete or are summarized incorrectly
— Adjustments for non-modeled business are not
appropriately reflected
— Manual errors when reformatting outputs for reporting or
analysis purposes
— Incorrect values uploaded to general ledger
© 2016 KPMG LLP, a Delaware limited liability partnership and the U.S. member firm of the KPMG network of independent member firms affiliated with KPMG International
Cooperative (“KPMG International”), a Swiss entity. All rights reserved. NDPPS 594581
10
Internal controls
Controls relative to the model risk inherent in the PBR
process
Sufficiency of controls and related documentation
relative to:
— Controls over choice of key non-economic
assumptions, such as policyholder behavior and
mortality assumptions
— Current SOX requirements (to the extent applicable for
statutory)
— NAIC Model Audit Regulation (MAR)
— Controls over choice of key economic assumptions
— Controls over model implementation and change
control procedures
— Controls over end-to-end process
© 2016 KPMG LLP, a Delaware limited liability partnership and the U.S. member firm of the KPMG network of independent member firms affiliated with KPMG International
Cooperative (“KPMG International”), a Swiss entity. All rights reserved. NDPPS 594581
11
Governance for PBR
A principles-based methodology requires a robust
governance framework.
The SVL primarily points to the VM for governance,
but does specifically:
— Flexibility in assumptions, methods, and models
requires a higher standard of governance
— Require that PBR assumptions, methods, and models
be consistent with (but not necessarily identical to)
those used in other company risk assessment
processes
— Companies are obligated to assure they are governing
the process by which reserves are determined
— Regulators must obtain assurance that results are
appropriate and consistent with the legal requirements
VM-G provides guidance regarding responsibilities for
company boards, management, and qualified
actuaries under PBR.
— Require that the company annually provide to the
commissioner and the board a certification of the
effectiveness of internal controls with respect to the
PBR valuations
© 2016 KPMG LLP, a Delaware limited liability partnership and the U.S. member firm of the KPMG network of independent member firms affiliated with KPMG International
Cooperative (“KPMG International”), a Swiss entity. All rights reserved. NDPPS 594581
12
Governance – Roles and responsibilities
Board
Model calculations
— Primary role is oversight
— Reviews summary results and
other information on the PBR
process
— Determines if additional steps
are needed to rely on the PBR
process of the company
— The oversight function should be
commensurate with the
materiality of principle-based
reserves in relation to overall
risks of the company
— Provide information to the board
— Review PBR results
— Adopt internal controls over PBR
valuations
— Ensure resources are adequate
and competent
— Ensure PBR processes operate
as intended
© 2016 KPMG LLP, a Delaware limited liability partnership and the U.S. member firm of the KPMG network of independent member firms affiliated with KPMG International
Cooperative (“KPMG International”), a Swiss entity. All rights reserved. NDPPS 594581
Qualified actuary
— Oversees the calculation of
principle based reserves
— Review and approve the
assumptions, methods, models,
and internal standards
— Provide a summary report to
management and the board
— The Appointed Actuary provides
an annual report on the
adequacy of all reserves
— Responsible for working with
auditors (internal and external)
and regulators
13
Governance considerations
Topic
Typical findings
People
— Key model stakeholders don’t understand or agree to the risk assessment process
— Company culture does not actively support the a robust model governance
— Roles and responsibilities not clearly defined
Documentation
— Model documentation is not detailed or robust enough to be appropriately understood and
implemented
— Documentation does not adequately cover model inventory, validation, model issues, and
resolutions
Change management
— Effective model change procedures not in place
— Policies for acceptable practices for model development, implementation, and use not in
place
— Controls do not exist to identify changes to product features
Assumptions
— Approved assumptions are not utilized consistently across model groups (input/ output)
— Accuracy of data used in experience studies is not adequately controlled
— Lack of robust documentation for actuarial and economic assumptions and scenarios
© 2016 KPMG LLP, a Delaware limited liability partnership and the U.S. member firm of the KPMG network of independent member firms affiliated with KPMG International
Cooperative (“KPMG International”), a Swiss entity. All rights reserved. NDPPS 594581
14
Governance considerations (continued)
Topic
Typical findings
Data
— Data used throughout the process is not certified as accurate, complete, and appropriate
by model or data owners
— Data integrity is not appropriately managed within the model processes
— Data quality is not defined and measured on a consistent basis
Internal models
— Lack of control over spreadsheet models
— Key person risk with relation to spreadsheet models
Vendor models
— Vendor models are not effectively tested for appropriateness
— Lack of understanding of functionality of vendor models (“black box” phenomenon)
Inventory
— The model inventory does not contain all relevant information to adequately manage and
prioritize new risks
— The model inventory is incomplete or inaccurate and does not include all models that are
implemented, under development, or recently retired
© 2016 KPMG LLP, a Delaware limited liability partnership and the U.S. member firm of the KPMG network of independent member firms affiliated with KPMG International
Cooperative (“KPMG International”), a Swiss entity. All rights reserved. NDPPS 594581
15
Governance considerations (continued)
Topic
Typical findings
Reporting
— Model validation governance processes do not include appropriate and adequate
requirements for internal and external reporting, specifications for validators and guidance
to analyze the impact of findings.
— Lack of formalized reporting that indicates compliance with governance requirements
Validation
— Model testing does not include a component of rigor and robustness based upon risk
— Lack of support from senior management for model validation
— Lack of a formalized policy for prioritization, scope, and frequency of validation activities
© 2016 KPMG LLP, a Delaware limited liability partnership and the U.S. member firm of the KPMG network of independent member firms affiliated with KPMG International
Cooperative (“KPMG International”), a Swiss entity. All rights reserved. NDPPS 594581
16
Governance and controls – Scorecard
December 2012 SOA research study on actuarial modeling controls
Topic
#
Governance
standards
The modeling
process
System access
and change
control
Leading practice
1
We have established a formal, documented governance policy for actuarial
modeling processes that is actively and periodically reviewed.
2
We regularly review models and the modeling process against the governance
policy for compliance purposes.
3
We have developed a corporate culture which values and aligns with the
governance policy.
4
We have consolidated models to a single platform or a single modeling system
where feasible. Where this is not feasible, we have implemented additional
controls to ensure model integrity across all modeling platforms.
5
We have established a model steward with clearly defined responsibilities for
ensuring adherence to the model governance standards.
6
We have implemented a formal change management process for governing
model code changes and model updates.
7
We periodically have internal model releases to ensure consistency of the
model of record across the organization.
© 2016 KPMG LLP, a Delaware limited liability partnership and the U.S. member firm of the KPMG network of independent member firms affiliated with KPMG International
Cooperative (“KPMG International”), a Swiss entity. All rights reserved. NDPPS 594581
Rating (1–5)
17
Governance and controls – Scorecard (continued)
December 2012 SOA research study on actuarial modeling controls
Topic
Model
assumption
mgmt
#
Leading practice
8
We have automated the input of assumptions into the models.
9
We have implemented a formal sign-off process for the setting of model
assumptions.
Rating (1-5)
10 We analyze and document the impact of each significant assumption change
as part of the formal assumption approval process.
Model input
mgmt
11 We obtain model input data feeds automatically from a centralized data
warehouse.
12 We have automated and standardized a set of test analytics performed to test
validity of model input.
Model output
mgmt
13 We have automated and standardized model output used for reporting and
analysis, to ensure a consistent presentation of information.
14 We store model output in a data warehouse which can be queried to allow for
additional analysis and evaluation of model results to prevent loss of data.
© 2016 KPMG LLP, a Delaware limited liability partnership and the U.S. member firm of the KPMG network of independent member firms affiliated with KPMG International
Cooperative (“KPMG International”), a Swiss entity. All rights reserved. NDPPS 594581
18
kpmg.com/socialmedia
The information contained herein is of a general nature and is not intended to address the circumstances of any particular
individual or entity. Although we endeavor to provide accurate and timely information, there can be no guarantee that such
information is accurate as of the date it is received or that it will continue to be accurate in the future. No one should act on
such information without appropriate professional advice after a thorough examination of the particular situation.
© 2016 KPMG LLP, a Delaware limited liability partnership and the U.S. member firm of the KPMG network of independent
member firms affiliated with KPMG International Cooperative (“KPMG International”), a Swiss entity. All rights reserved.
NDPPS 594581
The KPMG name and logo are registered trademarks or trademarks of KPMG International.
Valuation Actuary Symposium
39PD - Reviewing PBR
Results
Ross Zilber, FSA, CFA, FRM, CLU, ChFC, MAAA
VP, Deputy Appointed Actuary
Reviewing PBR Results
 Methodology
 Assumptions
 Margins
 Stochastic results
 Financial Reporting
Justify and Document New Methodology Choices
 VM20 has an Option to update CSO tables post issue
 Standard Nonforfeiture Law
 TERM – Current products (priced under 2001CSO) will fail SNFL CV test for
younger issues ages at older attained ages.
 Product Taxation
 No guidance
 If use 2001CSO through transition, what happens to 7702/7702A tests when
CSO table updated?
 Level Term Premium Period
 ACLI amendment to allow new generation products
 NPR for VUL with Guarantees - ULSG vs. CRVM+AG37
 Asset model
 New business vs. pro-rata in-force
 Negative starting assets
3
Assumptions – Prescribed Elements in DR and SR
 Mortality
 Start from credible company experience
 Grading into industry experience with margin
 Prescribed margin and grading mechanics based on credibility of experience
 No mortality improvement post valuation date
 Asset Assumptions
 Credit spreads
 Default Costs
 Constraints on reinvestment strategy
 Prescribed interest/equity scenarios
 Low return on equity under prescribed deterministic scenario
 Post-Level Term
4
Assumptions – Experience Elements in DR and SR
 Mortality
 Expected table development, slope of the table
 Lapses
 Premium Persistency
 Expenses
 Acquisition
 Maintenance
 Asset assumptions
 Inflation
 Returns on non-traditional asset classes
5
Review of Margins
 Mortality
 VM-20 prescribes that credibility must be determined by amount

Buhlmann vs. Limited Fluctuation Credibility
 The experience has a sufficient data period - last duration where there are at least 50 claims
Example if 15 years of sufficient data
Credibility
of
company
data
Maximum duration
where grading to
ultimate must begin
Maximum duration where Maximum duration
Maximum duration
must be 100% of industry where grading to where must be 100% of
ultimate must begin industry table (15VBT)
table (15VBT)
20-39%
Can’t use own company experience, must use industry table in all durations
7
13
17
23
40-59%
19
17
9
17
60-79%
22
32
12
22
80-100%
25
40
15
25
0-19%
 Margin on Dynamic Lapse function
 Asset margins
 Spread margins on disinvestment vs. reinvestment
 Aggregate margins
6
Example if 5 years of sufficient data
Aggregation/Allocation
 Product/risk aggregation rules
 Policy Allocation and Reinsurance credit example
Gross NPR
Gross DR/SR
Policy 1
Policy 2
Aggregate
150
100
250
300
300
DR/SR Excess Over NPR
(Gross)
30
20
50
Gross PBR
Net PBR
180
125
120
75
300
200
Ceded PBR
55
45
100
Reinsurer A
7
Total
Reinsurer B
Reinsurance %
15%
25%
40%
Reinsurance Credit
37.5
62.5
100.0
Analysis and Review of Stochastic Results
 Find key rate driving stochastic results (e.g. 5-year rate in 10 years)
 Review linear fit for runs with other assumptions (e.g. zero lapses)
GPVAD
300,000
250,000
200,000
150,000
100,000
50,000
0%
1%
2%
3%
4%
5%
5 Year Rate in 10 Years
GPVAD BASE
8
Linear (GPVAD BASE)
6%
7%
8%
Financial Reporting
 Peer Review
 Governance
 Report to the Board
 Actuarial Opinion Memorandum
 Model vetting
 New Annual Exhibits, Supplements
 YoY
 Aggregation of PBR and non-PBR reported business
 Audit
9
Q&A
THANK YOU
10
2016 Valuation Actuary
Symposium
Sam M Steinmann, ASA, MAAA
Reviewing PBR Results
August 29, 2016
Disclaimer
• I am an employee of MassMutual Financial Group;
however, any statements, representations, and
expressions of opinions or views that I make are
attributable only to me and should not be construed
as representing the views of my employer.
2
Reviewing Reserves
• Given the assumptions, how do you review the
reserves
• Data requirements should not change greatly
• Assumption requirements are a topic in themselves
• Methods
• Processes
3
Reviewing Reserves - Historical
• Reserves Calculated Seriatim
• Confirm that seriatim totals tie to reported totals
• Recalculate a sample of policies to confirm individual
seriatim reserves match documented assumptions and
features
• Each policy reserve is based on one interest rate
path
• This will not work for PBR
4
Reviewing Reserves - PBR
• PBR Reserves are calculated in Aggregate
• Need to confirm aggregation is correctly done
• Grouping decisions can be critical drivers of results
• PBR Reserves Stochastics
• Need to confirm stochastic paths are as intended
• Assessing the path/feature interactions is a key difficulty
• PBR Reserve assumptions and policy features
• Need to confirm that these are as documented
• May wish to do a single-path recalculation to confirm
• PBR Reserves should change reasonably
• Attribution analysis, a key control
5
Checking Aggregation
• Questions to ask
• Is everything coming from one run of one system
• Are the product groupings logical and appropriate
• If there are multiple systems, are the results
structured to aggregate correctly?
• Consistent scenario definitions
• Consistent scenario ordering
6
Validating Assumptions in the system
• Assumption changes much more frequent
• Like FAS 97 vs FAS 60
• Increases the need for a good process for assumption updates
• Increases the need to check that assumptions are as intended
• Are the assumptions in the system the same as in the
documentation?
• Direct spot-checks of particular assumptions
• Check effects of key dynamic assumptions by using singlepath/single-policy runs
• Have the assumptions in the documentation changed
since the last review?
• If so, have the valuation results changed as expected?
7
Checking stochastics
• Scenario generators and scenarios
• Full testing is a topic of its own
• Clear acceptance procedures for stochastics
• These are a valuable control—expect auditors to review them
• Thorough attribution analysis
• Isolate changes due to changes in stochastic scenarios
• Confirm that those changes are reasonable
8
Attribution Analysis
• The most general operational control
• Will detect many errors regardless of error source
• Can enhance and inform assumption setting
• Most effective if it captures values over multiple periods
• Examples on the next slides
• How many groups to analyze separately is a judgment
question
• Too many and the analysis won’t be as thorough
• Too few and the drivers of changes will not be clear
9
Poor Attribution Analysis Example
Life Insurance
Reserve
Actual Reserve for new business
2017Q3
2017Q4
1400
1600
120
Change in Interest Rates
-3
Other Changes
83
10
Good Attribution Analysis Example
Term Life
Reserve
Projected Reserve prior Quarter
Lapses (projected - actual)
Death Benefits (projected - Actual)
Actual Reserve for new business
Expected Reserve
Variance due to Interest Rate changes
Projected Reserve for Next Quarter
2016Q4 2017Q1 2017Q2 2017Q3 2017Q4
1000
1200
1300
1400
1600
980
1050
1260
1375
1470
30
-15
20
-5
10
-25
10
-20
-18
3
20
150
40
40
120
1005
1195
1300
1392
1603
-5
5
0
8
-3
1050
1260
1375
1470
1680
11
Productionizing these controls
• These controls will be run regularly
• Building in the needed output will help
• Single-path, single-policy output
• Reserves for next period for current policies
12