casualty society of actuaries exam 5 basic techniques for

CASUALTY SOCIETY OF ACTUARIES
EXAM 5
BASIC TECHNIQUES FOR RATEMAKING AND ESTIMATING CLAIM LIABILITIES
CHECKLIST AND SUMMARY NOTES
Compiled by Jessica Chen
April 2014
1
EXAM 5A – RATEMAKING
1. Introduction
Fundamental Insurance Equation, basic ratios
2. Rating Manuals
Four basic components, examples
3. Ratemaking Data
Policy vs claims vs accounting information
Aggregation methods: CY AY PY RY
External data
4. Exposures
Desired characteristics
Two aggregation methods
Four types of exposure
5. Premium
Four types; two aggregation methods
Three adjustments to historical premiums
Extension of exposure vs parallelogram
One-step vs two-step trending
6. Losses and ALAE
Four aggregation methods
Extraordinary losses
Catastrophe losses
Reinsurance
Change in coverage/benefit levels  WC example
Loss development
7. Other Expenses and Profit
All variable expense method
Premium based projection method
Exposure/policy based projection method
Permissible loss ratio
8. Overall Indication
Pure premium method
Loss ratio Method
Derivation and equivalence
Pure premium method
Loss ratio method
Adjusted pure premium method
10. Multivariate Classification
Univariate vs multivariate (4 benefits)
Minimum bias procedure
Generalized linear models
Diagnostics (3 tests/statistics)
Data mining techniques (5 analyses)
11. Special Classification
Territorial ratemaking (5 steps)
Increased limits (LAS and 3 other approaches)
Deductible pricing (LER)
WC expense adjustment by size
ITV and coinsurance
12. Credibility
Credibility approaches (classic, Buhlman, Bayesian)
Desirable qualities of complement of credibility (6)
First dollar losses – 6 methods
Excess losses – 4 methods
13. Other considerations
Regulatory, operational, marketing
Asset share, pricing approach
14. Implementation
Non-pricing solutions
Pricing solutions – derive base rate (2 methods)
Capping the magnitude of rate change
15. Commercial Lines Rating Mechanism
Manual rate modification (ER, SR)
Composite rating
Large/small deductible adjustments
16. Claims Made Ratemaking
9. Traditional Risk Classification
Criteria for rating variables (4 categories)
2
EXAM 5B – RESERVING
1. Overview (1-2)
Components of unpaid claim estimate (5)
3. Understanding the data used
Grouping & subdivision of data (4 considerations)
Data aggregation (CY AY PY RY)
4. Meeting with Management
5. The Development Triangle
6. Triangle as Diagnostic Tool
Look for trends in case O/S adequacy vs settlement
7. Development Technique
Selection of experience period
Selection of CDFs (5 considerations)
Tail factor selection (3 approaches)
8. Expected Claims Technique
Estimate initial ultimate losses
14. Salvage & Subrogation
Development method
Ratio method
Excess of loss and stop loss
15. Evaluation of Technique
16. Estimation of ALAE Reserve
Development technique on reported/paid ALAE
Ratio method, additive or multiplicative
17. Estimation of ULAE Reserve
Dollar-based techniques
Classical
Kittel adjustment
Conger & Nolibos – generalized Kittel
Mango Allen
Count based techniques
Triangle based techniques
9. Bornhuetter Ferguson Technique
Unreported/unpaid claims to develop based on
expected
Show it’s a credit weighted method
Benktander technique
10. Cape Cod Technique
Estimation of claim ratio (used-up premium)
11. Frequency-Severity Technique
Assume… at identifiable rate
Introduce exposure
Introduce disposal rate
12. Case O/S Development Technique
Assumptions… suited to CM policies
Self-insurer’s method
13. Berquist Sherman Technique
Data arrangement vs data adjustment
Diagnostic test with triangles
Adjustments for case O/S adequacy and/or rate of
settlement
3
EXAM 5A – RATEMAKING
Three aspects of economic uncertainty of losses
• Occurrence
• Timing
• Financial impact
UW Expense Ratio
LAE Ratio
Loss Ratio
Operating Expense Ratio
Combined Ratio
ASOP definition of Risk
1. Radom Variation from Expected Costs – should be consistent with cost of capital and be reflected in UW
profit provision
2. Systemic Variation of Estimated Costs from Expected Costs – should be reflected in contingency
provision
Underwriter’s manuals:
• Rules
• Rate pages
• Rating algorithm
• Underwriting guidelines
Rating Manual
UW Manual
Four principles on P&C ratemaking
1. A rate is an estimate of the expected value of future costs
2. A rate provides for all costs associated with the transfer of risk
3. A rate provides for the cost associated with an individual risk transfer (when an individual’s risk
experience does not provide a credible basis, it is appropriate to consider an aggregate basis or
similar risks).
4. A rate is reasonable and not excessive, inadequate or unfairly discriminatory if it is an actuarially
sound estimate of the expected value of all future costs associated with an individual risk transfer
Three primary purposes of risk classification
1. Protect insurance system’s financial soundness
2. Fairness
3. Encourage availability of coverage through economic incentives
Working with data (premium, exposures, losses and expenses)
1. Approximations for data aggregation
2. Exposure selection
4
3. Trending the exposure data (where inflation sensitive or exposure is asset price)
4. Trending the premium (discrete rate changes by actuary) and losses (a continuous annual trend)
• Two methods
o Extension of exposure
o Parallelogram method
 Assumption that policies are written uniformly may not hold
 Different classes may have experience different rate changes
 This method may produce a good estimate overall but the premium for each class
would be wrong, hence not suitable for ratemaking purpose.
• One-step trending: given experience period  find the average experience date  find the implied
average written date (trend from)  trend to average written date of the prospective period.
o Inappropriate when average premium vary significantly year by year
o Inappropriate if historical changes differ significantly to that in the future
• Two-step trending (to account for change in business mix or distributional changes)
o Step 1 – adjust to the end of experience period
o Step 2 – projected trend step, often an assumed % pa
Step 1 factor = 2011 Q4 Avg WP @ Current Rate Level / Cy 2011 Avg EP @ Current Rate Level
Step 2 factor = (1+2%) ^1.625
5. Limited Losses
• Comparability of historical losses.
• Apply limit adjustments where necessary
• Leverage effect on trends – inflation pressure may 1) falls solely on the portion above the limit or 2)
push losses to pierce through the limit, causing increased frequency in the upper layer
6. Development to ultimate (including audits)
7. Change in benefits or coverage – adjust in two-step trending 1 of 2
8. Treatment of extreme or cat losses
Comments on aggregation methods:
• Graves: Long pay-out pattern creates a problem when trying to match incurred losses with the
premium from which they arise. This task of matching incurred losses and earned premium is achieved
through the use of policy year data.
• Feldblum: Ratemaking should balance the considerations of
o Stability – PY experience being most homogeneous
o Responsiveness – CY experience being most recent
o Equity – riskier entities pay more premium
• Feldblum: Policy year premiums are subject to change after exposure audit. Hence development factors
are needed for PY premiums.
5
For any chosen method of aggregation,
• Earned Premium = written premium + Δ unearned premium reserve during this period
• Losses = paid losses + Δ loss reserve during this period
Criteria for exposure base
1. Directly proportional to expected losses and responsive to change
2. Practical – objective, easily obtainable, free from moral hazard
3. Consistency with historical practices and industry convention
Four types of exposures
1. Written exposures – In case of cancellation, CY written exposure may book +1 and -1 in two separate
CYs; while PY written exposure will book the cancellation to original PY.
2. Earned exposure – Most commonly CY earned exposure, allocating pro rata to CY. PY earned exposure is
always booked in full to one PY.
3. Unearned exposure = written exposure – earned exposure
4. In-force exposure
Catastrophic events: >$25m losses and affects multiple insurers.
• Non-modeled risks that occur with some regularity over time
• Modeled risks that are irregular with high severity
• Catastrophic losses are removed from ratemaking data
• Add in an average expected catastrophe loss amount in the end
There is no overlap between severity trending and loss development, because:
• Severity trending is from the midpoint of experience period to the midpoint of exposure period
• Loss development is from the midpoint of the exposure period to ultimate
Workers Comp benefit level change
• Given 1) cumulative % of workers and 2) cumulative % of wages
• Where cap/floor apply, calculate total benefits payable = (floor/cap) x (% of workers subject to
floor/cap)
• Between the cap & floor, worker is entitled to a % of his wage. Calculate total benefits payable = (%
entitled) x (cumulative % of wages applicable)
• Sum up the components to get the average benefit
Impacts of benefit level change:
• Direct effect – on premium or losses solely due to law changes
• Indirect effect – due to change in human behavior
Different components of expenses:
1. Commission & brokerage – State or Countrywide expense
• % of premium
• Varies between new business and renewal business
2. Other acquisition costs - Countrywide
• Advertisement, mailing, salary to sales workforce
3. Taxes, license and fees – State
6
• All taxes & fees but NOT federal income tax
4. General Expenses - Countrywide
• Office overhead associated with insurance operation
Calculating UW Expense Ratio
• Most expenses including acquisition expense – incurred at outset  divide by WP
• General expense – relate to ongoing operations  divide by EP
• Add all ratios together to derive UW Expense Ratio
Expenses in ratemaking:
1. All variable expense method
• Tends to under-charge risks with premium less than average; over-charge risks with premium
above average
• Insurer may combine this method with 1) premium discount 2) expense constant
2. Premium based projection
• Both fixed and variable expenses are expressed as ratio of premium
• Variable expense ratio with respect to EP or WP
• Potential distortions:
o Recent rate change – applying same ratio will over/under-state fixed expenses
o Distributional shifts – change in average premium
o Countrywide vs State/region – subjective allocation of expenses and could potentially
lead to inequitable rates between national and regional carriers.
o Allocate fixed VS variable proportion – subjective
3. Exposure/policy based projection method
• Variable expense treated as (2)
• Fixed expense is divided by historical exposure or policy count
• Some fixed expenses might vary, eg commission is different for new business vs renewal
Permissible Loss Ratio (PLR) = 1 – expense % - target profit %
Overall indication – TWO methods
• The fundamental insurance equation derive rating methods
Premium = Losses + LAE + UW Expenses + UW Profit
𝑃 = 𝐿 + 𝐸𝐿 + (𝐸𝐹 + 𝑉 × 𝑃) + 𝑄𝑇 × 𝑃
• Permissible Loss Ratio (PLR) = 1 – expense % - target profit %
• Goals of rate making: 1) Prospective 2) Achieve balance at aggregate as well as individual level
1. Pure Premium Method – require exposure data
Indicated Rate
= [Losses & ALAE (pure premium) + Fixed Expenses] / VPLR =
In derivation, remember to divide by Exposure
2. Loss Ratio Method – require on-level premium information
Indicated Rate Change
= (Loss & ALAE Ratio + Fixed Expense Ratio) / VPLR =
(𝐿+𝐸𝐿 )�
𝐸𝐹
𝑋 + �𝑋
1−𝑉−𝑄𝑇
(𝐿+𝐸𝐿 )�
𝑃 + %𝐹
1−𝑉−𝑄𝑇
Indicated Rate Change Factor = 1+ Indicated Rate Change…
7
Traditional Risk Classification – THREE approaches (+1 adapted)
1. Pure Premium Approach – Distributional bias of pure premium approach as it assumes uniform
distribution of exposures across all other variables. It ignores correlation between variables.
[𝐿+𝐸 ]
𝑅𝐼,𝑖 = [𝐿+𝐸 𝐿] 𝑖 , if all classes have the same expenses and profit provision
𝐿 𝐵
2. Loss Ratio Approach – requires on-level factors to trend premium
The LR method uses current premium to adjust for an uneven mix of business to the extent the
premium varies with risk, but only an approximation.
𝑅𝐼,𝑖
𝑅𝐶,𝑖
[𝐿+𝐸 ] /𝑃
= [𝐿+𝐸 𝐿] 𝑖 /𝑃𝐶,𝑖 =
𝐿 𝐵
𝐵,𝑖
𝑙𝑙𝑙𝑙 𝑟𝑟𝑟𝑟𝑟 𝑖
𝑙𝑙𝑙𝑙 𝑟𝑟𝑟𝑟𝑟 𝐵
3. Adjusted Pure Premium Approach
It multiplies exposures by the exposure-weighted average of all other rating variable’s relativities to
standardize the uneven mix of business, still only an approximation. Equivalent to LR method above.
4. Iterative minimum balance – see Multivariate chapter
Given detailed exposure by variables (t 1 , t 2 )x(g 1 , g 2 ) as well as Earned Premium. Solve four equations
with four unknows, eg with respect to t 1 :EP(t 1 ) = $100 x Exposure(t 1 , g 1 ) x t 1 g 1 + $100 x Exposure(t 1 ,
g2) x t1 g2
Substitute (t 1 , t 2 ) into original equation to get (g 1 , g 2 ); then substitute (g 1 , g 2 ) to get (t 1 , t 2 ); iterate.
Eventually converge to GLM model.
Criteria for rating variables
1. Statistical criteria
• Statistical significance – statistically significant, stable year to year, acceptable confidence
interval
• Homogeneity – homogeneous within groups, heterogeneous between groups
• Credibility – groups should be large & stable enough to accurately estimate costs
2. Operational – objective, inexpensive, verifiable
3. Social – affordability, causality, controllability, privacy
4. Legal
“Skimming the cream” – an insurer notices a positive characteristic that is not being used in their rating variable
or that of competitors. The insurer can market to the good risks and try to write more of them. This should lead
to lower loss rate and better profitability. Competitors may experience adverse selection if they do not adapt.
Shortcomings of univariate methods:
• Pure Premium Method – does not consider exposure correlation with other rating variables
• Loss Ratio Method – uses current premium to adjust for uneven business mix, but only a proxy
• Adjusted PP Approach – multiplies exposure by the exposure weighted average of all other rating
variables before calculating the one-way relativity. However it’s only an approximation to reflect all
exposure correlations
Benefits of multivariate
• Consideration of all rating variables simultaneously and adjusts for correlations between variables
• Raw data contains systemic effects and noise. Multivariate model seeks to remove noise and capture
signal
• Produces model diagnostics (certainty of results, goodness of fit)
• Allow interaction between two or more rating variables.
8
Multivariate Classification – GLM diagnostics
1. Standard Errors : 2 SD  95% confidence, to gauge whether a variable has a systematic
effect…
“Standard Errors are an indicator of the speed with which the log likelihood falls from the
maximum given a change in parameters”…
2. Devian Tests: eg Chi-sq or F tests, indicates how much fitted values differ from observation
often used when company nested models – trade-off between gain in accuracy and loss of
parsimony
3. Consistency over time
4. Back testing
Special Classification – Coinsurance
• Problems with underinsurance – inadequate and inequitable rates, only if partial losses are possible
• “coinsurance deficiency” is the amount of insurance shortfall compared to minimum requirement
• “coinsurance penalty” is the reduction in indemnity payment
Why larger firms tend to have better WC loss experience
• Experience at large firms receive more credibility, hence the incentive to reduce losses
• Safety measures cost money to implement and are more likely to be economically attractive to large
firms
• A loss constant X may be added to small risks or all risks, as a manual adjustment. 𝐿𝐿𝐿𝐿 𝑅𝑅𝑅𝑅𝑅 =
𝐿𝐿𝐿𝐿𝐿𝐿
𝐸𝐸𝐸𝐸𝐸𝐸 𝑃𝑃𝑃𝑃𝑃𝑃𝑃+𝐿𝐿𝐿𝐿 𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶 𝑋 ×𝐸𝐸𝐸𝑜𝑜𝑜𝑜𝑜
Different kinds of limit:
• Single limit – per claim
• Compound limit –damage vs liability, occurrence vs aggregate
Special Ratemaking – increased limit factors
Calculating Limited Average Severity (LAS) and Increased Limit Factor (ILF)
• If ground-up losses are available (uncensored) for all policies, simply calculate ILF directly as relative
ratio of losses & ALAE
• If losses are censored, need to calculate LAS consistently and ILF = LAS(high) / LAS(low)
• For lowest base limit B, LAS(B) = total claims in layer / claim count
9
•
•
•
•
•
•
Iterative process for higher limits. Calculate LAS(L i to L i+1 ) =
total claims in layer $
.
eligible (>Li)claim counts #
This is a conditional
LAS, based policies with observed losses >L i only.
Calculate breach probability P(X> L i ) based on claim counts
LAS(L i+1 ) = LAS(L i ) + LAS(L i to L i+1 )* P(X> L i ), now unconditional
When calculating LAS(L i to L i+1 ), only use data with limit> L i ; therefore when calculating P(X> L i ), only
use pol counts with limit> L i . Ignore small limit policies
Careful whether data is ground-up or in layer only
Instead of LAS method, could use Generalized Linear Model (GLM). GLM takes into account both the
limiting of losses and behavioral differences of insureds. This can sometimes lead to counter-intuitive
results (eg lower ILF for higher limits). GLM handles higher layers with thinner data better than LAS
method.
Criteria for credibility Z
• Between [0, 1]
• Should increase with number of risks
• Increase at non-increasing rate.
Desirable qualities of a complement of credibility
1. Accurate
2. Unbiased
3. Independent
4. Data availability
5. Easy to compute
6. Logical relationship
Credibility weighting methods – THREE methods
1. Classical approach
• Full credibility (i.e. Z=1)with Y observations if 𝑃(𝑆𝑌 ± 𝑘%) = 𝑝 for some small k and p
•
•
𝑧(𝑝+1)/2 2
𝑌=�
•
𝑧𝑝/2
� if all claim size is constant; 𝑌 = �
𝑘
Total claims S= X 1 +X 2 +X 3 … +X n and 𝑆~𝑋 𝑃𝑃(𝜆𝑁 )
By CLT
•
𝑘
𝑆−𝐸(𝑆)
�𝑉𝑉𝑉(𝑆)
~𝑁(0,1)
𝑘𝑘(𝑆)
�𝑉𝑉𝑉(𝑆)
𝜎2
� �1 + 𝜇𝑆2 � if claim size varies
𝑆
𝑧(𝑝+1)/2 2
)
𝑘
= 𝑧(𝑝+1)/2 )  𝑘�𝜆𝑁 = 𝑧(𝑝+1)/2 ) 𝜆𝑁 = (
𝑦
𝑌
Where less than full credibility, apply square root rule Z = �
Comments
o Commonly used and generally accepted
o Data readily available
o Computation is straight forward
o Simplifying assumption on constant claim size not true
2. Buhlmann Credibility (aka Least Square Credibility)
• Objective: minimize square of error between the estimate and true expected value of the quantity
being estimated.
•
•
•
Z=
N
where
N+K
K=
expected process variance (EVPV)
variance of hypothetical mean (VHM)
Both EVPV and VHM increase with N; also assume that the risk process does not shift over time
Comments
o Under the simplified assumptions that underpins the classic approach (VHM=0, K=∞), there
10
is never any credibility (Z=0), a bit of a problem
o Generally accepted
o Challenge in determining EVPV and VHM
3. Bayesian Analysis
• Buhlmann is the weighted lease square line associated with Bayesian estimate (??)
• Bayesian and Buhlmann are equivalent under certain mathematical situations
Complement of credibility
2. First dollar ratemaking – SIX methods using alternative data source and exposure group
Larger
Group
Inclusive
Accurate
Unbiased
Independent
Data
availability
Easy to
compute
Logical
relationship
Larger
Group
Related
Rate Change
of a Larger
Group
Harwayne
Method



Trended
Present
Rates
Depends on
ν
Competitor
 unless
excluded






Different
UW

?

?






?







?




Harwayne Method
1. Objective: Calculate complement to state A Class 1. Adjust for distributional difference
2. Calculate ���
𝐿𝐴 average premium of state A,
3. Assuming the same class distribution, calculate 𝐿�𝚤 the hypothetical average losses for state i
4. Calculate adjustment factor for each state =
����
𝐿𝐴
𝐿�𝚤
5. Apply the factors to loss costs in state i and get adjusted los cost
6. Based on adjusted loss cost and exposure, calculate average class 1 pure premium
Trended Method
Complement = 𝑝𝑝𝑝𝑝𝑝𝑝𝑝 𝑟𝑟𝑟𝑟 ×
𝑝𝑝𝑝𝑝𝑝 𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖 𝑙𝑙𝑙𝑙 𝑐𝑐𝑐𝑐
𝑝𝑝𝑝𝑝𝑝 𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖 𝑙𝑙𝑙𝑙 𝑐𝑐𝑐𝑐
× 𝑙𝑙𝑙𝑙 𝑡𝑡𝑡𝑡𝑡 𝑓𝑓𝑓𝑓𝑓𝑓
3. Excess Ratemaking – FOUR methods using alternative limits
ILF Method
Accurate
Unbiased
Independent
Data
availability
Easy to
compute
Logical
relationship
Lower Limits
?
 Even more
biased

Higher Limits
 ELR does
not vary by
layer

Fitted Curve

?
?
?
?






?

 Loss dist may
differ by size


11
𝐼𝐼𝐼𝐴+𝐿 −𝐼𝐼𝐼𝐴
𝐼𝐼𝐼𝐴
𝐼𝐼𝐼𝐴+𝐿 −𝐼𝐼𝐼𝐴
���
Method C = 𝐿
𝑑×
𝐼𝐼𝐼𝑑
���
ILF Method C = 𝐿
𝐴×
Lower Limit
Higher Limit Method C =𝐿𝐿 × ∑𝑑>𝐴 𝑃𝑑 ×
𝐼𝐼𝐼min(𝑑,𝐴+𝐿) −𝐼𝐼𝐼𝐴
𝐼𝐼𝐼𝑑
Ratios for competitive analysis
# 𝑎𝑎𝑎𝑎𝑎𝑎𝑎𝑎 𝑞𝑞𝑞𝑞𝑞𝑞
• Close Ratio =
. Fundamental different between insurers as some only issue one quote
# 𝑡𝑡𝑡𝑡𝑡 𝑞𝑞𝑞𝑞𝑞𝑞
per insured but others may issue multiple (varying deductible and limits etc.)
# 𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝 𝑟𝑟𝑟𝑟𝑟𝑟𝑟
.
# 𝑡𝑡𝑡𝑡𝑡 𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝 𝑟𝑟𝑟𝑟𝑟𝑟𝑟 𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝
•
Retention Ratio =
•
generally have more favorable losses.
𝑛𝑛𝑛 𝑝𝑝𝑙𝑙𝑙𝑙 𝑤𝑤𝑤𝑤𝑤𝑤𝑤−𝑙𝑙𝑙𝑙 𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝
% Policy Growth =
. Closely related to underwriting standards.
Renewal customers are cheaper to service and
𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝 𝑎𝑎 𝑜𝑜𝑜𝑜𝑜 𝑜𝑜 𝑝𝑝𝑝𝑝𝑝𝑝
Implementation – Rate change with caps
Adj Δ% Relativity 𝑖 = (1+ prior Δ% relativity 𝑖) x (1+ desired overall Δ% rate change) x OBF
1
Off-balance Factor =
1+𝑣𝑣𝑣𝑣𝑣𝑣 𝑤𝑤𝑤𝑤ℎ𝑡𝑡𝑡 𝑝𝑝𝑝𝑝𝑝 Δ% 𝑟𝑟𝑟𝑟 𝑐ℎ𝑎𝑎𝑎𝑎
Distribute capped premium to other classes
Implementation – pricing solutions for existing products
𝑃�
• Given: 𝑃�𝑝 or a change on average premium (1+Δ%) = 𝑝 ; rate differentials
•
𝑃�𝑐
Find: base premium to charge – THREE methods
1. Extension of Exposure Method (effectively a Solver process)
Start with a seed base rate 𝐵𝑠 . For each exposure 𝑃�𝑆 = 𝐵𝑆 × 𝑅1 × 𝑅2 × (1 − 𝐷1 − 𝐷2 ) + 𝐴𝑃
∑ 𝑃 ×𝑋
Calculate the exposure weighted average premium 𝑃�𝑆 = 𝑆 𝑖𝑖𝑖 , inevitably ≠ 𝑃�𝑝
Adjust the base rate 𝐵𝑝 = 𝐵𝑆 ×
𝑃�𝑝 −𝐴𝑝
𝑃�𝑆 −𝐴𝑃
= 𝐵𝑐 ×
(1+∆%)𝑃�𝑐 −𝐴𝑝
𝑃�𝑆 −𝐴𝑃
∑ 𝑋𝑖𝑖𝑖
2a. Average Rate Differential Method
Work with exposure (or premium) weighted rate differentials
𝑃� −𝐴
𝑃�𝑝 = 𝐵𝑝 × 𝑆𝑝̅ + 𝐴𝑝 , hence 𝐵𝑝 = 𝑝 𝑝
̅
𝑆𝑝
�1 − 𝐷
�2 )
Just need to estimate 𝑆𝑝̅ ≈ 𝑅�1 × 𝑅�2 × (1 − 𝐷
This method ignores the dependence of exposure distribution between rating variables
 Distributional bias
 Mitigate by weighting by variable premium instead of exposure
2b. Work with % change
𝑃�𝑝 −𝐴𝑝
𝑃�𝑐 −𝐴𝑐
•
=
𝐵𝑝
𝐵𝑐
𝑆̅
𝑆̅
× 𝑆𝑝̅ , hence 𝐵𝑝 = 𝐵𝑐 × 𝑆̅𝑐 ×
𝑐
𝑝
𝑃�𝑝 −𝐴𝑝
𝑃�𝑐 −𝐴𝑐
𝑆̅
. 其中OBF = 𝑆̅𝑐
𝑝
Estimate (1+Δ S %) ≈ (1+Δ R1 %) (1+Δ R2 %) (1+Δ 1-D1-D2 %)
Distributional bias persists – so what value does this process ad??
When calculating fixed expense fee, don’t forget to gross up!! 𝐴𝑃 =
Non-pricing solutions
• Reduce expenses
• Reduce average loss cost
𝐸�
1−𝑉−𝑄𝑇
12
o
o
o
Change portfolio mix (through targeted marketing etc.)
Reduce coverage provided
Institute better loss control procedures
Commercial Lines Ratemaking – to address heterogeneity and credibility
1. Manual rate modification
a. Experience rating
Note that “basic limit” is applied on indemnity only; Maximum Single Loss (MSL) is applied on
basic limit indemnity & ALAE combined.
Experience component, estimated based on expected loss rate (manual) and exposure.
Expected component, insured’s actual loss experience with adjustments
𝐶𝐶 =
AER =
𝐴𝐴𝐴−𝐸𝐸𝐸
𝐴𝐴𝐴
× 𝑍 , equivalently (1 + 𝐶𝐶) = (1 − 𝑍) × 1 + 𝑍 ×
𝐸𝐸𝐸
𝐸𝐸𝐸
𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟 𝑙𝑙𝑙𝑙𝑙𝑙 (𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜)+𝑢𝑢𝑢𝑢𝑢𝑢𝑢𝑢𝑢𝑢 𝑙𝑙𝑙𝑙𝑙𝑙 (𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒 & 𝑐𝑐𝑐𝑐𝑐𝑐𝑐 𝑚𝑚𝑚𝑚𝑚𝑚 𝑟𝑟𝑟𝑟𝑟,𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑)
𝐶𝐶𝐶𝐶𝐶𝐶𝐶 𝑠𝑠𝑠𝑠𝑠𝑠𝑠 𝑙𝑙𝑙𝑙𝑙𝑙 (𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒 & 𝑐𝑐𝑐𝑐𝑐𝑐𝑐 𝑚𝑚𝑚𝑚𝑚𝑚 𝑟𝑟𝑟𝑟𝑟,𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑)
Company Subject BL Losses & ALAE = Manual Prem * ELR * PAF 1 * PAF 2 * Detrend Factors,
• PAF 1 adjusts current company BL losses to an occurrence level…?? All10 1b p254
• PAF 2 adjusts for the experience period being CM, reflecting the CM year…??
Unreported Losses = Company Subject Losses *Detrend factor * EER * % Unreported.
Remember EER!
EER is the complement of an expected deviation of the company’s loss costs in the experience
rating plan from the loss costs underlying the manual rate.
NCCI experience rating separates primary layer and excess layer
𝑀=
𝑀=
𝑍𝑝 ×𝐴𝑝 +�1−𝑍𝑝 �×𝐸𝑝 +𝑍𝑒 ×𝐴𝑒 +(1−𝑍𝑒 )×𝐸𝑒
𝐸
𝐴𝑝 +𝑤×𝐴𝑒 +(1−𝑤)×𝐸𝑒 +𝐵
𝐸+𝐵
where 𝑍𝑝 =
or equivalently
𝐸
𝐸+𝐵
and 𝑤 =
Standard Premium = M × Manual Premium
𝑍𝑒
𝑍𝑝
“Ballast value (B) is the factor of experience rating formula that prevents the x-mod from
shifting too high or too low.” In practice, B is looked up in a table and it increases with expected
losses…
b. Experience rating – alternative formula
ALR−ELR
CD =
×Z
ELR
ALR/ELR are loss ratios. Numerators are same as AER/EER; denominator is subject premium.
ELR may be calculated as (ground-up LR * D-ratio)
D-ratio is the loss elimination ratio at primary loss limit
c. Schedule rating
Credit/debits based on characteristics are ADDITIVE; (1-x%-y%-z%)
If both schedule rating and experience rating are adopted, make sure that credits are not
double counted.
13
2. Single name rating – “loss rated risks”
Used when the insured meets size requirement and its experience receives full credibility (100%)
a. Large deductible plans
Calculate applicable losses with Loss Elimination Ratio (LER)
Insurer may need to adjust premium to account for frictional costs
b. Composite rating plan – bundle together a couple of lines
Combine exposure data of a few lines of coverage and apply a composite trend rate;
Develop losses for each individual line, then add together, apply a composite trend rate;
Composite loss rate = composite losses / composite exposure
Where the heck do you come up with the composite trend rate???
c. Retrospective rating plans
Retrospective
Premium
=
Basic
Premium
Basic
Premium
=
+Converted
Losses
+
Standard
Premium
+
Expense
x ( Allowance
-
= LR x (LCF -1)
=(
Excess Loss
Premium
+
Exp provided
thru LCF
Retro Dev
Premium
+
x
Tax
Multiplier
Net ins
Charge )
Basic Premium
Factor <100%
Insurance
Insurance
) x LR x LCF
Charges
Savings
Standard premium = manual premium ×M, see experience rating
Retro Dev Premium is used to smooth out back & forth payments
Min/max retro premium =mim/max ratio * standard premium
LCF is to adjust for LAE already included elsewhere; while Basic Premium contains broader
expense + profit & contingency
Five Principles of Claims Made policies
1. If claim costs are rising, CM policy should always cost less
2. If there’s a sudden unexpected change in underlying trends, CM policy priced based on prior trends
would be closer to the correct price than an occurrence policy priced on prior trend
3. If there’s a sudden unexpected change in reporting patterns, the cost of a mature CM policy will be
affected relatively little, if at all.
4. CM policies don’t have IBNR. Lower risk of reserve inadequacy.
5. Investment income from CM policies is substantially less.
Asset Pricing Method – components
A. Premium
B. Claims – Inexperience, youth, transience & vehicle acquisition
C. Expenses – Impact of policy duration on expenses
D. Persistency – Termination Rates (conditional YoY) vs Termination Probability (unconditional)
14
E. Discount Rates
Profitability is measured as return on premium ↔ return on equity/surplus over the life of the cohort.
Currently not very common in P&C industry because
• Data availability issues
• Losses are uncertain in both magnitude and timing
• Casualty pricing techniques are under-developed in
• Casualty policy structures are more flexible than life & health
Special Classification – Territorial Ratemaking
Challenges
1. Correlation of territory with other variables
2. Data in each territory may be sparse (a lot of noise)
First of all – estimate territorial boundaries
1. Determine geographical units
2. Calculate geographic estimator. Need to separate signal and noise. A univariate model would likely
be biased due to correlation
3. Spatial smoothing. Units in proximity should have similar risks. Distance based approach and
adjacency based approach
4. Clustering
Then calculate territorial rate relations
Special Classification – WC Premium Discount
• Assume all expenses are variable. Given expense ratios by premium range.
• Standard premium would have been calculated based on the lowest premium range
• Allocate the standard premium to respective premium ranges.
• Benchmark against the lowest range, calculate % exposure reduction for each range
• Gross up for tax & profit. %Discount = % expense reduction ÷ (1 – tax rate – profit margin)
Appendix B – Homeowner Indication
• Projected pure premium include
o Non-cat PP (State)
Credibility weighted, divide by exposure
o Non-cat PP (regional)
o Non-modeled cat PP = selected average AIY per exposure * selected cat-to-AIY ratio * ULAE
loading
o Modeled cat PP
• Projected fixed expense – General Expense go by EP, else go by WP
• Projected net reinsurance cost
• Gross up by VPLR
Appendix D – WC Indication
• Industry loss cost + payroll adjustment + experience adjustment = projected loss cost
• Industry loss cost needs to consider: benefit level change & wage inflation
• Medical benefit component is partially subject to schedule
• Overall indication = (indemnity LR + medical LR) x (1 +ALAE ratio +ULAE ratio)
1−𝑒𝑒𝑒𝑒𝑒𝑒𝑒𝑒 𝑙𝑙𝑙𝑙 𝑐𝑐𝑐𝑐 𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑
1−% 𝑒𝑒𝑒𝑒𝑒𝑒𝑒 & 𝑝𝑝𝑝𝑝𝑝𝑝
𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝 𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑
=
× (1 + 𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑) −
𝑐𝑐𝑐𝑐𝑐𝑐𝑐 𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑
•
For company proposed deviation =
•
company change
1
15
•
if no change in profit margin, =
1−𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝 𝐿𝐿 𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑
1−𝑐𝑐𝑐𝑐𝑐𝑐𝑐 𝐿𝐿 𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑
× (1 + 𝑖𝑖𝑖𝑖𝑠𝑠𝑠𝑠 𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑) − 1
16
EXAM 5B – UNPAID CLAIM ESTIMATION
Unpaid Claim Estimate – estimate of future obligation resulting from claims due to past events.
1. case outstanding on known claims
2. future development on known claims
3. re-opened claims
Broad IBNR
4. IBNR (including pure IBNR and IBNER)
5. claims in transit
Separate into components to test case O/S adequacy over time.
Six approaches to setting case O/S reserve
• best estimate of ultimate settlement value
• set to maximum possible loss
• seek advice of legal counsel
• set to estimated claim amount
• set to claim amount + ALAE
• set to claim amount + ALAE + ULAE
Balancing homogeneity & credibility of data – maximize predicative power.
Look for similar characteristics in the following
1. claim experience pattern (perils, laws, claims handling)
2. settlement pattern (time to settle, reopening, mitigation)
3. claims distribution (volume, severity)
4. Trends (portfolio growth, past regulation,, methodology changes)
Three considerations: volume, homogeneity, growth/change
Claim related expenses include
• defense and cost containment (DCC)
• adjusting and other (A&O)
Aggregation of Claims
Aggregation
Advantages
CY – All transactions
in period
•
•
Disadvantages
No future development after CY cut •
off
•
Data readily available
•
Not suited for loss dev purpose
very few techniques based on CY
Mismatch between premium & losses
AY – All accidents in
period
•
•
•
•
Industry standard
easy and intuitive
reliable – AY is shorter PY
AY claims valuable for tracking
changes due to major events
•
•
•
Subject to change at subsequent valuations
potential mismatch between exposure and claims
subject to distortion due to change in policy design
or underwriting
PY – All policies
written in period
•
•
•
•
Valuable for tracking underwriting
or pricing changes
particularly useful for self-insurers
Extended time to gather data, hence less reliable
difficult to isolate event impact
RY – All losses
•
reported in the period •
Particularly useful for claims made
no future development past YE
•
•
Only works with known claims
no IBNR by construction
17
Common combinations:
• CY exposure (earned prem ≈≈ AY claims
• PY exposure – PY claims (EXACT match)
• for self-insurers, CY exposure – AY claims (EXACT match)
Before developing estimates of unpaid claims, the actuary needs to understand the following
• classes of business written
• geographic reach of underwriting activities
• reinsurance arrangements (attachment point, limit)
• claims management (processing time, lags, staffing changes)
• legal and social circumstances
• economic environment (rates & inflation)
Choosing a large claim threshold, consider
• size of claims vs policy limit
• size of claim vs reinsurance limit
• number of claims that exceed threshold
• credibility of internal data
• availability of external data
FOUR phases to estimating unpaid claims
1. Explore data, run diagnostic tests
• reported (paid) claims / trended EP  tort reform, policy change
• paid claims $$ / reported claims $$  speed of settlement, change in case reserve strength
• paid claims # / reported claims #  speed of settlement
• average severity (reported, paid or case O/S)  all above
Look for possible explanations:
a) change in loss ratio (due to business mix, legal circumstances, or policy terms)
b) change in case O/S adequacy  impact is smaller at higher maturities
c) change in claim processing (affecting all claims, or only particular sizes)
d) change in business volume
2. Apply appropriate techniques
• Chain Ladder
Future is based on development to-date
• Expected Claims
Future is based on expected
• Bornhuetter Ferguson (also Benktander & Cape Cod)
• Frequency Severity
IBNR relates to claims already reported
• Case O/S technique
3. Evaluate in any conflicting results of different techniques
• Berquist Sherman – adjust data if neccessary
4. Monitor the evolution in subsequent time periods
Key considerations in preparing data
• observability of data
• static or dynamic in subsequent periods
• matching of exposure data & claims
• ability to isolate exogenous influences & major events
18
•
ability to adjust for changes in policy design & underwriting
Diagnostic Triangles
Paid / Reported Claims $$
Closed / Reported Counts #
Paid Claims / EP
Reported Claims / EP
Average claims / Case O/S
Reserve Adequacy
Y
Y
Y
Settlement Pattern
Y
Y
Y
Y
“Actuarially Sound” – An actuarially sound loss reserve, for a defined group of claims as of a given valuation
date, is a provision, based on reasonable assumptions and appropriate actuarial methods, for the unpaid
amount required to settle all claims, reported or not, for which liability exists on a particular accounting date.
Typical procedure:
trend everything analyze and selectdetrend the selected
19
******DEVELOPMENT TECHIQUE******
Development Technique (Chain Ladder) – assumptions
1. claims will develop in a similar manner to the past
2. the evolution through maturity is similar for different stab AY/CY/PY
3. consistent claim processing
4. stable business mix
5. stable policy limits
6. stable reinsurance
Development Technique – mechanics
1. compile claims data in triangle
2. calculate age-to-age factors in triangle
3. calculate average of age-to-age factors
• simple average
• medial average (excluding highest and lowest)
• volume weighted average
• geometric average
• for all above, selection of data points to use is a trade-off between stability and responsiveness
4. select factor, consider
• smooth progression & steadily decreasing to 1
• stability of factors (given any maturity)
• credibility (volume, homogeneity, benchmark)
• change in patterns
• applicability of historical experience
5. select tail factor, if most mature factor still >1. Options:
a. refer to benchmark data
b. fit curve to selected factors and extrapolate (exponential function)
c. if reported claims are at ultimate, use report-to-paid ratio
6. cumulative development factors (CDF), ultimate losses and reserve
Comments on Development Technique
• Usually don’t need to trend, as trends cancel out in factors. May require BS adjustment.
• if AY data is used, reserve includes broad IBNR
• sense check with (% reported or paid) := 1 / respective CDF
• Development factors tend to increase as retention increases because
◦
•
•
•
•
•
•
excess business exhibits slower reporting
◦ shape of loss distribution changes at successive valuations  initially mostly small claims
Works best in a stable business environment with a large volume of past data – high frequency low
severity. Not suited to a new line of business.
Unreliable for long tail business as CDF is highly leveraged  use Expected Claims Technique or hybrid
Change in loss ratio →OK.
Change in case O/S strength → reported, fail; paid, OK.
Change in business mix → reported, fail; paid, even worse.
Calculation based on reported claims has a shorter time frame than paid. Hence more responsive.
Tail factors, choosing maturities
20
•
•
Consider the age at which data becomes erratic;
Consider the % of claims expected to close beyond the selected age.
21
******EXPECTED CLAIMS TECHNIQUE******
Assumption:
• A prior/initial assumption gives better estimate than claims experience.
• The prior can be based on more mature years or an industry benchmark. Prior loss ratio may also be
adjusted for regulatory or rate changes to the level of a particular AY (very complicated ah…)
Expected Claims Technique – mechanics
1. Develop ultimate losses and trend all losses/exposure/ premium to current level
• Take the average of ultimate from reported & paid chain ladder
• or otherwise based on mature years or industry data
2. Calculate loss rates (loss ratio) for each AY.
3. Calculate average or otherwise select a loss rate (or loss ratio). Allow judgment.
4. Detrend LR to target AYs as appropriate
5. Multiply with target AY's earned premium or exposure to get expected ultimate claims
6. Calculate reserves
7. Calculate ultimate by combining reserve and paid claims.
Expected Claims Technique
• Good for lines with long tail (where CDF is highly leveraged), a new line of business or otherwise lacking
reliable claim experience (external circumstances renders experience irrelevant)
• Tends to provide good stability over time because actual claims do not enter calculation. But also
means the technique is not very responsive.
• Note that in the computation of loss rate, all AYs were trended to current level. However, when
calculating the reserve for a particular AY, make sure to use the compatible loss rate and exposure
(detrended).
• Change in loss ratio → fail, unless manually adjusted
• Change in case O/S → OK, indirectly influenced by chain ladder的selected loss rate
• change in business mix → fail
22
******BORNHUETTER FERGUSON TECHNIQUE******
Expected
Ultimate Claims
=
Observed
Reported Claims
+
Prior, based on Exp Claims Technique…
Expected
Ultimate Claims
X
% Unreported
Claims
From Dev Technique
Bornhuetter Ferguson (BF) Technique – mechanics
1. Obtain prior expected claims (from Expected Claims Technique)
2. Calculate % reported for each maturity based on CDF
3. Calculate IBNR as a % of prior expected claims
4. ultimate claims = reported + unreported claims
5. calculate IBNR as appropriate
Bornhuetter Ferguson (BR) Technique – comments
• credibility weighted combination of the Development Technique and the Expected Claims Technique
• Credibility given to actual claim experience Z := 1 / CDF
• BF assumes that unreported (or unpaid) claims will develop based on expected claims, i.e. actual claims
experience adds no information
• advantage: random fluctuations early in the life of an AY does not significantly distort the projections
• Also good for short tail lines (where IBNR can be a multiple of the last few month's EP)
• problem if CDF <1, hence % reported >100%
• floor CDF at 1
• or select ultimate claims in step 1 using a different technique for the years in question
• Change in claim ratio impacts actual reported/paid claims but NOT the estimates for
unreported/unpaid claims. Paid BF is less responsive than reported BF.
• Change in case O/S under reported BF leads to erroneous IBNR through erroneous Dev Factors. The
error is not as bad as in Dev Technique due to less leverage effect. Paid BF is OK.
Benktander Method
• “Iterative BF Technique”, take ultimate claims from BF iteration 1 step 4 and use it as prior in iteration 2
step 1
• Benktander is considered a credibility weighted average of BF Technique
• Advantage: more responsive than BF, ye t more stable than Dev Technique
• Eventually converge to development method
If change in business mix, Benktander responds well to changing claim ratio but NOT responsive to changes in
the underlying development patterns.
BF = (Z) * Dev + (1-Z) * ExpC
Benktander = (Z) * Dev + (1-Z) * BF = (2Z-Z2) * Dev + (1-Z)2 * ExpC
See CREDIBLE CLAIMS RESERVES: THE BENKTANDER METHOD by THOMAS MACK
Criteria on how many iterations to perform…???
23
******CAPE COD TECHNIQUE******
Expected
Ultimate Claims
=
Observed
Reported Claims
Expected
Ultimate Claims
+
Premium x Loss ratio
X
% Unreported
Claims
From Dev Technique
Cape Cod Technique (aka SB Technique)
• Assumption: unreported/unpaid claims develop based on expected claims (same as BF technique)
• Mechanism is almost the same as BF. The difference is in how expected claims are estimated. Cape Cod
combines all AY experiences to estimate and select a loss ratio.
•
•
•
•
•
∑𝐴𝐴 𝑇𝑇𝑇𝑇𝑇𝑇𝑇 𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅 𝐶𝐶𝐶𝐶𝐶𝐶
Estimate LR = ∑
𝐴𝐴 𝑇𝑇𝑇𝑇𝑇𝑇𝑇
𝑈𝑈𝑈𝑈−𝑢𝑢 𝑃𝑃𝑃𝑃𝑃𝑃𝑃
. More mature years implicitly receive more weighting
To calculate ultimate/reserves for a specific AY, detrend (!!!) the selected loss ratio to the AY required.
Expected Claims of that AY = (original unadjusted EP) x (untrended selected loss ratio)
the same estimate claim ratio is used in BF step 2-5 for all AY
Prior is based on actual claims, hence more responsive than Expected Claims Technique and BF
often used by reinsurer
24
******FREQUENCY-SEVERITY TECHNIQUE******
Over-arching assumptions:
• claim emergence
• claim settlement
• payment growth
• accuracy of individual case estimates
All occur & develop at
an identifiable rate!!
Frequency-Severity #1 – Simple Development Method
• Assumptions
•
•
◦
Claim counts are defined consistently over experience period
◦
Claim counts are reasonably homogeneous (limits etc.)
◦ + the standard assumptions of Development Technique
Mechanics
◦
project ultimate #claim (if required, trend all)
◦
project ultimate severity (if required, trend all)
◦
ultimate claims = (Claim Count) x (Severity), detrend as appropriate
◦ calculate reserve components as appropriate
Data consistency issues
◦
closes/reported #claim should consistently include/exclude Closed without Payment (CNP);
otherwise not comparable
◦
CNP will cause #claim to develop downwards
◦
all ALAE should be consistently included/exclude (with payment, without payment, case O/S)
◦
claim count vs. occurrence count
◦
how are claims < deductible recorded
◦
how are reopened claims treated – same old claim number or new
Frequency-Severity #2 – with consideration to exposure and trend
New to this approach, the frequency analysis compares ultimate claim counts to an exposure base.
• reasons for trending
•
◦
economic inflationary factors
◦
societal factors and regulatory environment
◦
underwriting changes (rates, limits, retention)
◦ Business mix (geographic etc.)
mechanics
1. project #claim, with FREQUENCY TREND (claim count)
2. analyze and TREND EXPOSURE (vehicle, payroll, pol count, EP)
3. calculate frequency per unit exposure – trended ultimate
4. project and select severity with SEVERITY TREND
5. ultimate claims = F per exposure * S * exposure
6. calculate reserve components as appropriate
25
Frequency-Severity #3 – with Disposal Rate
X
Incremental Closed
Claims#
•
•
•
=
Selected Severity
by Maturity

Incremental Closed
Claims $$

Unpaid Claims
by AY
Detrend as
appropriate
Disposal Rate = cumulative closed #claim / ultimate #claim
mechanics
1. analyze closed #claim (same as above) and apply development technique to select disposal rate
(1-dim by maturity)
2. apply development technique to project ultimate #claim (1-dim by AY)
3. based on the ultimate #claim and disposal rate, fill in incremental closed #claim for lower
triangle (2-dim by AY & maturity, complete the square)
4. Trend and regress to select severity. Possibly adjust for tail severity (1-dim by maturity, same as
above)
5. for each combination of AY-maturity in the lower triangle, calculate incremental paid claims =
F*S
6. sum up incremental paid claims as appropriate to calculate reserves or ultimate claims
comments:
◦
implicitly assuming no significant partial payments
◦
working with paid severity helps prevent distractions from changes in claim procedures and case
O/S strength
◦
sensitive to inflation assumption
26
Frequency Severity Technique Evaluation
• Advantages
1. When development methods can be unreliable for the less mature years, FS methods provide a
good alternative. Reported claim count is usually stable; severity is based on mature years.
2. Insight into claim process
3. Calculated based on paid claims not dependent on case O/S. Any changes in reserving strategy
will not distort FS method – more objective
4. FS allows inflation adjustment explicitly
• Disadvantages
1. highly sensitive to inflation assumption
2. requires more data than aggregate methods, which may not be available
3. data available may not be consistently defined or processed
4. FS may be distorted by change in business mix
Miscellaneous points of concerns
• trends apply typically from any AY to next so no need to trend when calculating factors/ratios for
either counts or severity (trends cancel out)
• When it comes to selecting ultimate estimate, trend counts/severity to latest level. Then select an
ultimate based on prescribed averaging method.
• If asked to calculate unpaid claim reserve other than the latest AY, remember to un-trend the selected
ultimate estimate back to the respective AYs
• estimating tail severity by volume weighted average = trended total paid claims / trended total closed
#claim
• FS technique is suited to primary & excess layer but not for reinsurers usually, who may not have
detailed claims data
• Most often used for long tail risks
• Negative (downward) development may be due to 1) salvage and 2) CNP
• Incorporating closed claim counts into the selection of ultimate claim counts may overstate the true
value of projected ultimate claims…??? See All10 p362
27
******CASE OUTSTANDING TECHNIQUE******
Case O/S
(mat 1)
Case O/S
(mat 2)
Case O/S
(mat 3)
No case O/S
at Ultimate
Incremental
Paid $$
(mat 1)
Incremental
Paid $$
(mat 2)
Incremental
Paid $$
(mat 3)
Incremental
Paid $$
(Ultimate)
Cumulative
Paid $$
(mat 1)
Cumulative
Paid $$
(mat 2)
Cumulative
Paid $$
(mat 3)
Cumulative
Paid $$
(Ultimate)
Assumption:
• IBNR is related to claims already reported in a consistent manner
• + all standard assumptions that apply to development technique
Case Outstanding Development Technique – mechanics
1. based on paid claims $$ and case O/S data, populate two triangles:
• ratio of Incremental paid $$ / previous case O/S
• ratio of case O/S / previous case O/S
2. particular attention to the ratio at last maturity to ultimate
• For (paid / prev O/S) ratio, an ultimate ratio of 1 means all case O/S will be paid out at 100% of
the amount. The ultimate ratio may be > or <1
3. select above ratios by maturity
4. using selected ratios iteratively complete the lower triangle of paid claims table and case O/S
5. calculate ultimate claims and reserves as appropriate
Comments
• Technique is most appropriate if most claims are reported during the first period, because claims for a
given AY is known at the end of an AY.
• Commonly used for claims made policies and report year analysis
• limitations:
◦
assumption on relation between IBNR & paid claims its use
◦
lack of benchmark
◦
lack of intuition
◦
challenge in selecting an ultimate ratio for case O/S
Self-insurers' case O/S Technique – given industry CDFs and case O/S only, find IBNR
• assumption:
◦
•
•
insurer's historical claims are not available
◦ but benchmark development patterns are available
𝑃𝑃𝑃𝑃 𝐶𝐶𝐶 ×𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅 𝐶𝐶𝐶− 𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅 𝐶𝐶𝐶
𝑈𝑈𝑈𝑈𝑈𝑈 𝐶𝐶𝐶𝐶𝐶𝐶 = 𝐶𝐶𝐶𝐶 𝑂/𝑆 ×
memorize, but can be derived
Limitations:
𝑃𝑃𝑃𝑃 𝐶𝐶𝐶−𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅 𝐶𝐶𝐶
28
◦
Benchmark may be inaccurate for the specific risks
◦
Not suited to less mature years due to volatility
◦
Large claims can distort results
29
******BERQUIST-SHERMAN TECHNIQUE******
Berquist-Sherman Techniques
1. Data selection, rearrangement
2. Data adjustment
Situation
Solution
1
Change in #claim definition
SUBSTITUTE: use earned exposure
2
Change in policy limits or deductibles
SUBSTITUTE: PY data instead of AY
3
Change in how claims are reported
SUBSTITUTE: RY data instead of AY
4
Significant growth in exposure
SUB-DIVIDE: consider quarterly/monthly
5
Change in business mix
SUB-DIVIDE: make homogeneous groups
6
Change in claim processing
SUB-DIVIDE: separate by size of claim
Berquist-Sherman #1 – Change in case O/S adequacy
Paid Losses
Trend Rate
Average Case
O/S
Reported
Losses
•
Adj Reported
Claims $$
Adj CDFs
Reported
Claims #
DIAGNOSIS
Amounts $$
•
Adj Average
Case O/S
◦
notice that in the age-to-age factors triangle,
factors display trend down AY
◦
notice significant difference in projected ultimate
claims (reported vs. paid)
◦
diagnostic triangle with average case O/S
◦
eye-ball for the trend down AY
◦
Fit exponential regression to average case O/S and
AY. Observe annual change >0%.
◦
Alternatively, compare % change in average case O/S and % change in average paid claims
Reported
Paid
Maturity
◦ alternatively, compare paid-to-reported triangle and look for trend
ADJUSTMENT – restate average case O/S triangle
◦
calculate average case O/S triangle
◦
last diagonal are at today's value so won't change
◦
to obtain trend rate, regress premium or paid losses by AY
30
◦
Detrend from diagonal entries to complete the average case O/S triangle
◦
adjusted reported claims = adj average case O/S * #claim + unadjusted paid claims
◦
•
Calculate CDF based on adjusted paid/reported claims. Apply CDF on original unadjusted
reported/paid losses.
CAUTION
◦
rate selection very judgmental
◦
reserve estimate very sensitive to selected adjustment
Berquist-Sherman #2 – Rate of Settlement of Claims
Reported
& Closed
Claims #
Ultimate
Claim #
(Reported)
Adj Closed
Claims ##
Adj Paid
Claims $$
Disposal Rates
Adj CDFs
Reported &
Paid Losses $$
•
•
Adj Reported
Claims $$ (y)
Regress unadjusted paid $$ (y) on unadjusted
closed #. Interpolate to get adj paid $$
Assumption – higher % of closed claim accounts imply higher % of ultimate claims paid
DIAGNOSIS
◦
compute triangle of closed-to-reported claims counts
◦
◦
apply standard development technique on reported claims to project ultimate #claim
◦
Calculate 𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑 𝑟𝑟𝑟𝑟 =
𝐴𝐴 𝑢𝑙𝑡𝑡𝑡𝑡𝑡𝑡 𝑐𝑐𝑐𝑐𝑐 𝑐𝑐𝑐𝑐𝑐 #
selected disposal rate for each maturity
◦
◦
•
𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 𝑐𝑐𝑐𝑐𝑐𝑐 𝑐𝑐𝑐𝑐𝑐 𝑐𝑐𝑐𝑐𝑐 #
for the diagonal entries. Use these as
For non-diagonal entries, calculate adjusted closed #claim = ultimate * disposal rate.
Fit a function to (closed #, paid $$) and interpolate to find (adjusted closed #, adjusted paid $$).
Typically pairwise linear in exam.
◦ Apply standard adjustment techniques on the adjusted triangles
CAUTION
◦
BS technique does not recognize that change to settlement could be related to claim size
◦
Thorne demonstrated an example of faster settlement of
small claims ad slower settlement of large claims. This led
to % closed claims decreasing while % of paid claims
increasing.
◦ BS technique could exacerbate the error.
Wouldn’t case O/S also change? Closed claims don’t have case O/S…
Amounts $$
•
Look for trend down AY. If trend exists, paid claims data violates the assumption of most techniques
and therefore is not fit for reserving analysis
ADJUSTMENT – adjust closed claims # and paid claims $$
Reported
Paid
Maturity
31
Berquist-Sherman #3 – BOTH changes in case O/S and settlement rates
Ultimate
Claim #
Reported
& Closed
Claims #
Adj Closed
Claims ##
Adj Paid
Claims $$
Disposal Rates
Adj CDFs
Reported &
Paid Losses $$
Regress unadjusted paid $$ (y) on unadjusted
closed #. Interpolate to get adj paid $$
As in BS Technique #1
•
•
•
Adj Average
Case O/S
Adj Reported
Claims $$ (y)
Need three triangles:
◦
average paid claims
◦
average case O/S
◦ additionally, adjusted open #claim
Adjusted reported $$claims = adj average case O/S * adj open #claim + adj paid claims
Appropriateness of adjusted data…?
Estimating ALAE – ALAE needs to consider CNP
1. simple chain ladder on reported ALAE or paid ALAE
2. Ratio method
•
◦
apply chain ladder on the ratio of (paid ALAE)/(paid claims)
◦
Age-to-age factors are ADDITIVE (default) instead of being multiplicative
◦ Ultimate ALAE = (priori ultimate claims) x (ultimate ratio)
Comments on the ratio method
 Ratio method recognize the relationship between claims and ALAE
 Ratio method’s development factors are less leveraged
 Judgment can be applied in the selection of ratio
 Any error in the ultimate claim estimate would feed through to ultimate ALAE
 CNP would incur large ALAE but no losses, which distorts the ratio model
 ALAE development lags behind that of losses. Need a tail factor for ALAE.
Salvage & Subrogation – Similar to ALAE, but doesn’t need to consider CNP
1. simple chain ladder on reported or paid recoveries
2. Ratio method
◦
Compute ratio triangle of (received S&S)/(paid claims)
◦
Apply chain ladder on the ratios. Obtain ultimate selected ratio
◦
Separately estimate ultimate claims gross of S&S
◦
Ultimate S&S for each AY = (ult claims gross of S&S) x (selected S&S ratio)
◦
Advantage: factors are less leveraged
Recovery through reinsurance:
32
•
•
Analyze GROSS & CEDED experiences separately or GRPOSS & NET experiences, depending on data
Caution & checks before analysis
◦
•
Check quota share % is consistent throughout experience period, through either claims or premium
data
◦ Check retention/limits are consistent for excess of loss insurance
Consistency checks during analysis
◦
Assumption consistency: trends, tail factor, etc.
◦
Selection of factors and ultimates: check the implied patterns on the implicit layer (net or ceded)
◦
Inclusion relationships, eg expect gross IBNR > net IBNR, net tail factor < gross tail factor, etc.
◦
Typically want to analyze data gross of stop-loss
Estimation of ULAE – Dollar Based Approaches…
1. Classical technique
• Assumptions:
▪
▪
•
𝑃𝑃𝑃𝑃 𝑈𝑈𝑈𝑈
𝑃𝑃𝑃𝑃 𝐶𝐶𝐶𝐶𝐶𝐶
Ratio is stable and same as
𝑈𝑈𝑈𝑈𝑈𝑈𝑈𝑈 𝑈𝑈𝑈𝑈
𝑈𝑈𝑈𝑈𝑈𝑈𝑈𝑈 𝐶𝐶𝐶𝐶𝐶𝐶
Future ULAE cost is proportional to IBNR (not yet reported claims) and Case O/S (not yet closed
claims)
▪ Assume that ½ of ULAE is sustained when opening a claim and the other ½ at closing a claim
Mechanics
𝐶𝐶 𝑝𝑝𝑝𝑝 𝑈𝑈𝑈𝑈
1. Calculate historical ratio of
𝐶𝐶 𝑝𝑝𝑝𝑝 𝑐𝑐𝑐𝑐𝑐𝑐
2. Review and select a ratio
3. Apply 100% of this ratio to pure IBNR; apply 50% to (total reserve – pure IBNR)
ULAE
Reserve
•
=
Selected
ULAE
Ratio %
x 100% x
Pure
IBNR
+ 50% x
Case
O/S
+
Total
IBNR
+
Pure
IBNR
Comments
▪
Paying a claim is not the same as closing a claim
▪
This approach does not consider 1) claims closed & reopened in the same CY 2) claims opened
and remained open in same CY
▪
Challenge that Pure IBNR needs to be estimated
▪
Best suited to short tail coverages (Johnson)
▪
Subject to distortion by changing business volume due to the time lag in paid ULAE and paid
claims (Rahardjo)
▪
Subject to distortion by inflation
Payments on prior case O/S reserves
Losses opened and remain open
Losses opened and paid during the year
½ Unit
½ Unit
1 Unit
33
2. Kittel’s Refinement
• Assumptions
•
▪
ULAE is sustained as claims are reported even if no claim payments are made
▪
ULAE payments for a specific CY are related to both the reporting and the payment of claims
▪
If come across IBNR in exam, consider it IBNER
Mechanics
▪
▪
▪
▪
𝐶𝐶 𝐼𝐼𝐼𝐼𝐼𝐼𝐼𝐼 = 𝐶𝐶 𝑝𝑝𝑝𝑝 + ∆ 𝑐𝑐𝑐𝑐 𝑂𝑂 + ∆𝐼𝐼𝐼𝐼𝐼
𝐶𝐶 𝑝𝑝𝑝𝑝 𝑈𝑈𝑈𝑈
Estimate historical ratio of 1
2
(𝐶𝐶 𝑝𝑝𝑝𝑝 𝑐𝑐𝑐𝑐𝑐𝑐+𝐶𝐶 𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖 𝑐𝑐𝑐𝑐𝑐𝑐)
Review and select a ULAE ratio
As before, allocate 100% to pure IBNR and 50% to (total reserve – pure IBNR)
3. Conger & Nolibos Generalized Method
• Assumptions
ULAE spent opening claims
U1
U2
ULAE spent maintaining claims
U3
ULAE spent closing claims
Σ=100%
• Mechanics
∝ ultimate costs reported
∝ payments made
∝ ultimate costs settled at
▪
Calculate “claim basis” as weighted average of
1. Ultimate of CY reported claims (U 1 )
2. CY paid claims (U 2 )
3. Ultimate of CY closed claims (U 3 )
▪
Estimate historical ratio of
▪
Calculate ULAE reserve using one of the methods
1. Development Method:
𝑇𝑇𝑇𝑇𝑇 𝐴𝐴 𝑈𝑈𝑈𝑈𝑈𝑈𝑈𝑈 𝐶𝐶𝐶𝐶𝐶𝐶
𝑈𝑈𝑈𝑈 𝑅𝑅𝑅𝑅𝑅𝑅𝑅 = 𝐶𝐶 𝑈𝑈𝑈𝑈 𝑃𝑃𝑃𝑃 × (
− 1)
𝑇𝑇𝑇𝑇𝑇 𝐶𝐶 𝐶𝐶𝐶𝐶𝐶 𝐵𝐵𝐵𝑖𝑖
2. Expected Claims Method:
𝑈𝑈𝑈𝑈 𝑅𝑅𝑅𝑅𝑅𝑅𝑅 = 𝑈𝑈𝑈𝑈 𝑅𝑅𝑅𝑅𝑅 × 𝑇𝑇𝑇𝑇𝑇 𝐴𝐴 𝑈𝑈𝑈 𝐶𝐶𝐶𝐶𝐶𝐶 − 𝐶𝐶 𝑃𝑃𝑃𝑃 𝑈𝑈𝑈𝑈
3. BF Method
𝑈𝑈𝑈𝑈 𝑅𝑅𝑅𝑅𝑅𝑅𝑅 = 𝑈𝑈𝑈𝑈 𝑅𝑅𝑅𝑅𝑅 × (𝑇𝑇𝑇𝑇𝑇 𝐴𝐴 𝑈𝑈𝑈 𝐶𝐶𝐶𝐶𝐶𝐶 − 𝐶𝐶 𝐶𝐶𝐶𝐶𝐶𝐶 𝐵𝐵𝐵𝐵𝐵)
𝐶𝐶 𝑝𝑝𝑝𝑝 𝑈𝑈𝑈𝑈
𝐶𝐶 𝐶𝐶𝐶𝐶𝐶 𝐵𝐵𝐵𝐵𝐵
. Review and select a ULAE Ratio
4. Conger & Nolibos Simplified
• Mechanics
▪
Estimate pure IBNR = x% of AY Ultimate Claims
▪
𝑈𝑈𝑈𝑈 𝑅𝑅𝑅𝑅𝑅𝑅𝑅 = 𝑈𝑈𝑈𝑈 𝑅𝑅𝑅𝑅𝑅 × [𝑍 × 𝑃𝑃𝑃𝑃 𝐼𝐼𝐼𝐼 + (1 − 𝑍) × (𝐴𝐴 𝑈𝑈𝑈 − 𝑃𝑃𝑃𝑃 𝐶𝐶𝐶𝐶𝐶𝐶)]
5. Mango Allen…
Count based methods…
Triangle based methods…
34