Actuarial Outpost Excess Loss Procedure
 Register Blogs Wiki FAQ Calendar Search Today's Posts Mark Forums Read
 FlashChat Actuarial Discussion Preliminary Exams CAS/SOA Exams Cyberchat Around the World Suggestions

#1
06-21-2018, 10:27 AM
 Swanson_Ron Member CAS Join Date: Jan 2018 Location: Pawnee Studying for 6 Favorite beer: All Posts: 95
Excess Loss Procedure

If you don't have total limits losses, but you have losses capped at various policy limits then is it true that the excess loss procedure does not fully reflect the true long term average of excess losses?

I'm not quite sure how to reflect the data having losses capped at various policy limits. Ideally, I'd like to apply trend and loss development factors to the losses before calculating an excess load but that seems counterintuitive if we're given a loss that's already been capped. I'm considering making a table by policy limits with losses capped at a new attachment point for the excess layer, then averaging the excess load calculated from each policy limit type based on exposures.

Does this seem reasonable? If anyone has any ideas, I'm open to any suggestions or readings that anyone knows of on this topic.
#2
06-22-2018, 03:24 PM
 DeepPurple Member Join Date: Jun 2004 Posts: 4,278

Quote:
 Originally Posted by Swanson_Ron If you don't have total limits losses, but you have losses capped at various policy limits then is it true that the excess loss procedure does not fully reflect the true long term average of excess losses? I'm not quite sure how to reflect the data having losses capped at various policy limits. Ideally, I'd like to apply trend and loss development factors to the losses before calculating an excess load but that seems counterintuitive if we're given a loss that's already been capped. I'm considering making a table by policy limits with losses capped at a new attachment point for the excess layer, then averaging the excess load calculated from each policy limit type based on exposures. Does this seem reasonable? If anyone has any ideas, I'm open to any suggestions or readings that anyone knows of on this topic.
Read the Dave Clark paper on Basics of Reinsurance Pricing. It is not perfect, but it is a decent starting point.

Do NOT apply development factors before cutting through to the layer. That's wrong. Do not apply primary loss development factors to excess losses in a layer. That's wrong too. If the LDFs do not look uncommonly large relative to Schedule P LDFs, they are likely too small.

Split trend into frequency and severity and use them differently. Actually, think really hard about what you select for these trends. I'd bet your first instinct is likely to be wrong.

Segregate your data by policy limit caps. You should only use data to calculate losses to a layer if the layer is below the cap. So if you have losses at 3 policy limit groups (100k limit, 500k limit, and 1M limit), you should not use the 100k data to evaluate losses in 250 xs 250 layer. You could use the 100k losses for a 50 xs 50 layer. You could use all the data for a 50 xs 50 layer.
__________________
Come on. Let's go space truckin'. Come on!
#3
06-22-2018, 04:01 PM
 examsarehard Member CAS Join Date: May 2011 Posts: 591

If your losses are capped at policy limits, but fully developed, then I would just fit a distribution to the losses with censoring in mind. The distribution would then give you an excess load relative to your basic limits.

If your losses are also undeveloped, then it's trickier. There's an Exam 7 paper that discusses using a distribution to create adjustment factors for loss development:
https://www.casact.org/pubs/forum/10...asrabuddhe.pdf
#4
06-22-2018, 04:44 PM
 DeepPurple Member Join Date: Jun 2004 Posts: 4,278

Quote:
 Originally Posted by examsarehard If your losses are capped at policy limits, but fully developed, then I would just fit a distribution to the losses with censoring in mind. The distribution would then give you an excess load relative to your basic limits. https://www.casact.org/pubs/forum/10...asrabuddhe.pdf

You could do this, but without more information you have insufficient reason to believe the statistically extrapolated loss curve will bear resemblance to what the uncapped loss curve would look like. The two could very well be different.

There are ways to get loss curve parameters with censored or truncated data. But there are real world considerations that affect claim patterns that are beyond mere statistics.
__________________
Come on. Let's go space truckin'. Come on!
#5
06-25-2018, 12:23 PM
 examsarehard Member CAS Join Date: May 2011 Posts: 591

Quote:
 Originally Posted by DeepPurple You could do this, but without more information you have insufficient reason to believe the statistically extrapolated loss curve will bear resemblance to what the uncapped loss curve would look like. The two could very well be different. There are ways to get loss curve parameters with censored or truncated data. But there are real world considerations that affect claim patterns that are beyond mere statistics.
You're right that without more information I can't say how I would approach the situation.

If the OP is performing primary ratemaking with consistent limit profiles and a lot of data, then using the empirical distribution to model the uncapped excess loss load is probably a good way to go. However, if the severity distribution looks highly skewed, or if the largest policy limits have never seen a full limit loss, then relying on the empirical distribution becomes suspect.

A modeled severity distribution doesn't have to be used as is to get the excess load. A lot of practical procedures involve using the curve just to get relativities between different layers, which is common when pricing excess lines or reinsurance.
#6
06-25-2018, 03:52 PM
 tometom Member CAS AAA Join Date: May 2004 Favorite beer: Homebrew Posts: 13,222

agree that more information is needed. What particularly you are trying to accomplish is my primary question.
__________________
If I had some duct tape, I could fix that.