

FlashChat  Actuarial Discussion  Preliminary Exams  CAS/SOA Exams  Cyberchat  Around the World  Suggestions 
DW Simpson 
Actuarial Salary Surveys 
Actuarial Meeting Schedule 
Contact DW Simpson 

Thread Tools  Search this Thread  Display Modes 
#1




Excess Loss Procedure
If you don't have total limits losses, but you have losses capped at various policy limits then is it true that the excess loss procedure does not fully reflect the true long term average of excess losses?
I'm not quite sure how to reflect the data having losses capped at various policy limits. Ideally, I'd like to apply trend and loss development factors to the losses before calculating an excess load but that seems counterintuitive if we're given a loss that's already been capped. I'm considering making a table by policy limits with losses capped at a new attachment point for the excess layer, then averaging the excess load calculated from each policy limit type based on exposures. Does this seem reasonable? If anyone has any ideas, I'm open to any suggestions or readings that anyone knows of on this topic. 
#2




Quote:
Do NOT apply development factors before cutting through to the layer. That's wrong. Do not apply primary loss development factors to excess losses in a layer. That's wrong too. If the LDFs do not look uncommonly large relative to Schedule P LDFs, they are likely too small. Split trend into frequency and severity and use them differently. Actually, think really hard about what you select for these trends. I'd bet your first instinct is likely to be wrong. Segregate your data by policy limit caps. You should only use data to calculate losses to a layer if the layer is below the cap. So if you have losses at 3 policy limit groups (100k limit, 500k limit, and 1M limit), you should not use the 100k data to evaluate losses in 250 xs 250 layer. You could use the 100k losses for a 50 xs 50 layer. You could use all the data for a 50 xs 50 layer.
__________________
Come on. Let's go space truckin'. Come on! 
#3




If your losses are capped at policy limits, but fully developed, then I would just fit a distribution to the losses with censoring in mind. The distribution would then give you an excess load relative to your basic limits.
If your losses are also undeveloped, then it's trickier. There's an Exam 7 paper that discusses using a distribution to create adjustment factors for loss development: https://www.casact.org/pubs/forum/10...asrabuddhe.pdf 
#4




Quote:
You could do this, but without more information you have insufficient reason to believe the statistically extrapolated loss curve will bear resemblance to what the uncapped loss curve would look like. The two could very well be different. There are ways to get loss curve parameters with censored or truncated data. But there are real world considerations that affect claim patterns that are beyond mere statistics.
__________________
Come on. Let's go space truckin'. Come on! 
#5




Quote:
If the OP is performing primary ratemaking with consistent limit profiles and a lot of data, then using the empirical distribution to model the uncapped excess loss load is probably a good way to go. However, if the severity distribution looks highly skewed, or if the largest policy limits have never seen a full limit loss, then relying on the empirical distribution becomes suspect. A modeled severity distribution doesn't have to be used as is to get the excess load. A lot of practical procedures involve using the curve just to get relativities between different layers, which is common when pricing excess lines or reinsurance. 
Thread Tools  Search this Thread 
Display Modes  

